Jump to content

Nvidia 30 Series unveiled - RTX 3080 2x faster than 2080 for $699

illegalwater
15 minutes ago, Tech Enthusiast said:

Well, if performance increase is irrelevant, but the NAMES on the products are important for you... then I guess we have different ways of judging GPUs.

 

Just because a GPU is called 3090 and not Titan, means THAT is the top dog? Even tho Jen said it is the Titan replacement? if it was called Titan, you would be all happy because then the top dog would have been the 700 dollar 3080?

 

Basically, you dislike the situation because the thing is not called Titan? 

Your missing part of my original comment.

 

The performance difference between the 'Top dog' gaming card and the Titan of that generation was always under 10%, damn near identical in most cases.

 

From Nvidias announcement, the 3090 is a not an insignificant amount faster than the 3080.

And as already explained very plainly....

 

IF the 3090 is indeed a Titan ,,and thus there should be no 'Titan' name released this generation, then we are still not seeing the real top dog GPU form NVidia because the 3080 is not close enough in performance to the 'Titan' (3090) .

 

Alternatively.

 

The 3090 is NOT a Titan, and is the top dog gaming card, there will be a actual Titan released in the future that will be closer to the 3090 in performance than the current 3080 is. BUT at that point the 3090 is just like the 2080ti , way overpriced.

 

The names matter because it helps place a GPU in the appropriate tier, and the tier matters because it dictates price. The performance should always be higher than the card of the previous generation in that tier.

 

i cant explain it any better than this so ill leave it here.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SolarNova said:

I see a lot of comments about how 'well' the 30 series is priced, and many youtube commenting on how much performance gain there is 'for the price'.

 

-snip-

Does nobody remember how expensive graphics cards have cost? Especially when you take in to account for inflation? Plenty of cards would be in the $700-1000 range today. RTX 20 pricing wasn't great but it shouldn't have been a surprise given how much cards have cost deep in to the past, ignoring how you can thank AMD for no competition.

.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, AlwaysFSX said:

Does nobody remember how expensive graphics cards have cost? Especially when you take in to account for inflation? Plenty of cards would be in the $700-1000 range today. RTX 20 pricing wasn't great but it shouldn't have been a surprise given how much cards have cost deep in to the past, ignoring how you can thank AMD for no competition.

Inflation doesnt result in a sudden price jump from $700 to $1200+.

 

IIRC in 2006 the GTX 8800 (of which i owned 2), cost ~$500 each.

Even if u go by inflation, that results in ~$665 today. Not $1000+

 

Same goes for the GTX 480, which was ~ $500 in 2010,

thats ~$600 today.

 

The 780ti of 2013 at a higher price point of $700 at launch is...

~$770.  Not $1000+

 

Inflation over the past 20 years doesnt account for the price of top tier (top dog) GPU's being over $1000.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, SolarNova said:

Inflation doesnt result in a sudden price jump from $700 to $1200+.

 

IIRC in 2006 the GTX 8800 (of which i owned 2), cost ~$500 each.

Even if u go by inflation, that results in ~$665 today. Not $1000+

 

Same goes for the GTX 480, which was ~ $500 in 2010,

thats ~$600 today.

 

The 780ti of 2013 at a higher price point of $700 at launch is...

~$770.  Not $1000+

 

Inflation over the past 20 years doesnt account for the price of top tier (top dog) GPU's being over $1000.

https://www.idgcdn.com.au/products/image/3636/angle/8/614x480/125270/

.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, AlwaysFSX said:

https://www.idgcdn.com.au/products/image/3636/angle/8/614x480/125270/

Yes i know what that is :P

 

So what ? You bring up the 8800 Ultra , there's a reason it was laughed at back then. it was a GTX8800 pre-overclocked costing something like $800 at launch, its value tanked quickly.

 

it is not a representative example.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, SolarNova said:

Inflation doesnt result in a sudden price jump from $700 to $1200+.

 

IIRC in 2006 the GTX 8800 (of which i owned 2), cost ~$500 each.

Even if u go by inflation, that results in ~$665 today. Not $1000+

 

Same goes for the GTX 480, which was ~ $500 in 2010,

thats ~$600 today.

 

The 780ti of 2013 at a higher price point of $700 at launch is...

~$770.  Not $1000+

 

Inflation over the past 20 years doesnt account for the price of top tier (top dog) GPU's being over $1000.

compare phones in same thing and you will see same trend

top tier are suppose to be out of majorities reach unless they splurge

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, porina said:

 

DLSS 2.0 - we've already seen this claimed in past presentations. I said it then and will say it again. Game benchmarks I think will be complicated going forwards as there will be quality and performance comparisons to do. NV 4k DLSS vs NV 4k native vs AMD 4k native vs AMD 4k + whatever they might have.

 

 

The problem is that DLSS is very easy to optimize against a benchmark, so I would probably suggest any benchmarks run with the card test DLSS off only, RT features off and DLSS off, and both on.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, porina said:

It'll come down to the use case. Having more of something if you don't need/use it is kinda pointless. I think until games are routinely using more VRAM, that'll be under-utilised by most. Bandwidth might have more of a short term performance impact.

ram requirements are expected to rise significantly soon, so its something to keep an eye out for, we should know if it will happen or not in a years time.

5 hours ago, Kilrah said:

People are just worried 10GB might not be enough anymore 2 years down the line. 

ya with games like Star citizien and those coming from the new consoles having so much more that needs to be loaded at the same time, we need faster IO, and while we dont have it more vram / ram will be the solution

4 hours ago, NE0XY said:

Any chance ASUS Poseidon returns this generation? 

nvidia has been restricting aibs more and more, so those kinds of cards are getting rarer, as they used to be more than simply a card on a fancy cooler, they were a binned die, on a fancy pcb with extra power, tdp limits could be removed, voltage could be increased over stock significantly, etc, kingpin seems to be able to do it for some reason though so there's that.

2 hours ago, spartaman64 said:

there were rumors that big navi is 40-50% faster than 2080 ti

and there is proof of at least 30% over the 2080 ti on a openvr benchmark though you have to have the "game" to see the result.

1 hour ago, Tech Enthusiast said:

Kinda strange assertion, don't you think?

 

I mean the reason we (the customers) want competition is to get better products for better prices.

Now you are claiming NVidia offering a great product for a great price is.... bad for competition? That does not check out for me.

 

We can be pleased that NVidia is not abusing their monopoly as much as they could do. Instead, they beat themselves up by basically deleting every reason to buy a Turing GPU. That is something we want from AMD,... but why do we need them, if NVidia is doing it themselves? 😉

they aren't doing it themselves, they are responding to the consoles and to what rumors point big navi to be, nvidea has the habit of doing this all the time, just before amd launches a card they find a way to counteract it the best they can, with things like the super series, the first XX80 ti was a response to the 290x, etc.

1 hour ago, porina said:

Big navi doesn't (necessarily) get any major uplift in process compared to existing navi. They will have to focus more on architecture. Let's see what they bring.

 

I'll have to test it out some time.

 

Apologies for my writing style. I was saying the only system I know I had passive conversion from DP to HDMI was on a Dell desktop, which had a pair of DP++ ports on it. Kinda annoying, as the only displays I had around me were native DVI and HDMI. This was in a corporate environment.

 

Traditional TV and related content (VHS, DVD) were not mastered to be shown in full. There could be unwanted stuff on the edges so TVs overscanned to hide them. 

they should have an edge over nvidia though, as samsung's node is behind, and from xbox series x performance and power amd's claim of 50% better perf/w seems to have panned out

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, SolarNova said:

Your missing part of my original comment.

 

The performance difference between the 'Top dog' gaming card and the Titan of that generation was always under 10%, damn near identical in most cases.

 

From Nvidias announcement, the 3090 is a not an insignificant amount faster than the 3080.

And as already explained very plainly....

 

IF the 3090 is indeed a Titan ,,and thus there should be no 'Titan' name released this generation, then we are still not seeing the real top dog GPU form NVidia because the 3080 is not close enough in performance to the 'Titan' (3090) .

 

Alternatively.

 

The 3090 is NOT a Titan, and is the top dog gaming card, there will be a actual Titan released in the future that will be closer to the 3090 in performance than the current 3080 is. BUT at that point the 3090 is just like the 2080ti , way overpriced.

 

The names matter because it helps place a GPU in the appropriate tier, and the tier matters because it dictates price. The performance should always be higher than the card of the previous generation in that tier.

 

i cant explain it any better than this so ill leave it here.

 

 

NVIDIA list the 3090 supports Studio Drivers, like the Titans and Quadros do. Looking at shaders alone, the 3090 should be around 20% faster than the 3080.

I expect there might be a 3080 Super, or 3080Ti to slot in between the two, and give more than 10GB VRAM.

https://www.nvidia.com/en-gb/geforce/graphics-cards/30-series/rtx-3090/

Screenshot_1089.png

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/1/2020 at 10:03 AM, nathanyule said:

My first thoughts when I saw the cards.

I'd throw a water block on that thing, still have my hydrocopper 1080. need to upgrade 

Main Rig: Cpu: AMD Ryzen 9 5950x @ 4.60Ghz 1.2V | Motherboard: ASUS CROSSHAIR VIII FORMULA | RAM: 32GB Corsair VENGEANCE DDR4 2133Mhz | GPU: Powercolor RADEON RX6900XT Liquid Devil  | Case: XFORMA MBX MKII | Storage: Samsung 840 256GB/Samsung 860 EVO 1TB | PSU: Corsair Hx850i | Cooling: Custom loop with gentle typhoons | 

 
Link to comment
Share on other sites

Link to post
Share on other sites

So hype aside,  when do we expect the *real* cards hitting, ie aftermarket models, and the Super / Ti ones? 

 

Because surely NV knows these low VRAM configurations won't cut it anymore after the new consoles released... 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mark Kaine said:

So hype aside,  when do we expect the *real* cards hitting, ie aftermarket models, and the Super / Ti ones? 

 

Because surely NV knows these low VRAM configurations won't cut it anymore after the new consoles released... 

 

 

I'm just hoping that AMD is a direct competitor of the cards and undercuts the pricing, especially in the mid range category, so nVidia has to respond by either giving more performance at the same price (super lineup) or reduce the price to compete with AMD's. 

 

This probably won't happen because AMD can't seem to get their head in the game for GPUs, but I am hoping that I'm jinxing it. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

 

Traditional TV and related content (VHS, DVD) were not mastered to be shown in full. There could be unwanted stuff on the edges so TVs overscanned to hide them. 

The reverse is actually true. CRT's were never edge-to-edge in the first place, so VHS, DVD, Video game consoles, and the like were typically designed to not put anything in the overscan area.

 

From NESDEV:

Safe_areas.png

 

 

https://eks.tv/title-safe-still-matters/

title safe and action safe

 

Basically any video you produce, needs to be within the "title safe" area to prevent it from being cropped by overscan or pillar-boxing (4:3 on a 16:9 screen)

 

Mobile phones don't all have 16:9 screens (like the iPhone) and tablets are rarely 16:9 because they're too darn awkward to hold that way. Tablets are designed with the aspect ratio closer to that of printed paper (the iPad Pro is larger than a A4 page) because they're intended to be used like that.

 

In the case of video game consoles, you usually need all the UI stuff within the title safe area, because if you let the game put stuff outside it, that might not be visible for whatever reason overscan is on (which is on by default on HDMI TV's, and some GPU's even assume this.) 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just saw the 3070Ti info, will there be a 3080Ti soon as well? 

Desktop: 7800x3d @ stock, 64gb ddr4 @ 6000, 3080Ti, x670 Asus Strix

 

Laptop: Dell G3 15 - i7-8750h @ stock, 16gb ddr4 @ 2666, 1050Ti 

Link to comment
Share on other sites

Link to post
Share on other sites

RTX 3070 has an interesting fan design. Can see in its reflection, the rear fan is open too, like the others, but front fan is located on the same side as the rear fan. So the 3070 is using the traditional way of cooling like how all cards are now. Or rear fan is pull and front fan is push, with the front fan, pushing air out from the card's exhaust and the rear and being helped pushing the air out with bottom intake fans, on the front of the case.

3070.thumb.png.bf5870a79654411d0125a57f54c91d34.png

 

3080.thumb.png.d9382077f5ba3b7d4394239903c7a365.png

 

@porina
Jensen did kind of emphasis "The more you buy, the more you save" but this time with RTX. :D

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

Those cards, the power connector smack right in the center. Bye bye, cable management and aesthetic finishes, because now you'll have a cable run, straight down the center of your tempered glass side panel!

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/1/2020 at 11:57 AM, porina said:

Just saw on the Asus cards, two HDMI connectors. YES! What took them so long? Too many DP before.

My Asus RTX 2070 has 2x HDMI

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, NumLock21 said:

Those cards, the power connector smack right in the center. Bye bye, cable management and aesthetic finishes, because now you'll have a cable run, straight down the center of your tempered glass side panel!

Yup, thats pretty dumb... 

 

Btw do we know if those are 2 or 3 slots now (besides the 3090)?

 

I don't think I can fit a 3 slot card in my PC...  

 

I mean I could but then I would lose my super awesome extra GPU cooling fan :(

 

 

IMG_20200902_234933.thumb.jpg.34f7adeaa9cc145d1f19a8d113d85717.jpg

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Mark Kaine said:

Yup, thats pretty dumb... 

 

Btw do we know if those are 2 or 3 slots now (besides the 3090)?

 

I don't think I can fit a 3 slot card in my PC...  

 

I mean I could but then I would lose my super awesome extra GPU cooling fan :(

 

 

IMG_20200902_234933.thumb.jpg.34f7adeaa9cc145d1f19a8d113d85717.jpg

I see you have a matx case so that's 4 slots in total. Triple slot cards, leaves you with a 1 slot gap between that bottom case fan and the fans on the GPU.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

RTX 3090 has 2 extra connectors on top, similar to AMD's original CFX, but they never mentioned what it's for, I assume it's still SLI. RTX 3070 and 3080 don't have those connectors, or they never bother to show them.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, NumLock21 said:

I see you have a matx case so that's 4 slots in total. Triple slot cards, leaves you with a 1 slot gap between that bottom case fan and the fans on the GPU.

Yeah I figured  but often the fans seem to extend this by a large margin so yeah, I think it would fit but cutting it close, and I'd lose my bottom intake fan (it's only there to cool the GPU, because 61 C is too hot for me,  I prefer 55-59c) but I suppose it cools the rest of the PC a little bit as well. 

 

I'm just saying,  I'd really prefer a 2 slot card,  but how realistic is that for 3070/80/ti/super is the question.  

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Mark Kaine said:

Yeah I figured  but often the fans seem to extend this by a large margin so yeah, I think it would fit but cutting it close, and I'd lose my bottom intake fan (it's only there to cool the GPU (because 61 C is too hot for me,  I prefer 55-59c) but I suppose it cools the rest of the PC a little bit as well. 

 

I'm just saying,  I'd really prefer a 2 slot card,  but how realistic is that for 3070/80/ti/super is the question.  

Can wait for AIB and hope they make a dual slot 3090.

Intel Xeon E5 1650 v3 @ 3.5GHz 6C:12T / CM212 Evo / Asus X99 Deluxe / 16GB (4x4GB) DDR4 3000 Trident-Z / Samsung 850 Pro 256GB / Intel 335 240GB / WD Red 2 & 3TB / Antec 850w / RTX 2070 / Win10 Pro x64

HP Envy X360 15: Intel Core i5 8250U @ 1.6GHz 4C:8T / 8GB DDR4 / Intel UHD620 + Nvidia GeForce MX150 4GB / Intel 120GB SSD / Win10 Pro x64

 

HP Envy x360 BP series Intel 8th gen

AMD ThreadRipper 2!

5820K & 6800K 3-way SLI mobo support list

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

The reverse is actually true. CRT's were never edge-to-edge in the first place, so VHS, DVD, Video game consoles, and the like were typically designed to not put anything in the overscan area.

I'm confused. You describe what I was saying. Maybe we're saying the same thing from different perspectives. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, F.E.A.R. said:

It wasn't worth it if you had a GTX 1080 Ti, the RTX 2080 Ti is around 23% faster than the GTX 1080 Ti. But I get there are people that don't care about value and will buy it anyway. The RTX 2000 was just Nvidia experimenting with their RT tech and overpricing the cards because of no competition. And yeah the RTX 3090 is overpriced as hell. Even if it's a behemoth of a card, we can't justify it's price.

You mean you can't justify the price. If the 3080 can run at 4k 144hz no problem even in games like cyberpunk 2077 then sure the 3090 probably isn't worth it to me but if it requires a 3090 to run at 4k 144hz then yeah I will pick it up and the cost is justified because it allows me to play at the resolution and framerates I desire. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×