Jump to content

NVIDIA GeForce 3070/3080/3080 Ti (Ampere): RTX Has No Perf Hit & x80 Ti Card 50% Faster in 4K! (Update 6 ~ Specs / Overview / Details)

23 hours ago, RejZoR said:

Well, 30% on top of RTX2080Ti isn't all that bad. What will be bad is the price. There is no way it'll be cheaper than what it's replacing. Because NVIDIA knows they'll sell them regardless of price. As for price/performance, NVIDIA makes them. Just not at the top end. RTX 3060 and RTX 3070 is whee value will be. It has always been this way.

Sure, the performance is good. But Nvidia could easily make a GPU that is 2x faster as well, but they don't need to, so they don't. Thinking about how LONG ago the 20 series launched, is ridiculous! 2 years and no innovation! 30% performance increase in 2+ years is actually BAD. Or at least it used to be.

 

The real price performance used to be 1060 / 1070. But even the prices of the xx60 and xx70 have been creeping up. Used to be the xx60 was 200-250 USD and the xx70 was 350. It is starting to become that the xx60 is 350 and the xx70 is 450+. Maybe the 3060 will be $399 and the 3070 will be $499. I would not be surprised in the least! That used to be close to the xx80Ti pricing!!

 

The real value is the RX5700 series in my opinion. Nvidia no longer offers value products. Just expensive and more expensive. Sadly their "experiment" of selling the 2080ti at 1200 USD showed them that people will pay whatever they ask. All the people who bought 2080ti's should be ashamed.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, maartendc said:

Sure, the performance is good. But Nvidia could easily make a GPU that is 2x faster as well, but they don't need to, so they don't. Thinking about how LONG ago the 20 series launched, is ridiculous! 2 years and no innovation! 30% performance increase in 2+ years is actually BAD. Or at least it used to be.

 

The real price performance used to be 1060 / 1070. But even the prices of the xx60 and xx70 have been creeping up. Used to be the xx60 was 200-250 USD and the xx70 was 350. It is starting to become that the xx60 is 350 and the xx70 is 450+. Maybe the 3060 will be $399 and the 3070 will be $499. I would not be surprised in the least! That used to be close to the xx80Ti pricing!!

 

The real value is the RX5700 series in my opinion. Nvidia no longer offers value products. Just expensive and more expensive. Sadly their "experiment" of selling the 2080ti at 1200 USD showed them that people will pay whatever they ask. All the people who bought 2080ti's should be ashamed.

Back in the day I had a plan to buy GTX 1080 to replace my GTX 980 that was having problems. And then I realized GTX 1080Ti offers massive performance boost for like 200€ more iirc, so I thought, why not shell out 200€ more now and just keep GTX 1080Ti for longer instead of changing the card in between. GTX 1080Ti launched in March 2017, I bought it around May/June same year. I still have it 3 years later now and I feel absolutely no need to upgrade because a) RTX is hardly used in any game and b) the performance uplift in regular non-RTX games is basically insignificant. Only card that even rivals my aging 1080Ti is the 2080Ti. And even there, the uplift in performance is so small I can't be bothered.

 

Now, things might change for Cyberpunk 2077 which might be the only game in many years that could make me want to buy a card that has ray tracing support. It'll be a slower paced game where you can really appreciate all the visual details and might be the only reason to buy something that's ray tracing capable. But outside of that, no real incentive to do it. Especially not at prices they are selling them. Who knows, maybe AMD will pull a rabbit out of their hat with RX 6000 series and I'll go with AMD again. They'll both present the new cards at the end of July or August and we'll see.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, RejZoR said:

Back in the day I had a plan to buy GTX 1080 to replace my GTX 980 that was having problems. And then I realized GTX 1080Ti offers massive performance boost for like 200€ more iirc, so I thought, why not shell out 200€ more now and just keep GTX 1080Ti for longer instead of changing the card in between. GTX 1080Ti launched in March 2017, I bought it around May/June same year. I still have it 3 years later now and I feel absolutely no need to upgrade because a) RTX is hardly used in any game and b) the performance uplift in regular non-RTX games is basically insignificant. Only card that even rivals my aging 1080Ti is the 2080Ti. And even there, the uplift in performance is so small I can't be bothered.

I hear you. The 1080Ti is a beast.

 

I am still running a 980Ti, which is equivalent still to like an RTX 2060. Still can play everything at 1440p 60 fps, except Red dead redemption 2, which gets like 45 fps at 1440p. Only with the very newest, most demanding games, it is starting to show its age.

 

I might actually buy a used 1080ti or 2080Ti when the 3000 series launches or when Cyberpunk comes out.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, maartendc said:

I hear you. The 1080Ti is a beast.

 

I am still running a 980Ti, which is equivalent still to like an RTX 2060. Still can play everything at 1440p 60 fps, except Red dead redemption 2, which gets like 45 fps at 1440p. Only with the very newest, most demanding games, it is starting to show its age.

 

I might actually buy a used 1080ti or 2080Ti when the 3000 series launches or when Cyberpunk comes out.

A two things,

 

1 - 980Ti equivalent isn't a 2060. The 980Ti is somewhat equivalent to the 1070 and the 2060 is ~ 15% faster than the 1070 

 

2 - I highly doubt you are getting 60FPS @ 1440p in any modern game unless you have all the settings set to the lowest

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, maartendc said:

I hear you. The 1080Ti is a beast.

 

I am still running a 980Ti, which is equivalent still to like an RTX 2060. Still can play everything at 1440p 60 fps, except Red dead redemption 2, which gets like 45 fps at 1440p. Only with the very newest, most demanding games, it is starting to show its age.

 

I might actually buy a used 1080ti or 2080Ti when the 3000 series launches or when Cyberpunk comes out.

hm yeah agree with above. you do know that the 980ti is a 1070 right? well a bit worse. and that the 2060 is about a 1080 right? 

and strongly agree with the guy up top even older games will not be getting you 60 fps 1440p unless its indie titles or whatever lol
why buy a used 1080ti when you can get a new 2070 super with around the same performance of it? with warranty and all. that makes no sense. 

also.. i do not get the complains about prices.

here in the netherlands the 1070 at launch costed about 420-460 ish.
and now the 2070 super for around 530 which is around the perf of a 1080ti. which was launched around here for 700-750 bucks. so 230 bucks cheaper for the same performance. wanna talk your card? 680-740 bucks here at launch the 980ti. and you could get the equivalent or well.. slightly better 1070 for a lot cheaper. 

So FFS stop it with that price complaining. its stupid. 

PC: 
MSI B450 gaming pro carbon ac              (motherboard)      |    (Gpu)             ASRock Radeon RX 6950 XT Phantom Gaming D 16G

ryzen 7 5800X3D                                          (cpu)                |    (Monitor)        2560x1440 144hz (lg 32gk650f)
Arctic Liquid Freezer II 240 A-RGB           (cpu cooler)         |     (Psu)             seasonic focus plus gold 850w
Cooler Master MasterBox MB511 RGB    (PCcase)              |    (Memory)       Kingston Fury Beast 32GB (16x2) DDR4 @ 3.600MHz

Corsair K95 RGB Platinum                       (keyboard)            |    (mouse)         Razer Viper Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, hollyh88 said:

So FFS stop it with that price complaining. its stupid.

Prices for the Super refresh have been decent, nothing amazing but good enough. What was pretty awful were the standard RTX prices at launch. Those were really disappointing. A 2070 was over 500€ at launch, the same price as a 1080 at the time for ~10% more performance. Or the normal 2080, priced the same as a 1080ti for again ~10% better performance.
And yes I have a 2070 that's why I remember just how bad the pricing was at launch. Thank god AMD forced a Super refresh with Navi.

People fear that the 3000 series will repeat this and I really hope they won't.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Medicate said:

Prices for the Super refresh have been decent, nothing amazing but good enough. What was pretty awful were the standard RTX prices at launch. Those were really disappointing. A 2070 was over 500€ at launch, the same price as a 1080 at the time for ~10% more performance. Or the normal 2080, priced the same as a 1080ti for again ~10% better performance.

Something we're going to have to get used to more and more going forward, is that a simple brute force performance comparison doesn't tell the whole story. We got the new RTX features. Sure, at the start, hardly anything supported it, but it is growing. With AMD trying to be fashionably late to the party, it will become more of a feature going into next year. Are we going to only use raw performance in non-RT games? Then we have RT performance, possibly with or without DLSS. As a gamer, I don't so much care how those technologies work, as long as they do work, and the end result is a better experience for it. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

If people wouldn’t pay for the overpriced shit then the prices would drop 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Medicate said:

Prices for the Super refresh have been decent, nothing amazing but good enough. What was pretty awful were the standard RTX prices at launch. Those were really disappointing. A 2070 was over 500€ at launch, the same price as a 1080 at the time for ~10% more performance. Or the normal 2080, priced the same as a 1080ti for again ~10% better performance.
And yes I have a 2070 that's why I remember just how bad the pricing was at launch. Thank god AMD forced a Super refresh with Navi.

People fear that the 3000 series will repeat this and I really hope they won't.

again.. did even launch prices suck for anything under 2080 ti? no. the 1080 launched here for about 580-620 ish. which is again way more expensive than better performing 2070 did. yes the 2070 was more expensive than the 1070 but not by that much. and with more performance to boot.

so again i see no reason to complain about pricing. the 2080 super/2080ti are expensive tho but when i see people complain about a 2070/2070 super/2080 i always need to take a few seconds.

and yes amd brought in good competitive cards for cheaper especially the 5700xt but it launched with piss poor driver support. and no extra features to boot which is why i went with nvidia again. 

i dont think the price hike will be too big till the 3080 range. dont even think it will change at all maybe a tiny bit more. but would you complain if a 3070 would launch around 450 and 3070 ti around 520? i dont think so. maybe some would complain but as it is rumoured the 3070ti might be very close to a 2080ti perf. so i dont think people will complain as much anymore.

lets just all hope that amd got their gpu drivers sorted this time. nothing to complain about their cpu's tho just the gpu side of them is kinda bad these days. (yes im aware they fixed the drivers but it was very late...)   

PC: 
MSI B450 gaming pro carbon ac              (motherboard)      |    (Gpu)             ASRock Radeon RX 6950 XT Phantom Gaming D 16G

ryzen 7 5800X3D                                          (cpu)                |    (Monitor)        2560x1440 144hz (lg 32gk650f)
Arctic Liquid Freezer II 240 A-RGB           (cpu cooler)         |     (Psu)             seasonic focus plus gold 850w
Cooler Master MasterBox MB511 RGB    (PCcase)              |    (Memory)       Kingston Fury Beast 32GB (16x2) DDR4 @ 3.600MHz

Corsair K95 RGB Platinum                       (keyboard)            |    (mouse)         Razer Viper Ultimate

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Gohardgrandpa said:

If people wouldn’t pay for the overpriced shit then the prices would drop 

 

That or at some point in the next 10 years any* dedicated graphics card will be able to run games fully ray traced with 8K texture streaming at 4K 240fps.

 

*Yes, even an RTX 9020TI.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, Orfieus said:

2 - I highly doubt you are getting 60FPS @ 1440p in any modern game unless you have all the settings set to the lowest

You're wrong.

 

Look up benchmarks for a GTX 1070 at 1440p if you don';t know what you are talking about.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, hollyh88 said:

hm yeah agree with above. you do know that the 980ti is a 1070 right? well a bit worse. and that the 2060 is about a 1080 right? 

and strongly agree with the guy up top even older games will not be getting you 60 fps 1440p unless its indie titles or whatever lol
why buy a used 1080ti when you can get a new 2070 super with around the same performance of it? with warranty and all. that makes no sense.

980 Ti does 70 fps at 1440 P Ultra settings in Battlefield V. Not exactly an "indie title".

Hitman 2 same thing, 60 fps at high or ultra on 1440p.

Red dead Redemption 2 gets 45 fps at 1440P Very High settings. (which is the only title I have found to date that cannot do 1440p / 60)

Look up benchmarks for GTX 1070, which is indeed equivalent to 980ti, in current games at 1440p if you dont know what you are talking about.

 

And no, the RTX 2060 is slower than the GTX 1080 and slightly faster than the 1070. It is closer to the 1070 than the 1080. At least when it came out, perhaps in more recent titles / drivers it has edged closer to the 1080.

 

I am just stating facts here.

 

If you don't know what you are talking about, go educate yourself instead of posting rubbish.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, porina said:

Something we're going to have to get used to more and more going forward, is that a simple brute force performance comparison doesn't tell the whole story. We got the new RTX features. Sure, at the start, hardly anything supported it, but it is growing. With AMD trying to be fashionably late to the party, it will become more of a feature going into next year. Are we going to only use raw performance in non-RT games? Then we have RT performance, possibly with or without DLSS. As a gamer, I don't so much care how those technologies work, as long as they do work, and the end result is a better experience for it. 

The DLSS 2.0 comparisons would be interesting. Seems that with the technology now proving itself since its 2.0 overhaul, newer games are slowly implementing it, and adoption may accelerate over time.

 

The curious thing is how much this affects Turing versus Ampere.

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, D13H4RD said:

The DLSS 2.0 comparisons would be interesting. Seems that with the technology now proving itself since its 2.0 overhaul, newer games are slowly implementing it, and adoption may accelerate over time.

 

The curious thing is how much this affects Turing versus Ampere.

 

Apparently the extra die space for what could have been a few thousand more CUDA Cores is being used to give us double or triple the RT performance. I'm guessing a combo of better RT capacity and DLSS 3.0 will mostly mitigate the frame drops. Maybe 10-15fps max.

 

Source: Moore's Law is Dead on YouTube

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Results45 said:

Apparently the extra die space for what could have been a few thousand more CUDA Cores is being used to give us double or triple the RT performance. I'm guessing a combo of better RT capacity and DLSS 3.0 will mostly mitigate the frame drops. Maybe 10-15fps max.

 

Source: Moore's Law is Dead on YouTube

For Ampere? Sounds interesting

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

I'd say the 20 series pricing was the result of Nvidia trying to cash in on the mining craze. They saw people paying upwards of 1500 for their GPUs, and decided to mark up their cards accordingly. Since the mining craze died down, and they were left with unsold cards (possibly the reason for the Super line), I would not be surprised if the price of the new xx80Ti series was between 800 and 1000 USD again. Maybe the xx90 or Titan in the range of 1200-1500.

 

That's my thoughts, anyways.

Spoiler

CPU: Intel i7 6850K

GPU: nVidia GTX 1080Ti (ZoTaC AMP! Extreme)

Motherboard: Gigabyte X99-UltraGaming

RAM: 16GB (2x 8GB) 3000Mhz EVGA SuperSC DDR4

Case: RaidMax Delta I

PSU: ThermalTake DPS-G 750W 80+ Gold

Monitor: Samsung 32" UJ590 UHD

Keyboard: Corsair K70

Mouse: Corsair Scimitar

Audio: Logitech Z200 (desktop); Roland RH-300 (headphones)

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, The1Dickens said:

I'd say the 20 series pricing was the result of Nvidia trying to cash in on the mining craze. They saw people paying upwards of 1500 for their GPUs, and decided to mark up their cards accordingly. Since the mining craze died down, and they were left with unsold cards (possibly the reason for the Super line), I would not be surprised if the price of the new xx80Ti series was between 800 and 1000 USD again. Maybe the xx90 or Titan in the range of 1200-1500.

 

That's my thoughts, anyways.

One can hope.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, The1Dickens said:

I'd say the 20 series pricing was the result of Nvidia trying to cash in on the mining craze. They saw people paying upwards of 1500 for their GPUs, and decided to mark up their cards accordingly. Since the mining craze died down, and they were left with unsold cards (possibly the reason for the Super line)

A lot of things were blamed on mining, but I don't feel 20 series pricing really falls into that. Mass mining was practically over in Pascal era. Even if we assume that nvidia priced high because they thought there was more mining demand that there was, they could have still lowered pricing or improved value if they wanted to. Maybe not so often these days, but there's rebates, or bundles of more games. Super was more a response they had to make to AMD's releases at the time. IMO the price was where they wanted it. 

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/9/2020 at 9:03 AM, Orfieus said:

A two things,

 

1 - 980Ti equivalent isn't a 2060. The 980Ti is somewhat equivalent to the 1070 and the 2060 is ~ 15% faster than the 1070 

 

2 - I highly doubt you are getting 60FPS @ 1440p in any modern game unless you have all the settings set to the lowest

My personal experience begs to differ. 1440p 60 FPS is no problem with my 980TI in most modern tiles on high settings.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, BabaGanuche said:

My personal experience begs to differ. 1440p 60 FPS is no problem with my 980TI in most modern tiles on high settings.

Not including Destiny 2 what modern games are we talking about?

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/9/2020 at 8:03 AM, Orfieus said:

A two things,

 

1 - 980Ti equivalent isn't a 2060. The 980Ti is somewhat equivalent to the 1070 and the 2060 is ~ 15% faster than the 1070 

 

2 - I highly doubt you are getting 60FPS @ 1440p in any modern game unless you have all the settings set to the lowest

How does a 980ti compare to a 580?  I’ve been given an opportunity to swap to one.  They go for about the same money it seems.  I’m hoping it’s old enough to play nice with my big monitor.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Bombastinator said:

How does a 980ti compare to a 580?  I’ve been given an opportunity to swap to one.  They go for about the same money it seems.  I’m hoping it’s old enough to play nice with my big monitor.

The second to last digit in an nVidia card indicates the performance tier. Each generation of difference tends to bump the previous down.

 

So 

RTX 2060/GTX 1660 = GTX 1070

GTX 1060 = GTX 970

GTX 960 = GTX 770

In theory.

 

In practice they tend to scale this way because the difference between a x60 and an x70 is that they have twice the cores, and an x80 has twice the cores of x70. So when you do a die shrink from 14nm to 7nm, you get the same performance boost because it now has 4x the cores. So going from a RTX 2080 to a RTX 3060 on a 14nm to 7nm die shrink would in fact do this. Assuming it's a 1:1 scale down.

 

However binning gives you the Ti or Super parts and these tend to slot in between the next highest part and the next part. So a 980Ti would be better than a 1060, but not a 1070. AMD doesn't have this kind of linear scale and upgrading the GPU across the same performance tier doesn't always offer the same benefit.

 

I assume you're talking about an AMD RX 580 not a nvidia GTX 580

https://www.videocardbenchmark.net/compare/Radeon-RX-580-vs-GeForce-GTX-980-Ti/3736vs3218

The 980Ti is about 50% better? In 3D performance at least. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Kisai said:

The second to last digit in an nVidia card indicates the performance tier. Each generation of difference tends to bump the previous down.

 

So 

RTX 2060/GTX 1660 = GTX 1070

GTX 1060 = GTX 970

GTX 960 = GTX 770

In theory.

 

In practice they tend to scale this way because the difference between a x60 and an x70 is that they have twice the cores, and an x80 has twice the cores of x70. So when you do a die shrink from 14nm to 7nm, you get the same performance boost because it now has 4x the cores. So going from a RTX 2080 to a RTX 3060 on a 14nm to 7nm die shrink would in fact do this. Assuming it's a 1:1 scale down.

 

However binning gives you the Ti or Super parts and these tend to slot in between the next highest part and the next part. So a 980Ti would be better than a 1060, but not a 1070. AMD doesn't have this kind of linear scale and upgrading the GPU across the same performance tier doesn't always offer the same benefit.

 

I assume you're talking about an AMD RX 580 not a nvidia GTX 580

https://www.videocardbenchmark.net/compare/Radeon-RX-580-vs-GeForce-GTX-980-Ti/3736vs3218

The 980Ti is about 50% better? In 3D performance at least. 

Kewl. Sounds like I should do it then.  Thx.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, illegalwater said:

Supposedly Nvidia will soon be discontinuing the 2080 Ti, 2080 Super, and 2070 Super/Non Super.

https://videocardz.com/newz/nvidia-rumored-to-retire-geforce-rtx-2080-ti-super-and-geforce-rtx-2070-super-graphics-cards-soon

Interesting.  This implies to me that they feel the new 30 series is weak because there will be new cards that are no faster than the old cards.  AMD killed the 590 in the same way just before releasing its lower end 55 and 56 stuff.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×