Jump to content

New Official 4080 16gb/12gb Benchmarks released by Nvidia (4080 12gb up to 30% slower in raster than 4080 16gb)

Deadpool2onBlu-Ray

Seems obvious to me. Nvidia is milking the 4090 through market segmentation in by making their lower tier have a reduced value. They will re-calibrate back to normalcy via injecting the Ti series in between their existing lineup to compete against AMD and Intel.

 

Be patient. Nvidia isn't above market forces. Their hand will be forced to adjust like everyone else's.

Link to comment
Share on other sites

Link to post
Share on other sites

No amount of 're-calibrating' is going to bring a future 4080ti down to ~$750.

 

Face it, Nvidia has given 'us' the finger big time this generation (and the last 2).

The signs have been there since the 20 series, with the 2080 being no better than the 1080ti at launch but costing the same, and the 2080ti being better but costing over $1000. Then the 30 series continued on from that, and now the 40 series has gone even further by not only sticking with the prices increases but by also lowering the quality of the silicon being used.

 

Not that those who defend the 40 series will be interested, but if u look at the history of GPU dies, you can see how far Nvidia has dropped the tiers and increased the prices.

 

Using the 4090 specs as a baseline as it its 'nearly' a full fat Die, the theoretical specs of a 4080 should be in region of 11,600 cores at a price of around $575.  (historically the core jump from x80 to titan/x90 is on average ~40%)

What we have is a 'GPU' with 9728 cores and a price of $1200.

That 'should' be a 4070ti, with that core count. With a price of ~ $475

The $900 GPU with 7680 cores, is more akin to a 4060ti, which should be ~ $325

 

A $1600 4090, is perfectly fine, the 700 , 900 and 10 series cards all had equivalents. But a 4080ti should be coming in at ~ 5% lower core count, for substantially less cost.. ~$750 .. But thats not going to happen with the current listed prices of the so called '4080s' (4060ti and 4070ti), and based on the previous 2 generations.

 

This generation is totally FUBAR, and those still buying into it are screwing not only themselves over in the long term, but also the rest of the community aswell, by rewarding Nvidia for this behavior.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SolarNova said:

No amount of 're-calibrating' is going to bring a future 4080ti down to ~$750.

 

Face it, Nvidia has given 'us' the finger big time this generation (and the last 2).

The signs have been there since the 20 series, with the 2080 being no better than the 1080ti at launch but costing the same, and the 2080ti being better but costing over $1000. Then the 30 series continued on from that, and now the 40 series has gone even further by not only sticking with the prices increases but by also lowering the quality of the silicon being used.

 

Not that those who defend the 40 series will be interested, but if u look at the history of GPU dies, you can see how far Nvidia has dropped the tiers and increased the prices.

 

Using the 4090 specs as a baseline as it its 'nearly' a full fat Die, the theoretical specs of a 4080 should be in region of 11,600 cores at a price of around $575.  (historically the core jump from x80 to titan/x90 is on average ~40%)

What we have is a 'GPU' with 9728 cores and a price of $1200.

That 'should' be a 4070ti, with that core count. With a price of ~ $475

The $900 GPU with 7680 cores, is more akin to a 4060ti, which should be ~ $325

 

A $1600 4090, is perfectly fine, the 700 , 900 and 10 series cards all had equivalents. But a 4080ti should be coming in at ~ 5% lower core count, for substantially less cost.. ~$750 .. But thats not going to happen with the current listed prices of the so called '4080s' (4060ti and 4070ti), and based on the previous 2 generations.

 

This generation is totally FUBAR, and those still buying into it are screwing not only themselves over in the long term, but also the rest of the community aswell, by rewarding Nvidia for this behavior.

I agree 100%. They are saving face a bit though.

 

The 4080 12gb is cancelled

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, CHICKSLAYA said:

I agree 100%. They are saving face a bit though.

 

The 4080 12gb is cancelled

So they have.

Still..... they are still a whole tier askew.

The '4080 16GB' is to far removed from the 4090 to be a 4080.

A 68% core difference is closer to the difference between a x70 and x90, than a x80 to x90 (which is ~40%.)

 

If things stay as is, a future 4080ti will not only be stupid expensive once again, it will also may not be within the usual 0-9% core difference (5% average) , and thus performance, that previous x80ti to titan/90 have been.

 

Which brings me back to ....

13 minutes ago, SolarNova said:

No amount of 're-calibrating' is going to bring a future 4080ti down to ~$750

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SolarNova said:

So they have.

Still..... they are still a whole tier askew.

The '4080 16GB' is to far removed from the 4090 to be a 4080.

A 68% core difference is closer to the difference between a x70 and x90, than a x80 to x90 (which is ~40%.)

 

If things stay as is, a future 4080ti will not only be stupid expensive once again, it will also may not be within the usual 0-9% core difference (5% average) , and thus performance, that previous x80ti to titan/90 have been.

 

Which brings me back to ....

 

Yup. The pricing is still a whole tier off

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

People who buy the wickedly overpriced 4080 and the 4070 cosplaying as the 4080 "12GB" are just encouraging the normalization of these insane prices. Nvidia is laughing all the way to the bank with this.

 

Shame.

MAIN SYSTEM: Intel i9 10850K | 32GB Corsair Vengeance Pro DDR4-3600C16 | RTX 3070 FE | MSI Z490 Gaming Carbon WIFI | Corsair H100i Pro 240mm AIO | 500GB Samsung 850 Evo + 500GB Samsung 970 Evo Plus SSDs | EVGA SuperNova 850 P2 | Fractal Design Meshify C | Razer Cynosa V2 | Corsair Scimitar Elite | Gigabyte G27Q

 

Other Devices: iPhone 12 128GB | Nintendo Switch | Surface Pro 7+ (work device)

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, CTR640 said:

nGreedia has removed the LHR in their latest driver lol!

https://videocardz.com/newz/nvidia-reportedly-removes-lite-hash-rate-limiter-with-the-latest-driver

 

This to increase the sales of the RTX30. They've gotten really used to the insane sales by miners and scalpers so they refuse to lower the RTX30 prices.

Didn't Nvidia say LHR  was a hardware limiter?  It seems like they're trying to look good even though mining isn't profitable.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Blademaster91 said:

Didn't Nvidia say LHR  was a hardware limiter?  It seems like they're trying to look good even though mining isn't profitable.

That has been in my mind al along! This is another proof that nGreedia is straight up lying. So all the "LHR" is just complete bullshit?

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe out of pocket, but is it possible to boycott nvidia and make videos/adds shaming them without consequences (such as lawsuits, etc)?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/12/2022 at 3:15 PM, agatong55 said:

So that means a 4070 will be worse then a 3070?? If so Nvidia screwed up but and this a HUGE but, I think this fake and trying to get AMD to feel good for themselves, the numbers for RDNA 3 will be interesting. 

If they can do the same as the "4080 12GB" with higher bus width but GDDR6 instead of GDDR6X, reduce cooling/power and only drop about 10% performance it would be a great 4070 depending on the price.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, CTR640 said:

That has been in my mind al along! This is another proof that nGreedia is straight up lying. So all the "LHR" is just complete bullshit?

It was a combination.  At the beginning it was able to get passed LHR just by using the pre-release beta driver.  I believe they "fixed" that with future hardware.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Blademaster91 said:

Didn't Nvidia say LHR  was a hardware limiter?  It seems like they're trying to look good even though mining isn't profitable.

 

2 hours ago, CTR640 said:

That has been in my mind al along! This is another proof that nGreedia is straight up lying. So all the "LHR" is just complete bullshit?

It was supposed to be both, in silicon or microcode/firmware component with activation of the limiter via drivers. What the truth actually is, only Nvidia knows.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, CTR640 said:

That has been in my mind al along! This is another proof that nGreedia is straight up lying. So all the "LHR" is just complete bullshit?

Dude, calm down.

This is not proof of Nvidia lying about anything. LHR was not bullshit either. It was a real thing.

 

 

LHR was at first a software lock. However, people bypassed it very quickly so in a later revision of the cards they modified something else on the cards to make it harder to bypass. It was probably some firmware to verify the driver in order to make sure the software limit was on.

Seems like they have now removed the limit in their driver in a way that doesn't trip the hardware check (probably just need correctly signed drivers).

 

 

If you ask me, these are great news. The whole "LHR" thing was really stupid to begin with. Companies should not tell their customers what they are or aren't allowed to do with the hardware they bought.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HumdrumPenguin said:

Unless you get a 4090, stay away from the 4000 series for now (it's the only one available, but you get the point)

While i hate its price, i have to agree.

Despite it being the Titan of the 40 series, along with the associated price tag, it is the only GPU that makes sense.

The now only remaining so called 4080 16GB, just doesnt make any sense for a 'reasonable' person to buy, its way over priced and way under speced.

Even if the now 'unreleased' 4080 12gb gets rebranded, i highly .. HIGHLY... doubt they will rebrand it a 4060ti (with a proper price tag ~ $325) which is what its speced as, more likely it will be a 4070 or 4070ti ..a whole tier. .or 2 ..higher than it should be along with an equally stupid price.

 

Im not going to lie and say this generation is a total wash, its clear Nvidia has the performance available, its just how they are segmenting and pricing these GPU thats making it such a bad generation for consumers.

 

Whats worse is rumors have it that AMD wont be able to compete at the high end ,raw performance wise, and even if they only reach the so called 4080 16gb level of performance on their top card, they wont undercut Nvidia enough to bring the price down from the eye wateringly stupid $1200, to the normal (based on historic averages with inflation) price of ~$575-600 for a x80 tier card.

I'd love to be wrong, AMD would obliterate Nvidias market share, even if they cant match a 4090, if they brought out their cards to match up to a 4080 16gb, but at pre 20 series pricing... that would REALLY kick Nvidia in the balls for being so complacent. but alas ..never going to happen.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/14/2022 at 6:54 PM, SolarNova said:

So they have.

Still..... they are still a whole tier askew.

The '4080 16GB' is to far removed from the 4090 to be a 4080.

A 68% core difference is closer to the difference between a x70 and x90, than a x80 to x90 (which is ~40%.)

 

If things stay as is, a future 4080ti will not only be stupid expensive once again, it will also may not be within the usual 0-9% core difference (5% average) , and thus performance, that previous x80ti to titan/90 have been.

 

Which brings me back to ....

 

I see this more as something that makes the XX90 model actually worth buying this time, relatively speaking. A 3080 vs 3090 was more than a doubling in MSRP, from $699 to $1499, for not nearly that much of a performance increase as far as I remember. Now it's a 33% price increase, from $1199 to $1599, for a similar increase in performance looking at these benchmarks. That sounds like a much better deal to me. Very expensive cards overall still, but at least relatively speaking the pricing seems to make more sense from these limited results.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, tikker said:

That sounds like a much better deal to me.

I have a bridge to sell you 😛

 

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, SolarNova said:

I have a bridge to sell you 😛

 

 

I didn't say I liked the absolute pricing, just that the relative pricing makes sense when comparing the price increase from the 4080 16 GB to the 4090 with the performance increase.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, tikker said:

I didn't say I liked the absolute pricing, just that the relative pricing makes sense when comparing the price increase from the 4080 16 GB to the 4090 with the performance increase.

The worst thing any of us could do is encourage price increases based on performance increases.

Theres already been rumors/leaks that thats what AMD/ Nvidia are planning. if that is whats happening, and what will continue to happen, u can completely forget prices ever return to 'normal', resulting in home PCs becoming as expensive as ur 'average' family Car thanks to GPU prices going from 'hundreds' to 'thousands', maybe even 'tens of thousands'.

We've got over 20 years of, what we would call the modern 'graphics card', prices to go by and we have never had price increases based of performance increases, if we had, then by now we would be paying tens of thousands for a mid range card. Go back 20 years, add on inflation and then add on a price based on the performance increases we have had since then for each subsequent generation..its insane and should never be humored as something we should ever want or tolerate.

 

So no , one cant consider the price of a 4080 as 'making sense' based on the relative performance increase to the 4090.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, SolarNova said:

The worst thing any of us could do is encourage price increases based on performance increases.

Theres already been rumors/leaks that thats what AMD/ Nvidia are planning. if that is whats happening, and what will continue to happen, u can completely forget prices ever return to 'normal', resulting in home PCs becoming as expensive as ur 'average' family Car thanks to GPU prices going from 'hundreds' to 'thousands', maybe even 'tens of thousands'.

We've got over 20 years of, what we would call the modern 'graphics card', prices to go by and we have never had price increases based of performance increases, if we had, then by now we would be paying tens of thousands for a mid range card. Go back 20 years, add on inflation and then add on a price based on the performance increases we have had since then for each subsequent generation..its insane and should never be humored as something we should ever want or tolerate.

 

So no , one cant consider the price of a 4080 as 'making sense' based on the relative performance increase to the 4090.

I think we're talking past each other a bit. I am not defending the high prices themselves nor am I saying the price increase with respect to the 3000 series makes sense.

 

I am merely contesting the point that the 4080 allegedly doesn't make sense compared to the 4090, because it was "too far from the 4090". The 4090 is 33% more expensive than the 4080 16GB and yields around that same number in terms of performance increase. Paying double the price of an XX80 for that same performance increase as with the 3080 to 3090 is a worse deal, no? We all loved to complain how the 3090 didn't make sense price-wise compared to the 3080.  Relatively speaking within the generation, we have a better situation this time where the 4090 seems about as more expensive as it yields more performance compared to the 4080.

 

 

The price itself is still a mess and, while there does seem to be indication of a slowly increasing trend, they are indeed quite above the (slowly rising) general trend.

https://www.reddit.com/r/nvidia/comments/jg816j/continuing_the_trend_inflation_adjusted_history/

ow4ii0wzipu51.jpg

 

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/15/2022 at 11:17 AM, SolarNova said:

While i hate its price, i have to agree.

Despite it being the Titan of the 40 series, along with the associated price tag, it is the only GPU that makes sense.

The now only remaining so called 4080 16GB, just doesnt make any sense for a 'reasonable' person to buy, its way over priced and way under speced.

Even if the now 'unreleased' 4080 12gb gets rebranded, i highly .. HIGHLY... doubt they will rebrand it a 4060ti (with a proper price tag ~ $325) which is what its speced as, more likely it will be a 4070 or 4070ti ..a whole tier. .or 2 ..higher than it should be along with an equally stupid price.

 

Im not going to lie and say this generation is a total wash, its clear Nvidia has the performance available, its just how they are segmenting and pricing these GPU thats making it such a bad generation for consumers.

 

Whats worse is rumors have it that AMD wont be able to compete at the high end ,raw performance wise, and even if they only reach the so called 4080 16gb level of performance on their top card, they wont undercut Nvidia enough to bring the price down from the eye wateringly stupid $1200, to the normal (based on historic averages with inflation) price of ~$575-600 for a x80 tier card.

I'd love to be wrong, AMD would obliterate Nvidias market share, even if they cant match a 4090, if they brought out their cards to match up to a 4080 16gb, but at pre 20 series pricing... that would REALLY kick Nvidia in the balls for being so complacent. but alas ..never going to happen.

Kinda sad indeed. I was on a 2080 Ti for a long while, and wanted a 3080 Ti at MSRP, but it was never available. I stopped searching at some point, and when they did show up the 4000 series were about to come out.


I don’t buy PC components used. Never did, and never will for a number of reasons, so I didn’t want to pay a lot for something that would soon be heavily outmatched (albeit at a bit over double the current price here in Canada). The only problem is that none of the cards aside from the 4090 made any sense to me if comparing to previous generations. Just look at the graphs. I didn’t think I would ever buy a xx90 card due to the cash that goes on unused VRAM, but in such circumstances that’s what I ended up doing anyway. I might as well update the X34P monitor at this rate. Starfield and Diablo IV would thank me.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, HumdrumPenguin said:

Kinda sad indeed. I was on a 2080 Ti for a long while, and wanted a 3080 Ti at MSRP, but it was never available. I stopped searching at some point, and when they did show up the 4000 series were about to come out.


I don’t buy PC components used. Never did, and never will for a number of reasons, so I didn’t want to pay a lot for something that would soon be heavily outmatched (albeit at a bit over double the current price here in Canada). The only problem is that none of the cards aside from the 4090 made any sense to me if comparing to previous generations. Just look at the graphs. I didn’t think I would ever buy a xx90 card due to the cash that goes on unused VRAM, but in such circumstances that’s what I ended up doing anyway. I might as well update the X34P monitor at this rate. Starfield and Diablo IV would thank me.

Buying used FTW 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...

now Nvidia might be pushing for a new 3060 or 3060ti version that is lower spec'ed.  (with same name)

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×