Jump to content

New Official 4080 16gb/12gb Benchmarks released by Nvidia (4080 12gb up to 30% slower in raster than 4080 16gb)

Deadpool2onBlu-Ray
4 minutes ago, Zando_ said:

Wait for more 3rd party reviews and broader benchmarks before going all doom and gloom. 

 

And if more testing proves the same numbers, then... I still wouldn't be so dramatic. Just skip 4000 series and wait for next gen, or see what AMD brings to the table. Especially as OP has a 3080, there isn't a need to upgrade cards every generation. I can see how it may suck for people coming from 2+ generations back if they're slapped in the face with shit pricing, but I'll wait for the full line to be released, AMD's new cards, and pricing to settle after that, before definitively making that call. 

I was never going to buy 40 series. I am just worried for a few years down the line. Will the 5080/6080 be $1400, $1600, $1800? PC building is turning into a trust fund kid hobby, and quickly at that.

 

I literally bought my 3080 for $550 on the ADA Lovelace reveal/price hike shock date.

 

Nothing is going to touch that price/performance for awhile. Idc about used, some people might. 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

As always. NVIDIA wants to hook people with their new exclusive features to their newest and shiniest cards.

 

Want DLSS3? Want AV1 encoding? Well, 4080 is for youuuuu! It does not matter that you can get the same performance for cheaper with 3000 series... you will miss out on these new features that are barely usable now because of little to no adoption! Get it now and enjoy the full potential of DLSS3 right around a 5000 series launch when we will introduce DLSS4 exclusive just for the 5000 series!

 

4090 really is a monster though. But I still think than NVIDIA marketing machinery just made it look waaaay better just because the 3090 and 3090ti were so bad when compared vs 3080. 4080 16GB seems like what the 3090/3090ti should have been. Yes, 4090 would be impressive even then but less so.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, agatong55 said:

So that means a 4070 will be worse then a 3070??

hmm haven't thought about that but, yes, what they'll be calling a 4070 will maybe be on par or slower with a 3070 and on par or barely faster than a 3070 Ti. This makes no sense and I think it comes from the ridiculous fact that the 4090 has a freakin 60% more CUDA cores than the 4080, wth Nvidia??

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, dizmo said:

Well yes. People that already have hardware are fine. There will always be thousands, if not tens of thousands of people looking to buy new.

You need to realize that last gen cards sell out fast once the new ones arrive, and will quickly not become an option for those looking to buy new.

What's not?

I agree that the 12GB is a bit of a farce, but I'm not going to go crying about it until I see how the complete product stack looks.

 

You can scream it as much as you want, it doesn't make you right.

 

You're so focused in on the 12GB you ignore the 16GB. You also can't seem to get over the fact of this being only three titles, and we don't know how the rest will look.

 

It's also never really made a ton of sense to upgrade generation to generation.

I can see that point of view. The real issue is the pricing. If the 16gb was called the 4080 and was around $799-899 (fair pricing according to price increases/inflation etc.) and the 12gb was $549-$649 and named a 4060ti/4070 I would have zero issue. They increased the prices like $500 for no reason.

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, ZetZet said:

But that really isn't true. 3080 was never available for 700 on release. It was 1200. You can blame scalpers, miners whatever, people bought them, including A LOT of gamers. 

If you go by online availability, I got my 3080 under MSRP at MC. (after a month).  It wasn't nearly impossible for about 6 months after launch.  Took me a month for the 3080 and day 1 for 4090.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ewitte said:

If you go by online availability, I got my 3080 under MSRP at MC. (after a month).  It wasn't nearly impossible for about 6 months after launch.  Took me a month for the 3080 and day 1 for 4090.

Yup. I rolled with a FE 3060ti at $399 the past few years. The $399 card this time will be the 4060 and it will perform worse than the 3060ti without DLSS3

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CHICKSLAYA said:

I can see that point of view. The real issue is the pricing. If the 16gb was called the 4080 and was around $799-899 (fair pricing according to price increases/inflation etc.) and the 12gb was $549-$649 and named a 4060ti/4070 I would have zero issue. They increased the prices like $500 for no reason.

I mean, there is absolutely a reason. You just won't like hearing it, and will say it's bullshit.

People will pay it. So, they price it accordingly.

They're simply doing what any good, successful business does.

 

If you were selling something, you wouldn't price it significantly lower than market acceptability just to be nice.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ewitte said:

If you go by online availability, I got my 3080 under MSRP at MC. (after a month).  It wasn't nearly impossible for about 6 months after launch.  Took me a month for the 3080 and day 1 for 4090.

Yeah, microcenter and ordering directly from Nvidia are not available for a lot of people. 

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CHICKSLAYA said:

I was never going to buy 40 series. I am just worried for a few years down the line. Will the 5080/6080 be $1400, $1600, $1800? 

IIRC that's the "slippery slope" logical fallacy. 

2 minutes ago, CHICKSLAYA said:

PC building is turning into a trust fund kid hobby, and quickly at that.

No. 

 

The high end is, sure. Always has been though (perhaps not to this scale, but top end PCs have always cost significantly more than a midrange build), so not sure what changed. The performance floor has been brought way up, an XX60 or XX70 card is all most people need even for 1440p. You seem to be ignoring that and acting under the assumption that people need an XX80/80 Ti or higher for general gaming, which was last the case with like... Pascal and Maxwell if you played at 1440p or higher. Now you only need high SKU cards for more niche stuff, usually 1440p high refresh rate with high settings (usually people who want high refresh run lower "competitive" settings anyways so it's a weird niche) and 4K gaming if you want to run high settings or don't have access to DLSS. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

So the 4080 12GB is barely an improvement over the 3080, a 15-20% increase isn't worth it in titles like F1 22 or MS Flight Simulator, especially at the $900 Nvidia is asking for a cut down 80 tier card.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Zando_ said:

IIRC that's the "slippery slope" logical fallacy. 

No. 

 

The high end is, sure. Always has been though (perhaps not to this scale, but top end PCs have always cost significantly more than a midrange build), so not sure what changed. The performance floor has been brought way up, an XX60 or XX70 card is all most people need even for 1440p. You seem to be ignoring that and acting under the assumption that people need an XX80/80 Ti or higher for general gaming, which was last the case with like... Pascal and Maxwell if you played at 1440p or higher. Now you only need high SKU cards for more niche stuff, usually 1440p high refresh rate with high settings (usually people who want high refresh run lower "competitive" settings anyways so it's a weird niche) and 4K gaming if you want to run high settings or don't have access to DLSS. 

The issue is high end has gotten significantly more expensive, and there isn't anything worth buying at midrange, unless you settle on buying a 2 year old card or something used. While the performance floor has gone up, so has the pricing floor, an x80 card used to be $600, Nvidia brought the x70 tier up to that level, and a 3060Ti isn't going to be that useful if you want to have RTX on at 1440p. Nvidia wants people buying the x80 or x90 cards and its very obvious this gen with the 4090 "only" being $1600.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Blademaster91 said:

The issue is high end has gotten significantly more expensive, and there isn't anything worth buying at midrange, unless you settle on buying a 2 year old card or something used. While the performance floor has gone up, so has the pricing floor, an x80 card used to be $600, Nvidia brought the x70 tier up to that level, and a 3060Ti isn't going to be that useful if you want to have RTX on at 1440p. Nvidia wants people buying the x80 or x90 cards and its very obvious this gen with the 4090 "only" being $1600.

Yup. The midrange this gen is going to be dogshit. Better off getting a last gen 3080 or 6900xt for cheap. RDNA3 better impress or we are doomed for this gen at least

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Blademaster91 said:

The issue is high end has gotten significantly more expensive, and there isn't anything worth buying at midrange, unless you settle on buying a 2 year old card or something used. 

Can you explain how there's nothing worth buying at midrange? Most people build at midrange and find plenty worth buying. For 40 series the midrange isn't even announced yet so we have no data on whether it is or is not worth buying and why. 

11 minutes ago, Blademaster91 said:

While the performance floor has gone up, so has the pricing floor, an x80 card used to be $600, Nvidia brought the x70 tier up to that level

Performance has gone up more than the pricing, you can get a good experience at 1440p (or even 4K under certain conditions) with a much lower card than you used to. And for 1080p, a near-potato tier card manages that fine. 

13 minutes ago, Blademaster91 said:

Nvidia brought the x70 tier up to that level, and a 3060Ti isn't going to be that useful if you want to have RTX on at 1440p.

Entirely game dependent. Something like Cyberpunk 2077, where RTX absolutely tanks fps, it might struggle yeah. A beautiful implementation like Metro Exodus Enhanced Edition, it'd be fine. Because most games that include RTX include DLSS, which makes running them at good looking settings far easier (your screenshots may suffer a bit but in-game it looks great). Tis how I manage to run stuff at 4K with a 2060 Super. Metro Exodus EE ran like butter at very nice settings, gorgeous game. Cyberpunk's RT implementation is far too performance heavy so I run that with RT disabled, blend of med/mostly high settings. DLSS Performance for both, so internal render at 1080p and then upscale, which makes games far easier to run without being crunched to hell (ultra performance tries to upscale a 720p internal frame to 4K and that looks pretty shit). 

 

Basically RTX is still in a weird spot where whether it runs like shit or not is decided just as much by the game engine/dev work as the card itself.

17 minutes ago, Blademaster91 said:

Nvidia wants people buying the x80 or x90 cards and its very obvious this gen with the 4090 "only" being $1600.

That's no revelation, company wants people to buy their higher margin products. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Zando_ said:

IIRC that's the "slippery slope" logical fallacy. 

Not a logical fallacy if it's coming true before our very eyes.

 

Nvidia over the past 6 years has progressively offered significantly more performance at a significantly worse price to performance.

 

Inflation, whilst technically a valid reason for increasing prices, is not responsible for anywhere near the price increases seen.

 

Same for wafer prices increasing. Yes, it contributes to higher prices but is nowhere near enough of a price increase to explain away the scale of the price increases.

 

Nvidia is going back to using mid range small dies for $500+ GPUs.

 

In the 40 series, at least so far, the only die I wouldn't consider a small die is the AD102 die. AD104 is frankly the kind of die one might expect on a RTX 4060 or 4060Ti, maybe it could be fully enabled on a RTX 4070 if we're being really charitable.

 

Nvidia should have just come out and said the 4080 12GB is the 4070 or better yet a 4060Ti and offer it for $399. It would be a bargain at that price, who wouldn't want one? Basically 3080 perf for, in theory, $300 less.

 

Instead we've got a 4080 12GB that barely outperforms a 3080 10GB and shoudln't be called a 4080, as well as the 4080 16GB which is the real 4080 which is overpriced and should cost what the 4080 12GB costs or arguably less.

9 hours ago, Zando_ said:

No. 

 

The high end is, sure. Always has been though (perhaps not to this scale, but top end PCs have always cost significantly more than a midrange build), so not sure what changed. The performance floor has been brought way up, an XX60 or XX70 card is all most people need even for 1440p. You seem to be ignoring that and acting under the assumption that people need an XX80/80 Ti or higher for general gaming, which was last the case with like... Pascal and Maxwell if you played at 1440p or higher. Now you only need high SKU cards for more niche stuff, usually 1440p high refresh rate with high settings (usually people who want high refresh run lower "competitive" settings anyways so it's a weird niche)

Respectfully disagree, tho I happen to be part of that niche. AMD's been advertising x700XT cards for it since RDNA1 tho arguably you need a better tier card for it. And Nvidia's been pushing their x70 cards for it since Pascal or Turing and pushing their x80 and x80Ti cards for 4K since Turing.

9 hours ago, Zando_ said:

and 4K gaming if you want to run high settings or don't have access to DLSS. 

 

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Dracarris said:

hmm haven't thought about that but, yes, what they'll be calling a 4070 will maybe be on par or slower with a 3070 and on par or barely faster than a 3070 Ti. This makes no sense and I think it comes from the ridiculous fact that the 4090 has a freakin 60% more CUDA cores than the 4080, wth Nvidia??

Hear me out, they release them, barely produce them, then when AMD drops their cards there will be magically super variants that are what they should have been all along. Probably for the same price these retail at and they'll say 'SEE! Look at how amazing we are, giving you 20% more performance for the same price' and just quietly everyone will forget about the base models

My Folding Stats - Join the fight against COVID-19 with FOLDING! - If someone has helped you out on the forum don't forget to give them a reaction to say thank you!

 

The only true wisdom is in knowing you know nothing. - Socrates
 

Please put as much effort into your question as you expect me to put into answering it. 

 

  • CPU
    Ryzen 9 5950X
  • Motherboard
    Gigabyte Aorus GA-AX370-GAMING 5
  • RAM
    32GB DDR4 3200
  • GPU
    Inno3D 4070 Ti
  • Case
    Cooler Master - MasterCase H500P
  • Storage
    Western Digital Black 250GB, Seagate BarraCuda 1TB x2
  • PSU
    EVGA Supernova 1000w 
  • Display(s)
    Lenovo L29w-30 29 Inch UltraWide Full HD, BenQ - XL2430(portrait), Dell P2311Hb(portrait)
  • Cooling
    MasterLiquid Lite 240
Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, AluminiumTech said:

Not a logical fallacy if it's coming true before our very eyes.

 

Nvidia over the past 6 years has progressively offered significantly more performance at a significantly worse price to performance.

Oh, I don't go off FPS/$ if that's how you're measuring price/performance. I go off end user experience, a modern midrange PC is as fast or faster than one a few years back, even in modern titles that push a lot more graphical stuff than older ones, at the same price point. DLSS (and AFAIK FSR is getting good on AMD's side) are a big part of that definitely, but raw raster is plenty fine on both Nvidia and AMD. 

12 minutes ago, AluminiumTech said:

Nvidia is going back to using mid range small dies for $500+ GPUs.

 

In the 40 series, at least so far, the only die I wouldn't consider a small die is the AD102 die. AD104 is frankly the kind of die one might expect on a RTX 4060 or 4060Ti, maybe it could be fully enabled on a RTX 4070 if we're being really charitable.

 

Nvifia should have just come out and said the 4080 12GB is the 4070 or better yet a 4060Ti and offer it for $399. It would be a bargain at that price, who wouldn't want one? Basically 3080 perf for, in theory, $300 less.

 

I don't know why dies are brought up. I assume as they influence performance, but unless you know the performance of each respective die that's kind of useless information (as I have been saying, we haven't seen midrange 40 series yet, so this is a lot of moaning over something we don't even have hard data on yet). 

13 minutes ago, AluminiumTech said:

Instead we've got a 4080 12GB that barely outperforms a 3080 10GB and shoudln't be called a 4080, as well as the 4080 16GB which is the real 4080 which is overpriced and should cosy what the 4080 12GB costs or arguably less.

The 1060 3GB or 6GB shouldn't have been called a 1060, nor the 3080 10GB and 12GB, if you wanna get technical then a lot of the Titans were just "80 Ti with more VRAM and like 5% better core performance) as well. Nvidia has been doing this for years so I don't know why people get so offended by it every time there's a new generation. I think I got grumpy over 20 series as well but I've sort of given up on corporations ever naming anything clearly (the only ones to actually bother me now are the repeats of previous CPUs). 

16 minutes ago, AluminiumTech said:

Respectfully disagree, tho I happen to be part of that niche.

I am in it as well, I usually play at 4K. 

17 minutes ago, AluminiumTech said:

AMD's been advertising x700XT cards for it since RDNA1 tho arguably you need a better tier card for it. And Nvidia's been pushing their x70 cards for it since Pascal or Turing and pushing their x80 and x80Ti cards for 4K since Turing.

Nah, longer. 980 Ti was pitched as a 4K card... the Fury X might have been as well but I'm not as familiar with Radeon's marketing. Can't remember if Vega was pushed for 4K or not, I think the RVII was a bit but mostly pushed for prosumer stuff. 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CHICKSLAYA said:

Yup. The midrange this gen is going to be dogshit. Better off getting a last gen 3080 or 6900xt for cheap. RDNA3 better impress or we are doomed for this gen at least

I'm expecting the midrange to be pretty bad, at least on price to performance, Nvidia really hasn't focused or cared about midrange since the GTX 1060, the RTX 2060 was a good card but RTX was too much of a hit for most games. A 3080 or 6900XT is a much better deal and either one should last years, its just kind of annoying Nvidia only allows DLSS3 and AV1 encoding with the 40 series.

The rumors on RDNA3 are its going to be better performance per dollar on rasterization than RTX 40 series, with 50% better ray tracing than RDNA2, maybe not nearly as good as RTX 40 series at ray tracing but IMO good enough considering there aren't that many games utilizing ray tracing or at least good enough to use it.

2 hours ago, HW100 said:


Which means they rely on people with more money than mind (It's atleast how I get it).
 

I think thats how its been on the high end with people buying the x90Ti or Titan card, but its a problem now with Nvidia pushing people to buy the 4090.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Blademaster91 said:

The issue is high end has gotten significantly more expensive, and there isn't anything worth buying at midrange, unless you settle on buying a 2 year old card or something used. While the performance floor has gone up, so has the pricing floor, an x80 card used to be $600, Nvidia brought the x70 tier up to that level, and a 3060Ti isn't going to be that useful if you want to have RTX on at 1440p. Nvidia wants people buying the x80 or x90 cards and its very obvious this gen with the 4090 "only" being $1600.

RTX for the last two gens hasn't been a mid tier option. It's a high tier option. It'll reach mid tier probably this gen since the RT cores are significantly stronger. You're arguing for features that weren't meant for mid range systems.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zando_ said:

Can you explain how there's nothing worth buying at midrange? Most people build at midrange and find plenty worth buying. For 40 series the midrange isn't even announced yet so we have no data on whether it is or is not worth buying and why. 

Because the midrange has jumped up in price in the past 5-6 years to what the high end was, and the midrange 30 series cards are terrible on price to performance compared to the higher end cards.

And we don't know if there even will be a 40 series midrange since Nvidia wants to dump their stock of 30 series cards.

1 hour ago, Zando_ said:

Performance has gone up more than the pricing, you can get a good experience at 1440p (or even 4K under certain conditions) with a much lower card than you used to. And for 1080p, a near-potato tier card manages that fine. 

That doesn't excuse the pricing to double over the previous x80 tier card, people would be so pissed if companies did that with CPU's, and having to buying a lower tier means needing to replace it sooner, which is probably what Nvidia wants though.

1 hour ago, Zando_ said:

Entirely game dependent. Something like Cyberpunk 2077, where RTX absolutely tanks fps, it might struggle yeah. A beautiful implementation like Metro Exodus Enhanced Edition, it'd be fine. Because most games that include RTX include DLSS, which makes running them at good looking settings far easier (your screenshots may suffer a bit but in-game it looks great). Tis how I manage to run stuff at 4K with a 2060 Super. Metro Exodus EE ran like butter at very nice settings, gorgeous game. Cyberpunk's RT implementation is far too performance heavy so I run that with RT disabled, blend of med/mostly high settings. DLSS Performance for both, so internal render at 1080p and then upscale, which makes games far easier to run without being crunched to hell (ultra performance tries to upscale a 720p internal frame to 4K and that looks pretty shit). 

 

Basically RTX is still in a weird spot where whether it runs like shit or not is decided just as much by the game engine/dev work as the card itself.

RTX being so game dependent is why I still don't care much for the feature, and with DLSS being required to run a game sufficiently doesn't make much sense, I don't see the need to lower game quality to make things shiny.  IMO, RTX shouldn't be in such a stage when Nvidia has had the tech out for a while now, they either need to implement it decently in midrange cards, or sell midrange cards at reasonable prices because RTX isn't going to get much adoption or well optimization from game devs unless its usable for most people.

1 hour ago, Zando_ said:

That's no revelation, company wants people to buy their higher margin products. 

Nvidia is getting too greedy when they're pushing people to buy a x80 tier card that really starts at $1200, while the 3080 launched at $700.

Also it's Nvidia making those margins while AIB's take a loss on the higher end cards which makes it seem even more scummy, but not a surprise EVGA left with how massive and power hungry the 4090 is.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Blademaster91 said:

Because the midrange has jumped up in price in the past 5-6 years to what the high end was, and the midrange 30 series card are terrible on price to performance compared to the higher end cards.

I guess that's fair. Midrange PCs now are $1000-1500 so they still fit those cards though, not sure what midrange was during Maxwell but from Pascal - now it's been pretty similar. I think there was a spot of time where you could get something pretty capable for $800 but IIRC that leveraged a pretty good used market in the US as well. 

3 minutes ago, Blademaster91 said:

And we don't know if there even will be a 40 series midrange since Nvidia wants to dump their stock of 30 series cards.

I doubt they'd just, not make em at all. But yeah, they could hold them for a while to try and sell off 30 series cards. Guess we wait and see. 

6 minutes ago, Blademaster91 said:

That doesn't excuse the pricing to double over the previous x80 tier card,

Weren't 3080s going for that price as others have pointed out already. Regardless of the actual MSRP. Though fair, the MSRP hike is spicy, wouldn't be surprised if they slash it once (or if) they have actual competition. 

4 minutes ago, Blademaster91 said:

and having to buying a lower tier means needing to replace it sooner, which is probably what Nvidia wants though.

Yeah, the way a lot of modern publicly traded companies operate, they want repeat customers over making the best product they can. That's also not a new thing though tbf, older midrange cards have not aged gracefully either. I think the legendary 1060 is still hanging on in the steam hardware surveys but that's mostly it. 

8 minutes ago, Blademaster91 said:

RTX being so game dependent is why I still don't care much for the feature, and with DLSS being required to run a game sufficiently doesn't make much sense, I don't see the need to lower game quality to make things shiny.  IMO, RTX shouldn't be in such a stage when Nvidia has had the tech out for a while now, they either need to implement it decently in midrange cards, or sell midrange cards at reasonable prices because RTX isn't going to get much adoption or well optimization from game devs unless its usable for most people.

It is implemented decently in most midrange cards, it's just on dev implementation + being inherently rough as we brute-force a solution to more realistic lighting. Doesn't always lower game quality btw, I much prefer Metro Exodus with the ray-traced lighting engine over a native internal render resolution that may look sharper in some scenarios, or in screenshots. The reason DLSS is often required with RTX is just inherent to RT though AFAIK, the RT process adds a lot of time per frame, cutting to a lower internal render res can let that work go more quickly and get a frame out in a lower ms count. Tis why the refinements in Ampere and Ada Lovelace have been focused on reducing that latency addition from RT processing, or finding a way to do other work while waiting on it, bringing down the overall time to render a frame. But yeah, until they've refined it to the point an XX60 will run RT playably with native res, or very well with DLSS, it won't be a main selling feature. 

 

On that note though, DLSS 3.0 does look... kinda useless tbh, the frame injection fuckery it does seems similar to what a lot of "120Hz" TVs do/used to do, which gamers usually avoided on purpose. Guess I need to find a deep dive into it in slower single player titles if there's a video like that out yet, I doubt it'll ever be useful in fast paced games. Does feel like a niche feature pushed to the forefront to inflate performance numbers.

15 minutes ago, Blademaster91 said:

Nvidia is getting too greedy when they're pushing people to buy a x80 tier card that really starts at $1200, while the 3080 launched at $700.

Also it's Nvidia making those margins while AIB's take a loss on the higher end cards which makes it seem even more scummy, but not a surprise EVGA left with how massive and power hungry the 4090 is.

Nvidia's a publicly traded corporation, they've always been greedy, same with their competitors. See how Intel sat on their ass until actually challenged, or how once becoming a true challenger, AMD has hiked their prices and basically murdered TRX40 HEDT even after promising better support when they launched it (I believe WS/EPYC stuff just makes them so much more money). If people can't afford the hardware (or refuse to buy it in large enough numbers), it won't sell, so they'll reassess their strategy. 

 

Intel HEDT and Server platform enthusiasts: Intel HEDT Xeon/i7 Megathread 

 

Main PC 

CPU: i9 7980XE @4.5GHz/1.22v/-2 AVX offset 

Cooler: EKWB Supremacy Block - custom loop w/360mm +280mm rads 

Motherboard: EVGA X299 Dark 

RAM:4x8GB HyperX Predator DDR4 @3200Mhz CL16 

GPU: Nvidia FE 2060 Super/Corsair HydroX 2070 FE block 

Storage:  1TB MP34 + 1TB 970 Evo + 500GB Atom30 + 250GB 960 Evo 

Optical Drives: LG WH14NS40 

PSU: EVGA 1600W T2 

Case & Fans: Corsair 750D Airflow - 3x Noctua iPPC NF-F12 + 4x Noctua iPPC NF-A14 PWM 

OS: Windows 11

 

Display: LG 27UK650-W (4K 60Hz IPS panel)

Mouse: EVGA X17

Keyboard: Corsair K55 RGB

 

Mobile/Work Devices: 2020 M1 MacBook Air (work computer) - iPhone 13 Pro Max - Apple Watch S3

 

Other Misc Devices: iPod Video (Gen 5.5E, 128GB SD card swap, running Rockbox), Nintendo Switch

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Blademaster91 said:

Because the midrange has jumped up in price in the past 5-6 years to what the high end was, and the midrange 30 series cards are terrible on price to performance compared to the higher end cards.

And we don't know if there even will be a 40 series midrange since Nvidia wants to dump their stock of 30 series cards.

That doesn't excuse the pricing to double over the previous x80 tier card, people would be so pissed if companies did that with CPU's, and having to buying a lower tier means needing to replace it sooner, which is probably what Nvidia wants though.

RTX being so game dependent is why I still don't care much for the feature, and with DLSS being required to run a game sufficiently doesn't make much sense, I don't see the need to lower game quality to make things shiny.  IMO, RTX shouldn't be in such a stage when Nvidia has had the tech out for a while now, they either need to implement it decently in midrange cards, or sell midrange cards at reasonable prices because RTX isn't going to get much adoption or well optimization from game devs unless its usable for most people.

Nvidia is getting too greedy when they're pushing people to buy a x80 tier card that really starts at $1200, while the 3080 launched at $700.

Also it's Nvidia making those margins while AIB's take a loss on the higher end cards which makes it seem even more scummy, but not a surprise EVGA left with how massive and power hungry the 4090 is.

the 4080 starts at 900. No amount of make believe entitlement makes it start at 1200. 
Dont just make shit up 

I do think nvidia screwed up pricing the 4080 at 900 because they can sell it for 800 and make profit.
a 4070 at 550 sounds reasanable to me, but thats a massive gap between 900 and 550.

Link to comment
Share on other sites

Link to post
Share on other sites

Love the comments based on just pure raster performance it's hilarious

 

More rt and tensor

Ser

Optical flow

Av1

And many other things

 

It's like you guys can't handle change and the direction gpus are going to go because of old school raster

 

Let's compare what they are capable of completely seriously we

 

Also price wise I have no complains considering the other features and included hardware but will say little 4080 is huge screw up on Nvidia really anti consumer

Link to comment
Share on other sites

Link to post
Share on other sites

Damn, some sure likes to kiss, lick and smell Jensen his feet while he's wearing leather jacket in his rendered kitchen lol

 

Performance of 30% between a 4080 and 4080 is quite big. A big screw-up and very anti-consumer. 

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×