Jump to content

NVIDIA Project Beyond GTC Keynote with CEO Jensen Huang: RTX 4090 + RTX 4080 Revealed

3 minutes ago, starsmine said:

Just so people are aware in terms of GPU costs
TSMC N7 is 9k a wafer
TSMC N4 is closer to 16k per wafer

For theses 500mm+ sized chips, THATS HUGE 
With AIBs already unable to make money AT MSRP with 30 series, that price increase has to exist

 

Also, you need to remember that the 3000 was built on samsung's 8nm, which was even cheaper (and worse) than tsmc's 7nm, making the price hike for this gen even worse.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, starsmine said:

Just so people are aware in terms of GPU costs
TSMC N7 is 9k a wafer
TSMC N4 is closer to 16k per wafer

For theses 500mm+ sized chips, THATS HUGE 
With AIBs already unable to make money AT MSRP with 30 series, that price increase has to exist

 

I doubt AIBs are making any money, even with the price increases the leaks so far on AIB cards are massive 4 slot coolers, and Nvidia can still undercut them with FE cards.

9 minutes ago, cj09beira said:

no card below 900 dollars, to help sell the 3000 series cards no doubt. 

Not too surprising, Nvidia still wants to make crypto mining level of profit as people bought cards at inflated prices, the sad thing is the 4080 12GB isn't a real x80 card as the specs are cut down a lot compared to the 4080 16GB, so the 4070 is probably like an x60 card.

I'm more excited for AMD RDNA3 cards at this point especially with EVGA out of the market, hopefully those use a regular 8 and 6 pin power connector.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Quackers101 said:

oh god the gameplay sections for DLSS looked awful, either something is wrong or such bad resolution.

the DLSS versions looks more blurry than current DLSS or in god awful performance mode.

- 1080p

- live

- YT

 

oMg iT lOoKs LIkE sHiT

 

🤦‍♂️

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm buried in meetings all morning and haven't had a moment to read up or watch the vid:  has NVidia said if the 4090 is using 1 or 2GB chips for the VRAM?  IOW: will the RAM be on one side of the card or both?

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Dracarris said:

how many (functioning) GPU dies do you roughly get out of one N4 wafer?

It can be estimated but we have many unknowns. For a 300mm wafer, assuming 0.1 defect rate for a 25x20mm die (500mm2) gives 62% yield, or 66 perfect dies per wafer. Yield could be improved by deactivating affected parts of the die for cut down models. This doesn't include further binning for performance reasons. 0.1 defect rate was given for N5 process 2 years ago so it might have improved more since then, and N4 is an evolution of N5, as opposed to a more radical change like N3.

 

3 minutes ago, Blademaster91 said:

Nvidia still wants to make crypto mining level of profit as people bought cards at inflated prices

The real test will be to see what RDNA3 pricing will be. It may well be that the pricing is required to be reasonably profitable and AMD will be similarly priced. If AMD undercut nvidia, then the real test will be to see how much they adjust.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, porina said:

It can be estimated but we have many unknowns. For a 300mm wafer, assuming 0.1 defect rate for a 25x20mm die (500mm2) gives 62% yield, or 66 perfect dies per wafer. Yield could be improved by deactivating affected parts of the die for cut down models. This doesn't include further binning for performance reasons. 0.1 defect rate was given for N5 process 2 years ago so it might have improved more since then, and N4 is an evolution of N5, as opposed to a more radical change like N3.

okay that would amount to roughly 250$ per perfect die, bringing total BoM cost for a card probably somewhere in the 500$ region. So 700$ to 1100$ headroom for profits of Nvidia, AIB and distributors combined for the two high-end cards. Not bad and still wiggle room for price reduction.

 

Edit: They also need to recoup the mask costs for ES and production runs, and one mask set is AFAIK in the 7-figures with these ultra scaled processes. However at the volume they are moving this is more or less a cashflow problem, which does not exit for Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, Senzelian said:

- 1080p

- live

- YT

nvm, it didn't look as bad. also know that... that wasn't the issue. both images where compressed...

how the videos differed in noise and handling of edges to text, which one can notice some changes or differences, depending on how they compare things. of course not as much like a real review and hands on for the tech.

 

Also note, not all of the DLSS's fault, some could be cheaper and lower quality raytracing. Also what AA they use.

 

by their cyberpunk video, it seems like there is some ghosting issues still there. But could be reduced with updates?

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

"quantum leap forward" - so the smallest discrete step imaginable. It's beyond me how things like this can end up in the keynote of a multi-billion dollar company.

But kudos to Nvidia for not being buttheads (like some other companies) and allowing people to jump back to any point of the stream, rewatching it or starting late with 1.5x the speed (in 2022 this should be taken for granted but sadly it's not a matter of course).

 

On first glance, we are beyond 1080p and even 1440p gaming and the 4090 only makes sense for 2160p gaming or multi-monitor setups.

I don't like the 4080 segregation. Just call the middle child a 4080 Ti, especially if not only the VRAM configuration is different, but also the compute performance. This is unnecessary confusing for costumers. And how are retailers supposed to communicate a $300 price difference for the very same model to their costumers? WTH, Nvidia?

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, Dracarris said:

okay that would amount to roughly 250$ per perfect die, bringing total BoM cost for a card probably somewhere in the 500$ region. So 700$ to 1100$ headroom for profits of Nvidia, AIB and distributors combined for the two high-end cards. Not bad and still wiggle room for price reduction.

 

Edit: They also need to recoup the mask costs for ES and production runs, and one mask set is AFAIK in the 7-figures with these ultra scaled processes. However at the volume they are moving this is more or less a cashflow problem, which does not exit for Nvidia.

no they aint getting close to that profit. 
16k was also out of date data. TSMC has increased prices over 10% since then due to demand, that exact number just isnt known.
your total bom is also a massive underestimation, GDDR6x is still 13 dollars a gig.
Nvidia tries to get 50-60% margins
so they buy from TSMC a 275 dollar die sure and then sell that to the AIB for 440 dollars. Memory will be another 300 dollars for the AIB
Cooler costs are up since 450W, shipping costs are up cause metal is HEAVY, and the boxes are bigger, VRM costs are up cause 450W. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, HenrySalayne said:

"quantum leap forward" - so the smallest discrete step imaginable. It's beyond me how things like this can end up in the keynote of a multi-billion dollar company.

But kudos to Nvidia for not being buttheads (like some other companies) and allowing people to jump back to any point of the stream, rewatching it or starting late with 1.5x the speed (in 2022 this should be taken for granted but sadly it's not a matter of course).

 

On first glance, we are beyond 1080p and even 1440p gaming and the 4090 only makes sense for 2160p gaming or multi-monitor setups.

I don't like the 4080 segregation. Just call the middle child a 4080 Ti, especially if not only the VRAM configuration is different, but also the compute performance. This is unnecessary confusing for costumers. And how are retailers supposed to communicate a $300 price difference for the very same model to their costumers? WTH, Nvidia?

i took it as something like "the leap quantum computers did vs normal computers", although even then we still have no usable quantum computers so...

Link to comment
Share on other sites

Link to post
Share on other sites

So we start to get synthetic frames, as in frames not rendered but guessed and Nvidia is running out of numbers. I would love to see official betting on how many RTX 4050's we're going to get and how many of those could actually be even called RTX cards 🤣

Link to comment
Share on other sites

Link to post
Share on other sites

k then, Nvidia keep your secrets.

 

Screen Shot 2022-09-20 at 18.45.58.png

 

9 minutes ago, starsmine said:

your total bom is also a massive underestimation, GDDR6x is still 13 dollars a gig.

Holy shizz I wasn't thinking GDDR6X is that expensive. That makes 600$ BOM for GPU die and VRAM alone for a 4090. Add cooler, PCB and all the components (which has many expensive ones like VRM controllers, inductors, larger caps, quite some electromechanical stuff) and we probably are at 800$.

So yeah 50% total margin.

Link to comment
Share on other sites

Link to post
Share on other sites

What a joke. 4080 12GB is clearly a mid-range card rebranded as "80-class" while charging $900+ for it.
 

How is a 192-bit card anything else other than a 60-class card looking historically? We've not even seen 70-class cards at 192-bit. Are they doing any sort of Infinity Cache sort of deal to compensate? Otherwise I fail to see how the 4080 won't be problematic with such a narrow memory bus at higher resolutions.

Zen 3 Daily Rig (2022 - Present): AMD Ryzen 9 5900X + Optimus Foundations AM4 | Nvidia RTX 3080 Ti FE + Alphacool Eisblock 3080 FE | G.Skill Trident Z Neo 32GB DDR4-3600 (@3733 c14) | ASUS Crosshair VIII Dark Hero | 2x Samsung 970 Evo Plus 2TB | Crucial MX500 1TB | Corsair RM1000x | Lian Li O11 Dynamic | LG 48" C1 | EK Quantum Kinetic TBE 200 w/ D5 | HWLabs GTX360 and GTS360 | Bitspower True Brass 14mm | Corsair 14mm White PMMA | ModMyMods Mod Water Clear | 9x BeQuiet Silent Wings 3 120mm PWM High Speed | Aquacomputer Highflow NEXT | Aquacomputer Octo

 

Test Bench: 

CPUs: Intel Core 2 Duo E8400, Core i5-2400, Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS

Motherboards: ASUS Z97-Deluxe, EVGA Z490 Dark, EVGA Z790 Dark Kingpin

GPUs: GTX 275 (RIP), 2x GTX 560, GTX 570, 2x GTX 650 Ti Boost, GTX 980, Titan X (Maxwell), x2 HD 6850

Bench: Cooler Master Masterframe 700 (bench mode)

Cooling: Heatkiller IV Pro Pure Copper | Koolance GPU-210 | HWLabs L-Series 360 | XSPC EX360 | Aquacomputer D5 | Bitspower Water Tank Z-Multi 250 | Monsoon Free Center Compressions | Mayhems UltraClear | 9x Arctic P12 120mm PWM PST

Link to comment
Share on other sites

Link to post
Share on other sites

It is only 30% faster in rasterization compared to 3090 TI

 

fail.

Yeah, we're all just a bunch of idiots experiencing nothing more than the placebo effect.
Link to comment
Share on other sites

Link to post
Share on other sites

Here are the various RTX 40 series AIB cards:

 

NVIDIA-RTX4090-1.thumb.jpg.69ef9311266be84d6d9c64b33c202918.jpg

 

Gigabyte

 

vGm7xaKjsljT9WMF.thumb.jpg.40fa61dd6a25bffa75408c53dd192d1e.jpg

 

j3dbo5mRq3wtkAdt.jpg.e99f153964c62e689eb6337f3c6e8c7c.jpg

 

 

MSI

 

vY0VqaHogmjFzwRv.jpg.867067fec07f2da2299a8d11ad617272.jpg

 

Iul7SetPtCQ6FLxF.jpg.1beb210452479147ebfdf3fec4e280ba.jpg

 

WglLeVZ0MmgehICp.jpg.9c6cd1edff0917f5cd0b01c941681474.jpg

 

Jo5DcgEP2m2gd3eF.jpg.56cbb5bb85f660e190ea05343d356855.jpg

 

8qHohHGP494mmxTu.jpg.29c6a285e52221fb731b9ac214455f5b.jpg

 

Palit

 

untitled-6.png.9fe8a41e235293614a456ff1f43cb6d3.png

 

998731257_untitled-6(1).png.07b5aab0aeb035ee5ec399c86014c0cf.png

 

Gainward

 

6Ey5GZ2iOKjVS4lm.jpg.5f6626365b079fbbb91e16deb043c425.jpg

 

PNY

 

Z7XnzqUnFhAR7ZDG.thumb.jpg.e8a321e478c48b203210736d408e2a06.jpg

 

Colorful

 

BYvNV2mMbGFDr4m3.thumb.jpg.054437db0fe5d969ad75db08d5d25dec.jpg

 

 

KCpB1SD9Hfgb1pUj.jpg.92c2f238987334646c223ee840073e74.jpg

 

HqLOTnpRm0sSzYOs.jpg.ba53354478e8336e0d036a7a6b2c9e65.jpg

 

ahnMb6TU18T4RCQ0.jpg.dfa74ebad9495c0e95ad8a4778f0d6e3.jpg

 

VfDduZsxm6UN7aZi.jpg.6ad4a021048b4b29bf2bd2811ee616c3.jpg

 

Innno3D

 

3UxyzRcc8Mtxq8oZ.thumb.jpg.73281652870d54ffec0d1602c002aef1.jpg

 

FTyMlXj2lpWLspUF.thumb.jpg.4379d75b55095a0ade2f27396c376109.jpg

 

btfSIqH0lYtWxEVj.thumb.jpg.9b77221d922f916fda832885b3706e33.jpg

 

dJjJRLpPF6lKHUpD.thumb.jpg.d3483cb11af9c43a6dd68d5c71496386.jpg

 

Zotac (turns out the leak was very accurate)

 

EJIKssbHPW34QEjg.thumb.jpg.2e9d85dbda6e2e1d6229aa237e8f58b0.jpg

 

4dbI1UYhXNkmCh0O.thumb.jpg.b9f4ed584a2a25fb44cc44ee29a5fd22.jpg

 

glvdi9IudIGw6qfZ.thumb.jpg.7763e6eb6f73b9c27d9dc294e5aaeaf2.jpg

 

qskvRBsVbRk530qg.thumb.jpg.5dd13bab3a8eb5c5c89221b29ed38dbe.jpg

 

Galax

 

OkEarpkMmInJN44K.jpg.87e7f77c7bce14e5058d9c3403ff43d9.jpg

 

tU32DVW2oMM7hgTd.jpg.eee8378e0ed00604710d8b14caa12ca8.jpg

 

DblqIXZkBByqd7BF.jpg.ac6b13227b698a9b03631329c02d6f7b.jpg

 

HhV3WNUS0snD6DKL.jpg.c5a1b7bddb43449dae62203ccc784972.jpg

 

ASUS

 

Vit56yZHQvSDYAbf.thumb.jpg.5ca69e3602ab8fc21b2f9ca612685e65.jpg

 

3ASX7TXuX7hbs7XH.thumb.jpg.dfa943e8f125eacf5537c99cf348b9b6.jpg

 

YjSurOXnSsOFlewe.thumb.jpg.70a32b9e0edace01b39fac87b605d04c.jpg

 

dy8qaY2t262eNZuC.thumb.jpg.1bdfe22e937f7b4d885a2b4af09a2367.jpg

 

6USmKQnuXaNyo3Zq.thumb.jpg.873edf21a13041a9a52d20ab36eddb7b.jpg

 

QXs4hVf5gKaVUCc3.thumb.jpg.9f219edffc1ff74455590be3e558ce6f.jpg

 

https://www.techpowerup.com/299057/gigabyte-launches-its-latest-aorus-graphics-cards-based-on-nvidia-geforce-rtx-40-series

https://videocardz.com/newz/nvidia-introduces-geforce-rtx-4090-4080-series-rtx-4090-launches-october-12th-for-1599-usd

https://www.techpowerup.com/299064/colorful-introduces-next-generation-geforce-rtx-4090-and-rtx-4080-graphics-cards

https://www.techpowerup.com/299076/asus-launches-geforce-rtx-40-series-rog-strix-and-tuf-gaming-graphics-cards

https://www.techpowerup.com/299068/zotac-announces-next-generation-geforce-rtx-40-series-graphics-cards

https://www.techpowerup.com/299062/msi-unveils-its-first-custom-nvidia-geforce-rtx-40-series-graphics-cards

https://www.techpowerup.com/299060/palit-announces-geforce-rtx-40-gamerock-and-gamingpro-series

https://www.techpowerup.com/299063/pny-introduces-next-evolution-nvidia-geforce-rtx-40-series-gpus

https://www.techpowerup.com/299075/galax-introduces-its-geforce-rtx-40-series-graphics-card-family

https://www.techpowerup.com/299065/inno3d-geforce-rtx-40-series-is-here-rtx-4090-and-rtx-4080

https://www.techpowerup.com/299059/msi-unveils-first-custom-nvidia-geforce-rtx-40-series

https://www.techpowerup.com/299061/gainward-releases-its-rtx-40-series-graphics-cards

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, GoodBytes said:

If anyone wonders:

God almighty that is a thicc card. Thankfully the I/O is single height so once it's got a waterblock on it's not a problem.

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, rcarlos243 said:

It is only 30% faster in rasterization compared to 3090 TI

 

fail.

Source? 

I am fairly sure Nvidia stated the 4090 has roughly twice the rasterization performance of the 3090 Ti, and 4 times the performance in Ray tracing. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, BiG StroOnZ said:

Shader Execution Reordering (SER) that improves execution efficiency by rescheduling shading workloads on the fly to better utilize the GPU’s resources.

OOO execution means speculative execution. Speculative execution means we now get to worry about Spectre-type vulnerabilities on our GPUs too. What fun.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, schwellmo92 said:

RTX 4080 12GB should really be the RTX 4070 but is called the 4080 so they can say the 4080 starts at $899 and make it look like prices haven’t gone up much. Change my mind.

I am beyond furious, it is the same crap the pulled with GT730 and GT1030 DDR4 vs gddr5 version. It is a totally different die, but they gave it the same name to engage in anti-consumer behavior, because Nvidia.

AD103 vs AD104

9728 vs 7680 cuda cores

256-bit vs 192-bit memory bus

THEY ARE ACTUALLY CHARGING 899$ FOR A xx70 tier die, amazing.

I am scared of what die they could even put in the 4070 now that they already used the AD104.Truly the worst new gpu generation I have seen. The 4080 12gb costs 1129€ in Europe, OVER 370€ MORE than 3080 launched for. Nvidia really didn't expect mining to crash so soon eh? Not to even talk about the recession and massive inflation, don't know how Nvidia missed that.

I only see your reply if you @ me.

This reply/comment was generated by AI.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Origami Cactus said:

Truly the worst new gpu generation I have seen. The 4080 12gb costs 1129€ in Europe, OVER 370€ MORE than 3080 launched for. Nvidia really didn't expect mining to crash so soon eh? Not to even talk about the recession and massive inflation, don't know how Nvidia missed that.

Lets just hope market pressure/AMD will show them there place and they'll be unable to move GPUs anywhere near those prices.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×