Jump to content

NVIDIA GeForce RTX 4090 Benchmark Leak Shows A 65% Gain Over RTX 3090 Ti (Updated)

Summary

Somebody has tested the RTX 4090 with the Geekbench CUDA benchmark. They did it at least twice, and the two results were very close: 417,713 points and 424,332 points as well. This particular RTX 4090 was tested on AMD’s latest X670E platform equipped with a Ryzen 9 7950X and 32GB DDR5-6000 memory. 

 

4090geekbench1.jpg.3d01bfb66303da8c2034821676966850.jpg

 

3090Tigeekbench1.jpg.646f2172294f74f05e01bd98f43c72bb.jpg

 

 

Quotes

Quote

Today's leak comes courtesy of the tearless retina of the Benchleaks bot on Twitter.

 

Comparing this result to the RTX 4090 gives us roughly a 65% uplift over the previous-generation RTX 3090 Ti and 78% uplift over the RTX 3090. That's an absolutely crazy uplift in performance, and good news for folks that use CUDA-based software for production work.

 

Geekbench CUDA is of course a measurement of heavyweight workloads (data center), so professional users will be more keen on seeing these results, not gamers. But it’s still interesting for all parties to see the relative power. There are some who believe this is a good pointer towards the kind of ballpark performance we will get for standard gaming – meaning rasterized, as opposed to ray-traced.

 

My thoughts

I think this is a good showing of RTX 4090 performance, about 1.8x faster than a RTX 3090 and about 1.6x faster than a 3090 Ti. While this workload is heavily compute related, it still gives us a rough idea of raster performance. Seems to be about average compared to predecessors of previous generations, except here this is a 3090 Ti compared to a 4090. If the 3090 Ti is compared to a 4090 Ti, the full fat die of AD102, it will probably be around a 90% increase in performance (if those rumors of specs on full fat AD102 are correct). We have 10 days until RTX 4090 officially launches, so I'm expecting better leaks to come out before then that give us a better idea of gaming performance. Some might be disappointed by these numbers, based one NVIDIA's own numbers, but this is a very specific benchmark that doesn't specifically indicate gaming performance entirely. There is a chart on some of these sources that indicates "Current NVIDIA CUDA API Geekbench Rating" featuring the 3080 Ti, 3090, A100, and 3090 Ti and when compared to the 4090's result they get trumped. 

 

Sources

https://www.guru3d.com/news-story/nvidia-geforce-rtx-4090-tested-in-benchmark-60-faster-than-rtx-3090-ti.html

https://www.techradar.com/news/nvidia-rtx-4090-gpu-benchmark-leak-has-some-folks-disappointed

https://videocardz.com/newz/geforce-rtx-4090-is-60-faster-than-rtx-3090-ti-in-geekbench-cuda-test

https://hothardware.com/news/rtx-4090-benchmark-leak-60-percent-over-3090-ti

https://browser.geekbench.com/v5/compute/5596628

https://browser.geekbench.com/v5/compute/5094052

 

Small update to this story:

 

Summary

3DCenter.org released an article denoting the scaling between this Geekbench benchmark and gaming performance at 4K. They compared the 2080 Ti to the 3090 and saw similar gains in performance scaling between the Geekbench result and gaming at 4K results.

 

Quotes

Quote

FeI1nU7acAA4J8d.png.698230ca3996b12b80da62a53a689f1f.png

 

The benchmark itself may not be relevant for most users, but interestingly, the performance scaling - at least in the previous nVidia generation - is quite close to the result under gaming benchmarks.

 

The CUDA benchmark for the GeForce RTX 3090 used for this is from September 2020 - but this applies exactly to the current situation of the GeForce RTX 4090. In the meantime, the performance of the GeForce RTX 3090 has increased slightly both in Geekbench and in games, but the performance ratio has remained the same - the Geekbench result is still surprisingly close to the gaming performance under 4K resolution.

 

This Geekbench result can therefore be interpreted as a further indication that the basic performance of the GeForce RTX 4090 is around +60% higher than the GeForce RTX 3090 Ti. It is still vacant whether the greatly enlarged Level2 cache helps the ADA graphics chips more in games than in theoretical benchmarks such as Geekbench.

 

My thoughts

This is interesting to see as the scaling is quite close between Gaming performance and the Geekbench result. Meaning, it's quite possible that pure raster performance of the 4090 will be very close to 60-65% higher than a 3090 Ti. But there's also the increased L2 cache in Ada that might assist to increase performance further. 

 

Sources

https://www.3dcenter.org/news/news-des-12-oktober-2022

Link to comment
Share on other sites

Link to post
Share on other sites

If this is scaled throughout all the RTX 4xxx Series, and for example an RTX 4060 is 1.6x more powerful than an RTX 3060, it might mean that lower end cards will no longer be 1080p cards but 1440p and even 4K for Ti versions! 

 

Example: RTX 3060 4K Ultra fps is 41 average on Elden Ring. This means that a 4060 would achieve over 60 fps at 4K Ultra. 

 

It's interesting to see where we are headed. Gaming on a 4K 60hz TV on an entry level card is amazing. 

Budget Gamer at heart.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Its just too bad this doesnt go with the rest of the 4000 series, Purposely gimped performance for the 4080 just to try making a 4080ti that should have been the 4080, but here we are. Dont expect this to translate to other cards, potentially the TI models but i just dont see this as working with what theyve announced so far for everything else BUT the 4090

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Brandi93 said:

If this is scaled throughout all the RTX 4xxx Series, and for example an RTX 4060 is 1.6x more powerful than an RTX 3060, it might mean that lower end cards will no longer be 1080p cards but 1440p and even 4K for Ti versions! 

 

Example: RTX 3060 4K Ultra fps is 41 average on Elden Ring. This means that a 4060 would achieve over 60 fps at 4K Ultra. 

 

It's interesting to see where we are headed. Gaming on a 4K 60hz TV on an entry level card is amazing. 

The 4080 8GB will probably be double the price of the 3060 though.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Are those 3090 and TI scores stock ? Or are those the fastest ones in the world? With like ln2?

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

Stock, if you believed the rumors of the RTX 4090 being 2x-4x faster than last gen then you needed to lay off breathing in the LN2 🙃 lol

But it smells soooooo good lol.

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

Wow  I'm extremely impressed. If they can keep them in stock and I can get one within a year or launch I might be changing my tune on these cards. 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Brandi93 said:

If this is scaled throughout all the RTX 4xxx Series, and for example an RTX 4060 is 1.6x more powerful than an RTX 3060, it might mean that lower end cards will no longer be 1080p cards but 1440p and even 4K for Ti versions! 

 

Example: RTX 3060 4K Ultra fps is 41 average on Elden Ring. This means that a 4060 would achieve over 60 fps at 4K Ultra. 

 

It's interesting to see where we are headed. Gaming on a 4K 60hz TV on an entry level card is amazing. 

Tbh ultra is stupid at 4k. Better off using high graphics which is usually very close visually and hard to tell the difference between that and ultra while also having a big difference in performance. Honestly ultra has made 0 sense at basically any resolution other than when you are so cpu bottlenecked that ultra vs high settings makes no difference on terms of performance. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

Tbh ultra is stupid at 4k. Better off using high graphics which is usually very close visually and hard to tell the difference between that and ultra while also having a big difference in performance. Honestly ultra has made 0 sense at basically any resolution other than when you are so cpu bottlenecked that ultra vs high settings makes no difference on terms of performance. 

Oh I totally agree, I myself play on high rater than ultra, even at 1080p. I was just arguing that a 4060 might push 4k60 even at ultra 😉

Budget Gamer at heart.

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, BiG StroOnZ said:

I think this is a good showing of RTX 4090 performance, about 1.8x faster than a RTX 3090 and about 1.6x faster than a 3090 Ti.

What a coincidence - the 4090 is also roughly 65% more expensive than a 3090 TI on the European market.

Take this offer for example: water-cooled 3090 TI for 1199€ while the 4090 (FE) is supposed to launch for 1949€.

image.thumb.png.fedf069464f7c0dcf53cab53376e01de.png

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Be very careful trying to compare compute results across generations and relating them to gaming. Since I started paying attention around Maxwell, every generation has come with it a significant uplift in compute performance relative to gaming perf. I've seen this through several generations:

980 Ti and 1070 are about same gaming performance, but 1070 is much faster compute.

2070 is faster than 1080 Ti in compute, but not in raster gaming.

2080 Ti and 3070, similar gaming perf, but in compute 3070 is way faster.

 

I don't know if Geekbench behaves like the apps I used in the past though, so try looking for generation on generation changes in that and see if what I observed holds for that also.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

To be completely honest. I'm not buying a 4090. I've heard during testing the power connection started to melt. Plus it's more expensive than AMD response so. Looks like team red might win this 

Link to comment
Share on other sites

Link to post
Share on other sites

To be fair, with how the card market is currently anyway, I'm not even sure if it was 3x the performance that I'd want to give them my money.

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, HenrySalayne said:

What a coincidence - the 4090 is also roughly 65% more expensive than a 3090 TI on the European market.

 

Take this offer for example: water-cooled 3090 TI for 1199€ while the 4090 (FE) is supposed to launch for 1949€.

 

image.thumb.png.fedf069464f7c0dcf53cab53376e01de.png

 

Yeah, but in the American market it is only 39% more expensive. So probably just a coincidence (probably). 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, BiG StroOnZ said:

Yeah, but in the American market it is only 39% more expensive. So probably just a coincidence (probably). 

I didn't mean this in the conspiracy theory way but rather that the perfomance/€ is identical if not worse for the new generation.

That's a hard pill to swallow. I hope we'll see price drops quite early into the new generation.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, BiG StroOnZ said:

Yeah, but in the American market it is only 39% more expensive. So probably just a coincidence (probably). 

Probably. EU VAT is 21% (or around that) usually and as it happens 1.39 * 1.21 = 1.68.The 3090 Ti launched at $1999/€2249 here, so in terms of launch price it's not that far off.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, tikker said:

Probably. EU VAT is 21% (or around that) usually and as it happens 1.39 * 1.21 = 1.68.

Firstly, there is no EU VAT and it ranges from country to country.

Secondly, your maths is completely off. You can't just add VAT to a percentage. I compared the 1199€ for a 3090 TI (including VAT, current price you have to pay) to the launch price of the 4090 with 1949€ (including VAT). The price difference between these two models will be identical if you calculate it with or without VAT.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, HenrySalayne said:

Firstly, there is no EU VAT and it ranges from country to country.

Yeah it varies by country. I should have worded that better. I meant it in the sense of it generally being around such a percentage.

16 minutes ago, HenrySalayne said:

Secondly, your maths is completely off. You can't just add VAT to a percentage.

There's a multiply sign there. I'm not adding anything.

16 minutes ago, HenrySalayne said:

I compared the 1199€ for a 3090 TI (including VAT, current price you have to pay) to the launch price of the 4090 with 1949€ (including VAT).

I know you did. When you buy that's what one needs to look at of course, but from a pure launch-price perspective I think it is fairer to compare launch price to launch price.

16 minutes ago, HenrySalayne said:

The price difference between these two models will be identical if you calculate it with or without VAT.

I am aware, because VAT is just another multiplication.

 

[Edit] I see what you mean now. I assumed a no-tax-included US price that was compared to the 65% difference w.r.t. your current 3090 Ti price, which is pure coincidence that it matches up with 68% after the mentioned 39% increase + my VAT, but that was not the right comparison.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tikker said:

I think it is fairer to compare launch price to launch price.

Not at all. The 3080 launched for 699€, the 3090 TI launched for 1999€. Yet, the 3090 TI sits at 1199€ and the 3080 is 799€ here in Germany (in the official Nvidia store the 3080 FE goes for 759€ while the 3090 TI is 1329€). Especially the two years of GPU shortage showed that availability and actual market price is more important for an educated buying decision than the launch MSRP.

If the 4090 launches for > 2000€ while the 3090 TI stays at 1200€, you might actually get more bang for the buck with the 3090 Ti - which might be even cheaper than the 4080s while offering same levels of performance.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

Be very careful trying to compare compute results across generations and relating them to gaming. Since I started paying attention around Maxwell, every generation has come with it a significant uplift in compute performance relative to gaming perf. I've seen this through several generations:

980 Ti and 1070 are about same gaming performance, but 1070 is much faster compute.

2070 is faster than 1080 Ti in compute, but not in raster gaming.

2080 Ti and 3070, similar gaming perf, but in compute 3070 is way faster.

 

I don't know if Geekbench behaves like the apps I used in the past though, so try looking for generation on generation changes in that and see if what I observed holds for that also.

 

3DCenter released this article today:

 

Quote

FeI1nU7acAA4J8d.png.698230ca3996b12b80da62a53a689f1f.png

 

The benchmark itself may not be relevant for most users, but interestingly, the performance scaling - at least in the previous nVidia generation - is quite close to the result under gaming benchmarks.

 

The CUDA benchmark for the GeForce RTX 3090 used for this is from September 2020 - but this applies exactly to the current situation of the GeForce RTX 4090. In the meantime, the performance of the GeForce RTX 3090 has increased slightly both in Geekbench and in games, but the performance ratio has remained the same - the Geekbench result is still surprisingly close to the gaming performance under 4K resolution.

 

This Geekbench result can therefore be interpreted as a further indication that the basic performance of the GeForce RTX 4090 is around +60% higher than the GeForce RTX 3090 Ti. It is still vacant whether the greatly enlarged Level2 cache helps the ADA graphics chips more in games than in theoretical benchmarks such as Geekbench.

 

https://www.3dcenter.org/news/news-des-12-oktober-2022

 

Might give you a slightly better idea of how Geekbench scales to gaming performance. I think I'm going to add this to the OP.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, HenrySalayne said:

Especially the two years of GPU shortage showed that availability and actual market price is more important for an educated buying decision than the launch MSRP.

Two years of GPU shortage showed that? Common sense should've shown that 30 years ago.

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Yawn. Completing skipping this gen because of Nvidia's BS anyway. The "4080" 16gb should really be a 4070 and the "4080" 12gb should really be a 4060. Look at how stripped down the 4080 16gb is compared to the 4090. It's not even on the 102 die like the 3080/3090. 

COMPLETE

BS

 

Avoid lol. This whole gen screams SCAM to me. It's worse than people think. The "4080" 16gb keeps being called the "real 4080" and the 12gb "4080" is being called the "fake 4080" or 4070. It's worse than that though. The 12gb one is only a 192bit bus and the 16gb one is wayyyyyyyyy more stripped down from the 4090 than the 3080 was from the 3090. The x80 tier sku should always be from the top die. Let alone the fact that the 3080 had like 80% as many CUDA cores as the 3090. What does the 4080 16gb have. Maybe 50% as many...

 

I haven't even added insult to injury yet and discussed price. The 4090 is relatively fairly priced. But lord those "4080"s......... 

 

Given record inflation, the move back to TSMC, and the overall complexity of these cards/development cost of DLSS 3 etc, of course they won't launch at the previous prices. But these prices are just absurd.

 

I'll be generous and call the "4080" 16gb a 4070ti (even though it's way more cut down than the 3070ti is compared to the 3090)

 

The 3070ti launched at $599. A fair price would be $699-$749

 

It's priced at $1199.

 

I'll be generous and call the "4080" 12gb a 4060ti even though it's more like a 4060.

 

The 3060ti launched at $399. A fair price for this card really should be like $499-549 but even that is pushing it.

 

This Gen is ABSURD.

 

Leather jacket man has lost his mind.

 

VOTE

 

WITH

 

YOUR 

 

WALLETS

 

In a vacuum, the 4090 is impressive based on performance alone, that's my only contribution to the actual topic lol 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Brandi93 said:

If this is scaled throughout all the RTX 4xxx Series, and for example an RTX 4060 is 1.6x more powerful than an RTX 3060, it might mean that lower end cards will no longer be 1080p cards but 1440p and even 4K for Ti versions! 

 

Example: RTX 3060 4K Ultra fps is 41 average on Elden Ring. This means that a 4060 would achieve over 60 fps at 4K Ultra. 

 

It's interesting to see where we are headed. Gaming on a 4K 60hz TV on an entry level card is amazing. 

 

I don't think that's the outcome you'll actually see. What we've always seen is the "X tier-1 product equals the last gen's tier-0 product" eg Geforce GTX 1080 = RTX 2070 = RTX 3060. And since I know I've had this discussion with people on this forum before, I'm not going to repeat it.

 

Nvidia isn't suddenly going to make a xx40 or xx30 tier product that is performative at a low cost, it will undermine the higher margin cards. It's been pretty clear the last (30 series) and new generation have been making some of their "large" gains by moving the power target upwards, when we should really be looking at the performance of the parts we would previously call Quadro, because those parts are still 2-slot cards. The A6000 (Ada generation) is a 300w card with 48GB and 18176 cuda cores, and also the highest end card you can get for a workstation. The RTX 4090 is a 24GB card with 16384 cuda cores. Where is the logic in that a "smaller die" consumes 50% more power?

 

That's because Intel and AMD have also being doing this with their CPU's. Basically going "well the overclockers are going to do this anyway, so let's just drive the overclocking headroom as little possible" (and remove any "OC edition" value from AIB's in the process,)  even though, as someone else posted before, you can likely push down the power limit on the GPU's by about 30% any lose about 10% of the performance.

 

Also I'm thinking that some of the environmental lawmakers in California are not having it.

https://www.energy.ca.gov/publications/2019/plug-loads-game-changer-computer-gaming-energy-efficiency-without-performance

 

image.thumb.png.85c34f5b38eb9b1536344b9208d38e6a.png

Remember, this report was from 2019.

image.thumb.png.794bf3c1ad7ff831a72ebf3382d2ecc1.png

Read the last line in particular. We're paying large amounts of money for GPU's that have been engineered out of the box to "go as fast as possible within a generous thermal solution" when really we should maybe start considering underclocking parts to aim for the most performance/watt rather than just the most performance at any power draw.

 

https://www.igorslab.de/en/cooler-breaker-station-fusion-reactor-when-the-geforce-rtx-3090-ti-with-300-watt-choke-sets-the-efficiency-list-on-its-head-and-beats-the-radeons/

01-FPS-Bars.png

So going from 450w to 300w results in only a 11fps difference. 

 

So it'll be interesting to see someone repeat this with the 40-series cards. If the same "x70 part equals x80 part of last gen" rule holds true, then nvidia has pretty much left the budget market, and isn't even pretending to make affordable parts. Like they may simply end up not producing x50 and x60 parts because they know people likely only need that much GPU for their setup and instead want them to buy more than they need.

 

But that gets into marketing more than anything. My wish here is that nvidia and the AIB's would actually make the standard version a 2-slot version of the card with removable coolers so you could "upgrade" to a 3-slot or 4-slot cooler and move the power ceiling depending on how large of cooler you can fit in the case. Intel and AMD have already figured that out when they stopped putting coolers in their "9" CPU parts.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Kisai said:

I don't think that's the outcome you'll actually see. What we've always seen is the "X tier-1 product equals the last gen's tier-0 product" eg Geforce GTX 1080 = RTX 2070 = RTX 3060. And since I know I've had this discussion with people on this forum before, I'm not going to repeat it.

 

Not to take away from the rest of your post, but I'd like to point out that a 2070 and 3060 are much closer in performance to a 1080 Ti than a 1080:

 

20703060.thumb.jpg.ce4565e27f4fd87387a929b0470512b7.jpg

 

1080 is about 16% slower than a 2070 and 3060, while the 1080 Ti is only about 6% faster than a 2070 and 3060. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×