Jump to content

Alleged NVIDIA GeForce RTX 4080 16GB benchmarks have been leaked: about 25% faster than an RTX 3090 Ti

Summary

Benchmark results featuring alleged NVIDIA GeForce RTX 4080 16GB have now been posted on Chiphell forums. Chiphell’s member ‘panzerlied’ has shared the following benchmarks which give us an idea of how this upcoming GPU performs. Going into more details, 'panzerlied' shared benchmarks for 3DMark, Shadow of the Tomb Raider and Red Dead Redemption 2. The first results are in native resolution and the second is with DLSS. As for Shadow of the Tomb Raider, its benchmarks are with DLSS Quality Mode.

 

RTX4080-3DMARK-4.jpg.4bb7d2dd6b810bfbc10555f09f3c259d.jpg

 

RTX4080-3DMARK-3.jpg.d2a76025138c74ca6cde294aba8b2901.jpg

 

RTX4080-3DMARK-1.jpg.ab723c6670a5b88625d2c28a190e14ae.jpg

 

RTX4080-GAMES-1.jpg.1fdc78dc7b4099a6038080596ec800fd.jpg

 

RTX4080-GAMES-1.png.8934c90278326b59dc7e72d2c5ab1b50.png

 

RTX4080-GAMES-2.png.ce0dc4c245844bdfeba47c2b6cbc7b81.png

 

videocardz.thumb.jpg.21430cc833b8c6d7f2b515529c6a8839.jpg

 

Quotes

Quote

First thing that should be mentioned is that we are not certain how could someone obtain a driver that supports RTX 4080 16GB this soon. This card is not expected to launch until next month and as far as we are aware, there are no drivers for this card yet.

 

At 4K resolution the RTX 4080 supposedly achieved in 3DMark 17,465 in Fire Strike Ultra and 13,977 in Time Spy Extreme. Looking at comparative scores for the RTX 3080 (12GB), that makes the RTX 4080 about 50% faster on both counts. A further 3DMark Port Royal result was also highlighted, with the RTX 4080 hitting 17,607 which is about 45% faster, so again pretty much the same ballpark. The results are quite good, in fact, the GPU is already the fastest GPU in a 3DMARK ranking.

 

I’ve recently benchmarked Red Dead Redemption 2, we can get an idea of the performance uplift that the RTX 4080 16GB offers. So, in native 4K, I was getting an average of 61fps on the RTX 3080. Without DLSS, the RTX 4080 16GB offers an average of 87fps. So basically, we’re looking at a 42% performance boost. Performance naturally improves once Nvidia DLSS is enabled, climbing up to 115fps in ‘Quality’ mode.

 

The GeForce RTX 4080 16 GB managed an average of 128 frames per second in Shadow of the Tomb Raider at 4K with DLSS set to "Quality".

 

My thoughts

First thing that should be mentioned is, as many of the news outlets pointed out, the card isn't coming out for another month; therefore how this individual obtained a driver that supports the RTX 4080 16GB is quite a conundrum. Take these results with a grain of salt definitely, could be fake. Now taking your skeptical glasses off, the RTX 4080 16GB here is 25% faster than an RTX 3090 Ti in Fire Strike Ultra. The RTX 4080 16GB is 31% faster than an RTX 3090 Ti in Time Spy Extreme. The RTX 4080 16GB is 19% faster than an RTX 3090 Ti in Port Royal. I tried finding a comparison for some of the gaming benchmarks that weren't present, but wasn't having too much luck. If you have an RTX 3080, 3080 Ti, 3090 or 3090 Ti and can test Shadow of the Tomb Raider with the same settings and RDR2 with DLSS enabled then post in this thread. As TechRadar mentions performance can definitely improve if you are disappointed by these results. As there is definitely some driver issues going on here. When official driver supports comes the 4080 16GB should surely be faster. Lastly, it should be noted that the OP of this leak on Chiphell deleted the post recently, now it says translated, "Okay, everyone knows, you can delete it". 

 

Sources

https://www.notebookcheck.net/Nvidia-GeForce-RTX-4080-16-GB-gaming-and-synthetic-benchmarks-leak-online.659990.0.html

https://www.guru3d.com/news_story/assumed_nvidia_geforce_rtx_4080_16gb_3dmark_performance_benchmarks_leaked.html

https://www.techradar.com/news/nvidia-rtx-4080-gpu-leak-disappoints-some-gamers-but-lets-not-get-carried-away

https://videocardz.com/newz/alleged-nvidia-geforce-rtx-4080-16gb-3dmark-benchmarks-have-been-leaked

https://www.dsogaming.com/news/first-gaming-benchmarks-leaked-for-nvidia-geforce-rtx-4080-16gb/

https://www.pcgamesn.com/nvidia/rtx-4080-16gb-benchmarks-performance-glimpse

https://www.chiphell.com/thread-2448461-1-1.html

Link to comment
Share on other sites

Link to post
Share on other sites

I also have my doubts. As with the evga story they didnt even get proper drivers till almost launchday so how this person has them is strange.

 

Unless its internal.

 

However this is strange for a 4080. Usually they are the cards that perform a little better than the full die of last gen by 10%(used to be titans now its the 3090ti).

 

So that also makes me doubt how legit this is.

 

Either way for 1200$ its a horrible deal for that card.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Shimejii said:

If only they didnt make it 1200$ then it would actually be worth it at 700-900. But Nope.

4 minutes ago, jaslion said:

Either way for 1200$ its a horrible deal for that card.

3080Ti was above that price for almost the entire time and you couldn't even get one. 4080 16GB will completely sell out, that being said I think Nvidia will artificially limit supply for a couple months to sell 3000 series to those who can't wait. 

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, ZetZet said:

3080Ti was above that price for almost the entire time and you couldn't even get one.

Oh I know even worse deal

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, jaslion said:

Oh I know even worse deal

Maybe we are just too poor. I am seriously hoping AMD comes out swinging with something like a 700USD 7800 XT, because 6800 XT is already sitting at 500 USD and they do not seem to have as much leftover stock as Nvidia, but these last couple of years it seems like disappointment what actually is going to happen. 

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, BiG StroOnZ said:

skeptical glasses

image.gif.c5d7c17e5f8618497c1ddb7cc1418870.gif

"A high ideal missed by a little, is far better than low ideal that is achievable, yet far less effective"

 

If you think I'm wrong, correct me. If I've offended you in some way tell me what it is and how I can correct it. I want to learn, and along the way one can make mistakes; Being wrong helps you learn what's right.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, ZetZet said:

Maybe we are just too poor. I am seriously hoping AMD comes out swinging with something like a 700USD 7800 XT, because 6800 XT is already sitting at 500 USD and they do not seem to have as much leftover stock as Nvidia, but these last couple of years it seems like disappointment what actually is going to happen. 

No it really is just a case of pricing people out of the market.

 

Just look at consoles. Ps4 400 launch good gaming pc back (so like a i5 k series + x70 nvidia or 79 series amd) then around 700$.

 

Ps5 500 launch good gaming pc same class pc from 2013 1200 at least. (keep in mind that the ps5 is a WAY better console compared to the ps4 when it launched. A ps4 was already a low end pc with a lower mid range gpu in 2013 whilst the ps5/xboxsx are solid upper mid range systems with good gpus. So eveb thats not comparable)

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, BiG StroOnZ said:

First thing that should be mentioned is, as many of the news outlets pointed out, the card isn't coming out for another month; therefore how this individual obtained a driver that supports the RTX 4080 16GB is quite a conundrum.

The 4090 drivers *may* support the 4080.  Also, I'm unsure of how easy Nvidia is to fool.  I made my 3080 look like a Radeon card fairly easily once.  Not in a graphical way but it was enough to "verify" I had a Radeon card.  It is quite possible to edit the drivers to make it think its a 4090 possibly even a 3000 series.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

Curious about the performance 4080/12 *cough cough 4070*

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, jaslion said:

I also have my doubts. As with the evga story they didnt even get proper drivers till almost launchday so how this person has them is strange.

 

Unless its internal.

 

However this is strange for a 4080. Usually they are the cards that perform a little better than the full die of last gen by 10%(used to be titans now its the 3090ti).

 

So that also makes me doubt how legit this is.

 

Either way for 1200$ its a horrible deal for that card.

How in the world is 25% more performance on bad drivers a horrible deal??? Your still not living in the real world where inflation is a thing 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Fasterthannothing said:

How in the world is 25% more performance on bad drivers a horrible deal??? Your still not living in the real world where inflation is a thing 

I don't believe this is accurate. I am very much living in the real world inflation sucks but these cards are far beyond what inflation would make them be.

 

That and gpu's haven't always followed inflation. The 600$ gtx 8800 this would be in 2006 would mean it should be around 800 now tops.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/7/2022 at 1:47 AM, Shimejii said:

If only they didnt make it 1200$ then it would actually be worth it at 700-900. But Nope.

Why should they not price it at $1200? I mean if you guys all blindly buy NVIDIA literally every day then why should they not..? I honestly hope they increase the price last minute and double the prices on all items. The funny thing is that they would lose 0% sales from it cause you guys are genuinely hopeless. Saw a YT comment earlier where a guy was bragging that he will buy a 4090 because "he can afford it" as if there are people who don't have a couple of thousand in their bank account. That's gamers for you. Keep it up guys. Well deserved.

 

<removed>

Edited by SansVarnic
Removed content.
Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Fasterthannothing said:

How in the world is 25% more performance on bad drivers a horrible deal??? Your still not living in the real world where inflation is a thing 

People (I think subconsciously) like to cling to this idea that the price of a product should be dictated by its name. That the tier that the GPU is allocated - 80/70/60 etc - should have some magical impact on the price of the product, completely independent of any actual details about the product. They then get angry when the price of the GPU of that tier goes up - I regularly see comments along the line of "remember when the 80-tier GPU cost $XYZ" - as if the fact that the GPU has an '80' on the end of its name inherently means something.

 

Spoiler: It doesn't. At all.


The 4080 16GB costs 70% more than the 3080. In what universe does that remotely matter? It's not as if the 4080 going up in price means that there isn't going to be a GPU being sold at the old price point. There most definitely will be - it'll just be a 60 or 70 tier card. And that distinction - that difference in name - means nothing. The designation of 60/70/80 etc. is nothing but a construct made up by a marketing department with the sole purpose of making more money. It's literally marketing peer pressure - they want you to feel bad about dropping down to a "lower-tier" GPU when you upgrade. They are trying to pressure you into buying a higher-end card with "big number better" mentality.

 

Stop looking at GPU marketing tier and start looking at GPU price tier. Instead of comparing the old 80-tier GPU with the new 80-tier GPU, compare the old $700 GPU with the new $700 GPU. Look past the marketing bullshit and consider what really matters: price to performance.

 

Give yourself a fixed budget - say $500 - and look at what GPU you can get for that price each generation. You will find that the performance on offer goes up every single generation. Yes the "tier" of the GPU you are buying will change - you might go from a 1070 to a 2060 for example - but each generation's GPU will beat the previous gen's GPU at the same price point, despite any "tier" differences that may arise. Some generations that performance difference is bigger than others, but it's always there and it always favours the newer product.

 

The 4080 16GB is a $1200 card. One that (supposedly) performs 25% faster than the previous gen's $2000 card. That (according to some rough napkin maths) therefore likely performs ~35% faster than the last gen $1200 card.

 

35% faster for the same price? That doesn't sound too bad to me.

 

Or lets consider the flipside: what if it weren't called the 4080 16GB, and was instead called the 4080Ti?

 

"Introducing the 4080Ti. It's the same price as the 3080TI, but now comes with 25% more VRAM and provides ~35% more performance."

 

Would this card have gotten such a bad rap if it had been introduced like that? I doubt it.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, tim0901 said:

People (I think subconsciously) like to cling to this idea that the price of a product should be dictated by its name. That the tier that the GPU is allocated - 80/70/60 etc - should have some magical impact on the price of the product, completely independent of any actual details about the product. They then get angry when the price of the GPU of that tier goes up - I regularly see comments along the line of "remember when the 80-tier GPU cost $XYZ" - as if the fact that the GPU has an '80' on the end of its name inherently means something.

 

Spoiler: It doesn't. At all.


The 4080 16GB costs 70% more than the 3080. In what universe does that remotely matter? It's not as if the 4080 going up in price means that there isn't going to be a GPU being sold at the old price point. There most definitely will be - it'll just be a 60 or 70 tier card. And that distinction - that difference in name - means nothing. The designation of 60/70/80 etc. is nothing but a construct made up by a marketing department with the sole purpose of making more money. It's literally marketing peer pressure - they want you to feel bad about dropping down to a "lower-tier" GPU when you upgrade. They are trying to pressure you into buying a higher-end card with "big number better" mentality.

 

Stop looking at GPU marketing tier and start looking at GPU price tier. Instead of comparing the old 80-tier GPU with the new 80-tier GPU, compare the old $700 GPU with the new $700 GPU. Look past the marketing bullshit and consider what really matters: price to performance.

 

Give yourself a fixed budget - say $500 - and look at what GPU you can get for that price each generation. You will find that the performance on offer goes up every single generation. Yes the "tier" of the GPU you are buying will change - you might go from a 1070 to a 2060 for example - but each generation's GPU will beat the previous gen's GPU at the same price point, despite any "tier" differences that may arise. Some generations that performance difference is bigger than others, but it's always there and it always favours the newer product.

 

The 4080 16GB is a $1200 card. One that (supposedly) performs 25% faster than the previous gen's $2000 card. That (according to some rough napkin maths) therefore likely performs ~35% faster than the last gen $1200 card.

 

35% faster for the same price? That doesn't sound too bad to me.

 

Or lets consider the flipside: what if it weren't called the 4080 16GB, and was instead called the 4080Ti?

 

"Introducing the 4080Ti. It's the same price as the 3080TI, but now comes with 25% more VRAM and provides ~35% more performance."

 

Would this card have gotten such a bad rap if it had been introduced like that? I doubt it.

This I agree with. I was vocal about it (names being meaningless marketing fluff) and I got roasted like I was blasphemous. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/7/2022 at 5:04 PM, CTR640 said:

Curious about the performance 4080/12 *cough cough 4070*

*cough* 4060 ti *cough*

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×