Jump to content

Rumor? nVidia RTX 2080 & 2080 Ti Reviewers Guide Leak? Lots of 4K

WMGroomAK

Videocardz has an article up that is supposedly sharing some of the reference performance numbers from nVidia's Reviewer's Guide for the 2080 & 2080 Ti.  Based on the graphs assembled from nVidia's supplied numbers, in the games that nVidia has for their recommended titles, the 2080 should be expected to perform between approximately 5 and 20 % better than the 1080 Ti with the 2080 Ti being substantially higher.  

https://videocardz.com/77983/nvidia-geforce-rtx-2080-ti-and-rtx-2080-official-performance-unveiled

Quote

The data we are sharing with you today comes from official Reviewers’ Guide. The numbers in this guide are only a reference for further benchmarking. It is probably an important thing to say that those numbers are should not be taken very seriously. Each reviewer has a different testing methodology (different scenario, different testing equipment, a different list of games).

 

The graphs are based on values (framerate/scores) provided by NVIDIA for their recommended titles. Yes, the word recommended is rather important here too.

 

In GeForce RTX reviewer’s guide, NVIDIA is not using any other resolution than 4K. So all benchmarks (except VRMark Cyan Room) were performed at 3840×2160 resolution. In fact, the RTX 2080 series were ‘designed for 4K’, as the document claims.

 

NVIDIA reference system includes: X299 Rampage VI Apex, Core i9-7900X 3.3 GHz, Corsair 16GB DDR4 (no frequency specified), Windows 10 (v1803), NVIDIA 411.38 drivers.

I would definitely take any of these numbers with a large serving of salt, but they do look like a good starting point... Most of the titles, it appears that the 2080 can possibly hit 60 FPS with a couple of them it lacking behind, but not far behind, like ME:A, Shadow of War & Shadow of the Tomb Raider.  Also, originally saw this posted over on HardOCP

 

Below are a couple of the graphs from the Videocardz site:

 

GeForce-RTX-2080-Ti-RTX-2080-R6S-1000x447.png.dd1f2d3d414619e0fbf168de7d55c89f.png

GeForce-RTX-2080-Ti-RTX-2080-StarWarsBF2-1000x443.png.c17fcd7a01be86f7f19eb68f42804fe6.png

GeForce-RTX-2080-Ti-RTX-2080-Witcher-3-1000x449.png.3f01250f3eabe126dabea86930e1f93e.png

GeForce-RTX-2080-Ti-RTX-2080-Shadow-of-Tomb-Raider-1000x447.png.4c5d92f70c464b729c16a997f8c1edc0.png

 

For the more numerically inclined, here is a spreadsheet of the numbers provided on the charts with the MSRP for the cards and the average increase in performance & price from 1080 to 2080, 1080 Ti to 2080 Ti and 1080 Ti to 2080.  I'll admit that this is my personal plotting so may not have everything correct and my version of Stats can always be wrong as well as we are basing this off of best case nVidia numbers...

 

Picture1.jpg.bcdfc4dfa97b8c04452a30fb52d26788.jpg

Edited by WMGroomAK
Adding a spreadsheet of values
Link to comment
Share on other sites

Link to post
Share on other sites

Why would videocardz put their name on graphs assembled with other peoples data?   

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

150$ more than a 1080 ti with 3gb less vram is still kind of disappointing. How much can it overclock? Thats what I'm waiting to find out.

Honestly, Nvidia probably could've done better. 

the 1070 (400$) was about even with the 980 ti.

the 970 was about even with the 780 ti.

now the 2080 is a little better [then the 1080 ti] and costs more. WTF Nvidia

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Firewrath9 said:

now the 2080 is a little better and costs more. WTF Nvidia

Are you upset because it supposedly performs better but costs more because of it?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mr moose said:

Why would videocardz put their name on graphs assembled with other peoples data?   

 

My best guess is that nVidia didn't supply the actual graphs, just the numbers/scores and Videocardz assembled that into the graphs.  Could be wrong though...

 

Quote

The graphs are based on values (framerate/scores) provided by NVIDIA for their recommended titles. Yes, the word recommended is rather important here too.

 

Link to comment
Share on other sites

Link to post
Share on other sites

it's a fairly consistent increase across the board

 

Quote

In GeForce RTX reviewer’s guide, NVIDIA is not using any other resolution than 4K

because that's the resolution that most people would be buying these cards for

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

i don't see performance increase that justifies the price difference. Only raytracing does, if there is games using it and you care for said games

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, WMGroomAK said:

My best guess is that nVidia didn't supply the actual graphs, just the numbers/scores and Videocardz assembled that into the graphs.  Could be wrong though...

 

 

I got that from the OP, but they didn't have to put their name on them, it looks like official videocardz testing results.  I bet you this thread ends up with people assuming they are independent numbers from videocardz. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, asus killer said:

i don't see performance increase that justifies the price difference. Only raytracing does, if there is games using it and you care for said games

Wouldn't surprise to to see a release a few months to a year down the track that will make the 20 series line up look like

 

RTX2080ti

GTX2080ti

RTX2080

GTX2080

RTX2070ti

GTX2070ti

RTX2070

GTX2070

GTX2060

GTX2050ti

GTX2050

 

given how many different variants of each card they like to make

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Arika S said:

Wouldn't surprise to to see a release a few months to a year down the track that will make the 20 series line up look like

 

RTX2080ti

GTX2080ti

RTX2080

GTX2080

RTX2070ti

GTX2070ti

RTX2070

GTX2070

GTX2060

GTX2050ti

GTX2050

 

given how many different variants of each card they like to make

gotta add at least two confusing VRAM variants for the 2050 through the 2060

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dan Castellaneta said:

gotta add at least two confusing VRAM variants for the 2050 through the 2060

He did:

 

22 minutes ago, Arika S said:

Wouldn't surprise to to see a release a few months to a year down the track that will make the 20 series line up look like

 

RTX2080ti

GTX2080ti

RTX2080

GTX2080

RTX2070ti

GTX2070ti

RTX2070

GTX2070

GTX2060

GTX2050ti

GTX2050

 

given how many different variants of each card they like to make

The bold part is 4 different cards each with 2 variants and 3 different ram configurations. xD

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, geo3 said:

The performance increase and cost increase aren't even remotely inline with each other. This is what most people are upset over.

But we only have marketing material to base those claims on.   We barely have any hard numbers let alone any independent tests.  Why get upset over something we don't yet know?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

But we only have marketing material to base those claims on.

A companies own marketing material are often exaggerated or absolute best case scenarios. Typically real world results are behind. If the performance of these cards was so amazing don't you think that would have been part of the presentation?

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, geo3 said:

The performance increase and cost increase aren't even remotely inline with each other. This is what most people are upset over.

Just did a quick spreadsheet of the numbers that Vidoecardz provided on their graphs as being from nVidia and comparing that to the price increase, the big issue really is with the Ti card...  The average increase in performance for the 2080 from the 1080 is around 49%, the 2080 Ti from the 1080 Ti is about 45%.  The average cost increase on the other hand is 45% from 1080 to 2080 and 71% from 1080 Ti to 2080 Ti.  So, if we actually put any sort of trust in these numbers, I don't see an issue with the 2080 price.  Just wouldn't look at the 2080 Ti yet...

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, geo3 said:

A companies own marketing material are often exaggerated or absolute best case scenarios. Typically real world results are behind. If the performance of these cards was so amazing don't you think that would have been part of the presentation?

We can also argue that cards increase in performance as further improvements and optimizations are made to drivers/games etc.  You re still trying to make absolute claims based on marketing material.    Right now we have neither independent testing nor do we have a crystal ball to show us the actual numbers.  Right now this is all conjecture and personal assumptions.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, mr moose said:

We can also argue that cards increase in performance as further improvements and optimizations are made to drivers/games etc.  You re still trying to make absolute claims based on marketing material.    Right now we have neither independent testing nor do we have a crystal ball to show us the actual numbers.  Right now this is all conjecture and personal assumptions.

 

 

Was anything stated about dlss too 

Considering that in another aspect

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, pas008 said:

Was anything stated about dlss too 

Considering that in another aspect

Exactly,  the 20 series is so different to what we have had before that we need a few in depth benchmarks that take all this into consideration and break down the differences, until then there is just no way to know how good or bad the performance really is. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, mr moose said:

Exactly,  the 20 series is so different to what we have had before that we need a few in depth benchmarks that take all this into consideration and break down the differences, until then there is just no way to know how good or bad the performance really is. 

 

 

Actually might take awhile considering

Visual quality too

Like 4k everything maxed no aa but dlss on Vs 4k everything maxed with all aa on

And mixing of those factors? If they even matter considering once the dgx does its thing and passes the info to user, the users tensors cores might just handle all aa completely could be the boost we are seeing

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, pas008 said:

Actually might take awhile considering

Visual quality too

Like 4k everything maxed no aa but dlss on Vs 4k everything maxed with all aa on

And mixing of those factors? If they even matter 

does AA matter at 4K even? 

 

The worst thing about all this is the fanboy fights that are going to ensue because people can't let somethings just be.   It's going to be very difficult to discuss the legitimacy of AA or the effects of DLSS at lower resolutions versus higher resolutions and so on.  It's already happening with people complaining about the cost and we don't even have any unified numbers to argue over.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mr moose said:

does AA matter at 4K even? 

 

The worst thing about all this is the fanboy fights that are going to ensue because people can't let somethings just be.   It's going to be very difficult to discuss the legitimacy of AA or the effects of DLSS at lower resolutions versus higher resolutions and so on.  It's already happening with people complaining about the cost and we don't even have any unified numbers to argue over.

did edit also

i agree completely

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Arika S said:

because that's the resolution that most people would be buying these cards for

Not really, 1400p@165hz is a lot more graphically taxing than 4k@60hz...

Link to comment
Share on other sites

Link to post
Share on other sites

So about 40-45% without DLSS factored in.

  • Won't be enough for people trying for strict P/L,... then again, no one in their right mind would go for top of the line GPUs when aiming at P/L, or expect the top end card to be as good P/L as lower tier cards. (no idea why people suddenly demand it tbh. never been that way)
  • Won't be enough for people that ignore all the added bells and whistles and just look at raster performance and price.
  • Will make people happy that factor in DLSS, or care about better-looking graphics.

 

So,... just as expected! Very happy here!

Feeling bad for those that just wanted more FPS and less price, but you can't always expect companies to do exactly what you want. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, JediFragger said:

Not really, 1400p@165hz is a lot more graphically taxing than 4k@60hz...

correct, but that's not what i said. 4k is more appetizing to consumers than 1440p@165. more people are talking about getting 4k monitors than 2k with a high refresh rate

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Arika S said:

correct, but that's not what i said. 4k is more appetizing to consumers than 1440p@165. more people are talking about getting 4k monitors than 2k with a high refresh rate

Yeah, most people don't seem to actually see a difference in high refresh rates. Including me. I just don't notice the difference.

On the other hand: Higher resolution is VERY visible, even my 81 year old mother noticed it right away when I got a new screen (demanding to get one as well obviously,...).

 

Maybe some games make high refresh worth it. I hear CS:GO loves frames, but i hate shooter games, so that may be why.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×