Jump to content

NVidia Geforce RTX 2080 confirmed up to 2x GTX 1080 performance. Evidence for 2080 ti as well.

Quote

We saw DLSS in action and a Nvidia RTX 2080 Ti was able to render Epic Infiltrator at a steady 85 frames per second (fps). Right next to the Turing rig was a Nvidia GTX 1080 Ti-powered system that struggled to keep the same experience running near 45 fps with temporal anti-aliasing and supersampling turned on.

So the 1080ti running the demo was using TXAA and SSAA vs just DLSS on the 2080ti? That's probably why.

 

You pretty much do not need AA if you're already running SSAA, sure more is better but running both is suicide for your framerate

 

I should point out that SSAA alone is usually suicide for framerate, you're literally asking the gpu to render 2x or 4x (for 2x and 4x supersampling) the number of pixels and then downsample to output resolution

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, S w a t s o n said:

So the 1080ti running the demo was using TXAA and SSAA vs just DLSS on the 2080ti? That's probably why.

 

Also nVidia were probably doing something funky (as usual) with the resolution and using an upscaled 1080p image using DLSS to pass it off as 4k (or something similar).

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, mr moose said:

I kinda just want them to explain their logic, because for the life of me I can't see it.

But you didn't bring up older gens as evidence nvidia were charging too much.

Even if it the 2080 was a direct replacement for the 1080 and the 1080 was out of stock already,   following a trend is not evidence that a product is overpriced, especially when it has already been demonstrated to have way more features let alone the prospective performance improvements.

 

"It has always been like that" is a moot argument when you have not only a large change in the product but large changes in the market i.e ram prices, mining, consumer demand as well.  

yep it definitely has to do with nvidia over producing last gen cards and having too much inventory when the mining craze crashed

Link to comment
Share on other sites

Link to post
Share on other sites

On 22.8.2018 at 8:11 PM, Rattenmann said:

That's misleading. Look at the graphs again. What is DLSS??

 

Because look here:

https://www.tweakpc.de/news/42674/nvidia-geforce-rtx-2080-performance-nur-etwa-20-mehr/

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stefan Payne said:

That's misleading. Look at the graphs again. What is DLSS??

 

Because look here:

https://www.tweakpc.de/news/42674/nvidia-geforce-rtx-2080-performance-nur-etwa-20-mehr/

Deep Learning Super Sampling. Are you referring to AdoredTV videos? 

 

I hope those tensor cores in RTX cards can help translate that website to english in real time so i'll have legit reason to buy it.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stefan Payne said:

That's misleading. Look at the graphs again. What is DLSS??

 

Because look here:

https://www.tweakpc.de/news/42674/nvidia-geforce-rtx-2080-performance-nur-etwa-20-mehr/

 

32 minutes ago, xAcid9 said:

Deep Learning Super Sampling. Are you pointing at AdoredTV videos? 

 

I hope those tensor cores in RTX cards can help translate that website to english in real time so i'll have legit reason to buy it.

Visuals are performance that's why we grade acid,graphics, and live concerts

People need to understand graphical increase is performance even if propriety

You can call it gimp works but still enhances the performance

Frames might be lost for some but graphically they see it

Like fps smoothness is performance

For some color accuracy

Shall i continue in what performance is?

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, spartaman64 said:

yep it definitely has to do with nvidia over producing last gen cards and having too much inventory when the mining craze crashed

mining sent the price of cards up and they haven't really come all the way down.  blame who ever you want, but that's just a fact of the market, no company is going to under value their product and take a reduced revenue just to make you happy.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Stefan Payne said:

That's misleading. Look at the graphs again. What is DLSS??

 

Because look here:

https://www.tweakpc.de/news/42674/nvidia-geforce-rtx-2080-performance-nur-etwa-20-mehr/

Not gonna read the whole article that already starts of with claims that are not based on what we have, but on what they guess.

So, can you quote the part where it says it is misleading?

Link to comment
Share on other sites

Link to post
Share on other sites

this is bullshit CEO math, the real gain is maybe 20% due to new arch and better GDDR6 VRAM, 12nm process, so please stop hyping these cards when you know very well there wont be any 2x performance gain

and you cant compare previous cards in RayTrace/ AI  tasks vs new cards,  old cards cant even run Ray Trace so there isnt even a comparison to be made here

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, yian88 said:

this is bullshit CEO math, the real gain is maybe 20% due to new arch and better GDDR6 VRAM, 12nm process, so please stop hyping these cards when you know very well there wont be any 2x performance gain

and you cant compare previous cards in RayTrace/ AI  tasks vs new cards,  old cards cant even run Ray Trace so there isnt even a comparison to be made here

So you don't want people to talk about the new features or specs or anything like that because they didn't exist before and we don;t know for sure how well it will perform?   Not going to happen. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, yian88 said:

this is bullshit CEO math, the real gain is maybe 20% due to new arch and better GDDR6 VRAM, 12nm process, so please stop hyping these cards when you know very well there wont be any 2x performance gain

and you cant compare previous cards in RayTrace/ AI  tasks vs new cards,  old cards cant even run Ray Trace so there isnt even a comparison to be made here

You do realize that the dark green bars represent a 1:1 comparison that ignores ALL the added new features, right? The WORST CASE is about a 35% gain and best case around 60%.

 

Again that is without even looking at all the new stuff.

So, where does your 20% figure come from? Do you have a source we all don't have? Well, share it!

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Rattenmann said:

You do realize that the dark green bars represent a 1:1 comparison that ignores ALL the added new features, right? The WORST CASE is about a 35% gain and best case around 60%.

 

Again that is without even looking at all the new stuff.

So, where does your 20% figure come from? Do you have a source we all don't have? Well, share it!

There’s some good analysis by Adored TV, in his latest video. It’s worth noting that the benchmarks provided (by the manufacturer, with almost no detail) are in 4K, and several also in HDR. 

Link to comment
Share on other sites

Link to post
Share on other sites

i also didn't get how turning DLSS on, increases (blank space). Shouldn't it increase quality and decrease fps if that is even what they are measuring? Are they measuring picture quality, like it's 2 times more beautiful?

 

But i really don't know how DLSS works, but it seems counterintuitive 

.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Morgan Everett said:

There’s some good analysis by Adored TV, in his latest video. It’s worth noting that the benchmarks provided (by the manufacturer, with almost no detail) are in 4K, and several also in HDR. 

All the "analysis" I have seen up to now spew out random figures based on nothing but guessing.

That is helping no one. Neither the ones trying hard to hate and not the ones trying to back up their purchase desition. 

 

What we need is fewer speculations based on guesses and more based on what we really know.

Yes, we only know what they want us to see, but in the past, NVidia has proven time and time again that their marketing is actually spot on. They even lowballed quite a bit. Jenson said the Infiltrator would run at 78fps, the press saw 85fps. Lowballed again.

 

I understand not wanting to speculate based on marketing material, but it is all we have. It has proven correct in the past and it is clearly better than useless speculation based on pure guesswork. If you don't want to use what we have, just don't speculate at all. It is just useless clickbait in either direction. Both the 250% claims as well as the 10% claims and pretty much all in between.

Link to comment
Share on other sites

Link to post
Share on other sites

Those rtx cards will do great once you get applications for it but for now tech is to new will stick with my system till that tech goes mainstream still good for now with my i5 7500 and gtx 1060 6gb..

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Rattenmann said:

All the "analysis" I have seen up to now spew out random figures based on nothing but guessing.

That is helping no one. Neither the ones trying hard to hate and not the ones trying to back up their purchase desition. 

 

What we need is fewer speculations based on guesses and more based on what we really know.

Yes, we only know what they want us to see, but in the past, NVidia has proven time and time again that their marketing is actually spot on. They even lowballed quite a bit. Jenson said the Infiltrator would run at 78fps, the press saw 85fps. Lowballed again.

 

I understand not wanting to speculate based on marketing material, but it is all we have. It has proven correct in the past and it is clearly better than useless speculation based on pure guesswork. If you don't want to use what we have, just don't speculate at all. It is just useless clickbait in either direction. Both the 250% claims as well as the 10% claims and pretty much all in between.

But we don’t need to treat that marketing material uncritically; we can, rather, interrogate it. The video I mentioned makes an attempt at that, and I’d recommend it. Nvidia doesn’t need to falsify data; it can simply emphasise and de-emphasise as needed. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mr moose said:

So you don't want people to talk about the new features or specs or anything like that because they didn't exist before and we don;t know for sure how well it will perform?   Not going to happen. 

Yeah sometimes when I see those comments I get the feeling that the people making them would like to ignore everything in the spirit of misguided fairness.

 

An owner of RTX card is not going to ignore additional performance via DLSS because he has access to it and he is interested only in the bottom line ie. how to make his system faster. He will not turn it off because it's not fair to Pascal which does not support it.

 

3 minutes ago, asus killer said:

i also didn't get how turning DLSS on, increases (blank space). Shouldn't it increase quality and decrease fps if that is even what they are measuring? Are they measuring picture quality, like it's 2 times more beautiful?

 

But i really don't know how DLSS works, but it seems counterintuitive 

The way I see it is that it's a comparison only with AA turned on. You need to imagine another set of bars on the graph without any form of AA.

 

9 minutes ago, Morgan Everett said:

There’s some good analysis by Adored TV, in his latest video

Let's forget for a moment that Adored looks like he has it in for NVIDIA for some time now and has as of now two sets of predictions of the new GPU performance (one from his leak and one his own) so he can't be possibly wrong in any scenario.

 

Why not test it in 4k? You can't test it in 1080p because at this point the bottleneck from CPU side will be severe. You can argue about 1440p but I think Adored has missed the underlying message of that graph: that you can get over 60fps in 4k with a single 2080. You don't need a 2080ti. You can do it with a 2080.

 

He talks about memory bandwith etc. and to me it all sounds like he wants to say "omg you are testing it in too high resolution! Ignore the fact that 2080 can get you a smooth 4k 60fps, you need to test in 1440p because otherwise the older GPU can't handle it! Not fair!"

 

I mean, isn't the point of this to show that 2080 can do something 1080 isn't able to? That's the bottom line the 2080 owners are interested in.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Lathlaer said:

Why not test it in 4k? You can't test it in 1080p because at this point the bottleneck from CPU side will be severe. You can argue about 1440p but I think Adored has missed the underlying message of that graph: that you can get over 60fps in 4k with a single 2080. You don't need a 2080ti. You can do it with a 2080.

 

He talks about memory bandwith etc. and to me it all sounds like he wants to say "omg you are testing it in too high resolution! Ignore the fact that 2080 can get you a smooth 4k 60fps, you need to test in 1440p because otherwise the older GPU can't handle it! Not fair!"

 

I mean, isn't the point of this to show that 2080 can do something 1080 isn't able to? That's the bottom line the 2080 owners are interested in.

I think the point to take from the video was that Nvidia has chosen a testing scenario that will show their new card in the best possible light. There’s nothing surprising about that, but it’s a worthwhile reminder that under other, common scenarios-such as playing at 1440p or in SDR- the increase may be less impressive than the marketing suggests. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Lathlaer said:

I mean, isn't the point of this to show that 2080 can do something 1080 isn't able to? That's the bottom line the 2080 owners are interested in.

 

Very good analysis.

 

AdoredTV does not really sound like he wants to be neutral at all. The video literally starts with the words "we see 50-60%, but I personally think it is 20% anyways!" (not a direct quote). I stopped watching right there, to be honest. Starting with "Let's ignore all we have and pull random numbers out of our asses" is not something that can end with any valuable information at all.

 

Testing at 4k is exactly what people always wanted, but now we are complaining about it?

I get people complaining about 1080p numbers. After all, they don't tell you if 1440p or 4k will work. But 4k numbers allow all three major resolutions to get information.

60fps in 4k means you get more in 1440p and even more in 1080p and you will definitely be above 60fps. Not rocket science.

 

Again:

If you want analysis, then take what we have and go from there. Don't ignore what we have, use random guesses and use those. It is not helping either side of the fence and most definitely not the ones that just want accurate information. If you don't trust what we have, tough luck. Then don't speculate and act like the results are facts or of any value whatsoever.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Morgan Everett said:

but it’s a worthwhile reminder that under other, common scenarios-such as playing at 1440p or in SDR- the increase may be less impressive than the marketing suggests. 

 

While i agree that everyone should be aware of the possibility: We don't need any improvements in 1080p or 1440p. Pascal already crushes that performance bracket. What we still don't have is a solid 4k setup. And if you thinking about spending 1k+ on new GPUs, you are likely trying to get something you don't already have access to.

 

So the stuff shown is, in my eyes, the best possible thing to show. Tailored to the audience that is likely going to spend that money. Not for the audience that does not need it.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, Morgan Everett said:

There’s nothing surprising about that, but it’s a worthwhile reminder that under other, common scenarios-such as playing at 1440p or in SDR- the increase may be less impressive than the marketing suggests.

Yeah sure but I'm arguing bottom line here and real world applications.

 

Answer me this: why would someone who has a 1080 or a 1080ti be concerned about a) 1080p b) 1440p performance? It is widely acknowledged that these cards offer enough for those resolution. Meaning, someone with those two cards has absolutely no reason to upgrade if all they do is play 1440p. They already have cards that give them possibilty to max the performance.

 

4k is so important because it's the only mainstream resolution that has not been widely acknowledged as conquered.

 

Who should be interested in 2080 or 2080ti?

 

a) people with 4k monitors

b) people with 1440p monitors who use DSR to improve the visuals beyond the native monitor

c) people with 3440x1440 ultrawides (see above)

d) people interested in ray tracing technology

 

That graph is for those 4 kind of people. If someone has 1440p monitor and a 1080ti or a 1080 they don't need an upgrade unless they want to push their system further - in which case all those things that Adored disregards (memory bandwith) are essential. 

 

They see that 2080 smokes 1080 in 4k, do you think they care that the % of advantage is lower in a resolution they are not interested in playing? Should they be also concerned about 1080p because it's even lower there?  

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Rattenmann said:

While i agree that everyone should be aware of the possibility: We don't need any improvements in 1080p or 1440p. Pascal already crushes that performance bracket. What we still don't have is a solid 4k setup. And if you thinking about spending 1k+ on new GPUs, you are likely trying to get something you don't already have access to.

 

So the stuff shown is, in my eyes, the best possible thing to show. Tailored to the audience that is likely going to spend that money. Not for the audience that does not need it.

 

It may be, in your eyes; but this doesn’t go to the point made.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Lathlaer said:

Yeah sure but I'm arguing bottom line here and real world applications.

 

Answer me this: why would someone who has a 1080 or a 1080ti be concerned about a) 1080p b) 1440p performance? It is widely acknowledged that these cards offer enough for those resolution. Meaning, someone with those two cards has absolutely no reason to upgrade if all they do is play 1440p. They already have cards that give them possibilty to max the performance.

 

4k is so important because it's the only mainstream resolution that has not been widely acknowledged as conquered.

 

Who should be interested in 2080 or 2080ti?

 

a) people with 4k monitors

b) people with 1440p monitors who use DSR to improve the visuals beyond the native monitor

c) people with 3440x1440 ultrawides (see above)

d) people interested in ray tracing technology

 

That graph is for those 4 kind of people. If someone has 1440p monitor and a 1080ti or a 1080 they don't need an upgrade unless they want to push their system further - in which case all those things that Adored disregards (memory bandwith) are essential. 

 

They see that 2080 smokes 1080 in 4k, do you think they care that the % of advantage is lower in a resolution they are not interested in playing? Should they be also concerned about 1080p because it's even lower there?  

I think plenty of people not playing in 4K or HDR will be interested in these cards; they are, as I say, very common scenarios, including among the 2080’s potential customer base.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Rattenmann said:

Not gonna read the whole article that already starts of with claims that are not based on what we have, but on what they guess.

So, can you quote the part where it says it is misleading?

Because its more like 20% more and the values you quoted are with this Super Sampling shit.

 

So far from "double Performance" like you guessed.

 

And that is also based on the values nVidia Quoted.

 

 

Oh and have you watched the WAN Show? 

Even Linus is kinda sceptical because we do not have any performance values from nVidia! And that could mean that the performance increase is lower than ever before...

 

 

See also:

A type of anti-aliasing similar to TAA.

Thank you!

I suspected as much. So the real value is the darker, shorter one not the other one. And that is kinda misleading...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×