Jump to content

NVidia Geforce RTX 2080 confirmed up to 2x GTX 1080 performance. Evidence for 2080 ti as well.

Source: 

Techradar

videocardz.com

Computerbase (german)

HardwareLuxx (german)

 

Quote

In some newly released presentation slides, Nvidia claims the Nvidia RTX 2080 hovers around being 1.5 times more performant than the last-generation Nvidia GTX 1080 in a multitude of games, including Epic Infiltrator, Wolfenstein II and Shadow of War (HDR).

Quote

Furthermore, if users activate a new Nvidia Turing feature known as Deep Learning Super-Sampling (DLSS), they can expect up to two-times better performance in titles such as Final Fantasy XV (HDR), Ark: Survival Evolved and JX3, according to Nvidia.

 

xBWE6pSdBDh8TQAoGJsF44.jpg

 

My take on the articles

First we have to look at what this chart actually represents by making a hybrid car analogy.

  • You can compare the gas engine only vs other cars. (Cuda only - dark green bars)
  • You can compare the full hybrid, so gas + electro engine vs other cars. (Cuda + Tensor Cores - light green bars)
  • Then you can look at the pure electro power (Cuda + Tensor + RT Cores - not represented in this chart at all)

 

So the first takeaway here is:

You can't easily compare them and be fair to everyone at the same time. If we want a 100% apples to apples comparison, we are out of luck.

Instead, we need all three metrics to make an informed decision if this new GPU Lineup is worth our time and money.

 

The source articles talk about a 40-50% performance jump from 1080 to 2080.

Even without DLSS (Tensor cores) and RT factored into this, Turing seems to mob the floor with Pascal.

NVidia claimed, "the biggest jump in performance in any generation" and this most definitely backs that claim.

 

If we ignore the bad Chart title (which I mistook for a SLI setup before thinking more about it), we can still see how big DLSS is for performance.

Basically, everyone that has a 20 Series card will switch to DLSS as a no-brainer option in every game that supports it.

 

Important to note:

This is all without any RT whatsoever. This is not a chart about showing how the new cards with RT are better, it would otherwise be a 10x factor, not a 2x.

"Prove" of this is pretty much common sense. Neither Hitman 2 or Mass Effect (or any of those older titles) would be able to just plug in RT tech into their game, just for the sake of this chart.

 

 

Some more evidence of 2x performance of the 2080 TI

Quote

We saw DLSS in action and a Nvidia RTX 2080 Ti was able to render Epic Infiltrator at a steady 85 frames per second (fps). Right next to the Turing rig was a Nvidia GTX 1080 Ti-powered system that struggled to keep the same experience running near 45 fps with temporal anti-aliasing and supersampling turned on.

 

Keep in mind that the Epic Infiltrator Demo does not include RT Technology.

Many people seem to believe the 20Series cards can only be so fast with RT enabled. The Demo does not have RT at all and is from 2015, a few years old.

This is an important consideration. The performance shown is before factoring in RT madness. So RT will be a bonus ON TOP of the amazing performance figures.

 

 

My take on performance shown on stage

I personally calculated the 2080 TI performance based on an estimation of their Infiltrator Demo before and came to the conclusion that 2x performance is actually pretty likely. You can check my Google Sheet for this here --> Goggle Sheet, performance comparison

 

A 20% Overclocked 2080 TI would reach about 60500 3dmark scores and be a 331% Improvement over 1070 (see the sheet for details and premises).

The performance per money (Euro prices in this sheet, so with tax included) would be around 48. A 1080TI only has 37.1 and Vega 64 runs around 35.1.

 

Note:

Performance comparison for the 2080 ti was made with the 78 FPS claim from Jenson on stage. As shown above, the press actually saw a steady 85 FPS behind the curtain. This means an even bigger jump for the 2080 ti as Jenson actually lowballed the 2080 ti performance.

 

 

Conclusion

If these Datasheets hold true we may have a crazy update right here.

Even ignoring the RT possibilities we are looking at a 2x gain and possibly even more for the 2080 ti. Then we can add RT goodness on top to make buying decisions easier.

 

Also, this may be a confirmation that the 20xx Lineup is actually an EXTENSION to the 10xx Lineup and not a replacement.

They simply added more to the upper spectrum and prices do reflect that. With this performance, the prices may actually be fine.

As you can see in the Google Sheet linked, an RTX 2080 TI may even have a better performance / Price ratio than the Pascal Lineup. Which is impressive given the price.

 

We can only wait for benches and hope they prove these Slides.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll believe it once the reviewers can confirm this kind of performance. If it had this kind of performance wouldn't they have said at least anything about it in the presentation? All we got to see was 'gigarays/s' which doesn't tell us anything about non-rtx performance.

We also don't know what the Y-axis means so it could be anything.

 

Still, would be very nice if it was true.

Link to comment
Share on other sites

Link to post
Share on other sites

Boy am I hoping this is true. I was worried when nvidia was hiding exact performance benchmarks. I wanna know if I am keeping my GTX 1080 for a few more years, or should start saving for that RTX 2080... hmmmm

Main Rig: cpu: Intel 6600k OC @ 4.5Ghz; gpu: Gigabyte Gaming OC RTX 2080 (OC'd); mb: Gigabyte GA-Z170X-UD3; ram: 16 GB (2x8GB) 3000 G.Skill Ripjaws V; psu: EVGA 650BQ; storage: 500GB Samsung 850 evo, 2TB WD Black; case: Cooler Master HAF 912; cooling: Cooler Master Hyper 212 Evo, Lots of fans, Air!; display: 4k Samsung 42" TV, Asus MX259H 1080p audio: Schiit Audio Magni Amp w/ Audio Technica M50x

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, lvh1 said:

I'll believe it once the reviewers can confirm this kind of performance. If it had this kind of performance wouldn't they have said at least anything about it in the presentation? All we got to see was 'gigarays/s' which doesn't tell us anything about non-rtx performance.

I also think that just switching from TFLOPS to Gigarays as a measure of compute power is very odd.  since they are super not at all comparable.  How many TFLOPS does it take to make a Gigaray?  Or maybe im an idiot.

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, lvh1 said:

I'll believe it once the reviewers can confirm this kind of performance. If it had this kind of performance wouldn't they have said at least anything about it in the presentation? All we got to see was 'gigarays/s' which doesn't tell us anything about non-rtx performance.

What's funny is people are laughing at NVIDIA for not showing performance comparisons, but the moment one comes out they go "it's from NVIDIA, don't believe it."

 

Damned if you do, damned if you don't.

Link to comment
Share on other sites

Link to post
Share on other sites

Guys, please note: They did not really hide this informations. They just did not talk MUCH about it.

 

He said the Infiltrator Demo was running at 78fps. 

You can go on Reddit and find that users with a capped out 1080 ti can reach 45fps. 

Just do the math, and add some minor OC to the 2080 TI, then you get your 2-2x5 figure as well.

 

Again, he said that on stage. It was just ignored by a lot of people.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

What's funny is people are laughing at NVIDIA for not showing performance comparisons, but the moment one comes out they go "it's from NVIDIA, don't believe it."

 

Damned if you do, damned if you don't.

part of the problem is that nVidia is only going to show results that are favorable to them.

 

if they said "here is all the applications where our shit is amazing.  And this is all the stuff thats not gone so well" we might be more open to their data.

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

was this also at 4k like their other performance specs page?

 

edit: nevermind see it at the bottom of the page

Edited by BlkAbysss
Link to comment
Share on other sites

Link to post
Share on other sites

 

1 minute ago, Rattenmann said:

Guys, please note: They did not really hide this informations. They just did not talk MUCH about it.

 

He said the Infiltrator Demo was running at 78fps. 

You can go on Reddit and find that users with a capped out 1080 ti can reach 45fps. 

Just do the math, and add some minor OC to the 2080 TI, then you get your 2-2x5 figure as well.

 

Again, he said that on stage. It was just ignored by a lot of people.

 

That would be nice if true, but I really don't trust nvidia and their sketchy graphs :P

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

Guys, please note: They did not really hide this informations. They just did not talk MUCH about it.

That is what I meant when I said hide... Should have said "avoided sharing that information". I mean I'm all for a 40-50% gain or a 2x GTX 1080 performance lol

Main Rig: cpu: Intel 6600k OC @ 4.5Ghz; gpu: Gigabyte Gaming OC RTX 2080 (OC'd); mb: Gigabyte GA-Z170X-UD3; ram: 16 GB (2x8GB) 3000 G.Skill Ripjaws V; psu: EVGA 650BQ; storage: 500GB Samsung 850 evo, 2TB WD Black; case: Cooler Master HAF 912; cooling: Cooler Master Hyper 212 Evo, Lots of fans, Air!; display: 4k Samsung 42" TV, Asus MX259H 1080p audio: Schiit Audio Magni Amp w/ Audio Technica M50x

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Rattenmann said:

Guys, please note: They did not really hide this informations. They just did not talk MUCH about it.

 

He said the Infiltrator Demo was running at 78fps. 

You can go on Reddit and find that users with a capped out 1080 ti can reach 45fps. 

Just do the math, and add some minor OC to the 2080 TI, then you get your 2-2x5 figure as well.

 

Again, he said that on stage. It was just ignored by a lot of people.

It wasn’t ignored; it was just very limited information. The slide in the OP adds significantly to that. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Morgan Everett said:

If accurate, then it looks like a 40-50% bump. That’s about what I would expect, but makes the 1080 Ti tempting, with price drops. 

40-50% jump without Tensor cores and vs 1080 SLI (unsure if the slides are labeled wrong, or the source text is labeled wrong tho). 

I will update the OP to make this more clear.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

40-50% jump without Tensor cores and vs 1080 SLI (unsure if the slides are labeled wrong, or the source text is labeled wrong tho). 

I will update the OP to make this more clear.

Yeah the labels are not very clear. I think they mean 2x the perf of a 1080, but well find out when people get their hands on them.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, MedievalMatt said:

part of the problem is that nVidia is only going to show results that are favorable to them.

 

if they said "here is all the applications where our shit is amazing.  And this is all the stuff thats not gone so well" we might be more open to their data.

But that's marketing's job: to make the product they're trying to sell sound better than it really is.

Link to comment
Share on other sites

Link to post
Share on other sites

Still don't believe any numbers until reviewers get their hands on them, I don't think it says what settings those games were measured at either.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, lvh1 said:

I'll believe it once the reviewers can confirm this kind of performance. If it had this kind of performance wouldn't they have said at least anything about it in the presentation? All we got to see was 'gigarays/s' which doesn't tell us anything about non-rtx performance.

We also don't know what the Y-axis means so it could be anything.

 

Still, would be very nice if it was true.

Maybe it's frame times ???

 

2080TI is half the performance of a 1080!!!

 

/s

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'll believe it when I see it, but if it's true then turing may not be the failfest I was anticipating.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Rattenmann said:

40-50% jump without Tensor cores and vs 1080 SLI (unsure if the slides are labeled wrong, or the source text is labeled wrong tho). 

I will update the OP to make this more clear.

They just mean that the new card can double the performance of the old one. 

Link to comment
Share on other sites

Link to post
Share on other sites

Waiting for more details

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Rattenmann said:

It is worth nothing tho, that the slides themselves show RTX 2080 vs 2x (!) 1080.

We may end up seeing an even bigger Jump. this kinda jump vs 1080 SLI is crazy. And all that without factoring any RT or Tensor into the mix.

Don't think that means SLI but two times (they always like to say "2X" in their press releases) it's also just the title of the slide.

 

I would suspect this refers to ray tracing ON but that's just a guess

Link to comment
Share on other sites

Link to post
Share on other sites

So the picture in the article is stating 84 FPS on BF1 on 4k (assuming ultra settings). I'm getting around 55-60 FPS with a gtx1080 on 4k ultra, so 60->84 is about 40% which I could believe. That'll probably tank to 60 again once you turn on ray tracing but I'd be okay with that. Not at the current prices, though.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, PopReference said:

Don't think that means SLI but two times (they always like to say "2X" in their press releases) it's also just the title of the slide.

 

It is a pretty misleading title for a slide if it does not reflect the actual slide itself. But we can only guess here. If 2x means 1080 x2 and not SLI, it would be even more impressive tho. After all a SLI setup would not be the same as just doubling a cards performance.

 

Looking forward to further details.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, M.Yurizaki said:

But that's marketing's job: to make the product they're trying to sell sound better than it really is.

right, and i get that... makes perfect sense.  But i think it was AMD recently with threadripper actually published their data showing Intel with a lead im some areas.

 

Mad props to them for doing that, and it adds a ton of credibility to what they are saying.

 

Otherwise we take this data as if it came from videocardzz .... massive overdose of salt.  with some salt thrown in for good measure.

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, MedievalMatt said:

right, and i get that... makes perfect sense.  But i think it was AMD recently with threadripper actually published their data showing Intel with a lead im some areas.

 

Mad props to them for doing that, and it adds a ton of credibility to what they are saying.

 

Otherwise we take this data as if it came from videocardzz .... massive overdose of salt.  with some salt thrown in for good measure.

Well, I would like to see this because last I heard, AMD kept emphasizing how Threadripper was "50% faster" than Intel, maybe with the "up to" caveat or not. Sure it was, in like one benchmark that was multithreaded to begin with.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×