Jump to content

NVidia Geforce RTX 2080 confirmed up to 2x GTX 1080 performance. Evidence for 2080 ti as well.

Updated the OP with two more sources for the german audience.

 

Also updated my take on the chart title.

I do think they just picked a really silly Title and it actually means "a 2080 is like 2 1080s". 

 

Added a quote showcasing potential 2080 ti performance. Added a note that the calculated 2080 ti performance was run with a lower FPS claim from Jenson then the press confirmed to see themselves.

Link to comment
Share on other sites

Link to post
Share on other sites

This would be very good news if the RTX 2080 was priced to compete with the GTX 1080, but it's not, it's priced against a GTX 1080 Ti, so I'd reckon we're seeing a 10-20% gain, if we're taking NVIDIA at face value, in normal titles. 

 

NVIDIA is making some very big bets on ray tracing, and I'm not convinced yet, not without a lot more developer support.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Since they were heavy on the ray tracing in the presentation. I am wondering if this graph is the comparison of a 1080 with Ray Tracing vs a 2080 with Ray Tracing.

 

Again going to need more details on this.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, exetras said:

Since they were heavy on the ray tracing in the presentation. I am wondering if this graph is the comparison of a 1080 with Ray Tracing vs a 2080 with Ray Tracing.

 

Again going to need more details on this.

They already "verified" its was over twice as fast in ray tracing in the event. Like 30vs78fps 1080tivs2080ti. But that is about it, but that lines up most with the chart. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, exetras said:

Since they were heavy on the ray tracing in the presentation. I am wondering if this graph is the comparison of a 1080 with Ray Tracing vs a 2080 with Ray Tracing.

 

Again going to need more details on this.

Ìt most definitely is not with RT.

 

1. RT is not included in most of the shown games.

2. They said that a 1080ti would be around 1.21 Gigarays, so 1080 would be sub 1 Gigaray in performance. Compared to 8 gigarays for a 2080 the difference would be MUCH MUCH bigger.

 

So no, these charts don't include RT and we can be about 99.99% sure of that. 

There remains a 0.01% chance that all of those games suddenly enabled RT, just for the sake of this chart. I guess we can safely ignore this chance hehe.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mynameisjuan said:

They already "verified" its was over twice as fast in ray tracing in the event. Like 30vs78fps 1080tivs2080ti. But that is about it, but that lines up most with the chart. 

Please don't take informations out of context and present them as facts.

 

The verified demo you mention is from 2015 and includes zero RT. It was a pure test of non RT performance.

We can not compare any non RT cards to Turing because it would basically show a 10x gap. It would neither be fair, nor would it be helpful to compare.

Link to comment
Share on other sites

Link to post
Share on other sites

Okay, they have my attention. Eagerly awaiting independent benchmarks but damn if that thing checks out.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, M.Yurizaki said:

What's funny is people are laughing at NVIDIA for not showing performance comparisons, but the moment one comes out they go "it's from NVIDIA, don't believe it."

 

Damned if you do, damned if you don't.

No the point is not hurr they finally show something and people are complaining, the point is they were not showing anything about real world performance outside of ray tracing, and now AFTER the fact they seem to be back peddling to show us how good it is.  If they showed it in the first place I would be less suspicious.

 

On top of that why are they only showing 4k resolutions in all their examples?

Link to comment
Share on other sites

Link to post
Share on other sites

Updated the OP to make it clear that no RT is involved in both the slides, and the quote about 2080ti performance.

This seems to be a pretty widespread misunderstanding and is important to note.

 

RT is just on top of the already impressive numbers. It is kinda like having the cake and being able to eat it too.

Even if you don't care about RT at all (i don't understand how you would not, given the huge jump in graphical goodness).

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Underwrought said:

 If they showed it in the first place I would be less suspicious.

 

They showed the holy grail of graphics. 

A technology many companies collaborated on for a few decades before it finally showed up now.

 

I can personally understand they wanted to focus on that and not diminish it by showing unrelated numbers.

I can however also understand where you are coming from. If you don't care about better graphics, but just for more performance, the presentation was meh.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Underwrought said:

On top of that why are they only showing 4k resolutions in all their examples?

Because those are the purest GPU comparisons you can get, as little CPU-bound as possible.

CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Woo Audio WA8 Eclipse + Focal Utopia Headphones

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Underwrought said:

No the point is not hurr they finally show something and people are complaining, the point is they were not showing anything about real world performance outside of ray tracing, and now AFTER the fact they seem to be back peddling to show us how good it is.  If they showed it in the first place I would be less suspicious.

 

On top of that why are they only showing 4k resolutions in all their examples?

And people still are suspicious because it's a first party source and thus marketing is taking over by showing you only want they want to show you.

 

The other question is how were these slides released? Did NVIDIA have some other presentation that didn't get the same flair as the keynote speech? And why does the keynote speech have to show some sort of performance chart? This is the first time NVIDIA is showing this GPU and people other than gamers are interested in it. If anything, the presentation was probably geared more towards those in the game development industry, and I'm pretty sure those in the game development industry don't care about performance all that much as they care about what new sparkling features they can use in their games.

 

As for only showing 4K, it's because that's a GPU centric load. It takes CPU performance out of the picture as much as possible.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Lathlaer said:

Because those are the purest GPU comparisons you can get, as little CPU-bound as possible.

Touche

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Rattenmann said:

They showed the holy grail of graphics. 

A technology many companies collaborated on for a few decades before it finally showed up now.

 

I can personally understand they wanted to focus on that and not diminish it by showing unrelated numbers.

I can however also understand where you are coming from. If you don't care about better graphics, but just for more performance, the presentation was meh.

 

LMAO wut the holy grail of graphics?  Says who nvidia marketing department?  And you clearly dont understand how marketing works if youre trying to say if they showed performance numbers it would 'diminish' the presentation lmao.  Also even more laughable considering they tried to conflate RTX/ray tracing performance numbers for regular FPS performance numbers in a very deceitful and purposeful way.

 

2 minutes ago, M.Yurizaki said:

And people still are suspicious because it's a first party source and thus marketing is taking over by showing you only want they want to show you.

 

The other question is how were these slides released? Did NVIDIA have some other presentation that didn't get the same flair as the keynote speech? And why does the keynote speech have to show some sort of performance chart? This is the first time NVIDIA is showing this GPU and people other than gamers are interested in it. If anything, the presentation was probably geared more towards those in the game development industry, and I'm pretty sure those in the game development industry don't care about performance all that much as they care about what new sparkling features they can use in their games.

 

As for only showing 4K, it's because that's a GPU centric load. It takes CPU performance out of the picture as much as possible.

You really think game devs dont care much about performance?  L  O L.

 

Thats why most games are sold on low resolution/quality consoles right?

 

Also yeah the FOR THE GAMERS keynote event wasnt REALLY for gamers right?

Link to comment
Share on other sites

Link to post
Share on other sites

Graph is intentionally misleading. 1x means both

  • Performance without AA
  • Performance with AA

Despite both of those numbers going to be wildly different from each other

 

Card is 30-40% better on average than the 1080 unless you're religiously using Raytracing and heavy AA

 

Additionally Ark runs around 30fps tops on a 1080 and I'm sure PUBG runs about the same. These optimized games shouldn't be used as benchmarks.

PLEASE QUOTE ME IF YOU ARE REPLYING TO ME

Desktop Build: Ryzen 7 2700X @ 4.0GHz, AsRock Fatal1ty X370 Professional Gaming, 48GB Corsair DDR4 @ 3000MHz, RX5700 XT 8GB Sapphire Nitro+, Benq XL2730 1440p 144Hz FS

Retro Build: Intel Pentium III @ 500 MHz, Dell Optiplex G1 Full AT Tower, 768MB SDRAM @ 133MHz, Integrated Graphics, Generic 1024x768 60Hz Monitor


 

Link to comment
Share on other sites

Link to post
Share on other sites

Looking at the graph again 

 

Am I the only one still trying to wrap my head around what the actual performance figures are? 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Rattenmann said:

Please don't take informations out of context and present them as facts.

 

The verified demo you mention is from 2015 and includes zero RT. It was a pure test of non RT performance.

We can not compare any non RT cards to Turing because it would basically show a 10x gap. It would neither be fair, nor would it be helpful to compare.

What the hell are you talking about? The demo was 100% RT tracing

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, D13H4RD2L1V3 said:

Looking at the graph again 

 

Am I the only one still trying to wrap my head around what the actual performance figures are? 

looking to be the typical 35% gain across the board for next gen until you factor in dlss

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, rcmaehl said:

Graph is intentionally misleading. 1x means both

  • Performance without AA
  • Performance with AA

Despite both of those numbers going to be wildly different from each other

1

Where does the graph mention not using AA?

The green bar replaces AA with the new AI-based AA, that is all. And since only 20Series card support it, 1080 can't change to that, right? 

 

5 minutes ago, D13H4RD2L1V3 said:

Looking at the graph again 

 

Am I the only one still trying to wrap my head around what the actual performance figures are? 

Unsure what is unclear. Can you go into more detail?

 

The grey bars are baseline 1080 values for the game running at 4k with common AA on.

The dark green bars are 2080 values for the game running at 4k with common AA on.

The light green bars are the performance gain for 2080 cards exchanging common AA with the Tensor core based AA. 

 

If it makes it easier to understand, imagine the green bars as a third bar, placed next to the dark green ones. They are just two informations in one bar, because that is how you usually do it if you can in order to save space and make it less confusing. I understand why this can actually be more confusing tho. ;-)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Rattenmann said:

Where does the graph mention not using AA?

The green bar replaces AA with the new AI-based AA, that is all. And since only 20Series card support it, 1080 can't change to that, right? 

+DLSS means the performance was compared with AA enabled

PLEASE QUOTE ME IF YOU ARE REPLYING TO ME

Desktop Build: Ryzen 7 2700X @ 4.0GHz, AsRock Fatal1ty X370 Professional Gaming, 48GB Corsair DDR4 @ 3000MHz, RX5700 XT 8GB Sapphire Nitro+, Benq XL2730 1440p 144Hz FS

Retro Build: Intel Pentium III @ 500 MHz, Dell Optiplex G1 Full AT Tower, 768MB SDRAM @ 133MHz, Integrated Graphics, Generic 1024x768 60Hz Monitor


 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mynameisjuan said:

What the hell are you talking about? The demo was 100% RT tracing

Please post evidence of that. if you are believing that

The Infiltrator demo does not include any RT code whatsoever. You can verify by downloading it yourself.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Underwrought said:

You really think game devs dont care much about performance?  L  O L.

Not to the extent that they'll go "oh my god, the RTX 2080 doesn't get 60 FPS in everything, we shouldn't support it!" But if you want to find articles interviewing developers who think these new GPUs suck, feel free to link them to me.

 

Quote

Thats why most games are sold on low resolution/quality consoles right?

Yes, because the goal of making a product is to make money which often times that means you need to sell as much as possible. Targeting only the high end is going to limit your customer base so severely that you'll be making flop after financial flop with the budget being spent on AAA development.

 

Quote

Also yeah the FOR THE GAMERS keynote event wasnt REALLY for gamers right?

If that was the title of their keynote event, then I could twist the wording around to get an interpretation of "Things we're doing for the gamers" rather than "we  address to the gamers." The phrase is ambiguous at best.

 

But otherwise people concerned about performance figures, people who likely associate themselves as PC gamers, are telling me that NVIDIA didn't cater this presentation to them. Almost nobody here is talking about what any of this new technology could bring. Graphics isn't all about trying to piss out the highest frame rate. Otherwise we should've stopped development with DirectX 7 and catered to just churning out as much frames as we can.

 

Besides that, almost every time some new supposed "game changer" for GPUs comes out, it almost never performs as well as people wanted it to (maybe except in one or two scenarios). 3D accelerators (S3 ViRGE), hardware transform and lighting (GeForce 256), fully programmable shaders (GeForce 3/Radeon 8500), unified shaders (GeForce 8 and Radeon HD 2000), DirectX 11 cards (Radeon HD5970), and now DirectX 12/Vulkan hardware.

 

I mean performance is nice and all, but I expect graphics hardware to appreciably improve either overall performance or image quality. I don't expect them to appreciably improve both at the same time, but it's a nice bonus.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, MedievalMatt said:

I also think that just switching from TFLOPS to Gigarays as a measure of compute power is very odd.  since they are super not at all comparable.  How many TFLOPS does it take to make a Gigaray?  Or maybe im an idiot.

You need 1.21 Gigarays to match the speed of 88 1080's

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

I mean performance is nice and all, but I expect graphics hardware to appreciably improve either overall performance or image quality. I don't expect on top to appreciably improve both at the same time, but it's a nice bonus.

 

For me, this is the actual kicker here.

More performance plus better graphics.

Sad that people just see the price and go into instant hate-mode, without even checking some key facts. Glad Nvidia tried to improve graphics for the first time in like forever. And from what we were shown, I am actually impressed.

 

Sure it won't be in every game right away. But should we ignore new stuff, just because it won't be everywhere out of the box? That would be quite sad!

 

And at least the Tensor core based AA is really easy to implement. It is basically a checkbox in the engine. 

Link to comment
Share on other sites

Link to post
Share on other sites

They're comparing different antialiasing methods. That says more about the anti aliasing methods than the cards themselves..

Them going full on on deep learning will bite them in the ass in the long run. It's cheap but it'll create image quality issues in the end.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×