Jump to content
WMGroomAK

nVidia building up for RTX release hype: 9 more DLSS games & more 4k perf comparisions (Plus a GN Video with details)

Recommended Posts

Posted (edited) · Original PosterOP

At GTC Japan today, nVidia is still building up the hype for the upcoming RTX 2080 & 2080 Ti by releasing some comparison graphs of Turing vs Pascal & Maxwell performance at 4k 60 FPS showing that the 2080 and 2080 Ti will be hitting or exceeding this benchmark, while both Pascal & Turing fall behind.  They also indicate that enabling DLSS will allow for '4k 60 FPS' performance in excess of without DLSS.  

 

4k 60 FPS w/o DLSS:

13121249271l.jpg.75ebb9e47907b534cac27b2c801f7d43.jpg

 

4k 60 FPS w/ DLSS

13121249691l.jpg.cb4eef0a35baad48c08ac523f1fe15b4.jpg

 

https://www.overclock3d.net/news/gpu_displays/nvidia_reveals_rtx_2080_and_rtx_2080_ti_performance_data_at_gtc_japan_2018/1

Quote

At GTC Japan, Nvidia released two new marketing slides for their RTX 2080 and RTX 2080 Ti graphics cards, showcasing the relative performance of both graphics cards when compared to Nvidia's GTX 1080 and GTX 1080 Ti. 

 

In the slide below, Nvidia claimed that their RTX 2080 is good enough for 4K 60FPS gameplay, placing their RTX 2080 above their GTX 1080 Ti in terms of raw performance, with the company's RTX 2080 Ti delivering even higher performance levels. 

 

Looking at the slide Nvidia compares the performance jump between GTX 10 series and the RTX 20 series as being similar to the performance leap when moving between the GTX 9 and GTX 10 series', a significant performance uptick even without the use of RTX features like Ray Tracing and DLSS (Deep Learning Super Sampling). The only issue here is that Nvidia's new 20 series products also ship with significantly higher MSRPs than their predecessors, a fact that is not considered in this "performance-centric" graph. 

 

When Deep Learning Super Sampling is added into the mix, the performance gap between the RTX 20 series and the GTX 10 series gets significantly wider, delivering much higher performance levels. 

 

So far, Nvidia has confirmed that DLSS will be added to 25 games, utilising Nvidia's AI performance to deliver similar levels of image quality to a native resolution presentation, but with significantly lower computational requirements, providing higher framerates. More information about DLSS and its supported games is available to read here. 

nVidia has also added 9 more games to the DLSS supported list, making for a total of at least 25 games that will have DLSS support.  This will include Darksiders III, Overkill's The Walking Dead, Fear the Wolves & Hellblade.  

 

https://www.overclock3d.net/news/software/nvidia_builds_dlss_momentum_reveals_9_new_rtx_enabled_games/1

Quote

In effect, DLSS allows Nvidia RTX graphics cards to play games at higher resolutions with lower amounts of computational performance, upscaling images with an AI-made algorithm to offer image quality levels that are similar to a native resolution presentation. In short, this technology allows supported games to run faster on Nvidia RTX enabled graphics cards, making high framerate gameplay at high resolutions more achievable than ever before. 

 

So far Nvidia has announced that sixteen games will support the company's DLSS technology, but now we can add nine more games to the list, which include Devolver Digital's SCUM, Overkill's The Walking Dead and Ninja Theory's Hellblade Senua's Sacrifice. All titles that are new to this list will have "Newly Added" listed beside the game's name. No games support DLSS at the time of writing, the titles listed below have agreed to support DLSS, with launch games adding support via future patches/updates. 
 
- Ark: Survival Evolved
- Atomic Heart
- Darksiders III - Newly added
- Dauntless
- Deliver Us The Moon: Fortuna - Newly Added
- Fear The Wolves - Newly Added
- Final Fantasy XV
- Fractured Lands
- Hellblade: Senua's Sacrifice - Newly added
- Hitman 2
- Islands of Nyne
- Justice
- JX3
- KINETIK - Newly added
- Mechwarrior 5: Mercenaries
- Outpost Zero - Newly Added
- Overkill's The Walking Dead - Newly Added
- Player Unknown's Battlegrounds
- Remnant: From the Ashes
- SCUM - Newly Added
- Serious Sam 4: Planet Badass
- Shadow of the Tomb Raider
- Stormdivers - Newly Added
- The Forge Arena
- We Happy Few

Honestly, looking at the graphs provided is extremely disappointing just because they are horrible graphs with no relative meaning.  Still, it will be nice to see if they can justify the price increase with consistent 4k 60FPS+ performance...

 

Update:  With today apparently being unboxing day for the new cards, Gamers Nexus has put out their take of an unboxing by releasing a technical dive into the Turing Architecture that provides some better details on how nVidia is calculating Ray Tracing and how it is being implemented, changes from Pascal to Turing and details on the TU102 chip.  Definitely would encourage a watch as this is probably going to be one of the better release videos out there.

 

Some key takeaways is that: 

 

  • FP32 * 0.8 + INT32 * 0.28 + RTOPS * 0.4 + Tensor * 0.2 = nVidia new metric for RTX Ops
  • Unified L1 and SRAM for a total of 32 KB + 64KB
  • 2 SMs per a TPC instead of single SM per TPC (better segmentation and memory/cache untilization)

 

Comparison of Full TU102 and RTX 2080Ti Specs (corrected typo to the chart is that the RTX 2080Ti has 4352 CUDA cores, not 4532):

Picture1.jpg.b5463549cc47e549f0cf0d94b9de7fcc.jpg

 

There is also a better explanation of the Ray Tracing processing at around the 22:20 mark...

 

With the specs listed though for the Full TU102, I would not be surprised if nVidia decides to release a RTX Titan card sometime in January or so...

 

Just going to add that PCPer also has an article up detailing a lot of the same information on the Turing Architecture for those who would like to see all the pretty graphs and read about it...  They also mention that there will be NVLink with 50 GB/s in the single link found on the 2080 and 100 GB/s with the dual link connection on the 2080Ti.  It looks like the NVLink bridges will be an additional $79.00 (in addition to the second card).  Finally they have a short discussion on the implementation of DLSS, although the details are still not quite as detailed as I would like...

 

https://www.pcper.com/reviews/Graphics-Cards/Architecture-NVIDIAs-RTX-GPUs-Turing-Explored

 

Quote

DLSS

While NVIDIA has been priming gamers to get ready for Ray Tracing and their RTX API through several announcements over 2018, the more surprising part of the technology suite that NVIDIA is referring to as RTX is DLSS.

Using the Tensor cores found in Turing for its inference capabilities, DLSS is a technology that aims to apply deep learning techniques to accelerate and increase the quality of post-processed anti-aliasing.

To implement DLSS, NVIDIA takes an early build of the given game (which they generally receive anyway for driver optimization) and generates a series of "ground truth" images rendered through 64x Super Sampling.

These extremely high-resolution images are then used to train a neural network which is capable of producing output images that NVIDIA claims are equivalent nearly identical to the original 64x Super-Sampled source material.

In this current stage, the neural network model needs to be trained for each specific game title. In the future, NVIDIA might be able to come up with more generic models that can be applied to particular genres of games, or different game engines, but at this point, it requires hand-tuning from NVIDIA.

Regardless, NVIDIA claims that implementing DLSS will cost game developers nothing and that they are committed to scaling their workflow and supercomputers used for training as far as necessary to meet demand.

This neural network model is then distributed via GeForce Experience to end users who have a GPU with tensor cores and have the given game installed. This distribution model is vital as it allows NVIDIA to silently update the model in the background as they come up with improvements as they get more experience and come up with better techniques.

Performance wise, NVIDIA is comparing the performance hit of enabling DLSS to their most recent push of anti-aliasing technology, TAA. While TAA is already a reasonably lightweight method, NVIDIA is claiming performance benefits of 2X when comparing DLSS to TAA. Also, DLSS is temporally stable, unlike TAA which can help prevent some fast moving details becoming blurred on screen.

While DLSS as a whole seems like a worthwhile initiative, with claims of 64x SSAA quality with very little of a performance hit, the more controversial part is NVIDIA pushing DLSS as a significant performance differentiator from previous GPUs. 

A lot of the early performance claims that NVIDIA has been making about these Turing GPUs go out of their way to also show DLSS as part of the performance increase story of Turing over Pascal.

Even if these performance numbers are indicative of the advantages that DLSS can give over traditional AA techniques, you have to place a lot of faith in this feature being implemented in the games you want to play to take this into account vis-a-vie a purchasing decision.

 

Edited by WMGroomAK
Added PCPer article
Link to post
Share on other sites
16 minutes ago, CatTNT said:

nvidia fucking up their graphs SO BADLY.

Pretty much this.

 

Pointless graphs like this are why (among other reasons) I have trust issues.

 

The 980ti is fully capable of 4k60 in the proper situation.

 

To Nvidia: more context please (more than none).


System specs:

4790k

GTX 1050

16GB DDR3

Samsung evo SSD

a few HDD's

Link to post
Share on other sites

Settle down guys, it's just marketing. It has nothing to do with engineers or even management.    You know, like for better pain relief use X brand of paracetamol because marketing with men in white coats and clipboards telling you it is scientifically proven to work actually works on people.

 

 

 


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

That graph is just pure marketing 


Please tag me if you need assistance or if you want me to contribute to a topic 

 

ASUS RoG STRIX GL502VM

Intel Core i7 7700HQ | GeForce GTX 1060 6GB | 16GB DDR4-2133 | 128GB SanDisk M.2 SATA SSD + 1TB 7200RPM Hitachi HDD | 15.6" 1080p IPS monitor @ 60Hz w/ G-SYNC | Windows 10 64-bit

 

Samsung Galaxy Note8 SM-N950F

Exynos 8895 (4x Mongoose @ 2.3GHz, 4x Cortex A53 @ 1.7GHz)ARM Mali G71 MP20 | 6GB LPDDR4 | 64GB Samsung NAND flash w/ UFS 2.1 dual-lane controller + 128GB SanDisk C10 UHS-I microSD | 6.3" 1440p "Infinity Display" AMOLED | Android Nougat 7.1.1 w/ Samsung Experience 8.5

Link to post
Share on other sites

Someone needs to explain this to me.

 

Up until now I have always assumed that DLSS is just another fancy way of AA that is leveraging new hardware for better performance. Implying that a game with DLSS "on" will never run as good as game without any AA at all (as it is popular to run 4k games without AA). So I always thought that the DLSS comparison was against TAA.

 

But from what I have been seeing recently it almost sounds like enabling DLSS actually gives you MORE FPS than playing without any form of AA which would be freaking miraculous.

 

Do I have this right or am I missing something?


CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Audio-GD NFB-11.28 + Focal Elear Headphones

Link to post
Share on other sites
Posted · Original PosterOP
Just now, Lathlaer said:

But from what I have been seeing recently it almost sounds like enabling DLSS actually gives you MORE FPS than playing without any form of AA which would be freaking miraculous.

My basic understanding of what DLSS does is that it is an upscaling technology so you are taking something like 1080p or 2160p framerates and upscaling it to 4k resolution.

Link to post
Share on other sites

Well... I potentially could play 2 of those games. Shame it has to be implemented, though I have no real intention of upgrading to rtx any time soon

Edited by RotoCoreOne
Missing comma
Link to post
Share on other sites

Horrible graphs


PC Specs: i7 6EiGht00K (4.4ghz), Asus DeLuxe X99A II, GTX1080 Zotac Amp ExTrEme),64Gb DOminator PlatinUm, EVGA G2 seven5zeroWatt, Phanteks Enthoo Primo, 3TB WD Black, 500gb 850 Evo, H100iGTX, Windows 10, K70 RGB, G502, HyperX Cloud 2s, Asus MX34. SAMSUNG 960 EVO

Just keeping this here as a backup 980tiZotacStockBIOS.zip☻♥■∞{╚§XÅD{┘Æ╩mYÄÜXτ╕○\╚Θº£¥ΘBM@Q05♠{{↨↨▬§¶‼↕◄►☼1♦  wumbo

Link to post
Share on other sites
55 minutes ago, Lathlaer said:

Someone needs to explain this to me.

 

Up until now I have always assumed that DLSS is just another fancy way of AA that is leveraging new hardware for better performance. Implying that a game with DLSS "on" will never run as good as game without any AA at all (as it is popular to run 4k games without AA). So I always thought that the DLSS comparison was against TAA.

 

But from what I have been seeing recently it almost sounds like enabling DLSS actually gives you MORE FPS than playing without any form of AA which would be freaking miraculous.

 

Do I have this right or am I missing something?

I actually have exactly the same question in my head right now.

Was 100% sure I got what DLSS was until I saw this graphs.

 

Apart from being crap without context, it looks like DLSS does indeed improve FPS. The 4k60 line does not change,... 

I realize you can make graphs look funky in order to semi-fake stuff. But NVidia never actually LIED in graphs, they only presented them intelligently / misleading.

 

So, let's just assume they are not flat out lying here for a second: How can this graph be interpreted?

What do you have to do to this graph to make sense,.. any statistic guys around that know what is happening here?

 

Edit:

my best guess: The first one is indeed with AA on and they took a game at ultra settings, so the 1080ti does not reach 4k60.

That would explain the DLSS uplift in the second screen. This kinda has to be the case, as I wholeheartedly doubt DLSS actually improves FPS from a baseline without AA on.

Link to post
Share on other sites

That being said, if the graphs are indeed a game on Ultra, so the 1080ti does not hit 4k60, then those 2080 (non ti) values do look pretty crazy good tbh.

I mean it does indeed have more horses than a 1080ti, but with DLSS support that's a kinda impressive jump.

 

Really can't wait to see what DLSS really is and how all this was measured. With so many games adding support, it seems to not be all that hard for developers.

Link to post
Share on other sites
1 hour ago, Tech Enthusiast said:

That being said, if the graphs are indeed a game on Ultra, so the 1080ti does not hit 4k60, then those 2080 (non ti) values do look pretty crazy good tbh.

I mean it does indeed have more horses than a 1080ti, but with DLSS support that's a kinda impressive jump.

 

Really can't wait to see what DLSS really is and how all this was measured. With so many games adding support, it seems to not be all that hard for developers.

They send their code to nvidia to run on their dgx super computer to get support it looks like

Link to post
Share on other sites
6 hours ago, Lathlaer said:

Someone needs to explain this to me.

 

Up until now I have always assumed that DLSS is just another fancy way of AA that is leveraging new hardware for better performance. Implying that a game with DLSS "on" will never run as good as game without any AA at all (as it is popular to run 4k games without AA). So I always thought that the DLSS comparison was against TAA.

 

But from what I have been seeing recently it almost sounds like enabling DLSS actually gives you MORE FPS than playing without any form of AA which would be freaking miraculous.

 

Do I have this right or am I missing something?

If I understand it correctly, the DLSS is being processed by another chip? So when you use DLSS it turns off AA which basically free some performance of GPU that can be used to render more frames (lightly said in very non-technical way :D)

 

Just imagine that you are using one card to render game without AA and you have a second card that takes care of the AA.


CPU: AMD Phenom II X4 945 | Mobo: MSI 790GX-G65 | RAM: Corsair Twinx 2GB+2GB | GPU: Gigabyte HD 6850 | PSU: Seasonic 500W | Storage: 250GB + 500GB Seagate | Cooling: Thermaltake Frio Silent 12 | Case: C-Tech Radiant

CPU: Intel i5-4690K | Mobo: MSI Z97 Gaming 3 | RAM: Kingston Savage 4GB+4GB | GPU: Asus Strix GTX970 | PSU: Seasonic M12II-620 Evo | Storage: MX100 128GB + WD Blue 1TB | Cooling: CM Hyper 212 evo | Case: NZXT H440
Link to post
Share on other sites
53 minutes ago, Dezz said:

If I understand it correctly, the DLSS is being processed by another chip? So when you use DLSS it turns off AA which basically free some performance of GPU that can be used to render more frames (lightly said in very non-technical way :D)

 

Just imagine that you are using one card to render game without AA and you have a second card that takes care of the AA.

That was how I interpreted it too, the DLSS is being done by the new tensor cores which free's up the cuda cores to do all the FPS work without having to try and apply any AA or scaling as well (even though they have less cuda cores than the ten series, they are doing less work).

 

I just want to wait for independent benchmarks to confirm/debunk this before getting to involved in a discussion about how good/bad it really is.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

Those are bullshit graphs. I still maintain that the performance increase is meaningless when it comes with an equivalent price increase. The 2080 should be directly compared to the 1080ti because that's what the price brackets look like. And the 2080 is slightly faster than the 1080ti without much TDP decrease. Yet people still preorder, all it's going to do is allow Nvidia (and perhaps AMD) to increase their prices permanently


If I'm honest I spend more time playing with the hardware that I do playing on the hardware

 

-Rig Specs in Profile

 

 

 

Link to post
Share on other sites
11 minutes ago, Froody129 said:

Those are bullshit graphs. I still maintain that the performance increase is meaningless when it comes with an equivalent price increase. The 2080 should be directly compared to the 1080ti because that's what the price brackets look like. And the 2080 is slightly faster than the 1080ti without much TDP decrease. Yet people still preorder, all it's going to do is allow Nvidia (and perhaps AMD) to increase their prices permanently

its almost like we are seeing this in the phone market.

 

still though. those are pretty bad performance graphs. at least phone manufactorers steps it up a notch, and removes stuff in the name of progress to justify it. (yes i know RTX, though at the price of another GPU its kinda meh)

Link to post
Share on other sites
11 minutes ago, GoldenLag said:

at least phone manufactorers steps it up a notch

I see what you did there


Processor: i7 7700k@Stock GPU: GTX 1080 MSI Armor OC  Mobo: Asus Prime Z270-A RAM: Corsair LPX 16GB 3200 MHz CPU Cooler: be quiet! Dark Rock 3 SSD: Sandisk Ultra 2 960GB Case: Phanteks P400s PSU: Gigabyte B700H Monitor: AOC G2460PF 1080p 144Hz Mouse: Logitech G900 Keyboard: Corsair Strafe w/ MX Blues

Link to post
Share on other sites

Without vertical axis on that graph we know precisely dick. There is a huge space in the DLSS picture that would suggest huge improvement but that could mean +20 FPS or +10 FPS. Imagine that 1080ti under the 4K 60FPS means something like 58 FPS xD

 

1 hour ago, Dezz said:

If I understand it correctly, the DLSS is being processed by another chip? So when you use DLSS it turns off AA which basically free some performance of GPU that can be used to render more frames (lightly said in very non-technical way :D)

 

Just imagine that you are using one card to render game without AA and you have a second card that takes care of the AA.

 

59 minutes ago, mr moose said:

That was how I interpreted it too, the DLSS is being done by the new tensor cores which free's up the cuda cores to do all the FPS work without having to try and apply any AA or scaling as well (even though they have less cuda cores than the ten series, they are doing less work).

Yes that was my initial interpretation as well, meaning that no matter how you look at it, DLSS will never be as good as running a game without AA. It would be nice of them to include in the graph that all values for non-DLSS graph are with TAA "on" because otherwise we arrive to a weird conclusion that enabling DLSS actually improves your FPS over non-AA.


CPU: i7 6950X  |  Motherboard: Asus Rampage V ed. 10  |  RAM: 32 GB Corsair Dominator Platinum Special Edition 3200 MHz (CL14)  |  GPUs: 2x Asus GTX 1080ti SLI 

Storage: Samsung 960 EVO 1 TB M.2 NVME  |  PSU: In Win SIV 1065W 

Cooling: Custom LC 2 x 360mm EK Radiators | EK D5 Pump | EK 250 Reservoir | EK RVE10 Monoblock | EK GPU Blocks & Backplates | Alphacool Fittings & Connectors | Alphacool Glass Tubing

Case: In Win Tou 2.0  |  Display: Alienware AW3418DW  |  Sound: Audio-GD NFB-11.28 + Focal Elear Headphones

Link to post
Share on other sites
1 minute ago, Lathlaer said:

Without vertical axis on that graph we know precisely dick. There is a huge space in the DLSS picture that would suggest huge improvement but that could mean +20 FPS or +10 FPS. Imagine that 1080ti under the 4K 60FPS means something like 58 FPS xD

 

 

Yes that was my initial interpretation as well, meaning that no matter how you look at it, DLSS will never be as good as running a game without AA. It would be nice of them to include in the graph that all values for non-DLSS graph are with TAA "on" because otherwise we arrive to a weird conclusion that enabling DLSS actually improves your FPS over non-AA.

It may actually do that though, It is possible that the 2080 with DLSS might perform on par or better than the 1080ti without AA.  We are just going to have to wait for independent testing for all the qualifiers and down to earth numbers though, this is why I have been reluctant to join in what is essentially speculation about how good this DLSS really is.  It seems people have all gone out and pre-ordered the thing to death,  and to be honest, there have been so few GPU's that weren't worth the asking price on release I can hardly blame them for assuming this will be no different.

 

I think it has been so long since we have seen such a different approach to GPU design,  that if I had to make a prediction,  I would say it will take a few updates to iron the kinks out of the drivers/games before we see the real performance values. Also that because of that difference, synthetic tests will likely not be reflective of real world gaming results.  Basically if that happens synthetic tests will be useless for RTX comparisons.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

**80 cards the equivalent of **80ti cards.....yeah no.


"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.

Buy VPN

×