Jump to content

nVidia building up for RTX release hype: 9 more DLSS games & more 4k perf comparisions (Plus a GN Video with details)

WMGroomAK
48 minutes ago, VegetableStu said:

I'm not sure how to write this into tech news, so for everyone's info quickly....

now I see how nvidias 6x claim came from

maybe if something uses it all lol prolly 25% increase across the board normally

and with other shit like dlss maybe more but hopefully some reviews actually take nice screenshots to see if there is any visual differences between dlss and non

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, pas008 said:

now I see how nvidias 6x claim came from

maybe if something uses it all lol prolly 25% increase across the board normally

and with other shit like dlss maybe more but hopefully some reviews actually take nice screenshots to see if there is any visual differences between dlss and non

Nvidia wasn't lying, but it was way too much marketing stuff. "It's 6x better at something no one does!" isn't exactly useful.

 

Weird thing is I think these are going to be pretty good tier for tier upgrades, 30-35% range with the 2080's better 4K edge case since the 1080 just didn't have the memory bandwidth to handle 4K/60 at Ultra settings. The Ray Tracing stuff is 'what your games will use in 5 years" technology, so I can see Nvidia being happy about bringing it forward, but someone clearly wasn't on point in the marketing of this launch.

 

@leadeater , looks like Turing is a Volta-Pascal hybrid. They dialed back Volta's structure to make it more streamlined (which makes sense for a mostly consumer-focused design). We're going to see some legitimate edge case games that really love what Nvidia did, even on a per CUDA core basis. I don't get why they've been sandbagging so bad on the information, unless they're just trying to hide the sticker shock since this stuff isn't going to be cheap.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Taf the Ghost said:

Nvidia wasn't lying, but it was way too much marketing stuff. "It's 6x better at something no one does!" isn't exactly useful.

 

Weird thing is I think these are going to be pretty good tier for tier upgrades, 30-35% range with the 2080's better 4K edge case since the 1080 just didn't have the memory bandwidth to handle 4K/60 at Ultra settings. The Ray Tracing stuff is 'what your games will use in 5 years" technology, so I can see Nvidia being happy about bringing it forward, but someone clearly wasn't on point in the marketing of this launch.

 

@leadeater , looks like Turing is a Volta-Pascal hybrid. They dialed back Volta's structure to make it more streamlined (which makes sense for a mostly consumer-focused design). We're going to see some legitimate edge case games that really love what Nvidia did, even on a per CUDA core basis. I don't get why they've been sandbagging so bad on the information, unless they're just trying to hide the sticker shock since this stuff isn't going to be cheap.

You watched this yet?

 

 

I was right about the doubling of SMs for Turing :). Really interesting video, I'm surprised they are allowed to give that huge amount of detail right now.

 

Edit:

Nvm vid already posed, removed the link

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, leadeater said:

I was right about the doubling of SMs for Turing :). Really interesting video, I'm surprised they are allowed to give that huge amount of detail right now.

Can't watch the vid right now. I've seen other architecture articles appear today so presumably an embargo is up. Although, with GN I don't know if they signed up, or if they have taken other routes to get the info and keeping roughly in sync in respect of others who have signed up.

 

6 minutes ago, leadeater said:

Graphs * 0.8 + Better Graphs * 0.28 + Number of Bars on Graphs * 0.4 + Raytracing * 0.2 = Best GPU performance ever!

There's a simpler way to tell what performance is better. Green = good. Red = bad. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, leadeater said:

Graphs * 0.8 + Better Graphs * 0.28 + Number of Bars on Graphs * 0.4 + Raytracing * 0.2 = Best GPU performance ever!

You could see how much Steve hated trying to even figure out what they were up to with that. The sad part is Nvidia really isn't lying. They're just that guy that really wants to tell you about his stamp collection. I think the problem is Jensen has become his own memes. He almost "no sells" the really cool, new stuff and it just comes off uninteresting. 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, porina said:

Can't watch the vid right now. I've seen other architecture articles appear today so presumably an embargo is up. Although, with GN I don't know if they signed up, or if they have taken other routes to get the info and keeping roughly in sync in respect of others who have signed up.

 

There's a simpler way to tell what performance is better. Green = good. Red = bad. 

At the end of the video, Nvidia apparently sent out a late email saying not to do tear downs. However, Hardware Unboxed, being Aussies, has their video set to auto post most of the time. Looks like it went live around Midnight in Australia with a tear down of the PCB, though not much analysis of the parts. 8+2 VRM setup, though.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Taf the Ghost said:

Nvidia wasn't lying, but it was way too much marketing stuff. "It's 6x better at something no one does!" isn't exactly useful.

 

Weird thing is I think these are going to be pretty good tier for tier upgrades, 30-35% range with the 2080's better 4K edge case since the 1080 just didn't have the memory bandwidth to handle 4K/60 at Ultra settings. The Ray Tracing stuff is 'what your games will use in 5 years" technology, so I can see Nvidia being happy about bringing it forward, but someone clearly wasn't on point in the marketing of this launch.

 

@leadeater , looks like Turing is a Volta-Pascal hybrid. They dialed back Volta's structure to make it more streamlined (which makes sense for a mostly consumer-focused design). We're going to see some legitimate edge case games that really love what Nvidia did, even on a per CUDA core basis. I don't get why they've been sandbagging so bad on the information, unless they're just trying to hide the sticker shock since this stuff isn't going to be cheap.

I dont think ray tracing will take that long, considering many wanted this long ago when larrabee was trying to do it

https://www.extremetech.com/gaming/135788-investigating-ray-tracing-the-next-big-thing-in-gaming-graphics

im not sure but I think it would save time for devs considering it does most of work for them instead of each object/area/etc, im not sure though

 

lol love this quote below after now both have their techniques in place

think intel actually announcing to attempt back into gpu might have forced this more too

 

Larrabee in 2007 and
proclaimed that the chip would deliver a digital Holy Grail — real-time ray tracing (RTRT) at playable framerates in modern games. Much was written. AMD and Nvidia actually agreed on something for the first time in living memory and publicly denounced the idea (at Nvision 2008, one analyst declared that Larrabee was “like a GPU from 2006.”

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, pas008 said:

I dont think ray tracing will take that long, considering many wanted this long ago when larrabee was trying to do it

https://www.extremetech.com/gaming/135788-investigating-ray-tracing-the-next-big-thing-in-gaming-graphics

im not sure but I think it would save time for devs considering it does most of work for them instead of each object/area/etc, im not sure though

 

lol love this quote below after now both have their techniques in place

think intel actually announcing to attempt back into gpu might have forced this more too

 

Larrabee in 2007 and
proclaimed that the chip would deliver a digital Holy Grail — real-time ray tracing (RTRT) at playable framerates in modern games. Much was written. AMD and Nvidia actually agreed on something for the first time in living memory and publicly denounced the idea (at Nvision 2008, one analyst declared that Larrabee was “like a GPU from 2006.”

 

Raytracing needs dedicated hardware, since it's functions don't help much else. Part of the whole "10 years in development" by Nvidia on RT is actually about CUDA and using a GPU as a processing unit. Without that framework, there's no profitable reason to have such specific hardware in a GPU die.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Arokhantos said:

When 1 has 59 fps and other one 60 fps and is mile ahead on the graph even tho like no difference ?

the scale could even by in microns xD

.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Lathlaer said:

Someone needs to explain this to me.

 

Up until now I have always assumed that DLSS is just another fancy way of AA that is leveraging new hardware for better performance. Implying that a game with DLSS "on" will never run as good as game without any AA at all (as it is popular to run 4k games without AA). So I always thought that the DLSS comparison was against TAA.

 

But from what I have been seeing recently it almost sounds like enabling DLSS actually gives you MORE FPS than playing without any form of AA which would be freaking miraculous.

 

Do I have this right or am I missing something?

+1

 

they lost me too, so they turn DLSS on and get better framerates??!!

.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, asus killer said:

+1

 

they lost me too, so they turn DLSS on and get better framerates??!!

DLSS should replace the normal AA functions, offloading it to otherwise unused hardware on the GPU. That's where the FPS jump comes from, as the normal rendering doesn't have to do AA + other functions. However, we have zero clue what it does for visuals. At some level, it looks like what the games would be running at going from Ultra to Medium settings. We'll have to wait for the second round of videos to see how bad the visual quality is.

 

If this was as massive of a bump as they've been showing at roughly the same visual quality, Nvidia would be giving us new information each day about it. (Realistically, this is the way things will eventually go, with AA moved into the Compute section of GPUs to process in parallel, but I doubt the visual quality is anywhere close at current gen.)

Link to comment
Share on other sites

Link to post
Share on other sites

I know everyone's gotten on Nvidia for the graphs, but what the actual fuck do they even mean realistically?

I've always found it funny when companies and sites put out graphs that mean literally nothing without any extra context, and this is yet another case of that.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, VegetableStu said:

I'm not sure how to write this into tech news, so for everyone's info quickly....

This is actually the best 'unboxing day' video that I've seen yet!  Would be great to have @LinusTech and GamersNexus do a short series of videos (Maybe Techquickies?) to go over the details of nVidia's RTX, Ray Tracing (History of and modern implementation) and DLSS.  I'll try to see if I can type up some of the relevant points from this into the thread...

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, VegetableStu said:

imagine rendering 1440p but only drawing 1080p and estimating the rest

(still want comparison pics though)

 

57 minutes ago, Taf the Ghost said:

DLSS should replace the normal AA functions, offloading it to otherwise unused hardware on the GPU. That's where the FPS jump comes from, as the normal rendering doesn't have to do AA + other functions.

 

1 hour ago, asus killer said:

+1

 

they lost me too, so they turn DLSS on and get better framerates??!!

So PCPer has an article out detailing the Turing Architecture in the same vein as Gamers Nexus video...  Within the article is a small section detailing DLSS, although still without the detail I would like on the blackbox workings of it.

 

https://www.pcper.com/reviews/Graphics-Cards/Architecture-NVIDIAs-RTX-GPUs-Turing-Explored/RTX-Features-Ray-Tracing-and-DL

Quote

Using the Tensor cores found in Turing for its inference capabilities, DLSS is a technology that aims to apply deep learning techniques to accelerate and increase the quality of post-processed anti-aliasing.

To implement DLSS, NVIDIA takes an early build of the given game (which they generally receive anyway for driver optimization) and generates a series of "ground truth" images rendered through 64x Super Sampling.

turing-dlss2.PNG.1674793f0f10255a4b1b3c79d7b519a2.PNG

These extremely high-resolution images are then used to train a neural network which is capable of producing output images that NVIDIA claims are equivalent nearly identical to the original 64x Super-Sampled source material.

In this current stage, the neural network model needs to be trained for each specific game title. In the future, NVIDIA might be able to come up with more generic models that can be applied to particular genres of games, or different game engines, but at this point, it requires hand-tuning from NVIDIA.

Regardless, NVIDIA claims that implementing DLSS will cost game developers nothing and that they are committed to scaling their workflow and supercomputers used for training as far as necessary to meet demand.

This neural network model is then distributed via GeForce Experience to end users who have a GPU with tensor cores and have the given game installed. This distribution model is vital as it allows NVIDIA to silently update the model in the background as they come up with improvements as they get more experience and come up with better techniques.

turing-dlss3.PNG.2392e64c101e5a3d1cb48026bc7cf211.PNG

Performance wise, NVIDIA is comparing the performance hit of enabling DLSS to their most recent push of anti-aliasing technology, TAA. While TAA is already a reasonably lightweight method, NVIDIA is claiming performance benefits of 2X when comparing DLSS to TAA. Also, DLSS is temporally stable, unlike TAA which can help prevent some fast moving details becoming blurred on screen.

While DLSS as a whole seems like a worthwhile initiative, with claims of 64x SSAA quality with very little of a performance hit, the more controversial part is NVIDIA pushing DLSS as a significant performance differentiator from previous GPUs. 

 

Kind of wish there were better scenes that they were rendering via this method with more contrast...  Maybe something not so dark on a black slide?

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, VegetableStu said:

ahh yes, the "dark scenes are easier to CGI" problem ._.

 

1 hour ago, asus killer said:

+1

 

they lost me too, so they turn DLSS on and get better framerates??!!

Someone over on HardOCP posted a link to nVidia's Whitepaper on the Turing architecture so I'll link it here...

 

https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf

 

The section on DLSS is Page 35 thru 37 (42 thru 44 in pdf), it also details more of the other shading and infilling techniques that they are implementing, some of which look familiar from the last several years of image testing they've released online.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, WMGroomAK said:

 

 

So PCPer has an article out detailing the Turing Architecture in the same vein as Gamers Nexus video...  Within the article is a small section detailing DLSS, although still without the detail I would like on the blackbox workings of it.

 

https://www.pcper.com/reviews/Graphics-Cards/Architecture-NVIDIAs-RTX-GPUs-Turing-Explored/RTX-Features-Ray-Tracing-and-DL

Kind of wish there were better scenes that they were rendering via this method with more contrast...  Maybe something not so dark on a black slide?

If some of the discussion is correct, basically they'd be downsampling a larger image and then using "machine learned", specialized filters. Which are basically anti-sharpening effects if you're downsampling. Which pretty much means the games are running at medium (with ultra textures) and with specialized filters.

 

The funny part is I won't give Nvidia crap for that. So much of the massive resource usage for modern effects actually produce very little noticeable image quality improvement. You have to turn on all of them to really get a noticeable stepwise improvement. Though custom per game, this actually has really good potential for 1440p, high refresh gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

Whereas TAA renders at the final target resolution and then combines frames, subtracting detail, DLSS allows faster rendering at a lower input sample count, and then infers a result that at target resolution is similar quality to the TAA result, but with roughly half the shading work.

From the white paper.

 

Interesting stuff is actually below it on the new shader abilities. This is really more of a major refinement to Pascal. I fully expect Ampere to be 85% just a die-shrink of Turing.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, VegetableStu said:

Not sure if I'd look forward to a Turing Titan, being a potential successor to the $3000 Volta Titan V o_o

It appears that the full GT102 is in the Quadro RTX 6000, which is being priced at $6300.00... 

 

The main difference being that the Quadro card has:

  • 2 additional TPCs,
  • 4 additional SMs,
  • 256 additional CUDA Cores,
  • 32 Additional Tensor Cores,
  • 4 Additional RT Cores,
  • higher base clock at 1455 Mhz,
  • higher boost clock at 1770 Mhz, 
  • more memory at 24 GB,
  • 8 additional ROPs & 
  • 14 additional Texture units.  

Of course this is a Quadro card so that probably accounts for about $4000.00 right away, but it would be interesting to know if the 2080 Ti would perform reasonably similarly in some of the Quadro workloads...  

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Taf the Ghost said:

At the end of the video, Nvidia apparently sent out a late email saying not to do tear downs. However, Hardware Unboxed, being Aussies, has their video set to auto post most of the time. Looks like it went live around Midnight in Australia with a tear down of the PCB, though not much analysis of the parts. 8+2 VRM setup, though.

Well, they decided to post their tear-down video...

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, CatTNT said:

nvidia fucking up their graphs SO BADLY.

 

BASIC FUCKING MATH SKILLS = EXCELLENT GRAPHS ARE A MUST

Basic math yes, but for marketing they don't give 2 shits lol. They know exactly what they're doin. xP

- Fresher than a fruit salad.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, my rx 480 can do 4k60 in some games so i dunno, is it really that big of a deal?

 

Also i just don't get the hype behind DLSS and ray-tracing stuff. Like, yes it looks good, but there are a LOT of games that also look good and being "realistic" isn't the only way to just look good. I really like the look of borderlands for example, not realistic, but it looks good! Same goes for overwatch.

 

For me for the last few years new games look differently, but not better or worse, just different, but that's probably me.

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, samcool55 said:

 

Also i just don't get the hype behind DLSS

 

If it works in the real world the way Nvidia are marketing it, then the hype is about gaining stupid amounts of processing power without any visual drop in quality.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×