Jump to content

Wolfenstein II Gets Turing Support

matrix07012

Wolfenstein II Gets Turing Support

 

Quote

turing-grids.png.1717ba0bf6acc57adb610972933350fc.png

To let shader power go where it's needed most, VRS allows developers to subdivide a 3D application's screen space into a grid of 16x16 pixel regions (among other possible division techniques, ranging from as fine as a per-triangle division or, more logically, per-object). According to Nvidia's SIGGRAPH presentation on the topic, each of those 16x16-pixel regions can, in turn, have one of several shading rates applied to it, from the naive 1x1 grid that would typically be used to shade every pixel in a scene today, to 2x2 groups of pixels for 1/4-rate shading, all the way out to a quite-coarse set of 4x4 grids for 1/16-rate shading. According to that presentation, then, the parts of a scene that need the most detail can get it via 1x1 shading, while those that are the most uniform or the most likely to be in motion can be shaded at the lowest rates to save shader horsepower for the parts of the scene that need it.

Quote

Wolfenstein II: The New Colossus is the first game to implement CAS, and it is in fact the poster child for how the technique's analysis step works in the aforementioned white paper. Nvidia provides a visualized example of the result of its post-processing analysis on a frame taken from the game's good guys' submarine, Evas Hammer.
cas-divs.png.4c286e69fc50cc991df7ba7b69b6cdc7.png

cas-comparo.png.83abed560dbdb6db7474b22135d990e5.png

Quote

Performance:

Wolfenstein_II___3840x2160_average_fps.png.204b38f9d58cb936849b438f48510233.png

Wolfenstein_II___3840x2160_99perc.png.6358e5ba7b63430a99fa6a83f9157012.png

Oh look, another proprietary tech. That's what everyone wants

The performance gains seem pretty nice, but obviously it depends how well the devs implement it, since techniques like this can generate some pretty bad artifacts.

 

Sauce: https://techreport.com/review/34269/testing-turing-content-adaptive-shading-with-wolfenstein-ii-the-new-colossus / https://archive.fo/jxyu8

Spoiler

Quiet Whirl | CPU: AMD Ryzen 7 3700X Cooler: Noctua NH-D15 Mobo: MSI B450 TOMAHAWK MAX RAM: HyperX Fury RGB 32GB (2x16GB) DDR4 3200 Mhz Graphics card: MSI GeForce RTX 2070 SUPER GAMING X TRIO PSU: Corsair RMx Series RM550x Case: Be quiet! Pure Base 600

 

Buffed HPHP ProBook 430 G4 | CPU: Intel Core i3-7100U RAM: 4GB DDR4 2133Mhz GPU: Intel HD 620 SSD: Some 128GB M.2 SATA

 

Retired:

Melting plastic | Lenovo IdeaPad Z580 | CPU: Intel Core i7-3630QM RAM: 8GB DDR3 GPU: nVidia GeForce GTX 640M HDD: Western Digital 1TB

The Roaring Beast | CPU: Intel Core i5 4690 (BCLK @ 104MHz = 4,05GHz) Cooler: Akasa X3 Motherboard: Gigabyte GA-Z97-D3H RAM: Kingston 16GB DDR3 (2x8GB) Graphics card: Gigabyte GTX 970 4GB (Core: +130MHz, Mem: +230MHz) SSHD: Seagate 1TB SSD: Samsung 850 Evo 500GB HHD: WD Red 4TB PSU: Fractal Design Essence 500W Case: Zalman Z11 Plus

 

Link to comment
Share on other sites

Link to post
Share on other sites

6fps isn't really a massive gain

Edited by Bananasplit_00
Because too tired to maths right

I spent $2500 on building my PC and all i do with it is play no games atm & watch anime at 1080p(finally) watch YT and write essays...  nothing, it just sits there collecting dust...

Builds:

The Toaster Project! Northern Bee!

 

The original LAN PC build log! (Old, dead and replaced by The Toaster Project & 5.0)

Spoiler

"Here is some advice that might have gotten lost somewhere along the way in your life. 

 

#1. Treat others as you would like to be treated.

#2. It's best to keep your mouth shut; and appear to be stupid, rather than open it and remove all doubt.

#3. There is nothing "wrong" with being wrong. Learning from a mistake can be more valuable than not making one in the first place.

 

Follow these simple rules in life, and I promise you, things magically get easier. " - MageTank 31-10-2016

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, huilun02 said:

Don't know if Nvidia even realizes that all their proprietary jank only amounts to boastful marketing. None of it will become mainstream because they were made exclusive to Nvidia's expensive stuff anyway. Majority of game sales is on consoles that use Radeon graphics so getting developers to do stuff that more than half of all customers can't see, is kind of a waste of resources.

 

Yeah I know you're going to say "but its technological progress"

Making stuff proprietary and black boxing to the point where it impedes its proliferation is not technological progress

This isn't how the graphics industry works, as much as consumers like to believe as such.

 

NVIDIA implements these kinds of effects as Vulkan and OpenGL extensions, which are a public API specification that any IHV is free to implement.

NV_shading_rate_image

VK_NV_shading_rate_image

 

What usually happens when AMD, Intel or NVIDIA creates extensions like this is that they publish the spec to Khronos for everyone to look at and read, then the other IHVs will go ahead and create their own version either based on the specification or with their own platform improvements in mind. If the specification is popular then vendors tend to work together to have it revised into an "EXT" extension, which is basically a version of the spec that is more vendor agnostic (even if it's just a cosmetic name change).

 

If the extension is really popular and is something lots of GPUs support, then it can be revised into an "ARB" extension, which is the Khronos architecture review board. If it's a feature that becomes pedestrian in graphics rendering, then it stops being an extension and gets promoted into the full OpenGL/Vulkan specification and IHVs are required to implement it to be certified for the latest Vulkan/GL version.

 

During all this, AMD, Intel, NVIDIA (ARM, Apple, Google, Samsung, etc, etc) are all free to implement each other's vendor specific extensions. As an example, NVIDIA GPUs and AMD GPUs both support a handful of vendor extensions from each other.

 

Take a look at these non-NVIDIA extensions that NVIDIA has implemented on a GTX 980:

S5KPJ11.gif

 

Siggraph is an event that's all about the graphics industry moving forward, so yes this is all about technological progress and that's especially obvious when it's published as a public Vulkan/OpenGL extension specification that anyone is welcome to implement or copy. Proprietary stuff does not get published (goes against the "Open" of "OpenGL").

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, System32.exe said:

It is at 4K.

Not when the frame rate is over 100 its not!

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, huilun02 said:

Don't know if Nvidia even realizes that all their proprietary jank only amounts to boastful marketing. None of it will become mainstream because they were made exclusive to Nvidia's expensive stuff anyway. Majority of game sales is on consoles that use Radeon graphics so getting developers to do stuff that more than half of all customers can't use, is kind of a waste of resources.

 

Yeah I know you're going to say "but its technological progress"

Making stuff proprietary and black boxing to the point where it impedes its proliferation is not technological progress

This is a very poor way of looking at it.

 

By making their own they put pressure on others to make feature equivalents, even if it is proprietary (which is rarely is actually... CPU Phys-X is in almost every single game these days). Which generally are less locked down. Do you honestly believe that FreeSync would have come out sooner or wider if G-sync didnt come out first?

 

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, grayperview said:

Not when the frame rate is over 100 its not!

Uhm, no idea what is up for debate here,... just look at the picture. It is right there. 4k resolution and over 100fps.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Tech Enthusiast said:

Uhm, no idea what is up for debate here,... just look at the picture. It is right there. 4k resolution and over 100fps.

As it was when CAS was off too, CAS is worth a massive 6fps improvement, over the 106 it was already running at, hardly game changing!

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, grayperview said:

Not when the frame rate is over 100 its not!

And neither when framerate is at 30fps because it's still too low whether you add 6fps or not...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, grayperview said:

Not when the frame rate is over 100 its not!

At 4K you take every frame you can get.

 

Plus only the 2080 Ti is over 100 FPS. The framerate increase is nice on the 2080 as it brings it closer to 100 FPS. And it's impactful on the 2070 as it was closer to 60 FPS than 100.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, huilun02 said:

Don't know if Nvidia even realizes that all their proprietary jank only amounts to boastful marketing. None of it will become mainstream because they were made exclusive to Nvidia's expensive stuff anyway. Majority of game sales is on consoles that use Radeon graphics so getting developers to do stuff that more than half of all customers can't use, is kind of a waste of resources.

 

Yeah I know you're going to say "but its technological progress"

Making stuff proprietary and black boxing to the point where it impedes its proliferation is not technological progress

By your logic it is a waste to invest in PC gaming, since the vast majority plays on console and that is where the big bang in terms of sales is or what about top tier GPU or hardware in general.. should we not just stop making those and only have midrange? How many use a 2080(ti) for gaming on steam? well less than 5% of all people.. We could apply this on almost any real life analogy and could fit it to support your arguement or mine. The thing here is that the "tech progress" arguement you try to hurt is actually a pretty valid one. If we had just disregarded most of the new techs, because it was not very supported, then we would have alot less of what we actually see now a days.

 

DLSS will become mainstream pretty fast. I even dare say that the combo of Ray Tracing with normal rastorazation will become mainsteam for the top tier gamers faster than you think. Does this mean next month? no... ofc not.

 

A funny fact is also that now that we talk about resolutions in this thread.. Under 2% of gamers on steam use a 4k monitor, under 7% use a 1440p, almost all use 1080p and will do so for many years.. This is because quality and new tech is expensive, not all will be able to afford it, but some always will.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, huilun02 said:

 

I wouldn't go outright and make that claim just yet. 

All new tech is expensive at the beginning and i'm sure they wouldn't just abandon raytracing just because the gaming market doesn't care for it.

 

The big money lies in the business end, if they want it then we shall have it. Don't forget, being able to raytrace in real time is a huge boom for animation movies that would've required days to render or even months with cheap hardware.

 

I can already see the company behind "Avatar" laughing their heads off in the time they will save in production.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, System32.exe said:

At 4K you take every frame you can get.

 

Plus only the 2080 Ti is over 100 FPS. The framerate increase is nice on the 2080 as it brings it closer to 100 FPS. And it's impactful on the 2070 as it was closer to 60 FPS than 100.

it scales, its about 3.5% nothing worth getting excited about.

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, grayperview said:

it scales, its about 3.5% nothing worth getting excited about.

Like I said, you take everything you can get at 4K. It's not exciting but it's nothing to be disappointed over.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, grayperview said:

it scales, its about 3.5% nothing worth getting excited about.

By that logic, when is the threashhold to start caring? And who defines it?

Free performance is always nice to have. You may not need more fps, but your GPU will run less hot... or you may actually hit 60fps, depending on gpu.

No matter how you put it, it is a clear and obvious win for everyone. And a big fat slap in the face of lazy devs. Wolfenstein II has amazing performance optimizations. Kinda sad i don't care about the game, but i like what their programmers do.

Link to comment
Share on other sites

Link to post
Share on other sites

Who even plays Wolfenstein II anymore?

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, System32.exe said:

At 4K you take every frame you can get.

 

Plus only the 2080 Ti is over 100 FPS. The framerate increase is nice on the 2080 as it brings it closer to 100 FPS. And it's impactful on the 2070 as it was closer to 60 FPS than 100.

4k isn't really that big of a deal as it was 5 years ago. 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, RorzNZ said:

4k isn't really that big of a deal as it was 5 years ago. 

On what accounts? Because 4K is only getting more popular, not less. And hardware wise it still takes some of the fastest GPUs on the market to reliably get 60 FPS in every game.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, System32.exe said:

On what accounts? Because 4K is only getting more popular, not less. And hardware wise it still takes some of the fastest GPUs on the market to reliably get 60 FPS in every game.

Certainly on my steam account, most new graphic cards can get above 60fps in 4k just fine and games are much more optimised. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, RorzNZ said:

Certainly on my steam account, most new graphic cards can get above 60fps in 4k just fine and games are much more optimised. 

You mean the RTX lineup? The 2070 can't even get to 60 FPS in a lot of games at 4K, let alone above it. And it's a high end card so it's not in the overwhelming majority of people's budgets.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, System32.exe said:

You mean the RTX lineup? The 2070 can't even get to 60 FPS in a lot of games at 4K, let alone above it. And it's a high end card so it's not in the overwhelming majority of people's budgets.

Literally 1060 and up can get nice frame rates at 4k

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RorzNZ said:

Literally 1060 and up can get nice frame rates at 4k

If by nice frame rates you mean 20 FPS, sure.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, System32.exe said:

If by nice frame rates you mean 20 FPS, sure.

Not everyone plays AAA titles dude, or even cares. The majority of people like e-sports titles, which are fine in 4k on most graphics cards. 

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, RorzNZ said:

Not everyone plays AAA titles dude, or even cares. The majority of people like e-sports titles, which are fine in 4k on most graphics cards. 

That's totally wrong. AAA games sell millions of copies on PC, they dominate the top selling list on Steam.

 

Yeah sure the 1060 will run League at 4K just fine. But to get a playable experience in fast paced games like Overwatch, Fortnite, or PUBG at 4K on a 1060 you need to play on the lowest settings which most people aren't willing to do. And that's just a playable experience mind you, forget 144 FPS as the only affordable 4K monitors are all 60Hz, and the 1060 isn't going to get 144 FPS in any of those three games at 4K.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×