Jump to content

World's first fully ray traced game (mod) released

Humbug
Go to solution Solved by straight_stewie,
21 minutes ago, CarlBar said:

It's been confirmed they're not using them. What AnandTech has to say about how RT work and what DICE are actually doing are two different things. 

Every article I read about it, either from DICE, from interested bloggers, or from interviews with DICE, says that they are using DXR for their ray tracing.

DXR is the name of the part of the DX12 API that supports ray tracing, which runs on whatever implementation the device drivers are providing. In Nvidias case, that is RT cores when available.

So please, point me to where you are getting your information, because not only can I not find it, I am finding an overwhelming amount of the exact opposite of what you are saying.

16 minutes ago, S w a t s o n said:

pretty complete, looks to be full raytraced global illumination and ambient occulusion and shadows i think.

I dont think full GI like this can be done without RT cores but I dont think there's any confirmed word

 

Daaamm, that is sweet looking. Hopefully when it comes out we'll get some nice benchmarks and comparisons that will better show off the effect quality and performance capabilities of RTX.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, leadeater said:

DXR (that's a given) with the Tensor cores doing the BVH calculations and the denoising being done on shader cores

I'm gonna need sources to be turned over. Everything I read just says they are using DXR, and everything I read about DXR says that it's really up to the driver implementors. I can't imagine that nvidia wouldn't use RT cores for the appropriate parts.

They do have the option to do it however they want, and I can find information that they are using Tensor cores for parts of the BFV performance update, but I can't find anything more in depth than that. Can you point me towards your sources?

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

This is my two cents on the whole “ew, ray tracing” thing.

  • RTX hardware is not required for ray tracing. Intel’s Larabbee which aimed to bring ray tracing games ran purely in software (it was literally a bunch of Atom processors stitched together). Even NVIDIA dabbled in ray tracing back in the GeForce 400. All RTX hardware, specifically the RT cores are doing is accelerating the work.
  • Traditional shading is basically hacks upon hacks for figuring out lighting. At some point those hacks are going to cost as much work, both from a hardware usage perspective and for the artist’s perspective trying to tweak how it looks, to get it to the same quality as ray tracing. To get a realistically lighted image without RT, you would need at least a half dozen steps, a few of them like shadowing are expensive to do. Ray tracing can resolve most of these steps, if not all of them in a single operation.
  • No one’s forcing you to use these features. If you think the current lighting models are good enough and you care solely about performance, then don’t use ray tracing. If you think it’s a waste of silicon to realize features that you don’t think are important, look at your current setup and ask yourself if you’re using every feature it has.
Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, straight_stewie said:

I'm gonna need sources to be turned over. Everything I read just says they are using DXR, and everything I read about DXR says that it's really up to the driver implementors. I can't imagine that nvidia wouldn't use RT cores for the appropriate parts.

They do have the option to do it however they want, and I can find information that they are using Tensor cores for parts of the BFV performance update, but I can't find anything more in depth than that. Can you point me towards your sources?

Did you not read my post?

Dice literally said they not using the RT cores.

 

 

It's not like DXR just says to the card, ok trace me some rays and the RT cores turn on. It has to be supported in the engine/game as well. BFV was not designed when RT cores existed, they only had titan v's with NO RT cores. This is why titan V can raytrace similarily to RTX cards, without having RT cores in battlefield V.

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Mira Yurizaki said:

This is my two cents on the whole “ew, ray tracing” thing.

  • RTX hardware is not required for ray tracing. Intel’s Larabbee which aimed to bring ray tracing games ran purely in software (it was literally a bunch of Atom processors stitched together). Even NVIDIA dabbled in ray tracing back in the GeForce 400. All RTX hardware, specifically the RT cores are doing is accelerating the work.
  • Traditional shading is basically hacks upon hacks for figuring out lighting. At some point those hacks are going to cost as much work, both from a hardware usage perspective and for the artist’s perspective trying to tweak how it looks, to get it to the same quality as ray tracing. To get a realistically lighted image without RT, you would need at least a half dozen steps, a few of them like shadowing are expensive to do. Ray tracing can resolve most of these steps, if not all of them in a single operation.
  • No one’s forcing you to use these features. If you think the current lighting models are good enough and you care solely about performance, then don’t use ray tracing. If you think it’s a waste of silicon to realize features that you don’t think are important, look at your current setup and ask yourself if you’re using every feature it has.

One of the more silly arguments I've heard against RTX is the claim it's not possible to buy a high end GPU without RTX silicon now, even though it appears you can't buy an equivalent AMD cheaper.  It's not like the rumored 11/16/1X series is about due.  I don;t see the same arguments being made for HBM at the moment, talking about tech the costs lots of money but  has little use. 

 

8 hours ago, S w a t s o n said:

Did you not read my post?

Dice literally said they not using the RT cores.

 

 

It's not like DXR just says to the card, ok trace me some rays and the RT cores turn on. It has to be supported in the engine/game as well. BFV was not designed when RT cores existed, they only had titan v's with NO RT cores. This is why titan V can raytrace similarily to RTX cards, without having RT cores in battlefield V.

 

I have to admit I was given the distinct impression earlier on that RTX was easy to implement retrospectively, I'm not sure how I came to that information but I dare say it's common.  I have to say though it's a shame because imagine if you could?.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, mr moose said:

One of the more silly arguments I've heard against RTX is the claim it's not possible to buy a high end GPU without RTX silicon now, even though it appears you can't buy an equivalent AMD cheaper.  It's not like the rumored 11/16/1X series is about due.  I don;t see the same arguments being made for HBM at the moment, talking about tech the costs lots of money but  has little use. 

 

 

I have to admit I was given the distinct impression earlier on that RTX was easy to implement retrospectively, I'm not sure how I came to that information but I dare say it's common.  I have to say though it's a shame because imagine if you could?.

Early tech is rarely easy to implement. Give iD 2-4 years, then it will be. The main thing is that the understanding of culling is way too early. Which is really the whole issue with RTX: it was way too early to be hyping DXR as the leading "tech" with the new generation. Nvidia really should have taken the angle of "Faster today and better looking tomorrow" or "The Future is Today, join us!". "It just works" is one of the worse directions you can go.

 

As for DXR, the RT cores are these interesting bits of tech that I have yet to see explained too well what they're really up to. I should probably dig into that more, but, at the end, it's still just a version of Async Compute. The DXR will just use whatever paths are available, which is why it'll be interesting to see if it can be activated on any of the Vega cards. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

"It just works" is one of the worse directions you can go.

Unfortunately that's how most things are done. And arguably, I would rather have this approach because it provides a baseline. Prove that your thing works first, then work on optimizations. If you can't prove your thing works, then there's no point in optimizing it because you don't know if the thing works still.

 

And this isn't just computers. Many things are done this way.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mira Yurizaki said:

Unfortunately that's how most things are done. And arguably, I would rather have this approach because it provides a baseline. Prove that your thing works first, then work on optimizations. If you can't prove your thing works, then there's no point in optimizing it because you don't know if the thing works still.

 

And this isn't just computers. Many things are done this way.

That and making promises about future optimizations isn't the wisest of PR language.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

That looks pretty fantastic. With those OLED screens coming out, ray tracing would look pretty nice.  

Link to comment
Share on other sites

Link to post
Share on other sites

Saw the DF video on this, really interesting stuff. I have to say the quake mod really sells what ray or path tracing can bring, I thought it looked fantastic! Even on such an old game. 

Also they mentioned that the lad who made it said that the shading of the textures that had been hit by a ray was much more intensive than the rays themselves. Interesting stuff. 

Bleigh!  Ever hear of AC series? 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, mr moose said:

 I don;t see the same arguments being made for HBM at the moment, talking about tech the costs lots of money but  has little use.

Memory I feel is trickier, since it depends on the GPU design. AMD's GPUs for some reason are more bandwidth hungry (possibly because of the larger scheduling unit size?). And I wouldn't even consider, in the case of gaming, an advantage using higher screen resolution or whatnot because that also takes GPU power to maintain.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, mr moose said:

I have to admit I was given the distinct impression earlier on that RTX was easy to implement retrospectively, I'm not sure how I came to that information but I dare say it's common.  I have to say though it's a shame because imagine if you could?.

They did say that and the devs of Atomic Heart said it was relatively easy but they still had to get a experimental unreal build and help directly from nvidia. DICE got help from nvidia too but they were working on with Titan V's and were probably much later into development possibly making it not as easy for them as it will be for others going forward

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/27/2019 at 5:16 AM, Nowak said:

Perhaps now is not the time for a ray tracing Skyrim mod, since the 2080 Ti would still be too slow for that, but, we all know that someone's gonna mod ray tracing into Skyrim eventually. If it can be done for Quake 2 then it can be done in just about any game that supports mods.

Can you do something like this without access to source code? 

I am not an expert but personally I don't see how you can. These guys had to go in and rewrite the Quake 2 graphics pipeline. It was possible because John Carmack released the source code back in the day.

 

Can Skyrim modders make a similar ray tracer with their regular modding tools?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Humbug said:

Can you do something like this without access to source code? 

I am not an expert but personally I don't see how you can. These guys had to go in and rewrite the Quake 2 graphics pipeline. It was possible because John Carmack released the source code back in the day.

 

Can Skyrim modders make a similar ray tracer with their regular modding tools?

Wellllllllllllll, there are existing lighting mods, without source code access. It might be possible? Maybe.

 

Wicked complicated but maybe.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Humbug said:

Can you do something like this without access to source code? 

I am not an expert but personally I don't see how you can. These guys had to go in and rewrite the Quake 2 graphics pipeline. It was possible because John Carmack released the source code back in the day.

 

Can Skyrim modders make a similar ray tracer with their regular modding tools?

I don't think so since most graphics mods for Skyrim are typically post processing, change the map data itself to use a different lighting setting, or they tweak the internal values of the game engine. Ray tracing has to be baked into the graphics rendering system because it's one of the first steps in rendering.

Edited by Mira Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mira Yurizaki said:

Memory I feel is trickier, since it depends on the GPU design. AMD's GPUs for some reason are more bandwidth hungry (possibly because of the larger scheduling unit size?). And I wouldn't even consider, in the case of gaming, an advantage using higher screen resolution or whatnot because that also takes GPU power to maintain.

It is,  i know, my point was that just because one person doesn't want to pay for a certain tech doesn't mean that a device shouldn't have it.  Like I don't do anything that HBM ram would be an advantage for but that doesn't mean it shouldn't be there. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

It is,  i know, my point was that just because one person doesn't want to pay for a certain tech doesn't mean that a device shouldn't have it.  Like I don't do anything that HBM ram would be an advantage for but that doesn't mean it shouldn't be there. 

I think a better example is Intel's iGPUs on higher end CPUs.

 

Show of hands who actually uses it for anything (if they have a video card)

Edited by Mira Yurizaki
Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mira Yurizaki said:

I think a better example is Intel's iGPUs on higher end CPUs.

 

Show of hands who actually uses it for anything (if they have a video card)

I do. For one monitor. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mira Yurizaki said:

I think a better example is Intel's iGPUs on higher end CPUs.

 

Show of hands who actually uses it for anything (if they have a video card)

except it isn't there with the intention to bring improvements to a whole section of computing.  I was trying to find an example that does have a actual use like RTX and HBM but that not everyone wants to pay for or needs.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, mr moose said:

except it isn't there with the intention to bring improvements to a whole section of computing.  I was trying to find an example that does have a actual use like RTX and HBM but that not everyone wants to pay for or needs.

Maybe not whole sections of computing, but Intel does have API to let you use it as a compute accelerator. So much like RTX, it has a specialized purpose (outside of providing a video interface) that's likely not going to be used >90% of the time.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, mr moose said:

One of the more silly arguments I've heard against RTX is the claim it's not possible to buy a high end GPU without RTX silicon now, even though it appears you can't buy an equivalent AMD cheaper.  It's not like the rumored 11/16/1X series is about due.  I don;t see the same arguments being made for HBM at the moment, talking about tech the costs lots of money but  has little use. 

 

 

To be fair whilst i do have a bit of sympathy on this point, (as all the extra stuff on their costs to develop and i'm certain has a part to play in the increased costs of RTX cards, even if NVIDIA is certainly price gouging), This is at least semi true anytime a new feature comes along that requires improved hardware. RT is just unfortunate in that it's requiring hardware improvements that don't benefit conventional rendering. In mean seriously between the Tensor cores an the R cores an RTX 2080 probably has several times the raw processing power built into it of a 1080TI, but outside of RTX most of that extra power is dormant.

 

That said even if a fully agreed with those people HBM 2 is different. Even for conventional rasterization work GDDR is starting to push the limit in terms of high bandwidth capacity you can fit on the card. Based on a few different data points current generation GDDR6 tops out at 32GB, but doing so requires a significantly more expensive PCB and not currently in production, (i've read of proof of concept manufacturing only), 2GB GDDR6 chips, and total bandwidth is 896GB. The upper limit for a similar theoretical HBM2 setup is at least 192GB and 2TB of bandwidth.

 

HBM's issue is that it's a little early in arriving for consumer usage, GDDR still has enough legs left to soldier on for a bit longer and as an established technology is cheaper as a consequence.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, CarlBar said:

RT is just unfortunate in that it's requiring hardware improvements that don't benefit conventional rendering. In mean seriously between the Tensor cores an the R cores an RTX 2080 probably has several times the raw processing power built into it of a 1080TI, but outside of RTX most of that extra power is dormant.

This is the case with almost any new radical feature to GPUs. When Hardware T&L came out, any non DX7 game couldn't use it. When programmable shaders came out, any non-DX8 game couldn't use the new hardware features. When tessellation came out (which to achieve acceptable results required the use of a special-function unit), any non-DX11 game couldn't use it.

 

The only radical feature to GPU design that didn't do anything that egregious was going to a unified shader design. Though even that had performance issues. If I recall the midrange first-gen unified shader GPUs couldn't convincingly outperform the midrange GPUs of the previous generation.

 

If "conventional rendering" is what was done previously, then all of these features that we take for granted now on modern GPUs are in the same boat as DXR.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

This is the case with almost any new radical feature to GPUs. When Hardware T&L came out, any non DX7 game couldn't use it. When programmable shaders came out, any non-DX8 game couldn't use the new hardware features. When tessellation came out (which to achieve acceptable results required the use of a special-function unit), any non-DX11 game couldn't use it.

 

The only radical feature to GPU design that didn't do anything that egregious was going to a unified shader design. Though even that had performance issues. If I recall the midrange first-gen unified shader GPUs couldn't convincingly outperform the midrange GPUs of the previous generation.

 

If "conventional rendering" is what was done previously, then all of these features that we take for granted now on modern GPUs are in the same boat as DXR.

 

Direct X 9 was released in 2002. Thats 17 years ago. In the last 17 years including DXR there's been 2 features that relied heavily on features that needed extra hardware. Whilst the Battlefield series has gone from this:

 

bf1942cemod_1.jpg

 

To This:

 

70

 

There's been a lot of graphical enhancements over the last 2 decades and only a few have relied on specific hardware over just ever increasing raw rendering power.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CarlBar said:

Direct X 9 was released in 2002. Thats 17 years ago. In the last 17 years including DXR there's been 2 features that relied heavily on features that needed extra hardware. Whilst the Battlefield series has gone from this:

 

There's been a lot of graphical enhancements over the last 2 decades and only a few have relied on specific hardware over just ever increasing raw rendering power.

Everything I mentioned to get to "acceptable" levels of performance with the feature enabled required hardware realization of those features.

 

This is what happens when a GPU that doesn't have hardware transform and lighting is trying to run software that supports it (note only the V3-2000 and CL-GeForce 256 (K7-650) scores):

q3-demo1.jpg

 

This is what happens when a GPU that doesn't support programmable shaders is trying to run software that has it (GeForce 3 vs everything else):

3dmarks.gif

image010.gif

 

Before you go "wait, the GeForce 3 was really powerful anyway":

Spoiler

(None of these games had programmable shader support)

image004.gif

SS_1024.gif

UT_min_1024.gif

 

 

Tessellation is the only feature where one couldn't run DirectX 10 hardware on the DirectX 11 rendering path. Apparently the amount of processing power needed for tessellation necessitated the need for a special fixed function set in hardware to perform the task because general purpose geometry processing isn't fast enough: https://www.anandtech.com/show/2716/7https://www.anandtech.com/show/2716/7

 

And funny thing, this bit right here seems to instill deja vu:

Quote

We do still have the problem that all the transistors put into the tessellator are worthless unless developers take advantage of the hardware. But the reason it makes sense is that the ROI (return on investment: what you get for what you put in) on those transistors is huge if developers do take advantage of the hardware: it's much easier to get huge tessellation performance out of a fixed function tessellator than to put the necessary resources into the Geometry Shader to allow it to be capable of the same tessellation performance programmatically.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×