Jump to content

World's first fully ray traced game (mod) released

Humbug
Go to solution Solved by straight_stewie,
21 minutes ago, CarlBar said:

It's been confirmed they're not using them. What AnandTech has to say about how RT work and what DICE are actually doing are two different things. 

Every article I read about it, either from DICE, from interested bloggers, or from interviews with DICE, says that they are using DXR for their ray tracing.

DXR is the name of the part of the DX12 API that supports ray tracing, which runs on whatever implementation the device drivers are providing. In Nvidias case, that is RT cores when available.

So please, point me to where you are getting your information, because not only can I not find it, I am finding an overwhelming amount of the exact opposite of what you are saying.

People go on and on about how Ray tracing is a gimmick and it's not worth it but they forget that this is just a stepping stone for the future, yes it's in it's infancy and the hardware is expensive and it's not optimized and needs time to mature. But we HAVE to start somewhere we can't just do something the ntoss it away because it isn't PERFECT the first time we tried it.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, straight_stewie said:

Sure. There are many, many ways to do raytracing: ahead of play time. As of right now, there is only one way to do Ray Tracing in any way that begins to approach real time on reasonable hardware, and that is to leverage Nvidias RT cores. Is it the best solution possible, we don't know. But right now it is the only solution available, and it is a disservice to computer science as a whole to dismiss it and demand that no one so much as publicly experiment with using the technology, which is what everyone that doesn't like RTX seems to be doing.

 

 

Random point. BF5 does not sue the RT cores, it also does not use the Tensor cores, it's version of ray tracing is being done entirely in the normal render pipeline.

 

2 hours ago, laminutederire said:

I never said they didn't do anything. I said they're not the experts you make them to be solely because other people have sunk 20 years on that when they didn't so you can't expect them to be world leaders. And that fact seemed to irritate you since you took your time ignoring i was just saying that they're far away from real ray tracing and that their released approach is limited.

 

Wait seriously. You've admitted that NVIDIA developed and ships, (and apparently still ships), a key piece of software in ray tracing, (Optix), implementations over a fair span of time and your trying to claim they're not experts?

 

You don't develop somthing thats even a partial industry standard without being experts. It's like saying Einstein wasn't an expert in physics because the only theory he's known for is Relativity.

 

It's possibble to have more than one expert in somthing, and even to be experts in different aspects of a given thing.

 

Also path tracing whilst related to ray tracing is not in fact raytracing, it's a seperate, (but again related), technique.

Link to comment
Share on other sites

Link to post
Share on other sites

@Humbug hey man, do you mind if you could add the following to your post?

 

Quake Wars: Ray-Traced

QW:RT site

 

They also did the following games with ray tracing -

Quake 3

Quake 4 (and)

Wolfenstein

You can bark like a dog, but that won't make you a dog.

You can act like someone you're not, but that won't change who you are.

 

Finished Crysis without a discrete GPU,15 FPS average, and a lot of heart

 

How I plan my builds -

Spoiler

For me I start with the "There's no way I'm not gonna spend $1,000 on a system."

Followed by the "Wow I need to buy the OS for a $100!?"

Then "Let's start with the 'best budget GPU' and 'best budget CPU' that actually fits what I think is my budget."

Realizing my budget is a lot less, I work my way to "I think these new games will run on a cheap ass CPU."

Then end with "The new parts launching next year is probably gonna be better and faster for the same price so I'll just buy next year."

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, CarlBar said:

Random point. BF5 does not sue the RT cores, it also does not use the Tensor cores, it's version of ray tracing is being done entirely in the normal render pipeline. 

DirectX 12 uses the RT cores when they are available dude.

Direct from AnandTech:

Quote

But like Microsoft’s other DirectX APIs it’s important to note that the company isn’t defining how the hardware should work, only that the hardware needs to support certain features. Past that, it’s up to the individual hardware vendors to create their own backends for executing DXR commands. As a result – and especially as this is so early – everyone from Microsoft to hardware vendors are being intentionally vague about how hardware acceleration is going to work.

And this fancy table with what they knew about vendor specific hardware support:

Quote
DirectX Raytracing Planned Support
Vendor Support
AMD Indeterminate - Driver Due Soon
NVIDIA Volta Hardware + Software (RTX)
NVIDIA Pre-Volta Software


There is a "fallback layer" so that it can be used without hardware support, but Microsoft recommends this only for early development before hardware support is availbale:

Quote

It’s not the fastest option, but it lets developers immediately try out the API and begin writing software to take advantage of it while everyone waits for newer hardware to become more prevalent. However the fallback layer is not limited to just developers – it’s also a catch-all to ensure that all DirectX 12 hardware can support ray tracing – and talking with hardware developers it sounds like some game studios may try to include DXR-driven effects as soon as late this year, if only as an early technical showcase to demonstrate what DXR can do.



Gee, it's almost like AnandTech did their research on that one or something.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, straight_stewie said:

DirectX 12 uses the RT cores when they are available dude.

Direct from AnandTech:

And this fancy table with what they knew about vendor specific hardware support:


There is a "fallback layer" so that it can be used without hardware support, but Microsoft recommends this only for early development before hardware support is availbale:



Gee, it's almost like AnandTech did their research on that one or something.

 

It's been confirmed they're not using them. What AnandTech has to say about how RT work and what DICE are actually doing are two different things.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, CarlBar said:

It's been confirmed they're not using them. What AnandTech has to say about how RT work and what DICE are actually doing are two different things. 

Every article I read about it, either from DICE, from interested bloggers, or from interviews with DICE, says that they are using DXR for their ray tracing.

DXR is the name of the part of the DX12 API that supports ray tracing, which runs on whatever implementation the device drivers are providing. In Nvidias case, that is RT cores when available.

So please, point me to where you are getting your information, because not only can I not find it, I am finding an overwhelming amount of the exact opposite of what you are saying.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mihle said:

Software wise Nvidia haven't done that much when it comes RT, but hardware wise, they have the best product I know of.

We don't know that as we haven't tested "the other side"...

But there was an Interview with someone on the other side, citing Luxmark beeing 1.62 times faster than nVidia.

 

And what I really don't get is why people believe the shit coming from nVidia without any critical thinking...

It wouldn't be the first time that they make something bigger than it deserves and also bigger than they deserve.

 

How about this Theory: They heard from a source that AMD was working with Microsoft on Raytracing for the Next gen Consoles. As soon as they heard that, they had to do something and invest in Raytracing as well. 

And it just works. And they had their "Tensor Cores" that they had for other stuff, so they thought for themselves that they could use that for Raytracing. Wouldn't be awesome, but works. And they could steal the fame from the other side, same thing they did with G-Sync. (and as we know, AMD and nVidia sit in VESA. "Freesync" is just the Marketing Name for the (initially) VESA Standard)...

 

So what the hell is the Problem of beeing sceptical of that what nVidia says? You remember "this is Fermi" with the wood screws (we call them Spax) and sawed off PCB??

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CarlBar said:

Random point. BF5 does not sue the RT cores, it also does not use the Tensor cores, it's version of ray tracing is being done entirely in the normal render pipeline.

Is it nVidia only? Or does it work with "the other side" as well? And non RTX Cores.

 

Because that is basically the goal of the RTX stuff, to close off the Ray Tracing stuff to other Competitors and own it, so that Gamers have to buy the nVidia stuff. Kinda like PhysX at the time...

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Stefan Payne said:

Is it nVidia only? Or does it work with "the other side" as well? And non RTX Cores.

 

Because that is basically the goal of the RTX stuff, to close off the Ray Tracing stuff to other Competitors and own it, so that Gamers have to buy the nVidia stuff. Kinda like PhysX at the time...

That's not the goal at all.  DXR is not nvidia only, end of story.  When are you going to stop spouting lies about anything not AMD?

 

5 hours ago, Stefan Payne said:

We don't know that as we haven't tested "the other side"...

But there was an Interview with someone on the other side, citing Luxmark beeing 1.62 times faster than nVidia.

 

And what I really don't get is why people believe the shit coming from nVidia without any critical thinking...

It wouldn't be the first time that they make something bigger than it deserves and also bigger than they deserve.

 

How about this Theory: They heard from a source that AMD was working with Microsoft on Raytracing for the Next gen Consoles. As soon as they heard that, they had to do something and invest in Raytracing as well. 

And it just works. And they had their "Tensor Cores" that they had for other stuff, so they thought for themselves that they could use that for Raytracing. Wouldn't be awesome, but works. And they could steal the fame from the other side, same thing they did with G-Sync. (and as we know, AMD and nVidia sit in VESA. "Freesync" is just the Marketing Name for the (initially) VESA Standard)...

 

So what the hell is the Problem of beeing sceptical of that what nVidia says? You remember "this is Fermi" with the wood screws (we call them Spax) and sawed off PCB??

The problem is every time there is a discussion about nvidia/intel you and a half dozen others pop in making comments like this and the ones above trying to convince everyone that Nvidia is rubbish.    No one is saying the things you think they are, your just over sensitive and can't accept anything that isn't  all hail AMD.  You are the very problem you make other s out to be, there are hardly any comments in this thread that conclude what you claim.

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Stefan Payne said:

Is it nVidia only?

No. The way it works is as follows:

  1. Microsoft is developing DX12
  2. As part of DX12, Microsoft has developed a ray tracing API called DXR
  3. As part of DXR, Microsoft has specified what functions the device drivers must implement in order to be DX12 compliant.
  4. Device makers get the option of either implementing the functions in hardware, in software, in a combination of both, or not being DX12 compliant.

Nvidia was just first to market, and calls their DXR implementation RTX. AMD will eventually add some as well, most likely when the market is a little more mature and the cost of developing such devices comes down.

An interesting side note is that Nvidia will probably fully expose RT cores to Cuda developers in a future version of Cuda. This would have implications for improvements in everything from wave propagation simulation to particle motion physics. However, this likely won't happen until AMD has released their DXR implementation, as releasing a general purpose API for their RT cores kind of requires telling everyone fairly in depth information about how it works.

So, right at the moment what we should be doing, instead of bashing Nvidia for being DX12 compliant, is urging AMD to be DX12 compliant, as that will allow Nvidia to release the knowledge, and provide a benefit to the scientific community, and thus the world as a whole.

Save the world, support DXR.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, straight_stewie said:

Every article I read about it, either from DICE, from interested bloggers, or from interviews with DICE, says that they are using DXR for their ray tracing.

DXR is the name of the part of the DX12 API that supports ray tracing, which runs on whatever implementation the device drivers are providing. In Nvidias case, that is RT cores when available.

So please, point me to where you are getting your information, because not only can I not find it, I am finding an overwhelming amount of the exact opposite of what you are saying.

 

@leadeater I think it was you who gave me confirmation on this. That or you where i the same thread, i can't seem to find it, (too many threads to go through and my memory for this kind of thing sucks :().

 

@straight_stewie i remember being surprised too. I increasingly get the impression that DICE's implementation of raytracing is a huge kludge.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, CarlBar said:

 

@leadeater I think it was you who gave me confirmation on this. That or you where i the same thread, i can't seem to find it, (too many threads to go through and my memory for this kind of thing sucks :().

 

@straight_stewie i remember being surprised too. I increasingly get the impression that DICE's implementation of raytracing is a huge kludge.

Commenting having barely read any of the conversation chain but from my understanding BFV is using DXR (that's a given) with the Tensor cores doing the BVH calculations and the denoising being done on shader cores, RT cores are not being used. DXR supports hardware accelerated paths, RT and Tensor cores can be used for hardware acceleration but that doesn't mean they are being used or automatically will be.

 

Honestly though I purge the BFV ray tracing stuff from my mind after every conversation about it because BFV is not a complete implementation of Nvidia RTX it's near as much worthless for technical analysis for that so I just don't care. @S w a t s o n would be the better person to ask from memory.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

  Honestly though I purge the BFV ray tracing stuff from my mind after every conversation about it because BFV is not a complete implementation of Nvidia RTX it's near as much worthless for technical analysis for that so I just don't care. @S w a t s o n would be the better person to ask from memory.

By not complete I assume you mean the game is traditionally rasterized and just uses ray tracing for reflections?

Well that is what the current (and next few) generations of hardware will be capable of in realtime.

 

11 hours ago, Stefan Payne said:

Is it nVidia only? Or does it work with "the other side" as well? And non RTX Cores.

  

Because that is basically the goal of the RTX stuff, to close off the Ray Tracing stuff to other Competitors and own it, so that Gamers have to buy the nVidia stuff. Kinda like PhysX at the time...

From Hardocp

DXR (which stands for Microsoft DirectX Ray Tracing) is inherent to Microsoft’s DirectX. It is tied into DirectX 12 and anytime you hear DXR it refers to the API or game supporting Microsoft Ray Tracing in the API. It only runs under DX12. DXR does not have anything to do with NVIDIA specifically. It is a game feature you can toggle on or off if there is hardware support when/if we see hardware support from either AMD or Intel.

https://www.hardocp.com/article/2018/12/17/battlefield_v_nvidia_ray_tracing_rtx_2070_performance/1

 

Basically the DXR api is a subset of directX12. Intel and AMD can update their own graphics drivers to interface with DXR and run ray tracing. The driver then has to assign the workload either to the shader cores or dedicated hardware (if available).

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Humbug said:

By not complete I assume you mean the game is traditionally rasterized and just uses ray tracing for reflections?

BFV doesn't utilize the RT cores so in relation to the full capabilities of Nvidia RTX software and hardware wise it's not a complete implementation of it.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, leadeater said:

BFV doesn't utilize the RT cores so in relation to the full capabilities of Nvidia RTX software and hardware wise it's not a complete implementation of it.

Wierd. If BFV talks to the DXR api then after that isn't it the Nvidia graphics drivers job to decide whether to put the workload on the RT cores or not?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Humbug said:

Wierd. If BFV talks to the DXR api then after that isn't it the Nvidia graphics drivers job to decide whether to put the workload on the RT cores or not?

That comes under depends, how are the APIs exposed, how can they be used etc. You can execute general compute on the Tensor cores so there is nothing stopping you from running the BVH math on those.

 

The essential issue was that the hardware did not exist during BFV development and they were using Titan Vs, which also support DXR. If you remember that Star Wars tech demo was originally run on 4 Titan Vs.

Link to comment
Share on other sites

Link to post
Share on other sites

Since this topic has gone the direction of talking about DX12 it's worth reiterating at this point that this Quake 2 mod actually uses vulkan for ray tracing and thus works across windows 7-10 and Linux as well. Nvidia were the ones who proposed the vulkan ray tracing api as a set of vendor extensions and they say it's very similar to DXR.

http://on-demand.gputechconf.com/gtc/2018/presentation/s8521-advanced-graphics-extensions-for-vulkan.pdf

 

It was introduced in Vulkan 1.1.85 before the launch of Turing and will stabilize as the other vendors accept it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Humbug said:

Wierd. If BFV talks to the DXR api then after that isn't it the Nvidia graphics drivers job to decide whether to put the workload on the RT cores or not?

Yes and no. The RT cores actually traverse the BVH using math. In the tensor core only method of RTX (Star Wars Demo, any RTX game running on Titan V) the shaders actually manually trace like 1 ray per pixel and let the tensor cores denoise to something like 4k rays per pixel. BFV is extra special because the developers decided to use either incorrect terminology or like a half custom variant of the tensor core only method as they state they are NOT using RT cores and are NOT using tensor core denoise as they built their own "denoiser" from a highly custom version of temporal filtering. My guess is they are in fact using tensor core method but are considering that the raytrace and not the denoise because otherwise it would be way too noisy for the temporal filter at only 1 ray per pixel.

 

There is another option: They could possibly be using the shader only fallback method microshit made to test cards with no hardware acceleration but this was only to help develop DXR and is no longer supported or updated. However you must consider the game would then be rasterizing the game, simulating lots of ray tracing on cpu and shaders, and then denoising (fancy AA) on shaders (not including normal AA for rasterized effects). Effectively 3x the work on the shaders compared to Star Wars demo of rasterizing, 1 ray per pixel of actual tracing with tensor core denoise and DLSS (no need for AA at all)

 

I'm not a developer so I dont know the specifics, but recently another game with RTX, Atomic Heart revealed they had to get a special experimental build of unreal that supported full raytracing, and help from Nvidia directly, and they will surely be using RT cores. So it could be that the unreal engine build used in BFV does not actually support that method.

 

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

1 minute ago, S w a t s o n said:

Yes and no. The RT cores actually traverse the BVH using math. In the tensor core only method of RTX (Star Wars Demo, any RTX game running on Titan V) the shaders actually manually trace like 1 ray per pixel and let the tensor cores denoise to something like 4k rays per pixel. BFV is extra special because the developers decided to use either incorrect terminology or like a half custom variant of the tensor core only method as they state they are NOT using RT cores and are NOT using tensor core denoise as they built their own "denoiser" from a highly custom version of temporal filtering. My guess is they are in fact using tensor core method but are considering that the raytrace and not the denoise because otherwise it would be way too noisy for the temporal filter at only 1 ray per pixel.

 

There are two other options: They could possibly be using the shader only fallback method microshit made to test cards with no hardware acceleration but this was only to help develop DXR and is no longer supported or updated. However you must consider the game would then be rasterizing the game, simulating lots of ray tracing on cpu and shaders, and then denoising (fancy AA) on shaders. Effectively 3x the work on the shaders compared to Star Wars demo of rasterizing, 1 ray per pixel of actual tracing with tensor core denoise and DLSS (no need for AA at all)

 

I'm not a developer so I dont know the specifics, but recently another game with RTX Atomic Hearts revealed they had to get a special experimental build of unreal that supported full raytracing, and they will surely be using RT cores, so it could be that the unreal engine build used in BFV does not actually support that method.

 

 

Thanks for chiming in on all that. And yeah there's a lot of messy info around BFV's RTX implementation, any way you cut it, it sounds like a massive kludge of a solution.

 

Port Royal is a bit more useful as we know it's using the RT cores. But since it dosen;t use the Tensor Cores it's still not representative of what RTX GPU's are capable of with a full implementation.

 

I have to wonder if the reason SotTR seems to have dropped adding RTX support isn;t for the same "it's a kludge" reasoning.

 

Anyone know what the next due to be released game to try to implement RTX is and when it will arrive? Atomic Hearts is late 2019 i know, not sure about anything else at this point.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CarlBar said:

Anyone know what the next due to be released game to try to implement RTX is and when it will arrive? Atomic Hearts is late 2019 i know, not sure about anything else at this point.

Metro Exodus: Feb 15, 2019

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, S w a t s o n said:

Metro Exodus

 

Thanks :). So half a month or so till we see something more. Any new yet on how complete the implementation is or are we pissing into the wind on that kind of info ATM?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CarlBar said:

 

Thanks :). So half a month or so till we see something more. Any new yet on how complete the implementation is or are we pissing into the wind on that kind of info ATM?

pretty complete, looks to be full raytraced global illumination and ambient occulusion and shadows i think.

I dont think full GI like this can be done without RT cores but I dont think there's any confirmed word

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×