Jump to content

Stupid ideas about ray-tracing

Greetings forum!

 

I just had a weird idea, and since i love people explaining why it would not work, i thought about posting it here since a lot of people here now infinitely more about technology than i do.

 

So, the new RTX GPUs are pretty nice, but, at least in Battlefield 5, they suffer serious performance hits. But this time it is not like other nVidia technology like hairworks or PhysX, where nVidia designed the tech themselfs, made driver optimisations and then helped game devs make their games look nice on nVidia GPUs while AMD did not have any response since the tech was non open source. This time with RTX, it is a feature of DirectX, so AMD can, theoretically have ray tracing functionality on their GPUs. Sure it would run like sh*t but it would in theory work. but then it struck me. most consumer CPUs have low-power iGPUs. So i wonder, would it be possible to have something like a Ryzen 5 2400G in your system, dedicated to RTX, and then your main GPU, lets say a Vega 64, dedicated to regular rasturisation? Could someone explain to me if this does not work, or if it would work in theory, beacuse i am looking for some kind of dev project (just something to thinker with) and this sounds fun.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, VictorN2990WX said:

Greetings forum!

 

I just had a weird idea, and since i love people explaining why it would not work, i thought about posting it here since a lot of people here now infinitely more about technology than i do.

 

So, the new RTX GPUs are pretty nice, but, at least in Battlefield 5, they suffer serious performance hits. But this time it is not like other nVidia technology like hairworks or PhysX, where nVidia designed the tech themselfs, made driver optimisations and then helped game devs make their games look nice on nVidia GPUs while AMD did not have any response since the tech was non open source. This time with RTX, it is a feature of DirectX, so AMD can, theoretically have ray tracing functionality on their GPUs. Sure it would run like sh*t but it would in theory work. but then it struck me. most consumer CPUs have low-power iGPUs. So i wonder, would it be possible to have something like a Ryzen 5 2400G in your system, dedicated to RTX, and then your main GPU, lets say a Vega 64, dedicated to regular rasturisation? Could someone explain to me if this does not work, or if it would work in theory, beacuse i am looking for some kind of dev project (just something to thinker with) and this sounds fun.

You can use the iGPU for rtx, but someone needs to write a driver for it. And the RTX DIrectX is not opensource, AMD has to write it's own implementation, and API. Then it can add it to directX. AMD does have a raytracing api btw. I believe it's called prorender or something. According to their presentation, it will run on consumer gpu's too, and it is opensource.

Link to comment
Share on other sites

Link to post
Share on other sites

yes, but you need to then get them to play nicely together, something that is going to be very difficult. take a page out of how the Turing handles raytracing and rastarization. 

 

you will probably be better of having 2 Vegas and then trying to get them to work together.

 

should work fine to some degree if you lock the framerate and get each of them to sqedual and merge the info at the correct timing.

 

 

honestly it is going to be a pain to attempt. id rather see how many Vegas you need to merge together to do pathtracing or wait for someone to make an ASIC.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, VictorN2990WX said:

So i wonder, would it be possible to have something like a Ryzen 5 2400G in your system, dedicated to RTX

No, as RTX isn't part of the DirectX specification. RTX is an Nvidia proprietary API that makes use of DirectX Ray Tracing. It's an extension.

PC Specs - AMD Ryzen 7 5800X3D MSI B550M Mortar - 32GB Corsair Vengeance RGB DDR4-3600 @ CL16 - ASRock RX7800XT 660p 1TBGB & Crucial P5 1TB Fractal Define Mini C CM V750v2 - Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

i had similar idea, have normal rasterization gpu and buy separate raytracing card (like physx was), then have them "crossfire/sli"

 

this is probably way better since people who want normal rasterization performance can just ignore raytracing and enthusiasts can get more powerful raytracing cards

MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to comment
Share on other sites

Link to post
Share on other sites

The general note seems to be yeah, if there were no comparability issues due to Nvidia being Nvidia

Link to comment
Share on other sites

Link to post
Share on other sites

The integrated graphics typically have a limited number of processing "cores" and they run at lower frequency (for example for 2400g you have 11 compute units, 704 unified shaders which is less than a third, if you compare to RX 580 with 36 compute units / 2304 shaders ... and they run at ~ 1.1 ghz vs ~1.35 ghz  )

 

Further, if you abuse the integrated graphics, chances are you'll hit the 65w TDP (or whatever the value is for that cpu) and you may cause your actual cpu cores to reduce their frequencies so that the cpu stays within its power budget. When you have only 4 cores as you do with a 2400g, it would make more sense to give the cpu part priority.

 

So basically, too little processing power to matter, too much difficulty making a game engine work with multiple cards, uploading data to several places, waiting for cards to complete jobs, synchronizing everything... imagine the frame stutter and all that... don't see it happen.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

As a lot of people say, we indeed won't get RTX on AMD, but AMD can create an raytracing extension for DirectX and vulcan just like NVIDIA did, and if they do it right(AMD and game developer), it might even be compatible without modifying the game!

But that depends on how games utilize Raytracing, if they use only DirectX function calls, then AMD can easiy make it compatible, but if a game decides to directly talk to the card, then it's a no.

Link to comment
Share on other sites

Link to post
Share on other sites

DXR is open-source depends if Nvidia pushes enough to get exclusives but proprietary software takea a lot of convincing developers usually develop for the lowest common denominator.  

 

RTX is too slow to be a usable technology we need probably 5 years for it to mature.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

The far bigger issue to me is that raytracing does not look better than the trickery that game developers alread did. At least not noticeable.

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, NeuesTestament said:

The far bigger issue to me is that raytracing does not look better than the trickery that game developers alread did. At least not noticeable.

Just today I've noticed in Killing Floor 2 how zed (the monsters) reflected in a subway entrance stairway wall as they walked past. A detail I've never noticed before and I played that map quite often. I don't know how they do it in Unreal Engine 3.x but reflections look incredibly convincing and they do seem actual real-time and not some fake vague reflection mimicking thing like in most games where looking at them from "wrong" perspective shows that it's not really reflecting the actual content. But here in Killing Floor 2 it actually does. It seems to do that for a limited distance, but the reflection content seems actual reflection.

 

@VictorN2990WX

Only problem with iGPU idea is that iGPU's are generally too crappy even for "faked" rasterized graphics. Expecting them to do ray tracing is literally expecting them to do the impossible.

 

Maybe we will get ray tracing accelerators (similar to 3D accelerators in the beginning of 3D games), but I find it to be a very unlikely scenario as it works better when it's a part of the rasterizer chip (GPU).

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, NeuesTestament said:

The far bigger issue to me is that raytracing does not look better than the trickery that game developers alread did. At least not noticeable.

 

2 hours ago, RejZoR said:

Just today I've noticed in Killing Floor 2 how zed (the monsters) reflected in a subway entrance stairway wall as they walked past. A detail I've never noticed before and I played that map quite often. I don't know how they do it in Unreal Engine 3.x but reflections look incredibly convincing and they do seem actual real-time and not some fake vague reflection mimicking thing like in most games where looking at them from "wrong" perspective shows that it's not really reflecting the actual content. But here in Killing Floor 2 it actually does. It seems to do that for a limited distance, but the reflection content seems actual reflection.

 

@VictorN2990WX

Only problem with iGPU idea is that iGPU's are generally too crappy even for "faked" rasterized graphics. Expecting them to do ray tracing is literally expecting them to do the impossible.

 

Maybe we will get ray tracing accelerators (similar to 3D accelerators in the beginning of 3D games), but I find it to be a very unlikely scenario as it works better when it's a part of the rasterizer chip (GPU).

 

I went over these points broadly in the technews BFV DXR thread but anytime you see a reflection in game without RTX in play it's a result of one or usually more hand created and hand positioned art assets. Every reflection you see in game is a result of an individually created solution for that specific reflection, (and possibly that specific viewing angle, i'm not 100% sure on that though).

 

That means the quality of the reflections is going to be directly proportional to how much work and tus money they throw at that aspect of things.

 

Ray Tracing provides a simpler solution to very high quality reflections because you just have to create your baseline art assets in a slightly different way.

 

On top of that reflections are the one of 3 things RTX can do that we can most convincingly fake. Lighting and Shadows are much more constrained and will see much greater gains.

 

Lastly BFV isn;t the best game for showing of reflections, it has too much muck to really show it off. Good reflections become increasingly noticable the more of them there are, and water aside there's not a huge number of highly reflective surfaces in BFV.

Link to comment
Share on other sites

Link to post
Share on other sites

I've never seen things like inverted reflections (viewspace reflections) as shown in some examples that are mostly around to basically promote RTX. The reflections in Killing Floor 2 look so real you could easily fool people into thinking it's ray traced. I don't know specifics how UE 3.x does it, but it looks so convincing I don't even care if it's faked.

 

People don't understand is that ray tracing is great, when it can actually be done properly. I was working with offline ray tracers that took hours on a quad core and I used 64 rays per pixel (did 256 rays once but it would take days so I stopped the render). And results often weren't as perfect as I wanted them. NVIDIA is using 1 (ONE!) ray per pixel.

 

In all honesty, RTX is a faked effect as well. Because they are cutting corners to even make it work in realtime. Frankly, I don't see much value in it. It's a beginning, sure, but of no worth at the moment. We need to increase the number of rays per pixel to at least 16 to get somewhat realistic calculations and it has to be used on a wider range of objects. Glasses on tables, windows, metallic objects, especially on weapons in first person where you have them in view 99% of the time.

Link to comment
Share on other sites

Link to post
Share on other sites

Youd have massive latency between the normal GPU and the iGPU, which is why the RT cores and GPU cores are on the same die.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, VictorN2990WX said:

Could someone explain to me if this does not work, or if it would work in theory, beacuse i am looking for some kind of dev project (just something to thinker with) and this sounds fun.

With DirectX 12, in theory two different GPUs can work in tandem. The problem is that the ray tracing step must be done before any shading can take place. So you're not going to get better performance because it's very likely what's causing performance to tank when DXR is enabled is that the RT cores are taxed. If the multi-GPU system in DX12 allows to split the workload between two GPUs, then you might be able to get a bit of a performance boost since DXR allows a fallback to the equivalent of software RT resolve. But this is just more work than it's really worth at this point.

 

12 hours ago, RejZoR said:

I've never seen things like inverted reflections (viewspace reflections) as shown in some examples that are mostly around to basically promote RTX. The reflections in Killing Floor 2 look so real you could easily fool people into thinking it's ray traced. I don't know specifics how UE 3.x does it, but it looks so convincing I don't even care if it's faked.

It's likely rendering another view port to texture like how Half-Life 2 does it's "TV broadcasts" The problem is you can't really use this method extensively unless you're willing to sacrifice quality somewhere.

 

EDIT: It looks like Killing Floor 2 just uses screen space reflections. The thing with screen space reflections is it can only reflect what you're actually seeing. It can't reflect anything you can't directly see because it was likely clipped from the view port. And example where screen space reflections fail:

 

ffxiv_11142018_065936.thumb.png.ead4c1eaea4cef7667401d48b14b56a1.png

 

The reflection off the water cuts off too early and too sharply because the view port isn't rendering the rest of the scenery.

 

Quote

People don't understand is that ray tracing is great, when it can actually be done properly. I was working with offline ray tracers that took hours on a quad core and I used 64 rays per pixel (did 256 rays once but it would take days so I stopped the render). And results often weren't as perfect as I wanted them. NVIDIA is using 1 (ONE!) ray per pixel.

At 1080p 60FPS, that's only about a percent of the claimed performance of what the RTX 2080 Ti can do. 1920 * 1080 * 60 = 124,416,000 rays/sec vs. 10,000,000,000 rays/sec NVIDIA claims. So what happened to the other 99%?

 

You're also comparing the performance of a CPU to a GPU. Even if I considered this was a i7-7700K, SisSoftware SANDRA benchmarks peg it at about 53 GFLOPS of single precision performance, compared to ~11,000 GFLOPS of single precision performance the 2080 Ti can do.

 

Quote

In all honesty, RTX is a faked effect as well. Because they are cutting corners to even make it work in realtime. Frankly, I don't see much value in it. It's a beginning, sure, but of no worth at the moment. We need to increase the number of rays per pixel to at least 16 to get somewhat realistic calculations and it has to be used on a wider range of objects. Glasses on tables, windows, metallic objects, especially on weapons in first person where you have them in view 99% of the time.

How is it "faked"? Either it's using a ray tracing algorithm or it's not.

Link to comment
Share on other sites

Link to post
Share on other sites

I was addressing quality issues with single ray used and mention of CPU rendering was just an example where I got artefacting even with 16 rays per pixel.

 

As for the faking part, see, you're not understanding ray tracing correctly. Just because it's happening, that doesn't mean it's absolutely correct. And they are faking it. But not in terms how everyone is addressing "faking" with standard rasterization. Faking here doesn't mean the output isn't accurate to an expected degree, it's the way how they are obtaining it. What they are doing is basically rendering it through a viewport by blasting rays from the viewport at things and then getting back the result (I must check how they take light sources into account tho). Where real ray tracing, the rays originate from light sources (every muzzle flash, every bulb, every fire, a spark etc), hit objects, bounce around multiple times and then the ones hitting viewport (your actual view) are captured and outputted to the user. It's so wasteful because a lot of rays actually never hit your viewport (or your eyes, but they have to be shot from the light sources for realtime graphics because you'e moving around and so is the light). But that's how real light behaves. Photons of light are bouncing all over the place even when they are not hitting your retina to see it. So, I'm not blaming NVIDIA for reversing the process to obtain reasonable performance from it. It's natural to optimize things to make them work at acceptable rates. But it's not how actual light behaves. Your eyes are not projecting the view, it's the light coming into your eyes doing that.

 

Basically they are using an IR camera approach (way how we obtain night vision capability). IR blaster on the camera is emitting IR particles and blasting them at the scenery. Then camera capable of "seeing" IR particles is recording the scenery and we can see an image even in absolute darkness. That's how NVIDIA is basically doing ray tracing. The emitter is in your viewport, shooting out rays of light and then viewport is capturing them back, just like the IR camera.

 

Frankly, doing real ray tracing without having reflections, global illumination and shadows as a package is no ray tracing for me as these things are interconnected. They shouldn't exist without one another. Because only when you have all of them in your view, the outcome is convincing enough to say "wow, that looks realistic".

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, M.Yurizaki said:

 

How is it "faked"? Either it's using a ray tracing algorithm or it's not.

 

maybe this helps a bit:

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, RejZoR said:

So, I'm not blaming NVIDIA for reversing the process to obtain reasonable performance from it. It's natural to optimize things to make them work at acceptable rates. But it's not how actual light behaves. Your eyes are not projecting the view, it's the light coming into your eyes doing that.

Ray tracing is generally understood to be the method of rays coming from the view point, not from the light source, because of the very problem you described. A lot of academic articles I've seen on the subject also describe ray tracing using this method. Besides, if we generalize the algorithm, it should be following the path of a ray and its interaction with objects, regardless of its origin.

 

7 hours ago, RejZoR said:

Frankly, doing real ray tracing without having reflections, global illumination and shadows as a package is no ray tracing for me as these things are interconnected. They shouldn't exist without one another. Because only when you have all of them in your view, the outcome is convincing enough to say "wow, that looks realistic".

It's fine to say what you could call realistic looking ray tracing or whatever, but regardless, if the lighting algorithm is using ray tracing, it's using ray tracing.

 

7 hours ago, mariushm said:

maybe this helps a bit:

Neither of these answers the question that ray tracing, when RTX is on, is being "faked" If anything they point out the issues with screen space reflections and some things that were apparently missed.

Link to comment
Share on other sites

Link to post
Share on other sites

This is actually a very nice idea. If it does come true i can see it be a huge punch in NVIDIA's face.


"Hey look, we got raytracing working on a existing piece of hardware, take that!"

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bcat00 said:

This is actually a very nice idea. If it does come true i can see it be a huge punch in NVIDIA's face.


"Hey look, we got raytracing working on a existing piece of hardware, take that!"

Getting it to work on existing hardware wouldn't be that exciting. It's the dedicated hardware that makes it possible to do it real time. In another thread there's an example of a mobile chip with dedicated raytracing hardware showing on par performance with a Maxwell GPU.

Crystal: CPU: i7 7700K | Motherboard: Asus ROG Strix Z270F | RAM: GSkill 16 GB@3200MHz | GPU: Nvidia GTX 1080 Ti FE | Case: Corsair Crystal 570X (black) | PSU: EVGA Supernova G2 1000W | Monitor: Asus VG248QE 24"

Laptop: Dell XPS 13 9370 | CPU: i5 10510U | RAM: 16 GB

Server: CPU: i5 4690k | RAM: 16 GB | Case: Corsair Graphite 760T White | Storage: 19 TB

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/22/2018 at 4:42 PM, RejZoR said:

Just today I've noticed in Killing Floor 2 how zed (the monsters) reflected in a subway entrance stairway wall as they walked past. A detail I've never noticed before and I played that map quite often. I don't know how they do it in Unreal Engine 3.x but reflections look incredibly convincing and they do seem actual real-time and not some fake vague reflection mimicking thing like in most games where looking at them from "wrong" perspective shows that it's not really reflecting the actual content. But here in Killing Floor 2 it actually does. It seems to do that for a limited distance, but the reflection content seems actual reflection.

 

@VictorN2990WX

Only problem with iGPU idea is that iGPU's are generally too crappy even for "faked" rasterized graphics. Expecting them to do ray tracing is literally expecting them to do the impossible.

 

Maybe we will get ray tracing accelerators (similar to 3D accelerators in the beginning of 3D games), but I find it to be a very unlikely scenario as it works better when it's a part of the rasterizer chip (GPU).

Actually, they are the perfect fit, AMD vega iGPU's(APU's) would be perfect for handeling raytracing.

On 11/22/2018 at 7:29 PM, CarlBar said:

 

 

I went over these points broadly in the technews BFV DXR thread but anytime you see a reflection in game without RTX in play it's a result of one or usually more hand created and hand positioned art assets. Every reflection you see in game is a result of an individually created solution for that specific reflection, (and possibly that specific viewing angle, i'm not 100% sure on that though).

 

That means the quality of the reflections is going to be directly proportional to how much work and tus money they throw at that aspect of things.

 

Ray Tracing provides a simpler solution to very high quality reflections because you just have to create your baseline art assets in a slightly different way.

 

On top of that reflections are the one of 3 things RTX can do that we can most convincingly fake. Lighting and Shadows are much more constrained and will see much greater gains.

 

Lastly BFV isn;t the best game for showing of reflections, it has too much muck to really show it off. Good reflections become increasingly noticable the more of them there are, and water aside there's not a huge number of highly reflective surfaces in BFV.

Reflections aren't hand drawn, reflections in games mostly are just in-game camera's that draw an texture onto an object as a reflection.(Called reflection captures in UE4)

Lighting and Shadows are still mostly done in Shaders(non-RTX), because believe me, they are realistic enough already.

On 11/22/2018 at 9:48 PM, Amazonsucks said:

Youd have massive latency between the normal GPU and the iGPU, which is why the RT cores and GPU cores are on the same die.

The GPU and iGPU don't need to communicate, the game needs to communicate with those, and because the iGPU mostly is on the same chip as the CPU cores, it has less latency than your normal GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

You would still have to composite the ray traced lighting with the graphics that the GPU renders for every frame though, thus adding latency to the rendering pipeline, no?

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/22/2018 at 2:14 AM, VictorN2990WX said:

This time with RTX, it is a feature of DirectX, so AMD can, theoretically have ray tracing functionality on their GPUs.

Not quite. RTX GPUs have special hardware to accelerate raytracing. Raytracing is not new, in fact, it is one of the oldest light rendering/shadow drawing techniques. Real Time raytracing is new, and possible because of special hardware.

 

On 11/22/2018 at 2:14 AM, VictorN2990WX said:

So i wonder, would it be possible to have something like a Ryzen 5 2400G in your system, dedicated to RTX, and then your main GPU, lets say a Vega 64, dedicated to regular rasturisation

Yes. Multi GPU computing is possible. And something I would like to see more of. The problem(s) are two fold. Unless you are using Cuda enabled Nvidia devices, it's very hard for a software developer to support heterogenous architectures without knowing exactly which GPU models are going to be used (which is impossible for game studios), it would disallow the use of current graphics libraries, and require developers to create their own which can take advantage of multiple card gpGPU computing.

Those two problems are two of the reasons why I would like nVidia to make Cuda and NvLink open specifications. Doing that would allow developers of real world applications to leverage the benefits of the Cuda platform, and probably advance graphics libraries five to ten years, because it would allow other hardware vendors to include the technology in their GPus.

ENCRYPTION IS NOT A CRIME

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, timl132 said:

Reflections aren't hand drawn, reflections in games mostly are just in-game camera's that draw an texture onto an object as a reflection.(Called reflection captures in UE4)

Lighting and Shadows are still mostly done in Shaders(non-RTX), because believe me, they are realistic enough already.

 

Maybe the explanation i read was bad but my impression is that cubemaps are functionally art assets only visible in reflections. Gonna go off and do some more reading i guess.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×