Jump to content

What is fake rt?

Fat Cat11997
Go to solution Solved by Mark Kaine,

simple answer: u don't need "rt cores" for raytracing, never did never will, the tech is like from the 80s lmao 

 

edit: if anything "rt cores" are "fake" not the other way round,  sure they help but you don't *need* them at all.

I've been trying out some Minecraft Java shaders that look like rt, but don't actually use rt cores because Java edition doesn't support rt, and ive just been curious about what it is doing to do fake rt? 

Link to comment
Share on other sites

Link to post
Share on other sites

simple answer: u don't need "rt cores" for raytracing, never did never will, the tech is like from the 80s lmao 

 

edit: if anything "rt cores" are "fake" not the other way round,  sure they help but you don't *need* them at all.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Stop thinking about it as "fake vs. real", but hardware accelerated vs. software. 

 

https://blogs.nvidia.com/blog/whats-the-difference-between-hardware-and-software-accelerated-ray-tracing/

 

If you use the RT cores Nvidia has been packing into their cards as an example:

 

Quote

Real-time ray tracing, however, is possible without dedicated hardware. That’s because — while ray tracing has been around since the 1970s — the real trend is much newer: GPU-accelerated ray tracing with dedicated cores.

 

Link to comment
Share on other sites

Link to post
Share on other sites

As the others have said, the concept of ray tracing has been around for a long time. Movies have used it for special effects, or for entire movies for decades. However, ray tracing is very compute intensive, which is why it can't really be done on a CPU if you want to achieve playable frame rates. For example for Toy Story 3 it took an average of 7 hours to render a single frame. Just as graphics cards have accelerated raster graphics for ages, they are now doing the same thing for ray tracing.

 

Games like Descent (1994) and Quake (1996) already used 3D raster graphics. However, you could only play them at low resolutions—e.g. 320 x 200—because CPUs simply weren't fast enough to run them at a sufficient frame rate (~30 fps) otherwise. That's were GPUs came in: they provided hardware acceleration for things like texturing, allowing for much higher resolutions and frame rates. You could suddenly play these games at 640 x 480 @ 60 fps, while also using much better looking textures.

 

These "fake RT" shaders are doing one of two things: they are either using raster algorithms that emulate ray tracing as much as possible, or are doing ray tracing at much lower resolution and/or precision. Just like games back in 199x had to compromise on a lot of things to achieve 30 fps, these shaders have to compromise to achieve good enough performance. The use of RT cores simply means you can now do the same thing at much higher speeds, while also using higher output resolution while using algorithms that are more precise.

Remember to either quote or @mention others, so they are notified of your reply

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×