Jump to content

Yep it’s true, Indiana Jones and the Great Circle is the first game that was developed from the ground up to use Ray Tracing… not just as an OPTION to make the lighting a little more realistic… but as a core - REQUIRED - component of the game. Does that mean every game’s FPS is gonna suffer? Or are we going to enter a golden age of Ray Traced Bliss where we can no longer tell the difference between reality and the metaverse? Do we all need NEW Graphics Cards?

 

 

Link to comment
https://linustechtips.com/topic/1596499-ray-tracing-is-mandatory-now/
Share on other sites

Link to post
Share on other sites

I am no Computer Graphics engineer, but I do believe in that traditional rasterization has the potential to look as good if not better than real time ray tracing if efforts are put in, which obviously is the main problem with game studios.

 

Just like baked-in lighting, I think either we can render the map in ray traced lights (so isn't that basically what computing baked-in lighting is?), manually art the maps, or use clever fakes.

 

Like honestly, besides the direct effect of ray tracing on the screen space reflections and shading and all, there are many places in modern games where there seem to be just no effort put in for traditional rasterization. For e.g., light coming through a door in a low light room. I am pretty sure you can bake-in the ray traced lighting there because everything is static.

 

And there are actually times where I would actually prefer the rasterized images. Ray tracing doesn't always mean better graphics, but mostly yeah, it is.

 

Not mentioning all the blurry mess that we have to deal with, which is just not a problem in rasterization.

 

Proper shading and map designing should be more emphasized for games. Not Ray Tracing.

Microsoft owns my soul.

 

Also, Dell is evil, but HP kinda nice.

Link to post
Share on other sites

Good, opportunistic thumbnail, and somewhat informative of the topic of the video. Refreshing in the sea of clickbait.

 

1 hour ago, Haswellx86 said:

I am no Computer Graphics engineer, but I do believe in that traditional rasterization has the potential to look as good if not better than real time ray tracing if efforts are put in, which obviously is the main problem with game studios.

Requiring effort is not a virtue, though.

 

1 hour ago, Haswellx86 said:

Proper shading and map designing should be more emphasized for games. Not Ray Tracing.

Gameplay. Gameplay should be emphasized. The graphics technology is all means to an end, go with whatever gets the job done more easily as far as I'm concerned, so more time and resources are available for what truly matters.

Link to post
Share on other sites

A big part missed is all the wasted energy (e.g., carbon footprint) of raytracing over and over again vs baking it once and using traditional methods. It's not just the direct cost of the devices to get that performance, but all the extra power the newer devices have to guzzle to get to that reduced performance, repeatedly.

 

My less-important hot take: I'm also firmly in the "no fake frames, no DLSS/AI upscaling" camp. A well-done game doesn't need these things to run. I hope AAA gaming prices themselves even more into irrelevancy.

Link to post
Share on other sites

12 hours ago, Haswellx86 said:

I am no Computer Graphics engineer, but I do believe in that traditional rasterization has the potential to look as good if not better than real time ray tracing if efforts are put in, which obviously is the main problem with game studios.

 

 

Here's the thing. We should have moved into raytracing and caustics back in 2005, but GPU's kept having leapfrog increases in performance, so we never found a graphics fidelity plateau with which to justify it. Everyone wanted higher resolutions or frame rates, not better lighting because the monitors (HDR 4K) didn't come out until relatively recently. So there is now a reason to do Raytracing by default.

 

Right now, I'd say that "ray tracing optional" will probably be the status quo until 4K HDR monitors are standard. If they ever become standard (we saw what happened with 3D televisions.) Raytracing doesn't require a HDR monitor, but half the reason for having ray tracing is to get realistic lighting through things like glass, reflections and shadows. 

 

Honestly in terms of evolution of 3D tech, probably the only thing that ever blew me away was seeing caustics, because it made water and glass look amazing. Very few traditional rasterized games have good looking glass or water, and usually you can't see through stained glass with traditional rasterization. Water is very miss, sometimes hit. Like only Archeage and No Man's Sky ever really look like they have realistic water. Other games (eg Satisfactory for example) that use UE5, the water only looks good until you dive into it, and then suddenly you realize the water is flat.

 

This is UE4 in 2020 https://developer.nvidia.com/blog/generating-ray-traced-caustic-effects-in-unreal-engine-4-part-1/ :

featured_image.jpg

 

This is POVRay, 2011 https://www.cs.umd.edu/~mount/Indep/Alisa_Chen/caustics.html :

glass_bowl_fancy.jpg

 

Ray tracing Caustics has been in POVRAY since probably 2001 (I can't remember exactly, but I do remember converting a OBJ model animation sequence into a series of POVRAY scripts, so I know it had to be around there when I was still using the Pentium III.)

 

The general idea is that there is a point where gamers don't care about the visual quality. (Just look at people saying to not play on Ultra settings.) Some games want to be "theatrical" grade, and that is only accomplished by putting ray tracing before resolution increases. Unfortunately that results in "smeary" visuals when the screen resolution doesn't match the render output. Something I can't stand is when a game is 720p or 1080p but the 4K monitor results in everything being soft-fuzzy. I have to go out of my way to disable non-integer scaling.

 

If we're going to have mandatory Ray tracing, I sure hope it's with the caveat that "best experienced at 4Kp60HDR" and not simply "4K HDR required" because I have yet to see raytracing in a game that actually impressed me, and that may simply be because I haven't played any game with RT on, on the HDR monitor. I just don't see the justification when the average thing RT did in games was make make it darker.

Link to post
Share on other sites

At this point, it's even rumored that the Switch 2 will have raytracing cores, phones are starting to, even laptop integrated graphics are getting it. The 2000-series RTX cards came out years ago now, there was bound to be a day when developers actually wanted to leverage this technology from the ground-up in games. All of the information about light-mapping and other technology taking a lot of development time basically cinches the deal with raytracing being utilized a lot going forward.

 

It does suck for people on previous-generation cards, but honestly unless you have a 1080 Ti, most of those pre-raytracing cards are getting to be very limited on current releases outside of raytracing support.

AMD Ryzen 5900X

T-Force Vulcan Z 3200mhz 2x32GB

EVGA RTX 3060 Ti XC

MSI B450 Gaming Plus

Cooler Master Hyper 212 Evo

Samsung 970 EVO Plus 2TB

WD 5400RPM 2TB

EVGA G3 750W

Corsair Carbide 300R

Arctic Fans 140mm x4 120mm x 1

 

Link to post
Share on other sites

I don't have a problem with games requiring hardware features that came out 6.5 years ago to run. That's normal and expected, it's just that the adoption of performant, RT-enabled cards has been hindered by all the price and availability issues the last few years, and the resulting giant slowdown in price-to-performance gains.

 

Like, if we had seen improvements in price-to-performance at the same level over the last 3 generations like we saw during the mid-2010s, the 4060 would be giving 3080-level performance and I think a lot fewer people would be complaining. As a point of comparison, the GTX 1060 is over 50% faster than the GTX 770 and nearly matches the GTX 980, and of course the 1060 had three times as much VRAM as the 770. But the RTX 4060 is only about 20% faster than the RTX 2070 (and obviously it's an even smaller gap from the 2070 Super) while the 3080 is far out ahead of the 4060, and the 4060 still has the same amount of VRAM as the 2070. That's how far the price-to-performance improvements have fallen. In a world where the 4060 was as good as the 1060 in terms of performance and VRAM gains vs. older cards, the 4060 would be easily running The Great Circle at 1440p and high settings without needing upscaling at all.

 

I don't think the game devs should be blamed for this. For example, Machine Games has probably been working on The Great Circle since 2019 at least (that's just speculation, but based on their release history I think it's a well-founded assumption). It would have been completely reasonable for them to assume that, in 2024 during their likely release window, that a mid-range card would be able to handily beat even the top-end Turing cards that were out at the time, because that's how GPU performance had been trending for years. So I can't blame them if they built the game around that assumption.

 

ALL THAT SAID, I wanted to reply to a couple comments from earlier:

 

 

7 hours ago, Kisai said:

The general idea is that there is a point where gamers don't care about the visual quality. (Just look at people saying to not play on Ultra settings.) Some games want to be "theatrical" grade, and that is only accomplished by putting ray tracing before resolution increases. Unfortunately that results in "smeary" visuals when the screen resolution doesn't match the render output. Something I can't stand is when a game is 720p or 1080p but the 4K monitor results in everything being soft-fuzzy. I have to go out of my way to disable non-integer scaling.

 

18 hours ago, SpaceGhostC2C said:

Gameplay. Gameplay should be emphasized. The graphics technology is all means to an end, go with whatever gets the job done more easily as far as I'm concerned, so more time and resources are available for what truly matters.

I think these comments really get at the main problem that a lot of gamers have with ray tracing and modern graphics (myself included). We understand why ray tracing is being used in new games - it does make games look more realistic and it does improve a lot of dev workflows.

 

But what it doesn't improve, generally, is how fun the game is!

 

I want new technologies to enhance how much I enjoy games by creating new modes of gameplay and allowing me to have types of experiences that I couldn't have before.

 

It used to be that this was always the case whenever new technologies came out. For example, think of the leap from the SNES to the N64: the rise of full 3D graphics led to entirely new modes of gameplay and entirely new genres of games becoming possible. Or, for example, consider the rise of heavily physics-based shooters like Half-Life 2 and Crysis, which allowed for modes of interaction with the game world that weren't possible with older technology.

 

Over the last couple console generations, we're seeing less and less in terms of actual new, exciting modes of gameplay. The games look better, but for the most part they don't play any different. Like, with most 9th gen games, I look and them and think, "Well, if you downgraded the graphics and added a few loading screens, you could probably just run this game on the Xbox 360 and preserve all of the core gameplay."
 

Until these new technologies like RT actually allow for new types of gameplay that make people excited, then I think gamers are going to be more frustrated than enthused about them, especially when they're driving up hardware prices.

 

Now obviously there are tons of new, creative games out there, but they're almost never utilizing new technology as a core part of their creativity. When I think about the most creative and novel games that I've played over the last 5-10 years or so, they're almost all indie games with modest graphics, or they're made by Nintendo.

Gaming PC: Ryzen 5 5600 :: Gigabyte RTX 2070 Super Gaming OC :: MSI B550-VC :: WD SN750 :: NH-D15 :: 32GB DDR4-3200 :: Phanteks Enthoo Pro M TG :: Windows 10

 

Laptop: Latitude E5440 (i5-4200U, 8GB DDR3-1600, 500GB Sandisk SSD) :: Linux Mint XFCE

 

Office PC: Optiplex 5090 (i7-10700, 16GB DDR4-2933, Quadro P400, 500GB SSD) :: Windows 10

 

File and Media Server: Precision 3620 (i5-7500, 16GB DDR4-2133, a bunch of old recert HDDs) :: TrueNas Scale

 

Web Server: Raspberry Pi 4 Model B (2GB RAM, 64GB storage) :: Raspberry Pi OS

Link to post
Share on other sites

1 hour ago, Ha-Satan said:

 

Until these new technologies like RT actually allow for new types of gameplay that make people excited, then I think gamers are going to be more frustrated than enthused about them, especially when they're driving up hardware prices.

 

Now obviously there are tons of new, creative games out there, but they're almost never utilizing new technology as a core part of their creativity. When I think about the most creative and novel games that I've played over the last 5-10 years or so, they're almost all indie games with modest graphics, or they're made by Nintendo.

 

Raytracing doesn't (or maybe "can't") add anything interesting to the game mechanics because you'd pretty much be asking for "Portal 3", except now you have a lot of light puzzles, and the player can't simply turn the brightness and contrast up on their monitor.

 

See episode 169 https://en.wikipedia.org/wiki/List_of_MythBusters_episodes "Let there be light"

 

You'd have to create a game entirely based on rotating mirrors, with opaque and translucent surfaces, and water. That might be fun in a "Portal" like sense, but most of portal's selling point was it's writing. 

 

Like don't get me wrong, there is nothing wrong with increasing the visual quality, but you can't be 100% eyecandy, 0% game mechanics. Because that's just making a film or television show then. In which case... why isn't it just a television show on netflix?

 

 

Link to post
Share on other sites

Yeah, the day that you will absolutly need a RT card as a must to game at all are still far far away. 

 

And no one was forced to pay for it and suffer trough the infanty phase, what an odd thing to say tbh. Well maybe Techtubers, but hey that´s their job.

At least, i can´t remember anyone forcing me to buy a expensive Nvidia GPU. 

 

Also 4K gaming is still for a very small minority, only very few care about 4K raytracing performance on ultra settings. WQHD isn´t even the majority when it comes to resolution. It´s still FHD.

Link to post
Share on other sites

9 minutes ago, Kisai said:

 

Raytracing doesn't (or maybe "can't") add anything interesting to the game mechanics because you'd pretty much be asking for "Portal 3", except now you have a lot of light puzzles, and the player can't simply turn the brightness and contrast up on their monitor.

 

See episode 169 https://en.wikipedia.org/wiki/List_of_MythBusters_episodes "Let there be light"

 

You'd have to create a game entirely based on rotating mirrors, with opaque and translucent surfaces, and water. That might be fun in a "Portal" like sense, but most of portal's selling point was it's writing. 

 

Like don't get me wrong, there is nothing wrong with increasing the visual quality, but you can't be 100% eyecandy, 0% game mechanics. Because that's just making a film or television show then. In which case... why isn't it just a television show on netflix?

 

 

Just better flashlights are a solid way to add interesting game mechanics. SH2 uses RT for its flashlights and it does enhance the experience. It may not be a mechanic in the strictest sense but it very much builds on how you interact with the game. 

 

32 minutes ago, pApA^LeGBa said:

Also 4K gaming is still for a very small minority, only very few care about 4K raytracing performance on ultra settings. WQHD isn´t even the majority when it comes to resolution. It´s still FHD.

ehhhhhh
How many people have not gotten a 4k TV yet? 
Also using steam survey to claim QHD isnt a plurality for PCs is misleading. Steam surveys survey everyones laptop. 

 

9 hours ago, Kisai said:

The general idea is that there is a point where gamers don't care about the visual quality. (Just look at people saying to not play on Ultra settings.) 

People say dont play on ultra because ultra settings are settings for cards that do not exist yet, not because gamers dont care about visual quality. Ultra is so when you play the game in 5 years on 3 generation newer hardware the game does not look AS aged. If Ultra settings are playable on hardware that exists when the game came out, the devs are using ultra wrong. Which means devs should not be spending any time optimizing the render pipeline to get those last cool visual touches that only exist in ultra, they should be spending their time optimizing the code that runs the low, medium and high's feature sets. 

Link to post
Share on other sites

26 minutes ago, starsmine said:

How many people have not gotten a 4k TV yet? 

I'm still using a 1080p TV for all my gaming. I would honestly buy another 1080p TV in the future just to be able to avoid having to worry about upscaling from low res to 4K, but nobody makes 1080p TVs in the size I want with the features I want, like proper full range VRR.

Gaming PC: Ryzen 5 5600 :: Gigabyte RTX 2070 Super Gaming OC :: MSI B550-VC :: WD SN750 :: NH-D15 :: 32GB DDR4-3200 :: Phanteks Enthoo Pro M TG :: Windows 10

 

Laptop: Latitude E5440 (i5-4200U, 8GB DDR3-1600, 500GB Sandisk SSD) :: Linux Mint XFCE

 

Office PC: Optiplex 5090 (i7-10700, 16GB DDR4-2933, Quadro P400, 500GB SSD) :: Windows 10

 

File and Media Server: Precision 3620 (i5-7500, 16GB DDR4-2133, a bunch of old recert HDDs) :: TrueNas Scale

 

Web Server: Raspberry Pi 4 Model B (2GB RAM, 64GB storage) :: Raspberry Pi OS

Link to post
Share on other sites

48 minutes ago, starsmine said:

Just better flashlights are a solid way to add interesting game mechanics. SH2 uses RT for its flashlights and it does enhance the experience. It may not be a mechanic in the strictest sense but it very much builds on how you interact with the game. 

 

ehhhhhh
How many people have not gotten a 4k TV yet? 
Also using steam survey to claim QHD isnt a plurality for PCs is misleading. Steam surveys survey everyones laptop. 

 

People say dont play on ultra because ultra settings are settings for cards that do not exist yet, not because gamers dont care about visual quality. Ultra is so when you play the game in 5 years on 3 generation newer hardware the game does not look AS aged. If Ultra settings are playable on hardware that exists when the game came out, the devs are using ultra wrong. Which means devs should not be spending any time optimizing the render pipeline to get those last cool visual touches that only exist in ultra, they should be spending their time optimizing the code that runs the low, medium and high's feature sets. 

 

I know a lot of people who play on PC. None of them plays on a TV. And most of my social circle does not have a 4K TV or a TV at all that is. And seriously how many people hook up their gaming PC to the TV? That´s awful, and yes i know new TV´s don´t have input lag, but again the majority won´t buy those right away when they come out. And even if they do, many can´t just use that as a gaming device. I mean i know it´s a cliche but not every gamer is single. And most people don´t have room for two big 4K TV´s. So yes, i think PC gaming on a 4K TV is a niche. As if the cost of a 4K capable card is not enough to tell that it is far away from being used by the majority.... Some people are really far away from reality.

 

In PC gaming FHD is still the majority. Most gamers aren´t tech enthusiasts. And most of them will neither discuss anything on tech forums or on steam for that matter. The real world is a bit different to the tech youtube bubble.

Link to post
Share on other sites

I wonder where the Intel B580 falls in performance. If people are looking for a budget upgrade to get into raytracing, this card appears to meet the requirements. Everything seems to be focused on Nvidia but $250 on Intel could offer a feasible upgrade path for those that don't have $500 to drop on 40 series.

Link to post
Share on other sites

  • 2 weeks later...
On 1/15/2025 at 11:21 PM, starsmine said:

Just better flashlights are a solid way to add interesting game mechanics. SH2 uses RT for its flashlights and it does enhance the experience. It may not be a mechanic in the strictest sense but it very much builds on how you interact with the game. 

 

ehhhhhh
How many people have not gotten a 4k TV yet? 
Also using steam survey to claim QHD isnt a plurality for PCs is misleading. Steam surveys survey everyones laptop. 

 

People say dont play on ultra because ultra settings are settings for cards that do not exist yet, not because gamers dont care about visual quality. Ultra is so when you play the game in 5 years on 3 generation newer hardware the game does not look AS aged. If Ultra settings are playable on hardware that exists when the game came out, the devs are using ultra wrong. Which means devs should not be spending any time optimizing the render pipeline to get those last cool visual touches that only exist in ultra, they should be spending their time optimizing the code that runs the low, medium and high's feature sets. 

Pretty much no one lol, the cost is just ridiculous, the power consumptions just ridiculous and you have to upgrade your entire setup, your processor, your ram, your graphics card, your storage which will also need brand new motherboard and the increased electricity consumption from the pure mind melting amount of processing power it takes to store, move and process 4k, 4k is very, very costly and also you'll have to pay a ton more for every single subscription service you have FOREVER. The cost just keeps adding up and at a point makes you feel like a complete fuckin idiot and you just start putting things back in their limits. People complain about cost of living and people pay these ridiculous prices for 4k? and even ones that do ( like me ) just don't play on it anymore. 4k tv gaming is almost entirely powered by pure virgin energy and to everyone who isn't single, gaming on tv just creates way too many problems than it solves. First hand experience. After a while, you just give up and go back to your old dusty monitor. And buying two ridiculously overpriced 4k tvs that are gonna cost twice as much to maintain and dedicating one to gaming beside your regular monitor, man now it just gets petty even for someone obsessed with gaming, that money is way better spent somewhere else. Things need to have limits. Now if gaming was a shared hobby, that's another story. But those kinds of people prefer consoles, not pcs

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×