Jump to content

Unreal Engine 5.1 - video game graphics have arrived at their destination

Delicieuxz
13 hours ago, Mihle said:

I disagree, developers should not hold graphics on games back because X hardware should be able to run at ultra.They should ofc be playable at 60 fps on most common older hardware (so 1060), but more than that no.

Again, I'm not saying "that's the only setting", If you can't hit 1080p60, fine, play 720p60. But don't advertise a "minimum requirement" that doesn't even hit parity with the game console at the same resolution. You don't need to list 720p60 requirements any more than you need to list 1440p60 or 8Kp60. It's assumed that all hardware out there can do 720p60 Ultra, and if for some reason you're trying to run a game on a 10 year old iGPU, that's on you.

 

To restate again, 4Kp60 on "low" is unacceptable in the same way that 720p on "Low" is. If you're making a compromise, you should change the resolution first, not the tunables. Going from 1080p to 720p is a 50% dial back in requirements. Going from 1080p to 4K is a 400% increase in requirements.

 

8K = 33,177,600 pixels per frame

4K = 8,294,400

1080p = 2,073,600

720p = 921,600

 

That is a 3600% increase to go from 720p to 8K. This is before we consider frame rate or HDR. The requirements only need to reflect 60fps. Just like going from 1080p60 to 4Kp60 has the same resource requirement as going from 1080p60 to 1080p240.

 

When we talk about tunables, no single tunable other than Antialiasing or DLSS will net you a 75% dialback in GPU requirements. Turning on DLSS? Maybe that dials back the requirements for a 4K game to that of a 1080p's requirements, but it certainly still looks like a 1080p render, only now with some blur.

 

13 hours ago, Mihle said:

Minimum specs should be whatever specs can run a game at minimum settings at 1080p 60 fps stable.

Recommended specs should be 1440p high settings 60 fps minimum.

That's not forward looking. You do not see or buy "1440p" televisions, or television broadcasts, so 1440p should not be considered at all. More to the point, the problem with the 1440p screen resolution is that many games do not understand 1440p at all, and when you "full screen" a game that is 1080p, you get the crappy screen/monitor upscale, and the GPU can't do an integer upscale. 1440p is the exception, just like the 1200p 16:10 monitors are. Good gawd the 1200p monitors were such a pain in the behind to get games to use the extra 120 lines, that I just played everything letterboxed, or ran it in a window to avoid the problem entirely.

 

To this very day, I'm always running games windowed because I can't trust games to actually run in hidpi mode, and I know they aren't because my second monitor will suddenly change resolution as well, despite the game not running on it.

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Kisai said:

1080p60 Ultra should be what's reflected in the minimum, 4Kp60 Ultra should be what's reflected in the Recommended. Eg, this is the hardware we developed the game on.

No, no it should not.

Thats unhingned.

Minimum gets you running at the minimum playable state.
Which would be the LOW bucket, and a steady 40-50 fps depending on genre, less then that could be considered. Stop treating low like its some kind of curse that should not exist. Ultra should never once be considered. Your obsession with Ultra is just weird. Ultra is literally, "the tunables that change visuals to be 1% better then high, but cost 40% more cycles"
Stop thinking about ultra, its there for two reasons, Benchmarking, and when you play the game 4 years into the future with a new GPU. Ultra is not there to play the game today, at launch. You are not entitled to running a game at 4k60fps ultra with a 3060ti. and any game that launched in the last 2 years that can do that, fucked up by not sending it harder on the graphical front. 

Quote

If the minimum requirements simply reflected "what it will run on", it would literately "run" on anything that supports OpenGL 4.5. Like AMD HD 5000, Nvidia 400's,  and the 4th gen Intel iGPU. But you'll note these minimum requirements reflect 2 generation's later of both GPU's. But these minimum requirements don't even hit "30fps"

Thats not how that works. 
RE8 for example does not boot on nvidia Maxwell, there is no driver support. Its not just hardware features, software features in the stack also need to exist. so DX11 games do not run on all DX11 capable cards. 
intel igpus would not even be able to run the menu screen at over 30fps if they even boot, so thats a bad thing to call minimum. 

 

17 minutes ago, Kisai said:

Again, I'm not saying "that's the only setting", If you can't hit 1080p60, fine, play 720p60. But don't advertise a "minimum requirement" that doesn't even hit parity with the game console at the same resolution. You don't need to list 720p60 requirements any more than you need to list 1440p60 or 8Kp60. It's assumed that all hardware out there can do 720p60 Ultra, and if for some reason you're trying to run a game on a 10 year old iGPU, that's on you.

That isnt true in the slightest. Even new AMD APUs can not run the newest hottest games at 720p60, and new ones have a lot of GPU power, nothing to sneeze at or condescend to. 

17 minutes ago, Kisai said:

To restate again, 4Kp60 on "low" is unacceptable in the same way that 720p on "Low" is. If you're making a compromise, you should change the resolution first, not the tunables. Going from 1080p to 720p is a 50% dial back in requirements. Going from 1080p to 4K is a 400% increase in requirements.

 

Stop treating low like its some kind of lepor. no you should not change resolution first. Resolution should default to native screen resolution. thats why the tunables exist. 

17 minutes ago, Kisai said:

 

That is a 3600% increase to go from 720p to 8K. This is before we consider frame rate or HDR. The requirements only need to reflect 60fps. Just like going from 1080p60 to 4Kp60 has the same resource requirement as going from 1080p60 to 1080p240.

 

When we talk about tunables, no single tunable other than Antialiasing or DLSS will net you a 75% dialback in GPU requirements. Turning on DLSS? Maybe that dials back the requirements for a 4K game to that of a 1080p's requirements, but it certainly still looks like a 1080p render, only now with some blur.

Shadows.... Literally shadows. Zmaps necessary for shadows that are not baked into the textures practically forces the GPU to draw the scene twice.

17 minutes ago, Kisai said:

That's not forward looking. You do not see or buy "1440p" televisions, or television broadcasts, so 1440p should not be considered at all. More to the point, the problem with the 1440p screen resolution is that many games do not understand 1440p at all, and when you "full screen" a game that is 1080p, you get the crappy screen/monitor upscale, and the GPU can't do an integer upscale. 1440p is the exception, just like the 1200p 16:10 monitors are. Good gawd the 1200p monitors were such a pain in the behind to get games to use the extra 120 lines, that I just played everything letterboxed, or ran it in a window to avoid the problem entirely.

1440p on consles are used as an internal render resolution. Just like how ps3 and xbox often rendered games at shit like 600p, no tv existsed at that resolution, it was RENDOR resolution, and then the consoles scale the games to the tv because TV scalors are shit, and always have been shit and always will be shit. 
You can force a PC to do this as well, Many have, thats how supersampling works. 

17 minutes ago, Kisai said:

To this very day, I'm always running games windowed because I can't trust games to actually run in hidpi mode, and I know they aren't because my second monitor will suddenly change resolution as well, despite the game not running on it.

 

That sounds like a you problem. I play games of all ages, of all genres, and that is such a rare uncommon issue I dont know what to even say. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, starsmine said:

No, no it should not.

Thats unhingned.

Minimum gets you running at the minimum playable state.

Nobody agrees what "playable is", CP2077 "minimum" was a joke.

 

To me "minimum" is what is parity with the console. Why would you pay money for the PC version for an inferior experience to the game console? You wouldn't. Recommended should be the development platform, presumably 4Kp60, otherwise why would I trust the requirements to work at 4K. Comprende?

 

If you are going to pay a small fortune for a PC game, yet run it on a PC that is not even at parity with the console, then you are going to make that compromise to run it at 720p, just like you would to run a current generation  console at 720p if you wanted to run it on a 10 year old TV.

 

6 minutes ago, starsmine said:


Which would be the LOW bucket, and a steady 40-50 fps depending on genre, less then that could be considered. Stop treating low like its some kind of curse that should not exist. Ultra should never once be considered. Your obsession with Ultra is just weird. Ultra is literally, "the tunables that change visuals to be 1% better then high, but cost 40% more cycles"

You seem to not understand the issue. "LOW" is not there to let you 4Kp60 instead of 1080p60 on high. "Low" is there because driver bugs exist. Which if you missed the first couple messages in the thread, I said "profile the damn computer" for, so you wouldn't have to guess.

 

It has always been "Reduce resolution", "Turn off Antialiasing", "turn off blurs", "turn off shadows", in that order. Once you turn off shadows, your game runs worse and is uglier than the console version.

 

That is why "4kP60" on low is not a thing. Play on Ultra 1080p if you can't do integer scale. But don't tell me you bought a RTX4090 and play games at 1440p120 low because for some reason you prefer a nonsense framerate over visual fidelity. 

 

6 minutes ago, starsmine said:

 

RE8 for example does not boot on nvidia Maxwell, there is no driver support. Its not just hardware features, software features in the stack also need to exist. so DX11 games do not run on all DX11 capable cards. 
intel igpus would not even be able to run the menu screen at over 30fps if they even boot, so thats a bad thing to call minimum. 

What's your point? That some developers make no effort to port a PC version? Like Square-Enix who seems to release PC ports of flagship titles that have nonsensical requirements that do not align with anything on the console versions. Nier Automata's PC ports, FF14's console ports, Various "mobile" ports, it just seems like SE doesn't care if a game doesn't work on anything but the highest end device/GPU. Hell the entire reason I upgraded the GPU the previous time (to the 1080) was because the GTX760 was not gonna cut it to play, at any resolution.

 

My point is that the "minimum requirements" on ALL games, reflect no real-word logical requirements. They're somewhere between "this is the minimum it will boot on" for Japanese visual novels, to "this is whatever was top of the line last year" for first person shooters, Again, that can be a performance gap of over 40 times, now compare that pixel count between 720p and 8K. That is 36X. So something that theoretically can run at 720p60fps on crappy Nvidia 400 series hardware should also be able to run at 8Kp60 with a RTX4090. So where do you put the line huh? Do you say "the minimum is a GT440, cause that's what it will boot on" and then also put Recommended RTX4090 because that's whatever the current hardware is? Because that is no more honest than existing 'minimum' and 'recommended' specs on steam pages.

 

The "recommended" spec should really be "the point of no additional returns at 4K Ultra", eg, owning better hardware than this will not gain you anything officially. Unofficially, maybe you can turn RT on, but the game was not actually developed with RT from the beginning.

 

At least CP2077 tried to put a meaning behind those requirements, few games even try to do that. 

 

 

6 minutes ago, starsmine said:

That isnt true in the slightest. Even new AMD APUs can not run the newest hottest games at 720p60, and new ones have a lot of GPU power, nothing to sneeze at or condescend to. 

Stop treating low like its some kind of lepor. no you should not change resolution first. Resolution should default to native screen resolution. thats why the tunables exist. 

Stop. No. Those tunables do not gain you a 400% increase in performance. That's why you always go down a resolution until you hit something close to 60fps before screwing around with the tunables, none of those tunables other than Antialiasing, RT or DLSS have any impact on performance. They may impact bugs in the game or the driver (like how fog or lighting/shadows behave differently on nvidia and amd hardware).

 

That's why I said to hide those tunables and quit presenting them as some kind of idiot-button. There is no way you're going to get 4Kp60 "Low" to work on the most common GPU is still the GTX 1060. That GPU is squarely a 1080p60 GPU, nothing higher. No tunable will give you a 1440p performance out of it without crippling the game to below the console port.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

Again, I'm not saying "that's the only setting", If you can't hit 1080p60, fine, play 720p60. But don't advertise a "minimum requirement" that doesn't even hit parity with the game console at the same resolution. You don't need to list 720p60 requirements any more than you need to list 1440p60 or 8Kp60. It's assumed that all hardware out there can do 720p60 Ultra, and if for some reason you're trying to run a game on a 10 year old iGPU, that's on you.

 

To restate again, 4Kp60 on "low" is unacceptable in the same way that 720p on "Low" is. If you're making a compromise, you should change the resolution first, not the tunables. Going from 1080p to 720p is a 50% dial back in requirements. Going from 1080p to 4K is a 400% increase in requirements.

 

8K = 33,177,600 pixels per frame

4K = 8,294,400

1080p = 2,073,600

720p = 921,600

 

That is a 3600% increase to go from 720p to 8K. This is before we consider frame rate or HDR. The requirements only need to reflect 60fps. Just like going from 1080p60 to 4Kp60 has the same resource requirement as going from 1080p60 to 1080p240.

 

When we talk about tunables, no single tunable other than Antialiasing or DLSS will net you a 75% dialback in GPU requirements. Turning on DLSS? Maybe that dials back the requirements for a 4K game to that of a 1080p's requirements, but it certainly still looks like a 1080p render, only now with some blur.

 

That's not forward looking. You do not see or buy "1440p" televisions, or television broadcasts, so 1440p should not be considered at all. More to the point, the problem with the 1440p screen resolution is that many games do not understand 1440p at all, and when you "full screen" a game that is 1080p, you get the crappy screen/monitor upscale, and the GPU can't do an integer upscale. 1440p is the exception, just like the 1200p 16:10 monitors are. Good gawd the 1200p monitors were such a pain in the behind to get games to use the extra 120 lines, that I just played everything letterboxed, or ran it in a window to avoid the problem entirely.

 

To this very day, I'm always running games windowed because I can't trust games to actually run in hidpi mode, and I know they aren't because my second monitor will suddenly change resolution as well, despite the game not running on it.

 

I would much rather play a game at 1080p at medium settings than 720p at ultra. No question about it.

 

I have a 4k monitor, my usual order of operation if I can't run it at 4k ultra: (most of the time)

1. Turn down settings to high

2. Turn resolution down to 3072p (80%of 4k), preferably with in game rendering option that is seperate from screen resolution if game has that as that look better than screen resolution adjustment.

3. Then turn settings down to medium if I need that 

 

I had an 16:10 display (1920x1200), almost all games supported it just fine. Sure, cut scenes and maybe main menus was letterboxed and there was plenty where the in game GUI (like mini map) was in the position where they would have been on 16:9. But as of in game rendering itself, almost all games I played supported it. Much more games supported that that my brothers UW monitor, that's for sure.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/23/2022 at 6:07 AM, Mark Kaine said:

also if what you said was true, well, that makes the vast majority of pc games instantly "problematic" since things like msaa x4 simply cannot be run on the vast majority of hardware (i think average pc on steam performance list is like gtx1060 level... so rather low/midrange) 

The GTX 1060 is the most common GPU in the Steam hardware survey results, being in 7.39% of respondents' PCs, but the next 3 most-common GPUs represent 16.62% of respondents and have a significantly higher average performance than a GTX 1060. And there are only a few exceptions to the next 24 listed GPUs in the hardware survey results, which represent maybe around 50% of respondents, that aren't faster than GTX 1060. The average performance might be above a GTX 1070, perhaps (though I'm not confident of it) closer to a GTX 1080.

 

https://store.steampowered.com/hwsurvey/

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Delicieuxz said:

being in 7.39% of respondents' PCs, but the next 3 most-common GPUs represent 16.62% of respondents

oh, you're right i looked at this a year or two ago,  so my info might have been outdated... however it's still true most pcs dont even come close to playing games at "ultra" and that's a good thing imo... it's why consoles exist basically,  for people who can't take it someone else might have something "better".

 

and i also think that's why pcs are on the rise, people want to get the best performance/ visuals possible... consoles probably won't go away anytime soon, but the locked down nature is a major turn off for many. (ps: and for others its a major draw... good thing we can choose, eh)

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

it seems like a lot of people are missing the point in this threat. Making improvments to a game engine (especially a popular one) is always a good thing. As an engineer myself its super impressive what the unreal team has been able to do. As a hobbiest game dev I'll definitely be trying out these updates myself.

CPU: Ryzen 5950X Ram: Corsair Vengeance 32GB DDR4 3600 CL14 | Graphics: GIGABYTE GAMING OC RTX 3090 |  Mobo: GIGABYTE B550 AORUS MASTER | Storage: SEAGATE FIRECUDA 520 2TB PSU: Be Quiet! Dark Power Pro 12 - 1500W | Monitor: Acer Predator XB271HU & LG C1

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/20/2022 at 2:43 PM, Kisai said:

the entire CyberPunk fiasco also proves it's entirely possible to misjudge what hardware people will desperately try to play the game on.

No. CP2077 was a mixture of scope creep and lack of familiarity with densely packed reactive open worlds. It played on PC just fine, the problem was the HDDs in the consoles and the PS4's fucked up memory management.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, ravenshrike said:

PS4's fucked up memory management.

I'm sorry but this does not exist... lol

 

I have to reboot my PS4 after using the Crunchyroll App before I can use the Plex App without a TON of interface lag in Plex. Really annoying.

Link to comment
Share on other sites

Link to post
Share on other sites

extras^

I guess upgrades to virtual textures (in game world), other physics/simulations and added features or helpful tools... somewhere 😛

Also maybe easier triggering of animation with nodes.

Link to comment
Share on other sites

Link to post
Share on other sites

Fortnite has been updated to receive the benefits of UE 5.1. It looks great. Though, I still see some adjustments happening in the grass at times (like the heads of flowers suddenly appearing, and some other things), as a distant area moves closer to the camera. I wonder what that is - still some LoD?

 

Drop Into The Next Generation Of Fortnite Battle Royale, Powered By Unreal Engine 5.1

 

 

 

 

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Delicieuxz said:

Fortnite has been updated to receive the benefits of UE 5.1. It looks great. Though, I still see some adjustments happening in the grass at times as an area moves closer to the player. I wonder what that is.

 

Drop Into The Next Generation Of Fortnite Battle Royale, Powered By Unreal Engine 5.1

Well that was surprisingly quick, hopefully that bodes well for others and how easy it is to implement these new features.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Well that was surprisingly quick, hopefully that bodes well for others and how easy it is to implement these new features.

Fortnite and Unreal is from the same company, sou they could probably work on it for quite some time already. But I hope for the best, it can't be too bad anyway.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

I think that the next major thing is probably going to be water based, in this demo the water was the only thing that looked slightly off 

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Delicieuxz said:

Fortnite has been updated to receive the benefits of UE 5.1. It looks great. Though, I still see some adjustments happening in the grass at times (like the heads of flowers suddenly appearing, and some other things), as a distant area moves closer to the camera. I wonder what that is - still some LoD?

 

Drop Into The Next Generation Of Fortnite Battle Royale, Powered By Unreal Engine 5.1

It's not able to drive 144fps @ 4K even with a 4090. Nanite reminds me a lot of ray tracing where turning it on just craters FPS.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, AnonymousGuy said:

It's not able to drive 144fps @ 4K even with a 4090. Nanite reminds me a lot of ray tracing where turning it on just craters FPS.

Oh no? I'd rather it look nice than 144 fps 🤷‍♂️

 

I know, it's an e-sports game. But still that's overall how I feel about the tech as a whole and games.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, leadeater said:

Oh no? I'd rather it look nice than 144 fps 🤷‍♂️

 

I know, it's an e-sports game. But still that's overall how I feel about the tech as a whole and games.

Fornite you're gonna end up running at 1080p60 if you want it to look nice.

 

The previous version even on a 3090, can not maintain 60fps at 4K with raytracing on. That said, even having it on before didn't meaningful improve anything because the game itself is over-saturated to begin with. It pretty much just washed out bright spots.

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/5/2022 at 5:22 PM, AnonymousGuy said:

It's not able to drive 144fps @ 4K even with a 4090. Nanite reminds me a lot of ray tracing where turning it on just craters FPS.

Did you miss the part about how Nanites are actually performing better than LOD's (what everyone is using until now) ? The low performance in Fortnite is not because it's running the Nanites feature, but because it's running with maxed settings, including ray tracing.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

This is what fortnite looks like with Raytracing on (RTX 3090):

image.thumb.png.bb6ffb4b28ee143c2333a4f7cfb8a576.png

 

I grabbed the screenshot somewhere where there was light and dark. When the raytracing is off it normally does not have "black" areas even in dark areas.

 

Also that particular skin changes substantially when raytracing is on/off. When RT is off like in this screenshot (from May of this year), it's a different color entirely:

image.thumb.png.707ee68e081492a4db1ca9c0972aa804.png

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Kisai said:

This is what fortnite looks like with Raytracing on (RTX 3090):

image.thumb.png.bb6ffb4b28ee143c2333a4f7cfb8a576.png

 

I grabbed the screenshot somewhere where there was light and dark. When the raytracing is off it normally does not have "black" areas even in dark areas.

 

Also that particular skin changes substantially when raytracing is on/off. When RT is off like in this screenshot (from May of this year), it's a different color entirely:

image.thumb.png.707ee68e081492a4db1ca9c0972aa804.png

If that last one had no other difference than RT off, other light and shadow settings are max, then that lighting looks way worse than some other games that don't have RT at all.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Kisai said:

This is what fortnite looks like with Raytracing on (RTX 3090):

image.thumb.png.bb6ffb4b28ee143c2333a4f7cfb8a576.png

 

I grabbed the screenshot somewhere where there was light and dark. When the raytracing is off it normally does not have "black" areas even in dark areas.

 

Also that particular skin changes substantially when raytracing is on/off. When RT is off like in this screenshot (from May of this year), it's a different color entirely:

image.thumb.png.707ee68e081492a4db1ca9c0972aa804.png

interesting skin choice...

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mihle said:

If that last one had no other difference than RT off, other light and shadow settings are max, then that lighting looks way worse than some other games that don't have RT at all.

Yeah, that's kinda the point I was making. Overall I find Fortnite looks worse with RT on, and not just in a "I don't like it" way, but rather in a "this overall doesn't improve the visuals for the performance tradeoff"

 

841621869_Fortnite2022_12.07-05_43_41_10_DVR.mp4_snapshot_00_06_631a.thumb.jpg.369c8cd08c39b60502298cb7c4fb74b7.jpg

 

There are a few places in the game that look better, if you're standing still, but this is not a game you stand still for long. The only difference between the RT on and RT off in most of the game is how heavy the shadows and light blow-outs are. When you're not in a brightly lit area, the overall game is darker.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, Kisai said:

Yeah, that's kinda the point I was making. Overall I find Fortnite looks worse with RT on, and not just in a "I don't like it" way, but rather in a "this overall doesn't improve the visuals for the performance tradeoff"

 

There are a few places in the game that look better, if you're standing still, but this is not a game you stand still for long. The only difference between the RT on and RT off in most of the game is how heavy the shadows and light blow-outs are. When you're not in a brightly lit area, the overall game is darker.

The game does look a lot more contrast-y with RT stuff on but it's bad for actual competitiveness because it makes it harder to track players..  Like old school where you would crank gamma settings so people couldn't hide in shadows.  FN with RT also has a displeasing "everything coated in plastic" look to it like they don't have surface reflections set right.

 

A lot of people (myself included) play with graphics settings turned down just to make the gameplay feel less cluttered.  Like I'd rather have grass just be a blob of green than be individual moving blades of grass.  (plus 60fps in a competitive shooter feels like lag, so anything that drops below 144fps is immediately getting turned off)

 

I don't know if it's just nostalgia but I feel like Crysis was basically "here" 15 years ago...

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×