Jump to content

Unreal Engine 5.1 - video game graphics have arrived at their destination

Delicieuxz

So does this mean Tekken will finally not look like shit?

 

#ibelieveitwheniseeit

 

#finalfrontier

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/20/2022 at 4:15 AM, Quackers101 said:

the biggest change was 5.0

5.1 is mostly a very improved upon what 5.0. Unity is also used for high quality.

That's how version work. 5.0 being a major release and 5.1 minour release. Welcome to the logical numbering

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/20/2022 at 8:23 AM, Kisai said:

That misses the point.

 

You should not have to meddle with the settings to create a playable experience. If you are playing on a RTX 3090 or a PS5 or Series X, the output should be exactly the same. If you are playing competitively, you should not have to fiddle with the settings to find the optimal settings. That's why universally you'll see people playing competitive FPS games with the quality settings to the worst. If you see someone playing it on Ultra, they're certainly not playing it competitively (maybe for fun, or with Dixper on a stream.)

 

Hell, Fortnite, runs like utter trash if you turn on any RT feature. On a 3090 even.

 

There should only be two "standard" settings modes for PC games "I'm playing with others" and "I'm streaming", in which case "with others" profiles all the players systems and sets the quality/framerate settings to the lowest common denominator (so yes, the player with the potato cripples everyone) and "streaming" which turns off features that induce motion sickness and seizures, and caps the display framerate to 60 or 120 (if supported by capture tools.) If neither are true, the game should run with all the features turned on that maintain the intended frame rate, and the game should determine which features are less necessary (eg motion blur, anti-aliasing, bloom, shadows, fog, etc) on a room-by-room basis to keep that frame rate. The only quality features the players should be able to toggle themselves are the screen resolution, and accessibility feature groups (eg blur/flicker/colorblind)

 

To that end, "I'm playing with others" should benchmark the GPU until it finds a 30fps+ or 60fps+ setting for that "map" before dropping the player into a multiplayer game using that map. With a game console, that is a predictable known value. With a PC, you will have people with 6 year old CPU/GPU's trying to play with people with 11/12/13th gen Intel CPU's and 2080/3080/4080 + GPU's who will have an effective advantage. If you do a private lobby then the hardware requirement is waived if the host has opted to.

 

But again, competitively, people with the best hardware and live the closest to the data center have an advantage. You level the playing field by making sure that these settings are minimum identical. You want to keep performance-affecting features away from being tuned. Also keep in mind that reshade is a thing, and there will always be players who use these tools to "make the quality even worse" to gain performance.

 

I dont want to be 'that' guy but your self entitled response says more about you than it does about others. All I hear is 'Me Me ME ME mE Me!'

 

Games have graphical settings which you can change to suite your hardware (for the most part). Most game will even check your hardware and change settings accordingly Geforce Experience can do this as well). Other than that, games also generally have some presets which you can choose. 

 

If this is above your ability, just get a console and stop complaining as everethying wont be built solely around your needs. The world has more people than only you.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/19/2022 at 4:53 PM, Kisai said:

That misses the point.

 

You should not have to meddle with the settings to create a playable experience. If you are playing on a RTX 3090 or a PS5 or Series X, the output should be exactly the same. If you are playing competitively, you should not have to fiddle with the settings to find the optimal settings. That's why universally you'll see people playing competitive FPS games with the quality settings to the worst. If you see someone playing it on Ultra, they're certainly not playing it competitively (maybe for fun, or with Dixper on a stream.)

 

Hell, Fortnite, runs like utter trash if you turn on any RT feature. On a 3090 even.

 

There should only be two "standard" settings modes for PC games "I'm playing with others" and "I'm streaming", in which case "with others" profiles all the players systems and sets the quality/framerate settings to the lowest common denominator (so yes, the player with the potato cripples everyone) and "streaming" which turns off features that induce motion sickness and seizures, and caps the display framerate to 60 or 120 (if supported by capture tools.) If neither are true, the game should run with all the features turned on that maintain the intended frame rate, and the game should determine which features are less necessary (eg motion blur, anti-aliasing, bloom, shadows, fog, etc) on a room-by-room basis to keep that frame rate. The only quality features the players should be able to toggle themselves are the screen resolution, and accessibility feature groups (eg blur/flicker/colorblind)

 

To that end, "I'm playing with others" should benchmark the GPU until it finds a 30fps+ or 60fps+ setting for that "map" before dropping the player into a multiplayer game using that map. With a game console, that is a predictable known value. With a PC, you will have people with 6 year old CPU/GPU's trying to play with people with 11/12/13th gen Intel CPU's and 2080/3080/4080 + GPU's who will have an effective advantage. If you do a private lobby then the hardware requirement is waived if the host has opted to.

 

But again, competitively, people with the best hardware and live the closest to the data center have an advantage. You level the playing field by making sure that these settings are minimum identical. You want to keep performance-affecting features away from being tuned. Also keep in mind that reshade is a thing, and there will always be players who use these tools to "make the quality even worse" to gain performance.

 

This sounds like the most boring terrible gaming experience possible. Well looks like your playing online guess your game is going to look like crap because someone joined with a laptop 🤣 Like seriously what the heck kinda idea is that. Do people already forget crisis?? It was the defacto can it run it benchmark forever heck it still is. Why? Because they developed it to absolutely crush GPUs and look dang good. All games should have as many graphics options as possible so as to be able to scale to the lowest level GPU power and beyond the highest tier GPUs.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just get a console if you care about a "level playing field" pcs aren't that,  they're the complete opposite and people buy them because they care about options, settings, quality and indeed mods of all kinds...

 

This is really simple: don't want options/ choice: get a console 

 

want the best performance, lots of options and mod the heck out of your game:  get a pc

 

 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Kisai said:

That's not how it works, and you know it.

 

Any benchmarkable game will show you that "everything turned on" is the setting the developer intended you to play the game on, and no games are actually intended to be run at any setting below what the current generation game console settings are, which are the default settings. The games do not profile the hardware to check if you are running a capable iGPU. they just see "Intel" and go "nope, this game will not work on your toaster, I don't believe it.".

No, the ultra for benchmarks is to limit bottlenecks as much as possible and show how GPUs scale. not because its the settings developers intend you to play the game on. that has NEVER been the case. Ultra often tanks the performance vs high for very LITTLE gain.

 

6 hours ago, Kisai said:

Both Final Fantasy 14's original version, and the entire CyberPunk fiasco also proves it's entirely possible to misjudge what hardware people will desperately try to play the game on. Despite meeting the BS "minimum" requirements, neither game is playable at any setting without having had the top-of-the-line hardware at release. If you try to "tune down" the game, what you got is a broken PS2-visual experience that nobody wants to play. I've seen videos of glitches from Cyberpunk that look like glitches I still see in GTA-V. When your computer is not as capable as a game console, that's when you stop using your PC to play games.

Yes, FF14 misjudged hardware. its not the fact that a pot had more triangles then any character model or something stupid like that. Or that the engine itself was a bear to work with. But they changed that, and they change it every major patch, and they change it in sub patches now to as they are making engine changes under the hood with Endwalker, such as with how grass is handled. 

Use something well optimized and not bugged out like Crysis or Farcry 1, or Witcher 3, or Doom eternal, as your example instead. where it scaled well onto any hardware on any setting without looking like crap, low was low detail sure, but it was a non issue and could run on any reasonable hardware.

The sign of a good engine and good use of the engine for pc IS the ability to scale. and scale INTO the future of hardware. If your game that you launch today, can be ran at 4k ultra 90fps on a 4090. you screwed up as a developer on taking advantage of what hardware is capable of doing. That should be your high setting, ultra should be for "tomorrow" with the 5090 to actually be "playable"
A 3060 playing games on ultra at 90fps at 1440p for a game launched today is a massively missed opportunity. 

There is zero shame, there is nothing wrong, with running the game on low to medium at 1440p on a 960 and getting 90 fps. 

 

 

6 hours ago, Kisai said:

Anyway, you seem to be of the mind that people spend an hour turning their games. Nobody does that. The average person adjusts nothing and hopes the developer didn't design their game to run on hardware that doesn't yet exist. Short of adjusting against motion sickness, seizures, and color-blind accessibility, leave everything else alone if it's at 60fps. You just don't know if you're going to get 60fps consistently, or if every time you switch locations, or walk through a door as the lighting shaders tank the performance for a minute.

You have been told... TWICE before you made that post, that is not what anyone is talking about. You don't spend hours if  you don't want to. You let it auto detect hardware, or you let geforce experience do its thing, or the AMD equivalent that exists but I don't remember the name of. And if you are not happy, you can change the bucket to one of four options. if you are then not happy, then you ARE The type of person who enjoys going into the deep settings of a game, so let them.
You are not spending hours, you are spending less then 5 minutes for 90% of the user base.

YOU do hope the developer designs for hardware that does not exist as I just went on about above.

 

 

7 hours ago, Kisai said:

Like good grief, I'm still seeing games that are new and games that are 12 years old have the same problem where an effect that is used in some places in the game turns the otherwise 60fps game into 5fps. Let's see, The Stanley Parable Deluxe (released this year, which is made in Unity) does that in one place. Ghostbusters; Spirits Unleashed does this every time you fire. Fortnite does this every time you start a new match, there will be like a 10 second lag as the map pops in. The latter two are Unreal Engine 4. The games are running on NVMe SSD's. The GPU is a RTX 3090. The problem is not the hardware. 

You are right, there is some sort of not fully correct thing in the code that causes a bubble in the pipe, or you are synchronizing some threads that might not need to be synchronized to cause that. 
The problem isnt the hardware there. But what point are you making there? That has nothing to do with graphical fidelity.

 

7 hours ago, Kisai said:

I've seen people tune down or mod "competitive" games to the point that all it is is geometry, no textures or shaders. At some point you have to say that the tradeoffs for running a game on a potato or toaster computer is not worth it and you'd have a better experience on a Nintendo Switch.

Competitive games can generally be ran on a potato.

I was running valorant at 1440p, with an i7 2600k, and a GTX 760 and getting 90fps on medium to low settings. I could just have dropped to 1080p and to low, but at that point who cares. Ain't mattering unless you are diamond and up, before that you just win with game knowledge and aiming. The advantage someone has running the game at 400 fps when the server runs at 120 really is not what makes me lose matches, its cause I'm shit, not because of the 3ms of reaction time they have on me in 99% of firefights due to the total end to end lag time being like 100ms.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, ouroesa said:

That's how version work. 5.0 being a major release and 5.1 minour release. Welcome to the logical numbering

I know, and not everyone does that. and that wasnt the point of the comment.

 

For another topic.

Spoiler

In visual media like games, there is always some lines to be drawn too. A lot about having freedoms but also some limits.

A limit and to go beyond, its not the game that it's ment to be (very much depends). Also the genre of the game can make visuals a hot topic, like MP + competitive games. There should be a lot of options, but not all options are "recommended". It can at times become like watching a movie and something was wrong the data or visuals, or not good enough color or resolution. Some will impact the experience more than others, and some are just good to have no matter what. But sometimes the media is coming with something wrong by design and fixing it, would make the experience more true to what it wanted to be.

 

or something of a rant

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, SolarNova said:

Niiiice.

 

One of my biggest gripes about game graphics, since i was a kid even, has been 'pop in' .. i HATE it ..i mean REALLY dont like it. I do everything i possibly can to try stop. if the game doesnt have an ingame option for LOD settings ..i find a mod ..or option file to tweek, to increase LOD quality so that 'pop in' is less noticeable.

 

While im sure it will take many years for this 'nanite' tech to work its way into the majority of games in one form or another, i look forward to it.

There have been many games with big worlds and no noticeable LOD pop in. Witcher 3 is one example that comes to mind. But anyway, nanite seems like an overall better alternative. Better performance AND better looks. What's not to like about it?

 

UE 5.1 really seems awesome, but like others said, it all depends on the devs. They still have to choose to use these features.

 

I'm personally really looking forward to UE 5 games to see how it works with actual games. But we're probably still years away. So far Witcher 4 is probably the first announced UE5 game i'm actually interested in.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, ouroesa said:

I dont want to be 'that' guy but your self entitled response says more about you than it does about others. All I hear is 'Me Me ME ME mE Me!'

 

Games have graphical settings which you can change to suite your hardware (for the most part). Most game will even check your hardware and change settings accordingly Geforce Experience can do this as well). Other than that, games also generally have some presets which you can choose. 

 

If this is above your ability, just get a console and stop complaining as everethying wont be built solely around your needs. The world has more people than only you.

^ Agree 100%

 

Also different people want to make their tradeoffs on different options. Some might not care about anti-aliasing, some might not care about ray-tracing. So why shouldn't people be able to decide what settings to drop. Not everything has to be dumbed down to Apple levels of "I know better than you what you want". And if someone doesn't want to learn what the settings mean, pretty much every game still has presets.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Quackers101 said:

there is still "pop in", only that its more of visual noise that is generated.

In the linked video from Unreal Sensei there is no visibly noticeable pop in effect.

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully we won't need 64GB RAM, a 11900K, and an RTX 3090 to play games made with UE 5.1 .

 

As others have pointed out game devs need to be targetting reasonable specs instead of shooting for the moon and pretending everybody has a 3090 or 4090.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Mark Kaine said:

Just get a console if you care about a "level playing field" pcs aren't that,  they're the complete opposite and people buy them because they care about options, settings, quality and indeed mods of all kinds...

 

This is really simple: don't want options/ choice: get a console 

 

want the best performance, lots of options and mod the heck out of your game:  get a pc

 

 

 

 

Not necessarily.

 

I don't know about COD these days but COD: Advanced Warfare had an FPS cap only in multiplayer of 90 (yes, even on PC) to stop people from getting an unfair advantage from having better hardware.

 

Some devs do that kind of thing to level the playing field.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

Hopefully we won't need 64GB RAM, a 11900K, and an RTX 3090 to play games made with UE 5.1 .

 

As others have pointed out game devs need to be targetting reasonable specs instead of shooting for the moon and pretending everybody has a 3090 or 4090.

When have they shot for the moon in the last decade?
What game is out there, that you are unable to run?
Stop making up ultra being the standard. Stop forgetting High/Med/Low exists.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, ouroesa said:

I dont want to be 'that' guy but your self entitled response says more about you than it does about others. All I hear is 'Me Me ME ME mE Me!'

Wonderful, I found all the condescending PC Master Race people on the forum. It's always fun to shout to the wall that "everyone should be able to play the game" because the performance gap between the absolute garbage iGPU laptops you buy from bestbuy and the top-end 16-core CPU with a 24GB video card.

 

You just don't understand PC's, and neither do several other people apparently. Games are played by people in more than just the basement dwellers living in the US.

 

15 hours ago, ouroesa said:

Games have graphical settings which you can change to suite your hardware (for the most part). Most game will even check your hardware and change settings accordingly Geforce Experience can do this as well). Other than that, games also generally have some presets which you can choose. 

And having 200 configurable options when the biggest setting that actually matters is the screen resolution, means you have 200 pointless configuration options that nobody understands or cares about. Why is so hard about understanding that?

 

15 hours ago, ouroesa said:

If this is above your ability, just get a console and stop complaining as everethying wont be built solely around your needs. The world has more people than only you.

I don't need a game console, you're failing to grasp the point.

 

 

14 hours ago, Fasterthannothing said:

This sounds like the most boring terrible gaming experience possible. Well looks like your playing online guess your game is going to look like crap because someone joined with a laptop 🤣 Like seriously what the heck kinda idea is that. Do people already forget crisis?? It was the defacto can it run it benchmark forever heck it still is. Why? Because they developed it to absolutely crush GPUs and look dang good. All games should have as many graphics options as possible so as to be able to scale to the lowest level GPU power and beyond the highest tier GPUs.

 

Again, none of those graphic options have any tangible performance impact other than the screen resolution. Profile the damn computer the first time it loads, and any time you're in a competitive matchmaking lobby. Because I assure you, everyone is running things on low settings because they believe the others are. This can go either way, whoever has the weakest system sets the settings for everyone, or whoever has the highest end system not hitting 60fps gets their settings nerfed until everyone has 60fps. Why is this so hard to understand. Play a MMO game sometime. The person with the worst hardware, you are still going to wait on to load the map.

 

 

13 hours ago, Mark Kaine said:

Just get a console if you care about a "level playing field" pcs aren't that,  they're the complete opposite and people buy them because they care about options, settings, quality and indeed mods of all kinds...

Again, you do not understand the issue. None of those 200 tunable options is going to improve the loading time of the person desperately trying to play the game on an IGPU in india. They're going to have to play at 720p if they want to play with the PC Master Race 1080p dudes in America.

 

13 hours ago, Mark Kaine said:

want the best performance, lots of options and mod the heck out of your game:  get a pc

 

You should not need to mod a game to enjoy it, and the fact that complete trolls do this to MMO games shows you their priority is winning even if they get banned for hacking.

 

12 hours ago, starsmine said:

No, the ultra for benchmarks is to limit bottlenecks as much as possible and show how GPUs scale. not because its the settings developers intend you to play the game on. that has NEVER been the case. Ultra often tanks the performance vs high for very LITTLE gain.

Ultra at 8K maybe. If you are not playing the game on Ultra when you can, then you're throwing money away on buying a PC, go buy a console instead. There is no reason why right now you can not play a game on Ultra 1080p60.

 

12 hours ago, starsmine said:

Yes, FF14 misjudged hardware. its not the fact that a pot had more triangles then any character model or something stupid like that. Or that the engine itself was a bear to work with. But they changed that, and they change it every major patch, and they change it in sub patches now to as they are making engine changes under the hood with Endwalker, such as with how grass is handled. 

I used that example because I was there. You can go download the benchmark still, and see the cross-over point in hardware performance was about 4 years later. However the average person wanting to play it on a laptop would never be able to. You can with the current version, but that comes by playing it at 720p. A GTX 1050Ti (laptop) is insufficent to run it at 1080p60.

 

12 hours ago, starsmine said:

Use something well optimized and not bugged out like Crysis or Farcry 1, or Witcher 3, or Doom eternal, as your example instead. where it scaled well onto any hardware on any setting without looking like crap, low was low detail sure, but it was a non issue and could run on any reasonable hardware.

Yes, there are games that scale properly on most hardware. I didn't say they do not exist. I said that there is no reason why a game engine is offering 200 performance tunables when none of those make a lick of difference, when the only one that matters is the screen resolution. When a game does not pre-compile the shaders, and when a game isn't installed to a SSD, has more bearing on the game's janky performance than anything you can ever fine-tune.

 

I may prefer to turn antialising and motion blur off, but that is because of motion sickness, not because of any performance reasoning. You can not tell me that anyone ever looks at all those tunables and decides "oh today, I'm going to play with no antialiasing, and the texture filtering set to bilinear instead of antisotrophic"

 

If anything, these bloody tunables should be set outside the game by software that can compare notes with other people who have the same game and hardware. I don't expect a game developer to test the performance on anything "Below the minimum requirements" but also the "minimum requirements" on damn near everything is an absolute lie. The "recommended" is the true minimum, and has always been since the 90's. "minimum" just means the game will launch, not that it won't suck to play.

 

12 hours ago, starsmine said:

 


The sign of a good engine and good use of the engine for pc IS the ability to scale. and scale INTO the future of hardware. If your game that you launch today, can be ran at 4k ultra 90fps on a 4090. you screwed up as a developer on taking advantage of what hardware is capable of doing. That should be your high setting, ultra should be for "tomorrow" with the 5090 to actually be "playable"
A 3060 playing games on ultra at 90fps at 1440p for a game launched today is a massively missed opportunity. 

I disagree, but probably on a different point. "Ultra" should be the equivalent of "uncapped", and if your game can do 90fps on a 4090 Ultra out of the box, congrats, you succeeded in aiming where the puck is going. If a GPU's video memory and a CPU's system ram has the capacity, there is no reason not to use as much of it as possible, and yet games routinely do not do this. Many games just hit the magic 32-bit 4GB wall and stay there.

 

Final Fantasy 15 is still the only game that can bring a system to it's knees when you "turn everything on" and that's because the nvidia libraries leak memory rapidly, and no other reason.

 

12 hours ago, starsmine said:

There is zero shame, there is nothing wrong, with running the game on low to medium at 1440p on a 960 and getting 90 fps. 

 

There is when you are spending $90 on a game and are getting PS2 visuals because the "minimum requirements" is an absolute lie. Cyberpunk took the absolute cake on not having the game scale.

 

I would rather play a game at 1080p60 or 1440p60 or 4Kp60 windowed with the settings on Ultra than be forced to play a game I paid money for at "4K low settings" because the game can't be bothered to tune for screen size, or give a windowed option that I can size myself. This is not me complaining about any specific game but there are specific games such as DBD and Fortnite that are on opposing sides of the "too many tunables" with DBD having next-to-none and Fortnite having many that don't matter at all. That ghostbusters game I mentioned, no tunable in the menu will fix the game dropping frame rate every time you fire. These are not "stop playing on a toaster" moments, these are game developers have somehow failed to find appropriate auto-tuning.

 

 

12 hours ago, starsmine said:

You have been told... TWICE before you made that post, that is not what anyone is talking about. You don't spend hours if  you don't want to. You let it auto detect hardware, or you let geforce experience do its thing, or the AMD equivalent that exists but I don't remember the name of. And if you are not happy, you can change the bucket to one of four options. if you are then not happy, then you ARE The type of person who enjoys going into the deep settings of a game, so let them.

You seem to misunderstand that the point was that "why are there 200 settings that don't do anything meaningful" when only the screen resolution actually has an impact on the performance. If you have a 3080, 3090, 4090, whatever high end card, why are you tuning for bechmarks instead of gameplay?

 

People will still try to play a game regardless of the minimum requirements on a toaster, none of those 200 adjustments are going to give them a playable experience, let alone the experience that compares to a game console version. So why offer this illusion of choice? Get rid of the "minimum requirements" BS and just straight up put "This game was developed on (HARDWARE), This is what the default settings assume.) Just because DX12 is a feature of every GPU since the GT440, doesn't mean you say the minimum requirements is a GT440.

 

12 hours ago, starsmine said:


You are not spending hours, you are spending less then 5 minutes for 90% of the user base.

No the average person just plays the game exactly how the game runs the first time, and they may adjust the screen resolution only because the game somehow failed to identify they have a 4K monitor, which seems to be most games.

 

12 hours ago, starsmine said:

 

You are right, there is some sort of not fully correct thing in the code that causes a bubble in the pipe, or you are synchronizing some threads that might not need to be synchronized to cause that. 
The problem isnt the hardware there. But what point are you making there? That has nothing to do with graphical fidelity.

Again, people seem to think there is a magic setting in the Video configuration that will suddenly make every game playable on 12 year old hardware. That's not the case, and has never been the case because the "entry level" computer has barely moved from 12 years ago, while the highest end 4090 GPU is now 50X faster than than that DX12 GT440 GPU that was never capable of even 30fps, ever. The highest end Nvidia GPU is still 15 times faster than the Iris Xe iGPU from two years ago.

 

That gulf only widens. And yet, what do you see on minimum requirements:

System Requirements

1080p Low still requires a GTX970. The 4090 is still a solid 4X more powerful than a GTX970. That Intel XE GPU? A GTX 970 itself is 3.5X more powerful than the Intel Iris XE GPU. 

 

 

What GPU did every laptop come out with for the last three years? 1050Ti, 2060, 3060 "laptop" models. The closest mobile GPU to the desktop 970 was in fact a 1070 Mobile.

 

 

 

12 hours ago, starsmine said:

Competitive games can generally be ran on a potato.

 

I was running valorant at 1440p, with an i7 2600k, and a GTX 760 and getting 90fps on medium to low settings. I could just have dropped to 1080p and to low, but at that point who cares. Ain't mattering unless you are diamond and up, before that you just win with game knowledge and aiming. The advantage someone has running the game at 400 fps when the server runs at 120 really is not what makes me lose matches, its cause I'm shit, not because of the 3ms of reaction time they have on me in 99% of firefights due to the total end to end lag time being like 100ms.

That's a separate argument I make in the "nobody can see more than 60fps" threads. The game's netcode is never going to be 8ms, so a game running at 120fps+ is never operating on 120fps, it's likely operating on 15fps at most, with many games only updating movement positions with motion vectors, the actual "character is at X/Y/Z facing object Q" packets are only sent every half second, and the difference is that the "character at" packets are really what the game uses to determine the collision box. The intermediate frames do not, those are for the client side to not see rubberbanding.

 

3 hours ago, Stahlmann said:

^ Agree 100%

 

Also different people want to make their tradeoffs on different options. Some might not care about anti-aliasing, some might not care about ray-tracing. So why shouldn't people be able to decide what settings to drop. Not everything has to be dumbed down to Apple levels of "I know better than you what you want". And if someone doesn't want to learn what the settings mean, pretty much every game still has presets.

Again, the entire point of this half of the argument is that people either never touch the settings, or don't care, and still complain that the games performance is terrible because the game developer didn't rip the bandaid off and say "PC version, requires a GTX 1080, or RTX 2070 for parity with the console version"

 

Nowhere did I say "you aren't allowed to tune the game", leveling the playing field was directed to the competitive sweaty raider types who already do all that hacky crap to the game, officially or by mods. You want to stop penalizing people who have decent hardware when everyone has it, they are already penalized in loading time by the person with the slowest hard drive and slowest internet connection.

 

Good grief, the amount of times you play a MMO game, and have to wait for a console player, let alone a PC player on crappy connection with a laptop.

 

22 minutes ago, AluminiumTech said:

Hopefully we won't need 64GB RAM, a 11900K, and an RTX 3090 to play games made with UE 5.1 .

 

I hope not, but the magic 8-ball says "nope, sucks to be you"

image.png.28a39ed6aae5bbcf4c30f6b3da6dc5ec.png

https://en.as.com/meristation_en/2022/05/18/news/1652904132_808189.html

That's the UE5 demo requirements.

 

22 minutes ago, AluminiumTech said:

As others have pointed out game devs need to be targetting reasonable specs instead of shooting for the moon and pretending everybody has a 3090 or 4090.

 

No what game devs need to do is develop the game around the idea of where the next generation hardware will be. Only P2W Microtransaction hell games aim for 10 year old hardware, because that's the only way they can keep people playing the same game.

 

In an ideal situation, UE5 would simply divide the world geometry and render smaller tiles on weaker hardware, so you get the same render fidelity regardless of the hardware, but it takes it out of the draw-distance when moving rather than downgrading the game.

 

Something that wasn't touched or covered at all in this conversation is why a game dev might target a next-gen hardware instead of the current gen. Well you have to decide, like in the planning stages, of how much geometry your models will have, and if the models are too complex (eg FF14 flower pots), it doesn't matter if someone is running at 720p on low, the geometry is just too complicated, and the only way to solve that is to have the game not render geometry. Which is I suppose how CP2077 tried to lower the LOD.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Salv8 (sam) said:

when these features can be run on a 1070 at 4k call me, because THEN we've made strides in real world video game graphics.

I run Satisfactory at 4k ultra settings with Vsync on, with it off, I get 110 FPS average. That's on Unreal engine 4, and Unreal engine 5 adds in more efficiency improvements, but it also increases what the standards are for ultra. So ultra on UE4 is probably med-high on UE5. But AFAIK think the same features will perform better on the same card with UE5.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, starsmine said:

When have they shot for the moon in the last decade?

Cyberpunk 2077 assumed everybody has insanely high end specs.

 

The requirements table shown earlier in this thread is only for 30fps.

 

I would consider Flight Simulator 2020 to be unplayable on anything below a 2070 at 1080p High.

3 hours ago, starsmine said:

What game is out there, that you are unable to run?

I don't own Cyberpunk 2077 but if I did, I would be unable to play the game at my monitor's 144Hz refresh rate without compromising the visual look of the game.

 

I don't own Flight Simulator 2020 but if I did, the game would be barely playable at 1440p Medium.

 

In Battlefield 2042 I can't play the game at > 100 fps at 1440p High without Dynamic Resolution Scaling.

 

In plenty of games, the game engine can't deliver the performance the card is capable of.

3 hours ago, starsmine said:

Stop making up ultra being the standard. Stop forgetting High/Med/Low exists.

Low and Medium are usually for people with significantly outdated or insanely low end hardware and does not normally look good at all. High is usually good looking but Ultra is what a game is really meant to look like.

 

If a game isn't playable on modern cards on High Settings then there's obviously a problem with the game.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

     The "Nanite Foliage" system really impressed me, since LOD issues are some of the most noticeable and distracting types of graphical limitations in games. Hopefully, it is easy to develop and import assets that are compatible with it. Everything else, however, I am less interested in. I am not enthusiastic about games that advertize "photo-realism" since I actually enjoy games that use the oportunity to employ visuals that can't be recreated in or deviate from real life.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Dracarris said:

In the linked video from Unreal Sensei there is no visibly noticeable pop in effect.

because this system has a completely different issue.

you can add a lot more, until it hits a sort of "balance point".

It re-adds some other issues that LODs dont need to have if done perfectly, like noise/detail changes which could be said is some kind of "pop-in" a lot in movement when a lot of meshes has to "adapt". While it won't have the full object pop-in unless its not a part of that system or might be outside of that "balance" in that kind of "auto LODs". (MP objects or players and what to prioritize)

 

more details, more objects, more textures, more things to account for and one of the reasons I hate some of the visual artifacting in newer games. Where you have to stand still a few seconds to make everything look nice, to make textures fully load, or adjust the quality back to highest when not in action while adapt when moving again.

 

Digital foundry on recent spider games and texture load, although this can be solved I would think and the recent directstorage 1.1, if that works well with nanite and all these engines and games.

https://youtu.be/OHiNtg-5qnc?t=887

Link to comment
Share on other sites

Link to post
Share on other sites

My opinion on some discussion people had here:

I don't think the average pc gamer tunes every little setting, but I also don't think they just put it at what it is by standard.

 

I think the average PC gamer resort to changing the resolution and main graphical setting (the one that changes everything else with it), and maybe turn motion blur off, but nothing else.

 

I personally change more settings than that but some games have too many settings, while others have too few.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

still waiting on all the Bethesda killer games the last Unreal Engine update was supposed to bring

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/21/2022 at 9:55 PM, AluminiumTech said:

Cyberpunk 2077 assumed everybody has insanely high end specs.

 

The requirements table shown earlier in this thread is only for 30fps.

 

I would consider Flight Simulator 2020 to be unplayable on anything below a 2070 at 1080p High.

I don't own Cyberpunk 2077 but if I did, I would be unable to play the game at my monitor's 144Hz refresh rate without compromising the visual look of the game.

 

I don't own Flight Simulator 2020 but if I did, the game would be barely playable at 1440p Medium.

 

In Battlefield 2042 I can't play the game at > 100 fps at 1440p High without Dynamic Resolution Scaling.

 

In plenty of games, the game engine can't deliver the performance the card is capable of.

Low and Medium are usually for people with significantly outdated or insanely low end hardware and does not normally look good at all. High is usually good looking but Ultra is what a game is really meant to look like.

 

If a game isn't playable on modern cards on High Settings then there's obviously a problem with the game.

its kind of the complete opposite,  medium and high often are basically indistinguishable visually, and lots of people with "modern" midrange hardware think they should be able to play everything on "ultra" on their 3060ti's... 

 

also if what you said was true, well, that makes the vast majority of pc games instantly "problematic" since things like msaa x4 simply cannot be run on the vast majority of hardware (i think average pc on steam performance list is like gtx1060 level... so rather low/midrange) 

 

No, devs put these options for the very few that actually have insanely high end hardware (think 4090+) and also for the future, its cool when games don't instantly become "outdated" just because there's a new GPU gen released. 

 

See "can it run crysis" phenomenon. 

 

 

ps: also many games target 1080p / medium settings/ sometimes *30fps* in the recommended hardware on steam... people often overlook this little detail, but it really should give them a hint at how it works 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mark Kaine said:

lots of people with "modern" midrange hardware think they should be able to play everything on "ultra" on their 3060ti's... 

That isn't unreasonable to expect though.

4 hours ago, Mark Kaine said:

also if what you said was true, well, that makes the vast majority of pc games instantly "problematic" since things like msaa x4 simply cannot be run on the vast majority of hardware (i think average pc on steam performance list is like gtx1060 level... so rather low/midrange) 

 

No, devs put these options for the very few that actually have insanely high end hardware (think 4090+) and also for the future, its cool when games don't instantly become "outdated" just because there's a new GPU gen released. 

It's also cool when GPUs don't instantly become outdated just because a new game is released.

4 hours ago, Mark Kaine said:

ps: also many games target 1080p / medium settings/ sometimes *30fps* in the recommended hardware on steam... people often overlook this little detail, but it really should give them a hint at how it works

30 fps should never be the target in system requirements tho imho.

 

60fps should be what's reflected in Minimum and Recommended Requirements.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Mark Kaine said:

 

ps: also many games target 1080p / medium settings/ sometimes *30fps* in the recommended hardware on steam... people often overlook this little detail, but it really should give them a hint at how it works 

 

 

Doubtful. I've never seen a game that was "playable" at minimum requirements, it simply means it run.

 

Quick story. I used to run games on a 386 that "minimum required" a 486, and even later a Pentium. Back in the 90's games would just list things like this:

image.png.722c2c4ff17f12da34f612b63ab73cf7.png

That comes from "Space Quest 6", a game I played on a 386. The only penalty to this was the loading time was substantial.

 

Compare that to today, where games do this:

image.thumb.png.97f62fddd83f10542e058833d9b83fd0.png

This is from "DOOM (2016)", those CPU requirements

That i5-2400 is half the cpu performance of the i7-3770. But those AMD chips? Less than 6% difference. The GPU's different story. The HD7870 is half the performance of the R9 290. The GTX? The 670 is half the performance of the 970.

 

Now go look at the benchmark.

1920.png

https://www.techspot.com/review/1173-doom-benchmarks/page2.html

So, 26fps for the HD 7870's "minimum" , That recommended requirement gets you 86fps. 

 

1 hour ago, AluminiumTech said:

 

30 fps should never be the target in system requirements tho imho.

 

60fps should be what's reflected in Minimum and Recommended Requirements.

1080p60 Ultra should be what's reflected in the minimum, 4Kp60 Ultra should be what's reflected in the Recommended. Eg, this is the hardware we developed the game on.

 

If the minimum requirements simply reflected "what it will run on", it would literately "run" on anything that supports OpenGL 4.5. Like AMD HD 5000, Nvidia 400's,  and the 4th gen Intel iGPU. But you'll note these minimum requirements reflect 2 generation's later of both GPU's. But these minimum requirements don't even hit "30fps"

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

Doubtful. I've never seen a game that was "playable" at minimum requirements, it simply means it run.

 

Quick story. I used to run games on a 386 that "minimum required" a 486, and even later a Pentium. Back in the 90's games would just list things like this:

image.png.722c2c4ff17f12da34f612b63ab73cf7.png

That comes from "Space Quest 6", a game I played on a 386. The only penalty to this was the loading time was substantial.

 

Compare that to today, where games do this:

image.thumb.png.97f62fddd83f10542e058833d9b83fd0.png

This is from "DOOM (2016)", those CPU requirements

That i5-2400 is half the cpu performance of the i7-3770. But those AMD chips? Less than 6% difference. The GPU's different story. The HD7870 is half the performance of the R9 290. The GTX? The 670 is half the performance of the 970.

 

Now go look at the benchmark.

1920.png

https://www.techspot.com/review/1173-doom-benchmarks/page2.html

So, 26fps for the HD 7870's "minimum" , That recommended requirement gets you 86fps. 

 

1080p60 Ultra should be what's reflected in the minimum, 4Kp60 Ultra should be what's reflected in the Recommended. Eg, this is the hardware we developed the game on.

 

If the minimum requirements simply reflected "what it will run on", it would literately "run" on anything that supports OpenGL 4.5. Like AMD HD 5000, Nvidia 400's,  and the 4th gen Intel iGPU. But you'll note these minimum requirements reflect 2 generation's later of both GPU's. But these minimum requirements don't even hit "30fps"

 

 

I disagree, developers should not hold graphics on games back because X hardware should be able to run at ultra.They should ofc be playable at 60 fps on most common older hardware (so 1060), but more than that no.

 

Minimum specs should be whatever specs can run a game at minimum settings at 1080p 60 fps stable.

Recommended specs should be 1440p high settings 60 fps minimum.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

but yeah, next game you need 4080 to have 60 fps on any settings, cash up those 1000 dollars for games that look 10y older to make it run any decent.

Also when GPU makers and such, sponsor gaming companies, use higher tech, create games for higher tech, sell for higher tech. its all marketing for GPUs! jk

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×