Jump to content

Alan Wake 2 won't run on old GPUs due to lack of DX12 Ultimate features

24 minutes ago, Mark Kaine said:

im not sure i understand... what you talk about if not games?  i get there are more subjects than that obviously,  but the vast majority of people i play games with,  or are on any of my "friendslist"... we talk about games, and new trends in gaming, etc *all the time*

What I was referring to is they don't s**tpost on forums and social media moaning about specs.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

It really wouldn't be a properly playable experience on my 6700XT despite supporting DX12 ultimate, considering I expect to be able to play games at 1440p on it, and the devs recommend 1080P Med/w FSR Performance - AKA 540P.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Alex Atkin UK said:

What I was referring to is they don't s**tpost on forums and social media moaning about specs.

i wouldn't even know.... i don't think about specs... we're all modders for the most part, if performance suxx we'll either mod it or just not play it... seems useless to complain about something like that... but as said, i dunno for sure lol. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Alex Atkin UK said:

I saw this coming a mile away.  I was laughed at when I said once games start being optimised for the current consoles, PC gamers would need to upgrade as their hardware would not be able to keep up.

 

Mark Cerny was quite clear for example that the PS5 hardware decompression block would have taken 6 of the PS5s CPU cores to do in software.  It was clear this would mean PC users would need faster and more cores to compensate, but PC users just blame it on poor optimisation.  Which sure, that is an element but if that optimisation means completely rewriting the engine then of course there will be compromises if the cost would outweigh releasing it on PC at all.  Plus if you DO make those optimisations then naturally that's likely to mean DX12 Ultimate.

 

I'm actually amazed we managed to get Ratchet & Clank Rift Apart on PC without GPU decompression.  Even with GPU decompression, its inherently less efficient on PC due to bottlenecks moving between the GPU and system RAM, whereas the PS5 has the benefit of that unified memory so once its loaded into main memory, GPU and CPU can access it.

 

Were returning to a time where consoles can punch above their weight due to more efficient integration of their architecture.  Yes PCs still have faster GPUs, but its a lot harder to utilise that when ported from console as we have very different bottlenecks.  Shuttling data between system RAM and VRAM is expensive.

Its rather like complaining that people should be able to get to their destination just as quickly on the back roads as they can on the motorway/highway.  Its completely unrealistic.

Yes because hardware at the level of a single piece of the hardware chain is so optimized now I actually think we are running into software efficiency issues for PCs compared to consoles. This is because PCs can't create a system that talks to all the hardware together in the most efficient ways. 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Fasterthannothing said:

Yes because hardware at the level of a single piece of the hardware chain is so optimized now I actually think we are running into software efficiency issues for PCs compared to consoles. This is because PCs can't create a system that talks to all the hardware together in the most efficient ways. 

There are actually a lot of hardware features that allow this but they are either more complex than probably worth the effort for games or exclusive to professional hardware. CUDA for example has lots of unified memory management and RDMA capabilities, GPU to GPU/GPU to CPU/GPU to PCIe device, but I don't see these getting used for games because it will ony work for CUDA. Sure you can do it in OpenCL/ROCm but you have to do it all again from scratch so you end up making the game twice or two entirely different game engine development branches, but now we also have Intel GPUs so three times.

 

There's a lot that can be done, it just won't be because it's impractical.

 

Also keep in mind when it comes to games all that memory in GPU isn't needed by the CPU and all the memory in the CPU isn't needed by the GPU. It's not like HPC/GPGPU compute where there is a much greater reliance and communication between them going on, also also why you might as well load basically everything in to GPU memory at the start since you'll need it at some point and having it already local is vastly more performant that a memory operation later no matter how efficient and high bandwidth low latency, you can't beat already there.

Link to comment
Share on other sites

Link to post
Share on other sites

Really I'm surprised that DX12 is still not used in general still in games, or yes Vulkan. This one is just using the full latest feature set, it's been out for a while too.

Now another thing is given sys req sheet from devs with * though. As well as lazy dev made crap ports. We'll see eventually from DF how it is. 

Really I'm looking forward to new Crysis eventually, I've built a new PC lately so I'm good for a while.

 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Doobeedoo said:

Really I'm surprised that DX12 is still not used in general still in games

i think almost about a decade ago i read something like "dx12 Microsoft's biggest failure"... and that still seems to be true for the most part.

 

i know dx12 has better performance theoretically, but devs seem to have indeed huge issues with it...

 

I remember ROTTR took years to run properly on dx12... but that also meant 100+ fps in the most demanding levels instead of "maybe sometimes" 50 fps... 

 

and something is *still* off with that game as most people i know run the beta version because that just runs more stable or something.  

 

 

11 hours ago, leadeater said:

also why you might as well load basically everything in to GPU memory at the start since you'll need it at some point

that should be mandatory imho, Monster Hunter World says hello! 

(mind you that game might still struggle,  but thats because some efdect settings are too high or whatever,  it *does not* load new assets during gameplay whatsoever  - the devs made a big deal out of that back then too, and i really thought it paid off!)

 

(but also you had to run like a CRC check thing to stop the anti cheat from doing its thing, if you really wanted the *best* performance) 

 

^wasnt an issue because the game also openly, while not officially,  supported modding (another strength of MT framework since it comes by default with a "mod folder" and consequently structure) 

 

ps: devs said something like: "we dont mind modding,  we aren't going to stop mods, but we will not offer support for any issues that come from modding." 

 

Seems fair to me! 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Mark Kaine said:

i think almost about a decade ago i read something like "dx12 Microsoft's biggest failure"... and that still seems to be true for the most part.

 

i know dx12 has better performance theoretically, but devs seem to have indeed huge issues with it...

 

I remember ROTTR took years to run properly on dx12... but that also meant 100+ fps in the most demanding levels instead of "maybe sometimes" 50 fps... 

 

and something is *still* off with that game as most people i know run the beta version because that just runs more stable or something.  

 

 

that should be mandatory imho, Monster Hunter World says hello! 

(mind you that game might still struggle,  but thats because some efdect settings are too high or whatever,  it *does not* load new assets during gameplay whatsoever  - the devs made a big deal out of that back then too, and i really thought it paid off!)

 

(but also you had to run like a CRC check thing to stop the anti cheat from doing its thing, if you really wanted the *best* performance) 

 

^wasnt an issue because the game also openly, while not officially,  supported modding (another strength of MT framework since it comes by default with a "mod folder" and consequently structure) 

 

ps: devs said something like: "we dont mind modding,  we aren't going to stop mods, but we will not offer support for any issues that come from modding." 

 

Seems fair to me! 

 

 

I mean it's around 8y old and especially early on many devs just ported to it, where more works needs to be done to actually utilize it properly. It seems still to this day, devs don't want to bother with it, laziness really. As far as some things and low level access it's similar to Vulkan in ways. But many devs don't even use Vulkan. Still using inferior APIs today that we see for some games. That still fail to work even. 

 

It was half assed that's why many early games had issues. First they were not even made for DX12 from the start but just ported to later on and trying to muster it fast.

It has a better feature set and performance floor and optimization, it needs more manual work, but yeah, if they don't want to properly work on... it's on them.

 

Nobody should be using anything other than DX12 or Vulkan anymore. It's stupid and lazy. But yeah there are even more problems with games releasing unfinished, broken and such regardless of API used. Or being ports which are all just by default since they're derived from console code.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, StDragon said:

UE5 is the gold-standard for engines. If only more titles would utilize it. If developers would stop reinventing the game engine (unless it's really that much better than UE5 and Unity) and focus on content and play mechanics, we wouldn't have the shitty launches we have now.

Have we seen any actually grapically impressive UE5 games so far? I feel like many people hold on to hope that UE5 is a magical fps AND visual boost without any real proof atm.

 

Fortnite isn't really a very visually impressive game and i feel that Lords of the Fallen isn't that innovative either in both visuals and performance. Are there any other currently released UE5 games that i've missed?

 

My point is: At the moment we don't really have any proof how well the advancements of the engine actually impact performance. Practically all the demos we've seen have been running on top-tier hardware and without an fps counter, while the uploaded video is only 30 fps.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't say it's all about the consoles, it's more about the tech in general. DX12 is a huge leap technically, it requires new hardware which requirement we really haven't seen since DX9 was released but back then no one cared that much because it was norm that new features required new hardware. More or less in history of gaming we have couple decades of being brought up in cotton pad and now with DX12 entering the native state it's hard landing back to the reality that your almost decade old hardware is just too old.

 

DirectStorage won't really change a thing unless there would be huge rearrangement of Windows and all kinds of policy changes on how PC OS's work or just simple "throw money at the problem" solution of getting dedicated game storage media as fast or faster than your OS/app media. Unlike console OS which can "hibernate" most of it's processes when game is running, Windows cannot or at least it hasn't been designed to do so, so all the time it is accessing the storage same as any program running in the background is accessing the storage and eating that bandwidth and there isn't a magic trick to make that happen without huge changes or throwing money at the problem. But that is mainly just loading times unless the game streams assets or otherwise preloads things in the background and even then it's pretty much nothing unless you're still running HDD's, but if you haven't already, dedicate NVMe drive for games and you have done already more than DirectStorage alone can do.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Doobeedoo said:

Really I'm surprised that DX12 is still not used in general still in games, or yes Vulkan. This one is just using the full latest feature set, it's been out for a while too.

Back at GDC 2016 MS made it clear that D3D11 will continue to live alongside D3D12 for the foreseeable future. Apparently, even they were aware of the difficult path the new API have to go through before gaining momentum.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, DuckDodgers said:

Back at GDC 2016 MS made it clear that D3D11 will continue to live alongside D3D12 for the foreseeable future. Apparently, even they were aware of the difficult path the new API have to go through before gaining momentum.

I'm aware of that, though it was meant firstly for lighter online games that are already established, another games that are not AAA for example and don't go for new tech and such. Obviously also time until transition and learning curves for it. But really it's been quite a while let's be honest.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, Alex Atkin UK said:

Its like the vocal minority who are moaning about Spiderman 2 using RT in all modes.  They aren't happy that it runs at "only" 60fps in performance mode.

 

Fact is, unless developers start utilising this newer tech, game development is going to stagnate.  They need to be using the latest tech so that more silicon on future GPUs can be dedicated to that tech, making it more performant and eventually we move on to 100% RT games, so developers have to focus less on lighting, shadows and AO hacks - and more on the actual game mechanics.

 

Its nothing new, people moaned when programmable shaders came out because it was stealing silicon from making fixed function shaders faster in new GPUs.  Its tiring seeing the same arguments again, though granted most people moaning probably weren't born during the last major upgrade.  They've become complacent with just more of the same only faster, which we can't do any more as were hitting technical limitations.  So we have to find new shortcuts which inherently means eventually no longer supporting the old ones, thus new GPUs required.

It's always kind of funny how this argument 180s and 360s every couple years/decades.

 

If consoles are crappy like PS4 and XBO, with tablet level CPU holding back game development for years we have the scenario that even decade old CPUs and 4-5 year old GPUs can run everything. PC becomes a huge value proposal over console but game tech stagnates.

 

And then we have the current situation. Just like how it was in the late 90s and early 2000s before console games impacted PC game development. You were expected to replace your entire computer after 1-2 years because it became ewaste and couldn't do programmable shaders like the shiny new GeForce 2...or was it 3?

 

Remember the timeline of DirectX before 9?

image.png.04971f64728a5ee3fd4f0f6fadfcc893.png

Link to comment
Share on other sites

Link to post
Share on other sites

The specs requirements do look punchy for what is a pretty niche game (or am I being unfair here?). But yeah, as the proper PS5/Xbox Series generation games start coming out, we should be expecting a large min spec bump.

 

I will be interesting to see what the PC version of direct storage will end up looking like. Gen 5 NVME SSDs? Or 32GB VRAM on our graphics cards?

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Monkey Dust said:

I will be interesting to see what the PC version of direct storage will end up looking like. Gen 5 NVME SSDs? Or 32GB VRAM on our graphics cards?

The PC version of Direct Storage is Direct Storage. Or am I missing something here? If you're looking at the hardware, decent PCIe 4.0 SSDs are already in the ball park of the PS5's own. More VRAM could be a workaround for not having the unified memory consoles have. I can't see the desktop going that direction any time soon but the door is open for non-upgradable laptops.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/22/2023 at 6:27 PM, Alex Atkin UK said:

Which consoles of the past did you have in mind?  Because for the most part, PC has been well ahead of consoles in brute force that completely offset the differences.

 

Last generation the consoles were extremely weak in comparison.

The previous generation Xbox 360 was the primary target, which again wasn't all that great compared to PC and was specifically designed to make porting from PC to console easier, rather than the other way around.

 

Its only PS3 where a few developers were able to really tap the hardware, and those games never got ported to PC because of it.  Its taken emulators 20 years of PC improvements and software hacks to get close to running those games on PC.

 

PS2 was similar, games which really optimised to the hardware didn't get PC ports.  Games were also much simpler so making concurrent versions for console and PC was more common.

 

The current consoles however were much more forward thinking than has happened for several generations.

 

I think you misunderstood me, (i didn't do the best job of explaining TBH in hindsight), I was getting at how console today are more like PC's than older generations, and yet developers still ported games from console to PC in those days, (and vice versa).

 

The argument of developing for PC and consoles being somehow harder is in light of that a bunch of nonsense. If game developers in the past could do it, then modern game developers working with a much more similar setups on both sides should find it far easier.

 

And it's not like the consoles are in any way special by high end gaming PC standards today. The CPU and GPU leap over them is allready so high that with any level of optimisation they should demolish consoles. But based on some stuff i've heard about this matter since my last post it's not even like it runs well on more advanced PC hardware, which just tells me they did a bad job of porting it in the first place.

 

Hell based on what's in the OP about what the feature is used to do for a game, this sounds like the developers just couldn't be bothered to implement a lower LoD level which is basic optimisation stuff.

 

 

And thats the real problem here. Implementing features like this is typically done in a gradual manner by the gaming industry. Imagine the outrage if when Cyberpunk and Metro where released if they'd had no non-RTX lighting and shadow option. You had to have RTX to even play. That would not have gone over well. If Alan Wake 2 had said "You have to have DX12 U support to play the highest settings", no one would have batted an eyelid, that's normal for advances in graphical fidelity. What isn't normal is for the new thing to be a requirement to play. Thats where the anger is coming from. No other major game has even used this before and now it's being used as a minimum requirement. That's not how PC gaming works historically, backwards compatibility is a cornerstone of PC gaming.

 

 

@leadeater Like i aid above my whole point was it was a lot harder to do, but it wa still done. hats the other weird things about this, high GPU prices have slowed adoption of later GPU's which naturally means a lot of gamers are still on hardware that can't even play this game on low. What's the point spending the money to port it at all if your sales base is so small. This just feels like a classic bad port job now i've read some more on the matter.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, CarlBar said:

 

I think you misunderstood me, (i didn't do the best job of explaining TBH in hindsight), I was getting at how console today are more like PC's than older generations, and yet developers still ported games from console to PC in those days, (and vice versa).

 

The argument of developing for PC and consoles being somehow harder is in light of that a bunch of nonsense. If game developers in the past could do it, then modern game developers working with a much more similar setups on both sides should find it far easier.

This is a common misconception.  Just because consoles and PC are based on the same architectures, does not necessarily make it that much easier to port.  As its how those parts work together that is vastly different, especially Playstation where the APIs are bespoke for the consoles so do not simply translate over to DirectX.

 

Xbox 360 was by all accounts easy to port between PC and console despite using a completely different CPU architecture, because Microsoft specifically designed Xbox based on Windows APIs, it was the whole reason for its name after all.  You largely just told it to compile for the target you want and it would run, though not necessarily run well.

 

If you take a read of what was required to port Linux to PS4 you will find huge parts of how a PC functions is simply missing on consoles, because they aren't designed for backwards compatibility or to run a general purpose OS.  How multi-tasking is handled on the consoles is completely different to a PC as its a controlled environment, heck on Xbox they use VMs to separate out the core OS, front-end and games.  While its written in a way to make ports from PC to Xbox easier, they wont necessarily run in a playable state without heavy optimisation.  So simpler indie games are an easy port, but anything more complex can run poorly.

 

Conversely, you can't simply run a game designed to operate on a unified memory system, on a split-memory system like PC.  So games written for PC first then ported to console are a lot easier, although you can also end up with situations like Immortals of Aveum where they over-estimated what they could optimise the console to handle, though that wasn't helped by the PC version not being great either.

 

Also what games are doing today is orders of magnitude more complicated than they used to, so saying "well we used to be able to do it" is a false argument to begin with.

 

This is another misconception of what a "port" is.  What they used to do is write completely different games from the ground up for each platform, not port the original game.  That's simply not possible today as games are too complex.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

This is a common misconception.  Just because consoles and PC are based on the same architectures, does not necessarily make it that much easier to port.  As its how those parts work together that is vastly different, especially Playstation where the APIs are bespoke for the consoles so do not simply translate over to DirectX.

 

Xbox 360 was by all accounts easy to port between PC and console despite using a completely different CPU architecture, because Microsoft specifically designed Xbox based on Windows APIs, it was the whole reason for its name after all.  You largely just told it to compile for the target you want and it would run, though not necessarily run well.

 

If you take a read of what was required to port Linux to PS4 you will find huge parts of how a PC functions is simply missing on consoles, because they aren't designed for backwards compatibility or to run a general purpose OS.  How multi-tasking is handled on the consoles is completely different to a PC as its a controlled environment, heck on Xbox they use VMs to separate out the core OS, front-end and games.  While its written in a way to make ports from PC to Xbox easier, they wont necessarily run in a playable state without heavy optimisation.  So simpler indie games are an easy port, but anything more complex can run poorly.

 

Conversely, you can't simply run a game designed to operate on a unified memory system, on a split-memory system like PC.  So games written for PC first then ported to console are a lot easier, although you can also end up with situations like Immortals of Aveum where they over-estimated what they could optimise the console to handle, though that wasn't helped by the PC version not being great either.

 

Also what games are doing today is orders of magnitude more complicated than they used to, so saying "well we used to be able to do it" is a false argument to begin with.

 

This is another misconception of what a "port" is.  What they used to do is write completely different games from the ground up for each platform, not port the original game.  That's simply not possible today as games are too complex.

im gonna say, being fully aware running the risk of how "wrong" i am... i think big part of why this is happening is secrecy,  weird habits and especially the need to "encrypt" everything...

 

 

in my experience the best games are the ones with the least secrecy about how they work and no unnecessary compression etc, aka you can see and manipulate everything without the need of specialized "tools"

 

ie. im pretty sure the wiiu version of Fatal frame 5 has nearly no differences to the pc version,  and yes, those games *do* scale automatically,  doesn't matter if you give it 5gb or 2, they'll use *all* available sources "as needed"

 

im not saying this is the only issue,  but i think its big part of why "bad ports" exist...

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Mark Kaine said:

im gonna say, being fully aware running the risk of how "wrong" i am... i think big part of why this is happening is secrecy,  weird habits and especially the need to "encrypt" everything...

 

 

in my experience the best games are the ones with the least secrecy about how they work and no unnecessary compression etc, aka you can see and manipulate everything without the need of specialized "tools"

 

ie. im pretty sure the wiiu version of Fatal frame 5 has nearly no differences to the pc version,  and yes, those games *do* scale automatically,  doesn't matter if you give it 5gb or 2, they'll use *all* available sources "as needed"

 

im not saying this is the only issue,  but i think its big part of why "bad ports" exist...

DRM can definitely cause stutters and crashes, but I don't think its a big aspect to performance overall (though I object to rootkits on my PC leaving security holes).

 

Compression is mostly two fold:

1) So we are downloading 100GB games instead of 1TB games.

2) Its more efficient to transport from storage and then decompress in fast memory.  We've had texture compression for example for decades as it hugely reduced VRAM requirements and VRAM (and extension PCIe on PC) bandwidth usage.  Compressed textures as I understand are decompressed on the fly by the GPU, so that textures are taking up less space in VRAM as they are only decompressed when being rendered out to the scene.  Unfortunately not everything can be this efficient, as anything that needs CPU processing has to be uncompressed in main memory.

 

Compression is not the same as DRM/encryption, most (if not all?) games use some compression and often there will be official or unofficial tools to decompress those assets.

I'm no expert thought, so I may be slightly wrong in the exact specifics of how this works.  But I believe I understand the basic concept on these things.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Alex Atkin UK said:

DRM can definitely cause stutters and crashes, but I don't think its a big aspect to performance overall (though I object to rootkits on my PC leaving security holes).

 

Compression is mostly two fold:

1) So we are downloading 100GB games instead of 1TB games.

2) Its more efficient to transport from storage and then decompress in fast memory.  We've had texture compression for example for decades as it hugely reduced VRAM requirements and VRAM (and extension PCIe on PC) bandwidth usage.  Compressed textures as I understand are decompressed on the fly by the GPU, so that textures are taking up less space in VRAM as they are only decompressed when being rendered out to the scene.  Unfortunately not everything can be this efficient, as anything that needs CPU processing has to be uncompressed in main memory.

 

Compression is not the same as DRM/encryption, most (if not all?) games use some compression and often there will be official or unofficial tools to decompress those assets.

I'm no expert thought, so I may be slightly wrong in the exact specifics of how this works.  But I believe I understand the basic concept on these things.

drm is part of it too, but what i mean is... simply by converting this stuff (call it compiling etc) there are differences in how things work, and thats often part of what we call "optimization" too... i cant explain it much better but in more broader terms, the more simple or dare i say elegant,  a code/program is, the better it runs typically... or again in other words,  devs often just don't know any better,  they don't know *why* they're compressing things, they don't know *how* to make code that works on multiple platforms.... etc.

 

 

here's how Fatal Frame looks... nothing encrypted or whatever,  all plain game files (ive not really seen it like this before,  even though tbf i think some UE games look similar) 

 

ffyuryrt367356y3.thumb.png.a82936e8984e651fb349f075c397bedd.png

you can just access, copy, alter, delete everything as you see fit ~

 

here's monster hunter world

 

mhktyu56456.thumb.png.5156b52582344b6c5cdde0386779a307.png

 

cant really access anything without decompressing it first ,with exception of the included "mod folder" (marked with ! )

 

its at least something but a lot less elegant and of course the whole game is like 10 times as big... 

 

not saying its a bad game, but more in line how most games are structured , just not as elegant or straightforward. 

 

 

and... im also saying this because while Fatal Frame is smaller in scope, i wouldn't say it's graphics are worse, they are comparable at least, yet Fatal Frame does use almost no resources whatsoever,  it literally runs on potato (its a wiiu game after all originally)  

 

and yes, you can pretty much literally put in wiiu code here and it'll run! the code looks exactly the same (from what i can tell!) 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mark Kaine said:

its at least something but a lot less elegant and of course the whole game is like 10 times as big... 

The data size isn't any different, bar maybe smaller if it's also being compressed. But to your point it's in a data format that requires processing which might be greater than if it were in xml format for example. Also instead of being in hundred (or thousands) of xml files it's in larger asset files which take longer to read or use more memory to access the smaller dataset within it.

 

Btw sometimes these things are done, at least in the past, to make accessing the data more optimal for disc media since seek and access times for that media type is really bad so lots of small xml files is actually vastly worse than reading out one larger file. It actually also helps to a degree for HDDs too, since quick access to lots of files isn't that great on that media type either but way better than CD/DVD.

 

Here's the thing though, regardless of how it's actually stored on disk and the processing requirements to actually access the data once that is done and it's in memory then it doesn't matter anymore. If it's game data that is needed all the time it's going to stay in memory so any pros/cons become only applicable for game loading time.

 

I think we often underestimate the full complexity of games and everything associated with them. What we might think matters a lot could be entirely irrelevant and it's really hard to know 🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, leadeater said:

think we often underestimate the full complexity of games and everything associated with them. What we might think matters a lot could be entirely irrelevant and it's really hard to know 🤷‍♂️

that's probably true... but for example assembly code is 100 times faster than BASIC... (iirc) so it still does matter how code is written etc.

 

and i was just saying the more complex something is the slower it probably is, like if you need 100 lines for something that could be done in 10, especially because things often do get loaded during runtime,  its not like we'd ideally wish that everything gets actually loaded at once - couple of years ago that was the in thing to do... "streaming assets"... it was absolutely terrible... likewise with Microsofts "power of the cloud" that was even worse...

 

 

ultimately we often don't know, thats true, its still easy to speculate the more efficient something is actually written the better it'll run... and to me it seems a lot of stuff nowadays is just bloated, and as in my example  - just from the looks of it - things are done wildly different,  with similar results,  but one uses a lot less resources than the other.

 

Also why aren't engines more geared towards multi thread and multiplatform... that should be the norm,  is rather the exception though. 

 

ie. these things can be done but apparently its often easier to just "port" stuff instead,  with all the associated issues. 

 

Maybe its easier,  but its not smarter lol...

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, iSynthMan said:

It's always kind of funny how this argument 180s and 360s every couple years/decades.

 

If consoles are crappy like PS4 and XBO, with tablet level CPU holding back game development for years we have the scenario that even decade old CPUs and 4-5 year old GPUs can run everything. PC becomes a huge value proposal over console but game tech stagnates.

 

And then we have the current situation. Just like how it was in the late 90s and early 2000s before console games impacted PC game development. You were expected to replace your entire computer after 1-2 years because it became ewaste and couldn't do programmable shaders like the shiny new GeForce 2...or was it 3?

 

Remember the timeline of DirectX before 9?

image.png.04971f64728a5ee3fd4f0f6fadfcc893.png

DirectX versions before 5 were often not used, and DirectX versions after 9 were often not used if the game needed to work on iGPU's. The thing is, hardware T&L came out in 1997, and hardware that took advantage of that didn't come out till the first Geforce model. That was the defining feature for Geforce. DX 7 is when we started moving away from fixed 3d pipelines to programmable ones.

 

DX9 is "programable vertex shaders" DX12 is "programable mesh shaders", which is highly threaded. They do not exist on NVidia hardware before the 16/20 series.

meshlets_comparison-1024x524.png

https://developer.nvidia.com/blog/introduction-turing-mesh-shaders/

https://www.khronos.org/blog/mesh-shading-for-vulkan

https://gpuopen.com/wp-content/uploads/slides/AMD_RDNA2_DirectX12_Ultimate_SamplerFeedbackMeshShaders.pdf

 

If you ever needed an excuse to upgrade to a 20/30/40 series GPU, this is it. AMD doesn't actually support Mesh shaders natively on RDNA2, it's doing some driver-level mapping so there are some limitations. Metal 3 (Apple) does support it though.

 

Like, to be real, the Mesh Shaders are what everyone is in theory moving to, as Metal 3, Vulkan and DX12 Ultimate are pushing it. But that means if you have a GPU more than 2 generations old, you likely won't be able to get the game to run.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm wondering, if this is a shock because it has been so long since we last had a mandatory hardware feature jump on the GPU side? I was expecting RT to be that jump, but now this has happened would others follow?

 

I still have a 1080 Ti. It's still what, 3060 level today? For raster anyway. I know it wont be competitive for modern games but I'll still remember it as the best consumer tier GPU before RT era.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×