Jump to content

Alan Wake 2 won't run on old GPUs due to lack of DX12 Ultimate features

The upcoming sequel to Alan Wake is already famous for its rather outlandish PC system requirements, but that's not all. A large swath of PC systems will simply not be able to run the game at all. 

 

Quotes

Quote

As you can see, neither NVIDIA's GTX 10 Series nor AMD's Radeon RX 5000 Series are supported. If you're wondering why, Remedy's Lea-Newin revealed that it's because those graphics cards do not feature hardware support for Mesh Shaders.

 

The original tweet was deleted, but DSO Gaming captured a snapshot. Moreover, the Remedy developer subsequently said a talk or blog post about the Mesh Shaders implementation in Alan Wake 2 could be on the way.

 

You could be easily forgiven if you thought this technology was new, given how little time it's been in the spotlight. However, NVIDIA first demonstrated Mesh Shading with the Asteroids demo released in December 2018. Mesh Shading aimed to reinvent the geometry pipeline by bringing the benefits of the compute programming model to the graphics pipeline's front end. Thanks to advanced culling and pre-culling, it can also improve performance and/or allow much higher levels of geometry. Mesh Shading was subsequently added to Microsoft's DirectX12 Ultimate. The first NVIDIA driver certified for Mesh Shading (and other DX12 Ultimate features) rolled out in April 2020.

 

Here's some supplementary material on the expected PC graphics card compatibility for the game:

 

F9CBledW0AA2Qmn.thumb.jpg.163132c52d7a714659988a455f8776eb.jpg

 

Source:

https://wccftech.com/alan-wake-2-dev-says-gtx-10-and-5000-gpus-arent-supported-due-to-lack-of-mesh-shaders/

Link to comment
Share on other sites

Link to post
Share on other sites

God. remember when just 7 years ago all PC gamers did was bitch and moan about how consoles were holding gaming back?

Now that a game comes out and uses the new DX features baseline they bitch and moan that they are the ones holding gaming back. 

Gamers, please note, you dont have to play it when it comes out. The game will be just as valid in a year when you buy a Lovelace Next xx60 part to replace your 8 year old 1060

Or in 3 years when you buy a lovelace next next xx60 part to replace your 1660.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, starsmine said:

God. remember when just 7 years ago all PC gamers did was bitch and moan about how consoles were holding gaming back?

Now that a game comes out and uses the new DX features baseline they bitch and moan that they are the ones holding gaming back. 

Gamers, please note, you dont have to play it when it comes out. The game will be just as valid in a years when you buy a Lovelace Next xx60 part to replace your 8 year old 1060

Or in 3 years when you buy a lovelace next next xx60 part to replace your 1660.

I saw this coming a mile away.  I was laughed at when I said once games start being optimised for the current consoles, PC gamers would need to upgrade as their hardware would not be able to keep up.

 

Mark Cerny was quite clear for example that the PS5 hardware decompression block would have taken 6 of the PS5s CPU cores to do in software.  It was clear this would mean PC users would need faster and more cores to compensate, but PC users just blame it on poor optimisation.  Which sure, that is an element but if that optimisation means completely rewriting the engine then of course there will be compromises if the cost would outweigh releasing it on PC at all.  Plus if you DO make those optimisations then naturally that's likely to mean DX12 Ultimate.

 

I'm actually amazed we managed to get Ratchet & Clank Rift Apart on PC without GPU decompression.  Even with GPU decompression, its inherently less efficient on PC due to bottlenecks moving between the GPU and system RAM, whereas the PS5 has the benefit of that unified memory so once its loaded into main memory, GPU and CPU can access it.

 

Were returning to a time where consoles can punch above their weight due to more efficient integration of their architecture.  Yes PCs still have faster GPUs, but its a lot harder to utilise that when ported from console as we have very different bottlenecks.  Shuttling data between system RAM and VRAM is expensive.

Its rather like complaining that people should be able to get to their destination just as quickly on the back roads as they can on the motorway/highway.  Its completely unrealistic.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

PC game launches have been introduced as a wet fart lately. Some of them will reach redemption later on and improve with age (CyberPunk 2077 and Halo Infinite for example), others not so much. Will Starfield reach redemption without the need for mods? Who knows.

Buying the latest GPU can be viewed as a way to play last gen PC games that have reached a polish state, and just launched titles in a shitty quasi-beta state. It's pathetic really. Console gamers have it good.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Alex Atkin UK said:

I saw this coming a mile away.  I was laughed at when I said once games start being optimised for the current consoles, PC gamers would need to upgrade as their hardware would not be able to keep up.

 

Mark Cerny was quite clear for example that the PS5 hardware decompression block would have taken 6 of the PS5s CPU cores to do in software.  It was clear this would mean PC users would need faster and more cores to compensate, but PC users just blame it on poor optimisation.  Which sure, that is an element but if that optimisation means completely rewriting the engine then of course there will be compromises if the cost would outweigh releasing it on PC at all.  Plus if you DO make those optimisations then naturally that's likely to mean DX12 Ultimate.

 

I'm actually amazed we managed to get Ratchet & Clank Rift Apart on PC without GPU decompression.  Even with GPU decompression, its inherently less efficient on PC due to bottlenecks moving between the GPU and system RAM, whereas the PS5 has the benefit of that unified memory so once its loaded into main memory, GPU and CPU can access it.

 

Were returning to a time where consoles can punch above their weight due to more efficient integration of their architecture.  Yes PCs still have faster GPUs, but its a lot harder to utilise that when ported from console as we have very different bottlenecks.  Shuttling data between system RAM and VRAM is expensive.

 

This isn't about upgrading hardware for optimisation though, it's about upgrading hardware to play at all. PC gaming has generally had a thing where systems significantly behind the curve could still play new games by turning the settings way down. And in this specific instance i think the problem is being made worse because GPU prices have been so elevated for so long.

 

Personally i've never heard of this game, so i've no idea how big of a deal it is amongst gamers, but this is a lot more than just consoles allowing games to push furthar.

Link to comment
Share on other sites

Link to post
Share on other sites

Like it or not, when a game is fully optimised for console, its not easily scalable to PC.  Xbox main benefit is developers can not bother to do such optimisation so it will scale to both, at the cost of not performing as well as it could on Xbox if fully optimised.

 

With so many developers being absorbed into big conglomerates, the bean counters just don't want to spend the money to pay for the game to be so heavily re-written for PCs limitations (compared to console).

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CarlBar said:

 

This isn't about upgrading hardware for optimisation though, it's about upgrading hardware to play at all. PC gaming has generally had a thing where systems significantly behind the curve could still play new games by turning the settings way down. And in this specific instance i think the problem is being made worse because GPU prices have been so elevated for so long.

 

Personally i've never heard of this game, so i've no idea how big of a deal it is amongst gamers, but this is a lot more than just consoles allowing games to push furthar.

You're missing my point, I wasn't referring to optimisation in the sense of "it performs better", I was referring to optimisation needed to make the game run at all on PC if its designed around hardware accelerated decompression and a single pool of RAM.

 

The same code simply wont work on PC at all without major modifications.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Alex Atkin UK said:

I'm actually amazed we managed to get Ratchet & Clank Rift Apart on PC without GPU decompression.  Even with GPU decompression, its inherently less efficient on PC due to bottlenecks moving between the GPU and system RAM, whereas the PS5 has the benefit of that unified memory so once its loaded into main memory, GPU and CPU can access it.

It also supports Direct Storage in which decompression occurs on the GPU. FPS is about the same with and without, but with GPU decompression, the 1% lows are vastly improved. Your average gamer probably wouldn't notice.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, StDragon said:

It also supports Direct Storage in which decompression occurs on the GPU. FPS is about the same with and without, but with GPU decompression, the 1% lows are vastly improved. Your average gamer probably wouldn't notice.

It also will have required a lot of re-working to do that, as the compression PS5 uses is not supported on PC GPUs.  Plus like I said, GPU decompression on PC still has to move the data between system RAM and VRAM, eating up PCIe bandwidth and adding latency.

 

There's also the fact it was an early game, clearly not pushing the PS5 to its limits.  When we finally get Spiderman 2 I think the requirements are going to be a LOT higher.

 

Were also seeing the reverse problem too in games using UE5.  Where the engine is more optimised for how PCs work, so doesn't scale well to console.

 

With all the recent layoffs by the bigger conglomerates, this is not boding well.  They don't want to spend the money to make good ports.  I think the huge migration to UE5 is they thought Epic would put in all the effort to automatically scale games between platforms, but were seeing so far this is clearly not the case.  With even Epic losing money now.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Alex Atkin UK said:

It also will have required a lot of re-working to do that, as the compression PS5 uses is not supported on PC GPUs.  Plus like I said, GPU decompression on PC still has to move the data between system RAM and VRAM, eating up PCIe bandwidth and adding latency.

The issue is with the OS itself. The GPU can't access the NTFS vol directly, specifically if it's encrypted with BitLocker. All DirectStorage requests must go through the CPU to have the fetching done by NTFS.SYS. Direct block storage access to a volume isn't supported. Perhaps one day some revision of DirectStorage will allow for a dedicated NVMe drive for game assets to be cached there just for direct GPU to block level storage access from an NVMe drive. But for now, DirectStorage in its current form is still useful primarily because of the massive parallel I/O requests and GPU decompression.

PCs rarely ever jump to revolutionary paradigms, but instead play the long path of planning iteratively to get there to maintain compatibility while all HW and software in the chain catch up to where it ultimately needs to be.

Note: You can install a game that uses DirectStorage on a volume that's encrypted with BitLocker. Just making the point it all goes through the CPU anyways.

Link to comment
Share on other sites

Link to post
Share on other sites

I'll just throw in this observation in case people missed it.

 

Minimum = Low

Recommended = Medium

Ultra = High

 

High is Ultra. Medium is the step-down for the masses, not High.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Alex Atkin UK said:

You're missing my point, I wasn't referring to optimisation in the sense of "it performs better", I was referring to optimisation needed to make the game run at all on PC if its designed around hardware accelerated decompression and a single pool of RAM.

 

The same code simply wont work on PC at all without major modifications.

 

And? Thats been true to an even greater degree for consoles in the past, never stopped developers developing and releasing on PC. The whole "it would be more work" argument just doesn't fly when doing that extra work is considered normal for releasing on a platform. Hell look at all the developers releasing on switch, thats an even bigger jump.

 

10 minutes ago, porina said:

I'll just throw in this observation in case people missed it.

 

Minimum = Low

Recommended = Medium

Ultra = High

 

High is Ultra. Medium is the step-down for the masses, not High.

 

The issue seems to be, (based on the OP), that no matter the settings if your in the not compatible bracket it won't run at all even on low settings.

Link to comment
Share on other sites

Link to post
Share on other sites

Alan Wake wasn't exactly a global shocker where entire world would be awed by it. It was a rather niche game and while it looked nice (especially dynamic lights and shadows), it was pretty repetitive. Giving it some insane requirements now, why? I remember days when we bought new graphic cards just to run latest Quake game in all its glory. But those days are long gone and now hardly anyone buys new high end graphic cards just to experience 1 game fully.

 

Which is why most optimize engine to run on as many graphic cards and as fast as possible. I think one of best showcases of that is actually Doom 2016 and partially Doom Eternal. They booth look really good, but at the same time run on near potato level hardware. Are we really returning to the days of "it either looks good and runs like poo or it runs great and looks like poo" ? Because most current games look really good even when running at Low settings while running on ancient stuff like GTX 1060 or RX 580.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, CarlBar said:

And? Thats been true to an even greater degree for consoles in the past, never stopped developers developing and releasing on PC. The whole "it would be more work" argument just doesn't fly when doing that extra work is considered normal for releasing on a platform. Hell look at all the developers releasing on switch, thats an even bigger jump.

Which consoles of the past did you have in mind?  Because for the most part, PC has been well ahead of consoles in brute force that completely offset the differences.

 

Last generation the consoles were extremely weak in comparison.

The previous generation Xbox 360 was the primary target, which again wasn't all that great compared to PC and was specifically designed to make porting from PC to console easier, rather than the other way around.

 

Its only PS3 where a few developers were able to really tap the hardware, and those games never got ported to PC because of it.  Its taken emulators 20 years of PC improvements and software hacks to get close to running those games on PC.

 

PS2 was similar, games which really optimised to the hardware didn't get PC ports.  Games were also much simpler so making concurrent versions for console and PC was more common.

 

The current consoles however were much more forward thinking than has happened for several generations.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

If it makes the game look better, I'm all for it.

Shame for those that rock older hardware for sure, but I'm sick of improvements being held back by people that hold onto aging technology.

 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Its like the vocal minority who are moaning about Spiderman 2 using RT in all modes.  They aren't happy that it runs at "only" 60fps in performance mode.

 

Fact is, unless developers start utilising this newer tech, game development is going to stagnate.  They need to be using the latest tech so that more silicon on future GPUs can be dedicated to that tech, making it more performant and eventually we move on to 100% RT games, so developers have to focus less on lighting, shadows and AO hacks - and more on the actual game mechanics.

 

Its nothing new, people moaned when programmable shaders came out because it was stealing silicon from making fixed function shaders faster in new GPUs.  Its tiring seeing the same arguments again, though granted most people moaning probably weren't born during the last major upgrade.  They've become complacent with just more of the same only faster, which we can't do any more as were hitting technical limitations.  So we have to find new shortcuts which inherently means eventually no longer supporting the old ones, thus new GPUs required.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, CarlBar said:

And? Thats been true to an even greater degree for consoles in the past, never stopped developers developing and releasing on PC. The whole "it would be more work" argument just doesn't fly when doing that extra work is considered normal for releasing on a platform. Hell look at all the developers releasing on switch, thats an even bigger jump.

Kind of, lots of console games never game to PC and a lot of them simply couldn't for the time. Square/Square Enix for a long time would make bespoke game engine specific to the generation console hardware (PlayStation) and it would literally only ever work on PlayStation. Games of the era that have eventually gotten PC ports run essentially though software emulators and run like garbage for what they are, only made possible through steer force of generations newer performance just to get playable what was possible ~8 years prior on the console.

 

And that's not because the console 8 years ago was that much faster than PCs of the time. sometimes yes compares to the average to higher end, but it's simply down to how the engine and game was designed in the first place.

 

The bad PC port reputation started because of the issues, before consoles were x86 and the situation has actually improved since the migration to x86 console.

 

So lot of games and developers were stopped in the past from releasing on PC due to these problems. You used to only be able to reliably count on the likes of CoD etc be available across platforms and the PC version was at it's core different to the console version, which required lots of resources (people, time and money). But also lets not forget just how massive the gaming market is and how much money can be made. Where the economics support it games will get support on as many platforms as makes sense no matter the technical challenges to do so, the mighty dollar always wins out eventually (CoD on Wii).

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Alex Atkin UK said:

…though granted most people moaning probably weren't born during the last major upgrade.  They've become complacent with just more of the same only faster, which we can't do any more as were hitting technical limitations.  So we have to find new shortcuts which inherently means eventually no longer supporting the old ones, thus new GPUs required.

Feature set hasn’t moved appreciably for a long time. And even today, a lot of games still use DX11. For a decade, gamers didn’t have to take heed of features, and instead, just looked at performance, and even when new features were employed, they were optional. 
 

This is also part of why VRAM is such a sore spot while games move beyond 8 GB, as this goes well against the long-held convention of performance being the only consideration of a card. Now, we’ve seen cases where the 12 GB RTX 3060 has outperformed 8GB 3060 TI and even the 3070, when VRAM was pressured. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

Like it or not, when a game is fully optimised for console, its not easily scalable to PC. 

im not sure thats true... 

 

i think its more a design philosophy if everything (read: they're lazy af) also you're implying the game runs *perfectly* on consoles  - and i don't know if it does, but typically these sort of games don't. 

 

13 minutes ago, Alex Atkin UK said:

Its like the vocal minority who are moaning about Spiderman 2 using RT in all modes.  They aren't happy that it runs at "only" 60fps in performance mode.

thats another yet similar thing, ive seen people complain about a "bad port" even though on average the game ran actually better on pc than consoles...

 

14 minutes ago, Alex Atkin UK said:

Its nothing new, people moaned when programmable shaders came out because it was stealing silicon from making fixed function shaders faster in new GPUs.

agreed, but i feel nowadays people are looking for any reason to upgrade at least twice a year, so this sort of "AAA business" is probably a very welcome trend to them.

 

PLUS they get to bitch about it on the internet!  Perfect! 🙂

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Zodiark1593 said:

Feature set hasn’t moved appreciably for a long time. And even today, a lot of games still use DX11. For a decade, gamers didn’t have to take heed of features, and instead, just looked at performance, and even when new features were employed, they were optional. 
 

This is also part of why VRAM is such a sore spot while games move beyond 8 GB, as this goes well against the long-held convention of performance being the only consideration of a card. Now, we’ve seen cases where the 12 GB RTX 3060 has outperformed 8GB 3060 TI and even the 3070, when VRAM was pressured. 

That's my point, its been a long time since we've really had a generational shift in render technology BECAUSE they didn't want to force people to upgrade.

 

However the current consoles and UE5 being more forward thinking, things have to change.

 

4 minutes ago, Mark Kaine said:

agreed, but i feel nowadays people are looking for any reason to upgrade at least twice a year, so this sort of "AAA business" is probably a very welcome trend to them.

 

PLUS they get to bitch about it on the internet!  Perfect! 🙂

What you have to remember is, the people moaning on the Internet are not the majority (not just gaming).

 

I'm not sure a single person on any of my gaming friends lists has ever said a single thing about gaming, they just get on with it.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

I remember when games started requiring GPU hardware T&L support, I brought the latest Medal of Honor game and it wouldn't launch at all with a prompt saying "Hardware T&L support required".

 

I'm surprised and in a way sad this doesn't happen more, literally means lots of generational specific hardware GPU features aren't being used or aren't being fully leveraged so older hardware can stay supported.

 

Those that care and find this annoying just be thankful it doesn't happen more often, which it should.

Link to comment
Share on other sites

Link to post
Share on other sites

UE5 is the gold-standard for engines. If only more titles would utilize it. If developers would stop reinventing the game engine (unless it's really that much better than UE5 and Unity) and focus on content and play mechanics, we wouldn't have the shitty launches we have now.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, StDragon said:

UE5 is the gold-standard for engines. If only more titles would utilize it. If developers would stop reinventing the game engine (unless it's really that much better than UE5 and Unity) and focus on content and play mechanics, we wouldn't have the shitty launches we have now.

I'll forever support Square Enix doing it haha, they actually do a decent job of it (despite delays from nobody agreeing on what it needs to be).

 

Luminous Engine back in 2015 was so much better than anything else and it took ages for anything to match it, problem was... Square Enix lol

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

"wAaAaaAaH mY 7 yEaR oLd hArDwArE wOn'T rUn sHiNy nEw tHiNg wAaAaAaAaH!"

 

Dude.  Stop.  That's normal.  That's completely normal.  If your finances don't allow you to upgrade then I'm sorry that's the case.  In the meantime, you really shouldn't be expecting great performance on the latest-greatest AAA games anyway if your hardware is on the older side.

 

This literally reminds me of all the games that came out in 2015-2016 when the current gen GTX 970 recommended and GTX 660 minimum was becoming normal.  GTX 660 was two generations & a tier behind the GTX 970.  And now, with Alan Wake II, an RTX 2060 is two generations & a tier behind the RTX 4070.

 

Mad at the pricing?  Welcome to the club.  Besides, that's an NVIDIA problem, not an Alan Wake developer problem.

Sorry for the mess!  My laptop just went ROG!

"THE ROGUE":  ASUS ROG Zephyrus G15 GA503QR (2021)

  • Ryzen 9 5900HS
  • RTX 3070 Laptop GPU (80W)
  • 24GB DDR4-3200 (8+16)
  • 2TB SK Hynix NVMe (boot) + 2TB Crucial P2 NVMe (games)
  • 90Wh battery + 200W power brick
  • 15.6" 1440p 165Hz IPS Pantone display
  • Logitech G603 mouse + Logitech G733 headset

"Hex": Dell G7 7588 (2018)

  • i7-8750H
  • GTX 1060 Max-Q
  • 16GB DDR4-2666
  • 1TB SK Hynix NVMe (boot) + 2TB Crucial MX500 SATA (games)
  • 56Wh battery + 180W power brick
  • 15.6" 1080p 60Hz IPS display
  • Corsair Harpoon Wireless mouse + Corsair HS70 headset

"Mishiimin": Apple iMac 5K 27" (2017)

  • i7-7700K
  • Radeon Pro 580 8GB (basically a desktop R9 390)
  • 16GB DDR4-2400
  • 2TB SSHD
  • 400W power supply (I think?)
  • 27" 5K 75Hz Retina display
  • Logitech G213 keyboard + Logitech G203 Prodigy mouse

Other tech: Apple iPhone 14 Pro Max 256GB in White, Sennheiser PXC 550-II, Razer Hammerhead earbuds, JBL Tune Flex earbuds, OontZ Angle 3 Ultra, Raspberry Pi 400, Logitech M510 mouse, Redragon S113 keyboard & mouse, Cherry MX Silent Red keyboard, Cooler Master Devastator II keyboard (not in use), Sennheiser HD4.40BT (not in use)

Retired tech: Apple iPhone XR 256GB in Product(RED), Apple iPhone SE 64GB in Space Grey (2016), iPod Nano 7th Gen in Product(RED), Logitech G533 headset, Logitech G930 headset, Apple AirPods Gen 2 and Gen 3

Trash bin (do not buy): Logitech G935 headset, Logitech G933 headset, Cooler Master Devastator II mouse, Razer Atheris mouse, Chinese off-brand earbuds, anything made by Skullcandy

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Alex Atkin UK said:

I'm not sure a single person on any of my gaming friends lists has ever said a single thing about gaming, they just get on with it

im not sure i understand... what you talk about if not games?  i get there are more subjects than that obviously,  but the vast majority of people i play games with,  or are on any of my "friendslist"... we talk about games, and new trends in gaming, etc *all the time*

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×