Jump to content

Alan Wake 2 won't run on old GPUs due to lack of DX12 Ultimate features

7 hours ago, Kisai said:

AMD doesn't actually support Mesh shaders natively on RDNA2

At least RDNA3 does though.

 

Some pretty nice info on RNDA2 for this if anyone wants to read it, seems not too bad for not having native support (I think anyway? 🤷‍♂️).

https://timur.hu/blog/2022/how-mesh-shaders-are-implemented

https://timur.hu/blog/2022/how-task-shaders-are-implemented

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

I'm wondering, if this is a shock because it has been so long since we last had a mandatory hardware feature jump on the GPU side? I was expecting RT to be that jump, but now this has happened would others follow?

 

I still have a 1080 Ti. It's still what, 3060 level today? For raster anyway. I know it wont be competitive for modern games but I'll still remember it as the best consumer tier GPU before RT era.

Well, yeah.

 

It's pretty crazy that cards like the 980 were still very capable until now. Back in say 2013 you definitely didn't have cards from 2004 that could run modern games. They physically couldn't do it, not even at a low awful framerate. They didn't have the technologies necessary to run them.

 

Modern games and GPUs have been really good about that for the past 10 years, even if due to pretty abysmal console hardware.

 

@Kisaihow's mesh shaders support on RDNA3 then? Couldn't find any info on it.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, porina said:

I'm wondering, if this is a shock because it has been so long since we last had a mandatory hardware feature jump on the GPU side? I was expecting RT to be that jump, but now this has happened would others follow?

 

I still have a 1080 Ti. It's still what, 3060 level today? For raster anyway. I know it wont be competitive for modern games but I'll still remember it as the best consumer tier GPU before RT era.

Tessalation was 11 years ago yea. but even then many games had mipmap fallbacks. (not all) 
Kinda like how many RT games have baked lighting fallbacks today

So yea, this kinda thing is not in the recent memory of PC gamers. 

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, iSynthMan said:

Well, yeah.

 

It's pretty crazy that cards like the 980 were still very capable until now. Back in say 2013 you definitely didn't have cards from 2004 that could run modern games. They physically couldn't do it, not even at a low awful framerate. They didn't have the technologies necessary to run them.

 

Modern games and GPUs have been really good about that for the past 10 years, even if due to pretty abysmal console hardware.

 

@Kisaihow's mesh shaders support on RDNA3 then? Couldn't find any info on it.

I'm assuming since RDNA3 was released AFTER DX12u was announced, that the support is native. The issue with RDNA2 is that it supports it by mapping features in the driver, thus they aren't capable of Mesh Shaders that min-max the feature.

 

image.thumb.png.d720784a1bfdc34516598d97b02006fc.png

https://www.amd.com/content/dam/amd/en/documents/radeon-tech-docs/instruction-set-architectures/rdna3-shader-instruction-set-architecture-feb-2023_0.pdf

 

There was also driver bugs initially, and bugs in 3DMark:

https://hardforum.com/threads/rdna-3-driver-23-1-2-fixes-amd-mesh-shader-performance-issue-7900-xt-xtx-update-maybe-not.2025374/

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This console generation upped the bar.  If you want to play on a $200-300 graphics card wait a few more years until they are quite a bit faster than consoles.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/22/2023 at 8:51 PM, Techstorm970 said:

Dude.  Stop.  That's normal.  That's completely normal.  If your finances don't allow you to upgrade then I'm sorry that's the case.  In the meantime, you really shouldn't be expecting great performance on the latest-greatest AAA games anyway if your hardware is on the older side.

well at some point there was dx9 and dx11 legacy support for some software, see less of that now. but of course its understandable that some features are very needed to make it run. not sure when it comes to vulkan.

Link to comment
Share on other sites

Link to post
Share on other sites

high teens 1080p native Low preset on a GTX 1070 in the city which is less demanding then the forest

 


1080p Low preset in the forest for RTX 2060 - 30fps
FSR quality gets 40fps

1080p Medium Preset RTX 3060 - 40fps
FSR quality gets 50 fps
FSR performance gets 55-60fps


1440p medium RTX 3060 FSR Balanced - 40fps


1080p Medium RTX 3070 - 60fps
1080p DLSS perfromance - 80fps

1440p Medium RTX 3070 - low to mid 40fps
1440p DLSS Quality - high 50s
1440p Low 3070 DLSS quality - 65fps

1080p High RTX 4070 - 70FPS
1440p High RTX 4070 - 50 FPS
1440p High DLSS Quality - 70 FPS
4k high DLSS Performance - 50 fps

 

1440p high RTX 4090 - 100fps
4K high RTX 4090 - 60FPS
4k high RTX 4090 RT High - 25FPS
4k high RTX 4090 RT High DLSS Quality - 45 FPS
4k high RTX 4090 RT High DLSS Balanced - 50 FPS
4k high RTX 4090 RT High DLSS Performance - 60 FPS

 

Obviously not a proper benchmark but it looks like the reqs are pretty spot on, and the game looks amazing. 
 

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/22/2023 at 4:46 PM, Dabombinable said:

It really wouldn't be a properly playable experience on my 6700XT despite supporting DX12 ultimate, considering I expect to be able to play games at 1440p on it, and the devs recommend 1080P Med/w FSR Performance - AKA 540P.

People keep saying, "oh your 8 year old card won't run it quit bitching" when my 6700XT I purchased SPECIFICALLY because my gpu was 8 years old can't run it either. That's poor optimization if a 2 year old mid-range GPU can't run the game at 60 without running it at 540p.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, DANK_AS_gay said:

People keep saying, "oh your 8 year old card won't run it quit bitching" when my 6700XT I purchased SPECIFICALLY because my gpu was 8 years old can't run it either. That's poor optimization if a 2 year old mid-range GPU can't run the game at 60 without running it at 540p.

Lets be realistic here for a moment. GTX 1080Ti is 7 years old. It's expected to perform less than optimal now. The fact that we've progressed so little in 7 years is not to blame on game developers but on GRAPHICS CARD industry with stagnant bullshittery. Just in last 3 years the price of top end cards has doubled (800€ used to be top end, now it's 1600€), mid end cards are over 1/3 more expensive (used to be 300€, now are basically 450+ €) and the low end is now 300€ and they are literal e-waste. You can't run ANYTHING on them at acceptable framerate except those ugly "e-sports" games people run at Ultra Low so they have 3000000000000 fps. But somehow we haven't really progressed performance wise beyond the upscaling thing which should be a performance bonus, not something we have to rely on to have acceptable framerate.

 

So, blaming gaming companies to want to excite gamers with awesome graphics is lame. And it's no secret that Remedy always pushes the boundaries. Remember Max Payne? It looked amazing and they even made benchmark with it in 3DMark. Alan Wake 1, despite being rather mediocre game in my opinion, it was incredible visually. Dynamic lighting partially integrated into gameplay was pretty sick and it looked amazing. Have people also forgot about CONTROL? It's pretty demanding with RT and it was borderline playable for me on RTX 3080, but I did have it maxed out with RT and all that whizz. Did it look great? Absolutely. Was it running great? Not so much. Should I blame Remedy for it? No.

 

People became too complacent to the situation where games just ran at 100 fps on potato graphic card and looked decent enough. Not gonna complain, that's great, but when you can appreciate amazing graphics from all the features, I'll take it as well. And don't forget that developers pushing such boundaries are actually pissing in their own bowl of cereals. You're exciting owners of top end hardware but excluding users of low end hardware. Guess which are the majority. Hint: Ain't the top end hardware. So, you're shrinking the circle of potential buyers of your product because it's too demanding. Kinda bad business move, but Remedy always goes for the high end. Similar to Crytek. And id Software used to be the same, but hasn't been for a while. I'll give them that Doom 2016 and Doom Eternal do look and run amazingly well. Which is a rare odd thing to see.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RejZoR said:

Lets be realistic here for a moment. GTX 1080Ti is 7 years old. It's expected to perform less than optimal now. The fact that we've progressed so little in 7 years is not to blame on game developers but on GRAPHICS CARD industry with stagnant bullshittery. Just in last 3 years the price of top end cards has doubled (800€ used to be top end, now it's 1600€), mid end cards are over 1/3 more expensive (used to be 300€, now are basically 450+ €) and the low end is now 300€ and they are literal e-waste. You can't run ANYTHING on them at acceptable framerate except those ugly "e-sports" games people run at Ultra Low so they have 3000000000000 fps. But somehow we haven't really progressed performance wise beyond the upscaling thing which should be a performance bonus, not something we have to rely on to have acceptable framerate.

 

So, blaming gaming companies to want to excite gamers with awesome graphics is lame. And it's no secret that Remedy always pushes the boundaries. Remember Max Payne? It looked amazing and they even made benchmark with it in 3DMark. Alan Wake 1, despite being rather mediocre game in my opinion, it was incredible visually. Dynamic lighting partially integrated into gameplay was pretty sick and it looked amazing. Have people also forgot about CONTROL? It's pretty demanding with RT and it was borderline playable for me on RTX 3080, but I did have it maxed out with RT and all that whizz. Did it look great? Absolutely. Was it running great? Not so much. Should I blame Remedy for it? No.

 

People became too complacent to the situation where games just ran at 100 fps on potato graphic card and looked decent enough. Not gonna complain, that's great, but when you can appreciate amazing graphics from all the features, I'll take it as well. And don't forget that developers pushing such boundaries are actually pissing in their own bowl of cereals. You're exciting owners of top end hardware but excluding users of low end hardware. Guess which are the majority. Hint: Ain't the top end hardware. So, you're shrinking the circle of potential buyers of your product because it's too demanding. Kinda bad business move, but Remedy always goes for the high end. Similar to Crytek. And id Software used to be the same, but hasn't been for a while. I'll give them that Doom 2016 and Doom Eternal do look and run amazingly well. Which is a rare odd thing to see.

The 6700XT is the Xbox Series X GPU (actually it's the 6700 non-XT) Last I checked, it's not running at 540p on the xbox series X. That comes down to optimization. I could also run Control on my 1070ti at roughly 80 at medium-low settings. I get that this is a large advancement in gaming, but that doesn't mean 1 generation old GPUs should struggle to run it. There's not a lot of visual improvement for such a massive performance drop. As of right now, the fault is on both the GPU market and the devs. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, DANK_AS_gay said:

The 6700XT is the Xbox Series X GPU (actually it's the 6700 non-XT) Last I checked, it's not running at 540p on the xbox series X. That comes down to optimization. I could also run Control on my 1070ti at roughly 80 at medium-low settings. I get that this is a large advancement in gaming, but that doesn't mean 1 generation old GPUs should struggle to run it. There's not a lot of visual improvement for such a massive performance drop. As of right now, the fault is on both the GPU market and the devs. 

As mentioned RDNA2.0 doesn't support the full mesh shaders. So while it works, it likely has to run in a more nerfed state.

 

Targeting a console maybe doesn't use the mesh shaders at the level it does on the PC. Who knows. We can only speculate that they have to make different compromises for the lack of unified memory on the PC.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Watching now. Note this only covers raster, RT will be a later separate video.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×