Jump to content

Forspoken PC Requirements Announced | RTX 4080 Recommended for 4K

AlTech

Summary

 

The PC requirements for Forspoken have been announced by Square Enix and many are dismayed by the incredibly high requirements.

 

The requirements were shared in image form by the developers:

FmxlQiqakAAjM_k.jpg.7986dcf96dc1cd61b0cb4125681ac062.jpg

 

The main areas of concern are the 16GB minimum requirements as well as the resolution and frame rates shown for each requirement column (720p 30fps for Minimum and 1440p 30fps for Recommended).

 

Requiring an RTX 3070 and 6700XT for 1440p 30fps is insane.

 

720p 30fps requiring what amounts to RX 580 performance is also shocking to some.

 

Many have speculated this is due to the game lacking optimization as the PS5 version is upscaled 1259p to 4K and fails to maintain 30fps.

 

4K also requiring 32GB RAM is yet another area of concern.

 

Quotes

Quote

 These are surprisingly demanding system requirements; 16GB of RAM is a fairly high total for base specs, considering that the game only displays at 720p/30fps. Even more surprising are the recommended requirements that ask for powerful GPUs, but only promise 1440p/30fps. More alarming, 60fps is the exclusive playground of top-tier PC builds.

 

My thoughts

The requirements seem quite insane but it looks like the game is unoptimized in general. Hopefully the reviews will prove or disprove this.

 

That the review embargo is so close to the game's launch rather than weeks before is another red flag.

 

Sources

 https://www.pcmag.com/news/can-your-pc-run-forspoken

 

https://m.timesofindia.com/gadgets-news/forspoken-pc-requirements-heres-what-you-need-to-play-the-game/articleshow/97099909.cms

 

 

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

good thing i don't play 4K nor do i buy from SE

One day I will be able to play Monster Hunter Frontier in French/Italian/English on my PC, it's just a matter of time... 4 5 6 7 8 9 years later: It's finally coming!!!

Phones: iPhone 4S/SE | LG V10 | Lumia 920 | Samsung S24 Ultra

Laptops: Macbook Pro 15" (mid-2012) | Compaq Presario V6000

Other: Steam Deck

<>EVs are bad, they kill the planet and remove freedoms too some/<>

Link to comment
Share on other sites

Link to post
Share on other sites

The thing that confuses me is the fact that they say both a 6800 XT and a 4080 will do 4K60, those cards aren't really that close in terms of raw performance. Are they horribly unoptimized for Nvidia GPUs specifically, or will this be a game that actually uses a full 16GB of VRAM and they couldn't recommend a 3080 because it only has 12GB at most?

Link to comment
Share on other sites

Link to post
Share on other sites

This is the funniest system requirements of my recent memory. The game is going to have horrible performance, mark my words. I'd say no DLSS means I'm not buying it, but who are we kidding, I would never buy something that looks this terrible from gameplay to everything this close to release anyway. I can already imagine the field day Digital Foundry will have with this on release.

Link to comment
Share on other sites

Link to post
Share on other sites

Keep in mind that time to time, Listed Specs for a game are a best guess from people who know little to nothing about PC's outside of general usages,..
They might have a workstation in the office to base something on, and use research on other parts (maybe from misleading or otherwise accurate sources online) and beyond their internal spec, all the others are a crapshoot based of non-real world usage.

Don't always believe or purchase based off Recommended Specs, or these kinds of images until it's actually tested by a wider range of users within the day/week of launch.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, RONOTHAN## said:

The thing that confuses me is the fact that they say both a 6800 XT and a 4080 will do 4K60, those cards aren't really that close in terms of raw performance. Are they horribly unoptimized for Nvidia GPUs specifically, or will this be a game that actually uses a full 16GB of VRAM and they couldn't recommend a 3080 because it only has 12GB at most?

This was my first thought too. I'm going with the VRAM theory. At mid range the 6700XT and 3070 are ball park comparable in performance. It also presumes 8GB VRAM is sufficient. Low end, 6GB VRAM ok. Doesn't look like it cares much for CPU either if it scales down to ancient quad cores.

 

So that leaves the Ultra GPU and system ram requirements as a bit odd. Are they going really asset heavy on this one? I think that's the only way to explain it if this is accurate.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, AluminiumTech said:

 

 

Many have speculated this is due to the game lacking optimization as the PS5 version is upscaled 1259p to 4K and fails to maintain 30fps.

 

That's the go-to cry of people who play games on a potato. "It's not optimized"

 

I've literately seen this complaint about EVERY SquareEnix PC game. FF14, Neir Automata, FF7 Remake, etc.

 

Please, people who desperately trying to hypermile their potato PC's, Buy a PS5 and quit trying to turn the PC experience into a PS2 experience.

 

The original FF14, SquareEnix overshot the PC requirements at the time, and then never actually put the game on the PS3. Version 2.0? 720p60 or 1080p30 was viable on a GTX 760 or a 1050Ti.  The bottom-tier GPU could still play it without making it look like a PS2 experience. But no, with the DX10/DX11 update, peoples's GPU's started catching fire.

 

Sadly, Neir Automata, which was originally a PS4 title, and there were some oversights on the PC version that maybe dragged the performance a little, but nope, people on steam were insisting that their GTX 1060 should be giving them a 4K experience and their GPU should not be catching fire. 

 

(by catching fire, I mean that the GPU driver would crash or their system would BSOD, meaning they likely had an OC GPU card or OC CPU for whatever reason.)

 

The point I'm making here is that claiming that a game is not optimized is a lazy excuse. If you want to play the latest, biggest, baddest game, on the minimum requirements, you are not going to enjoy that experience. Either give in and buy something better than the recommended (because the recommended specs is always a lie), or buy the PS5 just to play it. 

 

This has been the same problem ever since the late MSDOS/early Win95 era, where a console game comes out with a PC port, and the PC port is substantially inferior to the SNES or PS1 game. Which was the case with FF7 and FF8 due most computers having inferior MIDI music hardware, and would not be rectified until "remaster" versions 20 years later.

 

A game should be able to dial-down "enough" that you could play it, poorly, on whatever hardware is common, but expecting the PS5 equivalent experience on anything below the Ultra settings is wishful thinking.

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, Kisai said:

The original FF14, SquareEnix overshot the PC requirements at the time, and then never actually put the game on the PS3. Version 2.0? 720p60 or 1080p30 was viable on a GTX 760 or a 1050Ti.  The bottom-tier GPU could still play it without making it look like a PS2 experience. But no, with the DX10/DX11 update, peoples's GPU's started catching fire.

I find FFXIV (at least around 4.x to 5.x era) runs 1080p 60fps-ish on a 1050 non-Ti laptop. 

 

52 minutes ago, Kisai said:

A game should be able to dial-down "enough" that you could play it, poorly, on whatever hardware is common, but expecting the PS5 equivalent experience on anything below the Ultra settings is wishful thinking.

And this game does scale down, from what we know anyway. Minimum CPU is a quad core from over 10 years ago. Minimum GPU is mid range from over 6 years ago. 16GB of ram is pretty much a given for any gaming system used for more than esports. Don't like the serving suggestions? I'm sure people can come up with alternatives tweaking a balance between actual fps and perceived image quality if they don't want the console 30fps experience.

 

However I have to disagree on the last part. PC Ultra settings usually do exceed console defaults. If you don't already, I'd suggest DigitalFoundry where they try to find PC settings to go against console provided experience. Current gen console games often have a trade off between 60+fps and higher graphics settings. Don't expect both at the same time. 6700XT/3070 should be sufficient to meet or exceed current gen consoles assuming rest of system is also up to it.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

(Off for the console-to-PC port discussion, but)

I just miss the old days when having innovation in graphic engines by individual PC games was actually a thing. (and I just realized The Game Awards doesn't even have a graphics/visuals category now, like WTF?) So graphics-innovative games used to routinely target next-gen PC hardware with their ultra settings, and it was a good thing.

 

This whole "you must play it right at launch" thing around games is just so toxic, I want my games back which worth buying and playing after 1-2 years as they were getting their maxed-out graphics settings available with next-gen and after-next-gen hardware.

 

End of boomer rant, I go back to play Crysis and Witcher 3...

         \   ^__^ 
          \  (oo)\_______
             (__)\       )\/\
Link to comment
Share on other sites

Link to post
Share on other sites

Anyone else annoyed that they have mistake in that release?

As in I have never heard about "AMD Ryzen 5 5800X".

 

Also I don't get the storage media requirements. The experience does depend on them to some level but actual performance difference is questionable unless they have very aggressive asset streaming or they have actually optimized the backbone to the PS5 storage with which running the game on ultra with HDD all asset couldn't be loaded in the given time and the game wouldn't wait that everything is loaded correctly.

 

Optimization wise we have to see what the game is when it comes out. Usually it is as @Kisaisaid people running the game on potatoes screaming that the game is poorly optimized. But then we have the actually poorly optimized which have the telltale marks of poor graphic settings (or even the awful wholly separate config software), otherway lacking settings (like the button config has everything pooled in and you cannot separate actions to different buttons) and then my personal favorite, "hold to interact" mechanic, nothing is as #%&#ing more frustrating and vomit inducing that you need to hold down some button to do something that in itself isn't constant interaction, like I can stomach that if the character is climbing or doing something else that is constant but "Hold to pick up" or "Hold to press the button" or "Hold to speak with" are just horrible shit that should be burned to ash and burned again just to make sure no one ever again starts to make that shit.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

This was my first thought too. I'm going with the VRAM theory. At mid range the 6700XT and 3070 are ball park comparable in performance. It also presumes 8GB VRAM is sufficient. Low end, 6GB VRAM ok. Doesn't look like it cares much for CPU either if it scales down to ancient quad cores.

 

So that leaves the Ultra GPU and system ram requirements as a bit odd. Are they going really asset heavy on this one? I think that's the only way to explain it if this is accurate.

It's open world, and I'm guessing there's probably some rapid teleport system they're using. Because that's a reason you'd leave a lot of textures in memory.  So, my guess is they're using VRAM in place of Direct Storage technology from the consoles.

 

Though a 3600 / 3070 combo for 1440p / 30 fps is wild. It's also possible the massive particle engine they're using is just a memory hog and there's not much they can do about that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

I find FFXIV (at least around 4.x to 5.x era) runs 1080p 60fps-ish on a 1050 non-Ti laptop. 

I have a 1050Ti Laptop that I've tried to play it on, it's 45fps, tops. Playable, but not really reasonable unless you were playing with a desktop keyboard and mouse. Playing with the laptop keyboard is insufferable, never mind the trackpad.

 

1 hour ago, porina said:

And this game does scale down, from what we know anyway. Minimum CPU is a quad core from over 10 years ago. Minimum GPU is mid range from over 6 years ago. 16GB of ram is pretty much a given for any gaming system used for more than esports. Don't like the serving suggestions? I'm sure people can come up with alternatives tweaking a balance between actual fps and perceived image quality if they don't want the console 30fps experience.

 

As I mentioned before, "The Minimum" is always a lie. That simply what it will "run" on, not have a good experience on. That's been the case as far back as "100% IBM compatible, MSDOS 3.2, 640K RAM" minimum requirements of the 80 and early 90's games.  That's what it needs to run, not what it will run well on.

 

Recommended, again, is what should be parity with the console experience (which is the 1080p60 experience for most, or 4Kp60 for the smaller portion who bought the deluxe model of the console for a 4K experience.)

 

Ultra should be "the uncapped, future-be-damned" point, which means all it is is a reference point at where they expect no diminishing returns with the hardware listed.

 

There are certainly ways to drown the capability of a current GPU by turning ray tracing on and DLSS off. My expectation is that DLSS should be turned off EXCEPT to trade off visual quality for framerate, which is actually not that unreasonable over changing any other tunable which affects the visual quality.

 

 

 

1 hour ago, porina said:

However I have to disagree on the last part. PC Ultra settings usually do exceed console defaults.

 

As I said, setting the "ultra" setting, should be equal or better than the console experience. If people are going to complain that the game doesn't run on their potato PC, they should have bought a PS5. That has been the entire point of playing games on a console, is a subsidized cheaper "Computer" unit that plays overpriced versions of games, but you can at least play ALL of them even if you don't plug it into a 4K screen. The device itself is capable of switching to 720p, 1080p or 4K depending on what it's plugged into. There's much less to futz around with, because it should just work out of the box.

 

Your computer however? Has a massive overhead of being a general purpose computer that consoles didn't start having until the Wii/PS3/Xbox360 era. Since then, all consoles have had an embedded OS, but don't permit multitasking. If you want to play a different game, the console has to stop, unload what you're doing, and then load the new thing. You can't just stop and watch youtube for 30 minutes and then switch back like you can on the PC.

 

Which comes back to this entire problem of many of the things you can buy in BestBuy are not that good and would fall below the minimum specs, "right now", and don't have an expected lifespan of more than 3 years. A desktop? Just replace the GPU if the game doesn't work well, surely the rest of the system isn't bad. But a laptop? All the soldered-to-the-MB parts, makes the only option replacing the laptop.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kisai said:

I have a 1050Ti Laptop that I've tried to play it on, it's 45fps, tops. Playable, but not really reasonable unless you were playing with a desktop keyboard and mouse. Playing with the laptop keyboard is insufferable, never mind the trackpad.

Laptop Low setting? Not counting the drops anywhere where there's a lot of people, I don't recall 60fps being a problem on my old laptop. 7300H (4c4t Kaby Lake) and 1050 of some description.

 

2 minutes ago, Kisai said:

As I said, setting the "ultra" setting, should be equal or better than the console experience.

What I quoted earlier, you said not to expect console parity below Ultra, which is what I disagreed with.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, porina said:

 

What I quoted earlier, you said not to expect console parity below Ultra, which is what I disagreed with.

At 4K. But yes, sometimes people choose to nitpick a detail that I didn't think needed to be said.

 

At any rate, the minimum and suggested specs are still going to be a lie in the end, and it's more likely that some DLSS-like feature is enabled on the PS5 when it's at 4K, because I don't see a 4K experience happening if the "PC" requirement is for specs that far exceed that of the PS5.

Link to comment
Share on other sites

Link to post
Share on other sites

That 1440p 30FPS and 4k 60FPS is with DLSS3 enabled... right? 😄

 

 

But seriously, if this is on Ultra with RT I'm fine with it. If this is Medium/ High without RT... uh, I wish them best of luck I guess. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

That's the go-to cry of people who play games on a potato. "It's not optimized"

 

I've literately seen this complaint about EVERY SquareEnix PC game. FF14, Neir Automata, FF7 Remake, etc.

 

Please, people who desperately trying to hypermile their potato PC's, Buy a PS5 and quit trying to turn the PC experience into a PS2 experience.

Except in this case the PS5 gets a substandard experience.

3 hours ago, Kisai said:

The original FF14, SquareEnix overshot the PC requirements at the time, and then never actually put the game on the PS3. Version 2.0? 720p60 or 1080p30 was viable on a GTX 760 or a 1050Ti.  The bottom-tier GPU could still play it without making it look like a PS2 experience. But no, with the DX10/DX11 update, peoples's GPU's started catching fire.

 

Sadly, Neir Automata, which was originally a PS4 title, and there were some oversights on the PC version that maybe dragged the performance a little, but nope, people on steam were insisting that their GTX 1060 should be giving them a 4K experience and their GPU should not be catching fire. 

 

(by catching fire, I mean that the GPU driver would crash or their system would BSOD, meaning they likely had an OC GPU card or OC CPU for whatever reason.)

 

The point I'm making here is that claiming that a game is not optimized is a lazy excuse

1) Except in this case it is true.

2) It is not the responsibility of gamers to own highend gear. It is the responsibility of game developers to target reasonable hardware so that there exists a sufficient customer base to actually purchase their products.

 

Not every gamer will have a 3070 or 6700XT or a 4080 or 6800XT.

 

It is their job to make sure that the performance level they're targeting isn't niche.

3 hours ago, Kisai said:

. If you want to play the latest, biggest, baddest game, on the minimum requirements, you are not going to enjoy that experience.

Which is why developers should try to optimize their game.

3 hours ago, Kisai said:

Either give in and buy something better than the recommended (because the recommended specs is always a lie), or buy the PS5 just to play it. 

That's not the fault of gamers. If you spend as much money as a PS5 costs just on a graphics card you should be able to have a good time.

 

You seem to be defending releasing unoptimized games.

3 hours ago, Kisai said:

This has been the same problem ever since the late MSDOS/early Win95 era, where a console game comes out with a PC port, and the PC port is substantially inferior to the SNES or PS1 game. Which was the case with FF7 and FF8 due most computers having inferior MIDI music hardware, and would not be rectified until "remaster" versions 20 years later.

 

A game should be able to dial-down "enough" that you could play it, poorly, on whatever hardware is common, but expecting the PS5 equivalent experience on anything below the Ultra settings is wishful thinking.

The PS5 is less powerful than a 3070 or 6700XT. So why should those cards not be able to play the game on ultra?

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, AluminiumTech said:

The PS5 is less powerful than a 3070 or 6700XT. So why should those cards not be able to play the game on ultra?

one reason could be that ultra on consoles very well could mean low/medium on pc ... at least that was the case for many ps3/ps4 games.

 

what they usually do on consoles:  lower settings , lower fps , *more* motion blur to hide the aforementioned things. 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, AluminiumTech said:

The main areas of concern are the 16GB minimum requirements as well as the resolution and frame rates shown for each requirement column (720p 30fps for Minimum and 1440p 30fps for Recommended).

 

4K also requiring 32GB RAM is yet another area of concern.

 

Quotes

 

My thoughts

The requirements seem quite insane but it looks like the game is unoptimized in general. Hopefully the reviews will prove or disprove this.

- For a gaming machine in the last 2-3 years people *should* have been shooting for 32GB.   It surprises me how many still hold out for 16GB.  Even if the game takes up 8GB your going to have to micromanage closing out of other programs all the time.

- High system requirements doesn't mean unoptimized it just means high system requirements.  

- This is a next generation title the consoles have 16GB (shared with system but it is not the same as a PC since its 500GB/s).  This is the hardware "next gen" titles will be shooting for unless they want to develop cross generation.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, RONOTHAN## said:

The thing that confuses me is the fact that they say both a 6800 XT and a 4080 will do 4K60, those cards aren't really that close in terms of raw performance. Are they horribly unoptimized for Nvidia GPUs specifically, or will this be a game that actually uses a full 16GB of VRAM and they couldn't recommend a 3080 because it only has 12GB at most?

Firsthand experience with the 10GB 3080 it was pushing it at 4k last generation and in several instance was insufficient without dropping textures.  I had one title that would run 4k 6fps but jumped to 60fps when decreasing texture settings.  12GB isn't much better it is going to struggle with anything developed for the new consoles.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Jon-Slow said:

This is the funniest system requirements of my recent memory. The game is going to have horrible performance, mark my words. I'd say no DLSS means I'm not buying it, but who are we kidding, I would never buy something that looks this terrible from gameplay to everything this close to release anyway. I can already imagine the field day Digital Foundry will have with this on release.

I can't say until its out and reviewed.  Definitely not at that price I would wait for end of year sales.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, AluminiumTech said:

The requirements seem quite insane but it looks like the game is unoptimized in general. Hopefully the reviews will prove or disprove this.

I'll preface this with that i dont really know anything about this game nor am i interested in it , i don't even buy SE products... they're the Japanese EA and get 0 cent from me, but to totally honest developers have been not utilizing pc hardware fully for many years, probably a decade,  ram requirements,  especially vram is being held mostly back by nvidias arbitrary "8gb limit" for even higher end cards, etc, etc, physics are basically non existent nowadays (only rudimentary) etc,etc, so if a dev actually ups the ante, i very much welcome this, i can see specifically Capcom doing similar in the near future, they're always pushing for higher graphical fidelity, also use their own engine rather than the craptastic UE that so many developers prefer (because they're unable to make their own or are too cheap and follow the 'its good enough' mantra)

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Protip: Don't play games on max settings. I know it's a wild concept, but the max settings are usually terrible anyway. You get like 5% better image quality at the cost of like 20% performance. And this might just be me, but I typically don't really care about the graphics anyway when I play games. Of course the graphics matter, but I generally don't enjoy a game more because there are 10 rocks per square inch on a pavement vs 7 rocks per square inch.

 

System requirements are also, as far as my experience go, usually pretty inaccurate. 

Link to comment
Share on other sites

Link to post
Share on other sites

A few years ago Godfall had similar crazy requirements for 4K. The devs said you need 16gb for 4K and that its all being used. Never played it nor do I ever plan to but their claims were making me curious on the actual performance. Turns out the game only uses ~8GB at 4k max settings according to several tests. So who knows how much it actually needs.

 

Apart from that why is it always medicore games that aren't very interesting with high requirements like that? Do they need some news coverage to hopefully push sales?

 

41 minutes ago, ewitte said:

12GB isn't much better it is going to struggle with anything developed for the new consoles.

Only game that actually comes close to the 12 GB limit at 4K is FarCry6 afaik.

Desktop: i9-10850K [Noctua NH-D15 Chromax.Black] | Asus ROG Strix Z490-E | G.Skill Trident Z 2x16GB 3600Mhz 16-16-16-36 | Asus ROG Strix RTX 3080Ti OC | SeaSonic PRIME Ultra Gold 1000W | Samsung 970 Evo Plus 1TB | Samsung 860 Evo 2TB | CoolerMaster MasterCase H500 ARGB | Win 10

Display: Samsung Odyssey G7A (28" 4K 144Hz)

 

Laptop: Lenovo ThinkBook 16p Gen 4 | i7-13700H | 2x8GB 5200Mhz | RTX 4060 | Linux Mint 21.2 Cinnamon

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, SkilledRebuilds said:

Keep in mind that time to time, Listed Specs for a game are a best guess from people who know little to nothing about PC's outside of general usages,..

also this, I played games where my specs were apparently good for 720p 30 @ 1080p 60 med/high settings ... 

 

what confuses *me* is the RAM requirements... i wonder if there's actually a good reason for or if it just ends up being hot air (basically marketing gag)

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×