Jump to content

Forspoken PC Requirements Announced | RTX 4080 Recommended for 4K

AlTech
52 minutes ago, AluminiumTech said:

If you spend as much money as a PS5 costs just on a graphics card you should be able to have a good time.

Do we know people wont have a good time with this game because of weaker hardware? They may have a target profile in mind paralleling consoles when drawing up these suggested specs, but if a user prefers more fps it is likely a balance can be struck.

 

52 minutes ago, AluminiumTech said:

The PS5 is less powerful than a 3070 or 6700XT. So why should those cards not be able to play the game on ultra?

Let's see if DigitalFoundry cover this after it is released. One of the things they try to do is match PC game settings to console. PC Ultra is generally far in excess of what a current gen console actually does unless it is a relatively simplistic graphics game.

 

6 minutes ago, LAwLz said:

Protip: Don't play games on max settings. I know it's a wild concept, but the max settings are usually terrible anyway.

I wouldn't call it terrible but certainly in many cases the benefit over "high" isn't much outside of side by side pixel comparisons. The difference between high and low is often very obvious you don't need a side by side comparison to tell them apart. If I can't see it, does it matter? 

 

 

Forgot to reply earlier and too lazy to look up who wrote it. About consoles rendering below native, they do that. This gen. Last gen. It's long been a thing.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Montana One-Six said:

 

 

Only game that actually comes close to the 12 GB limit at 4K is FarCry6 afaik.

That would qualify as last gen.  Yep, that is one of the titles that completely tanked on my 3080 when I went to 4k (the 10GB card).

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Montana One-Six said:

Only game that actually comes close to the 12 GB limit at 4K is FarCry6 afaik.

RE3 can use more than that, but i honestly believe its a bug... like 6GB or 8GB (max i could test) there's like 0 visual difference (imo) but with 8GB there's a very high chance of the game insta crashing, so its using the Vram for "something" ¯\_(ツ)_/¯ 

 

Pretty sure Monster Hunter World can use more too, except that isnt a bug... i can only play with medium/ high settings with 8GB for example, the vram is literally the limiting factor...

 

and "4k" isnt an issue at all (with DLSS) but the textures could be pushed higher for sure, it has a 8GB minimum requirement for a reason (the hd texture pack that is)

 

its a weird claim to me saying games don't use more than 12(?) there are certainly a few that do.

 

 

21 minutes ago, porina said:

About consoles rendering below native, they do that. This gen. Last gen. It's long been a thing.

on top of my head

monster hunter world on pc 1080p 60fps almost maxed, not a problem with a 1060 6gb, on ps4 its not even 30fps (more like 28-30 with dips) and blurry as hell (looks like 720p native if i had to guess)

 tomb raider, played on my crappy laptop with a 940mx, 1080p / above 30 fps, graphics looked dope even at medium... on ps4 i couldn't even play due to all the motion blur, like literally i could barely see anything... 👀 

 

Edited by Mark Kaine

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mark Kaine said:

RE3 can use more than that, but i honestly believe its a bug... like 6GB or 8GB (max i could test) there's like 0 visual difference (imo) but with 8GB there's a very high chance of the game insta crashing, so its using the Vram for "something" ¯\_(ツ)_/¯ 

Haven't played that one but according to guru3D and techpowerup it doesn't go above 8GB at 4K.

 

5 minutes ago, Mark Kaine said:

Pretty sure Monster Hunter World can use more too, except that isnt a bug... i can only play with medium/ high settings with 8GB for example, the vram is literally the limiting factor...

Never played that either but the game is fairly old so I doubt its using that much VRAM. According to techpowerup it uses 3-4GB at 4K.

 

7 minutes ago, Mark Kaine said:

its a weird claim to me saying games don't use more than 12(?) there are certainly a few that do.

I have been playing at 4K for alomst 2 years now and the only game I have seen getting even close to 12 GB of actual VRAM usage is FarCry6.

 

Are you sure you are looking at actual VRAM usage and not allocated VRAM?

Desktop: i9-10850K [Noctua NH-D15 Chromax.Black] | Asus ROG Strix Z490-E | G.Skill Trident Z 2x16GB 3600Mhz 16-16-16-36 | Asus ROG Strix RTX 3080Ti OC | SeaSonic PRIME Ultra Gold 1000W | Samsung 970 Evo Plus 1TB | Samsung 860 Evo 2TB | CoolerMaster MasterCase H500 ARGB | Win 10

Display: Samsung Odyssey G7A (28" 4K 144Hz)

 

Laptop: Lenovo ThinkBook 16p Gen 4 | i7-13700H | 2x8GB 5200Mhz | RTX 4060 | Linux Mint 21.2 Cinnamon

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Montana One-Six said:

Never played that either but the game is fairly old so I doubt its using that much VRAM. According to techpowerup it uses 3-4GB at 4K.

the minimum req is 8GB (hd texture pack)

 

techpowerup is a very poor source, they don't even get tdp right for gpus.

 

and that they say 4GB is just further prove to that, sure it can run with 4, but it can use far more, i have seen a 3080 almost max out the 12GB Vram at 4k for example.

 

Capcom games in general have very detailed Vram settings,  i can use like 7, more and it starts stuttering at "medium/high'ish" settings and yes thats confirmed by afterburner. 

 

MHW, medium/high'ish 1440p 5GB, and thats because i set it like that, everything above ~7 and there'll be issues ie, memory swapping...

 

mhw_1440p_vram.png.a56d08e293d3063880567e15142c5e5f.png

 

i mean, its always good to check stuff yourself obviously,  i mean the game outright tells you how much Vram it needs and its fairly accurate with that too.

 

the funny / sad thing is i had the same conversations years ago "games don't use more than 8GB" and that was wrong back then too...

 

its just a lot of games target sub ~8GB, because generally thats what most people will have (basically thats not because games couldn't use more, rather than an arbitrary limit set by NVIDIA, as said for like, a decade now)

 

 

edit: btw thats something that consoles do better than pcs traditionally... unified memory,  imagine what a pc could do with like 64GB unified,  fast RAM... GPUs would be a lot cheaper too...

 

illusionary with the current market situation,  but for games it would be *far* better... 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Mark Kaine said:

its just a lot of games target sub ~8GB, because generally thats what most people will have (basically thats not because games couldn't use more, rather than an arbitrary limit set by NVIDIA, as said for like, a decade now)

We've always had different VRAM amounts on different GPUs for a while, on all sides. When did 8GB even was a thing? 980Ti 6GB. 1080Ti 11GB. It's all over the place.

 

59 minutes ago, Mark Kaine said:

edit: btw thats something that consoles do better than pcs traditionally... unified memory,  imagine what a pc could do with like 64GB unified,  fast RAM... GPUs would be a lot cheaper too...

GPUs would need to be closely integrated with the CPU for it to work well. AIBs would cease to exist as we know it. Do you want that tradeoff?

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, porina said:

We've always had different VRAM amounts on different GPUs for a while, on all sides. When did 8GB even was a thing? 980Ti 6GB. 1080Ti 11GB. It's all over the place.

yeah, it is... but i meant like midrange cards are typically 6-8GB, they even had the same models with 6 and 8GB, like the 2060... and of course the 8GB were extremely expensive (at least here) 

 

42 minutes ago, porina said:

GPUs would need to be closely integrated with the CPU for it to work well. AIBs would cease to exist as we know it. Do you want that tradeoff?

well, i don't know, i just know its not ideal and kinda holding games back, not sure i always thought you just need really fast RAM and motherboards that are designed for it? 

As for AIBs , gamers would still want better coolers etc, not sure what they have to do with Vram (for the most part) ?

 

edit: i had to look this up, i don't think you need to fuse gpu and cpu for this, older consoles didn't do that and had still the unified memory advantage...

 

i think the reason they do this now is economical,  its just cheaper, such a "mobile" / low power chip isnt suited for pc... you really would just need a motherboard redesign and probably better connections  -- all extremely expensive to actually design and i doubt manufacturers are willing to do this, especially RAM manufacturers,  nobody would buy their cheap and slow "DDR" RAM anymore (except data centers maybe), but whole industry isnt ready for this... Intel could have done this if they were a bit more aggressive... its a risk, with potentially huge pay off , imo.

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Thaldor said:

Anyone else annoyed that they have mistake in that release?

As in I have never heard about "AMD Ryzen 5 5800X".

It released in 2020...

elephants

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, FakeKGB said:

It released in 2020...

Is this satire? I suck at picking up when a comment is satire. 

If it isn't, they typo was the Ryzen 5 5800X it should be Ryzen 5800x

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mark Kaine said:

its just a lot of games target sub ~8GB, because generally thats what most people will have (basically thats not because games couldn't use more, rather than an arbitrary limit set by NVIDIA, as said for like, a decade now)

 

True but not necessarily at high/ultra and/or at 4k.  Should be fine at 1080p but we are now getting 720p30 games with 6-8GB listed as the minimum!

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Kisai said:

That's the go-to cry of people who play games on a potato. "It's not optimized"

 

I've literately seen this complaint about EVERY SquareEnix PC game. FF14, Neir Automata, FF7 Remake, etc.

 

Please, people who desperately trying to hypermile their potato PC's, Buy a PS5 and quit trying to turn the PC experience into a PS2 experience.

 

The original FF14, SquareEnix overshot the PC requirements at the time, and then never actually put the game on the PS3. Version 2.0? 720p60 or 1080p30 was viable on a GTX 760 or a 1050Ti.  The bottom-tier GPU could still play it without making it look like a PS2 experience. But no, with the DX10/DX11 update, peoples's GPU's started catching fire.

 

Sadly, Neir Automata, which was originally a PS4 title, and there were some oversights on the PC version that maybe dragged the performance a little, but nope, people on steam were insisting that their GTX 1060 should be giving them a 4K experience and their GPU should not be catching fire. 

 

(by catching fire, I mean that the GPU driver would crash or their system would BSOD, meaning they likely had an OC GPU card or OC CPU for whatever reason.)

 

The point I'm making here is that claiming that a game is not optimized is a lazy excuse. If you want to play the latest, biggest, baddest game, on the minimum requirements, you are not going to enjoy that experience. Either give in and buy something better than the recommended (because the recommended specs is always a lie), or buy the PS5 just to play it. 

 

This has been the same problem ever since the late MSDOS/early Win95 era, where a console game comes out with a PC port, and the PC port is substantially inferior to the SNES or PS1 game. Which was the case with FF7 and FF8 due most computers having inferior MIDI music hardware, and would not be rectified until "remaster" versions 20 years later.

 

A game should be able to dial-down "enough" that you could play it, poorly, on whatever hardware is common, but expecting the PS5 equivalent experience on anything below the Ultra settings is wishful thinking.

I mean I would say it's possible to have cutting edge graphics with excellent performance. I mean battlefield V had amazon graphics but also was super well optimized and could be played with very good performance without having to have a 2k dollar pc. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Mark Kaine said:

techpowerup is a very poor source

I was just googling for some site who measured VRAM usage. Since you said that, I looked through a few games they reviewed and their results match pretty good with what I get.

 

3 hours ago, Mark Kaine said:

MHW, medium/high'ish 1440p 5GB, and thats because i set it like that, everything above ~7 and there'll be issues ie, memory swapping...

OK and without HD texture pack? Still its nowhere near 12 GB and going from 1440p to 2160p won't suddenly add 5GB on top of that.

 

Like I said the only game I have played that actually comes close to 12GB at 4K is Far Cry 6. And looking through guru3D and techpowerup technical game reviews shows that  Far Cry 6 has the highest actual VRAM usage out of all games they reviewed.

Desktop: i9-10850K [Noctua NH-D15 Chromax.Black] | Asus ROG Strix Z490-E | G.Skill Trident Z 2x16GB 3600Mhz 16-16-16-36 | Asus ROG Strix RTX 3080Ti OC | SeaSonic PRIME Ultra Gold 1000W | Samsung 970 Evo Plus 1TB | Samsung 860 Evo 2TB | CoolerMaster MasterCase H500 ARGB | Win 10

Display: Samsung Odyssey G7A (28" 4K 144Hz)

 

Laptop: Lenovo ThinkBook 16p Gen 4 | i7-13700H | 2x8GB 5200Mhz | RTX 4060 | Linux Mint 21.2 Cinnamon

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, AluminiumTech said:

 

2) It is not the responsibility of gamers to own highend gear. It is the responsibility of game developers to target reasonable hardware so that there exists a sufficient customer base to actually purchase their products.

If a game says it needs some overkill hardware, and you do not meet the "recommended specs", and you cry "it's not optimized" then that is YOUR problem. No amount of complaining on steam or LTT is going to change that.

 

Like I said, That is the go-to cry for people who want to play a newly released game on their 3 year old potato laptop, or 7 year old desktop. I've seen this happen, every single time for SquareEnix games, and every single time they've been wrong, and SquareEnix releases a patch oh, 5 years later that moves the performance target even higher. Please, go look at the steam discussion pages for Neir: Automata and the FF14's own support forums. People will complain that the game "Doesn't work", or gets like 15fps on a bottom-rung GPU.

 

 

5 hours ago, AluminiumTech said:

 

It is their job to make sure that the performance level they're targeting isn't niche.

It is their job to see where the puck is going when they release a game, and not to go "oh we're going to release this game in 5 years, by then GPU performance will be *magic 8ball* "8Kp240 LOL"

 

The worst experience with a Square Enix game I've had was FFXV, because all the Nvidia tunables broke the game due to memory leaks in the nvidia gameworks libraries.

 

5 hours ago, AluminiumTech said:

 

You seem to be defending releasing unoptimized games.

The PS5 is less powerful than a 3070 or 6700XT. So why should those cards not be able to play the game on ultra?

I'm defending the fact that developers should not nerf a game just so it works on a potato with specs less than the development environment for the game console. They clearly had a PC environment at some point to develop the game, and that should be the "minimum specs" for 4K.

 

1 hour ago, HumdrumPenguin said:

Why would people expect to run new AAA games on ultra settings at 4k with mid tier cards? 

They shouldn't and yet they always do and say the game isn't optimized. The reason it always happens with SquareEnix games has a reputation of releasing only bleeding edge hardware-required games. Sometimes however you can tell that was just a straight up recompile of the PS4 version (which was the case with Neir Automata) which means a top-end PC will run it at the same visual quality as the console version, but anything lower than a top-end card at the time of release will give you an inferior experience to the console version. 

 

Look at the patch notes for Neir:Automata in 2021. The game was originally released in 2017

Quote

• UI textures (4K)
Approximately 270 UI textures for icons, backdrops and UI elements etc. now support 4K resolutions.
• Cut scenes
The bit rate has been improved and all pre-rendered cut scenes adjusted, so they will now play in 60FPS and display in the correct aspect ratio without stretching the picture.

now look at the requirements:

Quote

image.thumb.png.1838c54381c493389e1d7f4687b7fe2c.png

The recommended was a 4GB Geforce x80 GPU part. Good luck playing it on that unless you play it at 1080p60 only. But look at the minimum spec, a 770 will get you a 720p30 experience, at most. What was the most common GPU in 2017? GTX 1060, which was well below the recommended spec here.

 

So go look at the current hardware survey:

image.png.d7a93d1e0b9d889e8b05341b08ed4552.png

Oh look GTX 1060 is still number 2, behind 1650, a GPU that has only 4GB of video memory and 22% slower. The most common "higher spec" GPU there is the RTX 3070. Which is the spec asked for the recommended for Forspoken.

 

Forspoken is by the same development team as Final Fantasy XV, so one could pretty much look at the experience of playing FFXV on Windows and go "oh, that makes sense then"

image.thumb.png.b6e801d2bd45b2993337eb3601bc906d.png

 

GTX 1060 6GB is the recommended for a 1080p experience.

 

Forspoken?

image.thumb.png.27178d9a269bb709a8dbc20e09e86992.png

So what was a 1080p30 experience for FF15 is now a 720p30 experience for Forspoken.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Mark Kaine said:

well, i don't know, i just know its not ideal and kinda holding games back, not sure i always thought you just need really fast RAM and motherboards that are designed for it? 

To get GPU like bandwidths out of DDR? Without looking it up, the latest Epyc is 12 way DDR5. If we assume 5200 speeds, that's around 325GB/s bandwidth. Still far short of current high end GPUs. 3080 10GB was over double that at 760GB/s.

 

2 hours ago, Mark Kaine said:

As for AIBs , gamers would still want better coolers etc, not sure what they have to do with Vram (for the most part) ?

Because I can't really see unified ram happening in a way where a dGPU holds all the system ram. GPU and CPU have to be closer together on mobo.

 

2 hours ago, Mark Kaine said:

edit: i had to look this up, i don't think you need to fuse gpu and cpu for this, older consoles didn't do that and had still the unified memory advantage...

They don't have to be on same package, but basically I can't see the VRAM being far from the GPU. The CPU as a much lower demand would kinda become attached to the GPU, than the other way around it is now. PCIe 5.0 x16 would be about 64GB/s, which is still DDR4 era bandwidth requiring a transfer rate around 4100. Maybe PCIe 6.0 if it happens soon enough starts to get comparable to DDR5 era. Note I'm only looking at peak unidirectional bandwidths. I think PCIe can read/write at same time, but ram is one or other.

 

Maybe a mid way solution is to allow the CPU to retain its own memory pool as current PCs do, but enable more direct usage of VRAM for cases where it makes sense. For all I know this might be possible now, but not so used until we get more VRAM.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, HumdrumPenguin said:

Why would people expect to run new AAA games on ultra settings at 4k with mid tier cards? 

LOL, the investors for one. 

 

This company messed up by posting all of these ridiculous specs.  Post your minimum and recommended, maybe name 1 gpu for reference and say "or equivalent".  

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Heliian said:

LOL, the investors for one. 

 

This company messed up by posting all of these ridiculous specs.  Post your minimum and recommended, maybe name 1 gpu for reference and say "or equivalent".  

 

 

It doesn't make sense. What makes sense is making games that shine with high end cards. It incentivizes people into buying that high end gear with a promise that they'll be able enjoy better visuals and higher FPS. If that's not the case, high end gaming cards would be utterly pointless.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, HumdrumPenguin said:

It doesn't make sense. What makes sense is making games that shine with high end cards.

That would please the 4 people with RTX 4080s but what about everybody else?

5 minutes ago, HumdrumPenguin said:

It incentivizes people into buying that high end gear with a promise that they'll be able enjoy better visuals and higher FPS.

A) It is not a game company's job to sell GPUs except if they're paid to by AMD or Nvidia or Intel. Their job is to make as much as money as possible by selling as many units of their video games as they can as well as associated DLCs etc.

 

B) If you're making a game you want it to run on the lowest common denominartor so as many can buy and play the game as possible. You don't want the experience to be bad on what most people use.

 

C) It isn't financially sustainable to only target x80 or x70 class graphics card customers for games on PC. If you're selling a game on PC it needs to run on hardware people have; not what's on store shelves, unless your game is Cyberpunk 2077 or MS Flight Simulator 2020.

 

The most common GPU on Steam right now is the GTX 1650 which doesn't even meet the minimum requirements of this game.

5 minutes ago, HumdrumPenguin said:

If that's not the case, high end gaming cards would be utterly pointless.

They're not pointless, different resolutions, and different frame rates do exist and there may also be additional options they can use.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, HumdrumPenguin said:

high end gaming cards would be utterly pointless.

It's not pointless, high end cards can give you a great experience in games and often showcase the best the game has to offer.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, AluminiumTech said:

 

 

C) It isn't financially sustainable to only target x80 or x70 class graphics card customers for games on PC. If you're selling a game on PC it needs to run on hardware people have; not what's on store shelves, unless your game is Cyberpunk 2077 or MS Flight Simulator 2020.

 

This game is a Cyberpunk 2077. Perhaps you're forgetting this is a SquareEnix game and not an Electronic Arts one.

 

Electronic Arts most recent game is "Wild Hearts"

image.png.ffdf55ac3c8fc4ae43f2de0b26314dad.png

The minimum there is also a GTX 1060 (6GB), and a recommended is an RTX 2070 (8GB).

Note 12GB and 16GB system RAM. Less than Forspoken.

 

SquareEnix console and PC games are always bleeding edge hardware. Their mobile games however? They're insane for when they want to produce a gacha/battlepass game and it doesn't work on any device but the highest end Samsung and iPhones. Those are two different types of games and target markets. 

 

Freemium Pay2Win Mobile trash, you can not make the requirement be anything over the baseline hardware, because people aren't going to dump money into a game that won't work on their potato. That's been the case for at least 15 years. P2W "Freemium" experiences must target the lowest common denominator.

 

A premium game experience, is what is being asked for for Forspoken. If it's on the PS5, then expecting it to run on a PC weaker than a PS5 is wishful thinking.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Kisai said:

Sadly, Neir Automata, which was originally a PS4 title, and there were some oversights on the PC version that maybe dragged the performance a little, but nope, people on steam were insisting that their GTX 1060 should be giving them a 4K experience and their GPU should not be catching fire. 

It was more than oversights that dragged the performance a little, that PC port was a genuine dumpster fire that was thankfully made very playable by the fan made FAR mod. Without the FAR mod though the game wouldn't render at native resolution and was so blurry, and unless you were using GTX 980 Ti / 1070 class hardware or better you'd have to use the FAR mod to lower a hidden global illumination setting not in any of the game's official graphics settings to be able to run 1080p 60fps. I don't think it was unreasonable to expect 1080p60 out the box from a 1060 in 2017. Still one of my ten favorite games ever made despite having to jump through hoops to get it running well.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Jon-Slow said:

This is the funniest system requirements of my recent memory. The game is going to have horrible performance, mark my words. I'd say no DLSS means I'm not buying it, but who are we kidding, I would never buy something that looks this terrible from gameplay to everything this close to release anyway. I can already imagine the field day Digital Foundry will have with this on release.

Per Digital Foundry the demo on PS5 doesn't even come close to maintaining 60 fps despite being upscaled from 900p. PS5 has like an RX 6600 XT class gpu, that's lunacy. It's not like this game looks like Crysis did in 2007, at least that you could understand having to drop settings and resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

I felt like ranting about the backlash because I grew up when games that absolutely pushed top tier hardware were praised, but looked at some footage of the game... oof. It has no right demanding that hardware for how poorly it looks. Got that Skyrim unmodded water detail going on.

5800X3D / ASUS X570 Dark Hero / 32GB 3600mhz / EVGA RTX 3090ti FTW3 Ultra / Dell S3422DWG / Logitech G815 / Logitech G502 / Sennheiser HD 599

2021 Razer Blade 14 3070 / S23 Ultra

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, SteveGrabowski0 said:

It was more than oversights that dragged the performance a little, that PC port was a genuine dumpster fire that was thankfully made very playable by the fan made FAR mod. Without the FAR mod though the game wouldn't render at native resolution and was so blurry, and unless you were using GTX 980 Ti / 1070 class hardware or better you'd have to use the FAR mod to lower a hidden global illumination setting not in any of the game's official graphics settings to be able to run 1080p 60fps. I don't think it was unreasonable to expect 1080p60 out the box from a 1060 in 2017. Still one of my ten favorite games ever made despite having to jump through hoops to get it running well.

That global illumination flag is something fixed by the 2021 patch. I'm not saying it was "perfectly optimized" before, but there was clearly no issue running it if you had something better than the recommended hardware. But the people on the discussion forums were somehow convinced this game should be running at 4kp60 on a GTX 1060, when it didn't even do that on a GTX 1080. Or somehow get 1080p60 on a GTX 1050Ti. It was clearly designed as a 1080p game from the get go, and the cutscene was the major clue that perhaps it didn't actually run at 1080p on the PS4.

 

Modders will have you believe that they are magically making high-requirement games better, when really they're just turning off, or dialing down features that break the game somewhere. Like with the global illumination, turning that off, makes various areas super-dark.

 

I played and finished Neir:automata without any mods, at 4K on the GTX 1080, and when it dipped below 60fps, I didn't care, but it was not a 1080p30 experience for most of game. I would certainly not stream the game on that GPU. I have not replayed it with the RTX 3070 or 3090, but just to explain again, "it's not optimized" is generally a complaint made by people trying to run a game at, or below the "minimum", because again, the "minimum" is always a lie.

 

The minimum is what you would need for a 720p30 experience at settings below where the ps5 would be, and if you are willing to play the game like that, I question your sanity of buying a $100 game to play on a 7 year old GPU.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, ewitte said:

I can't say until its out and reviewed.  Definitely not at that price I would wait for end of year sales.

After so many years of playing games, I have developed a sense to tell how much a game is going to suck from the marketing material, I've also played the demo tho. This is going to be one of the worst reviewed games of the year.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×