Jump to content

Gaming development hardware vs end user performance?

Hello all.

 

I've got a bit of a question I've been trying to figure out for years now, figured this might be a decent place to find someone who has an answer.

 

I'm not a hardcore gamer, but I do play a fair bit, and I watch a lot of gaming content on various online platforms like Youtube and Twitch..  On many occasions, over the years, I've seen games with crazy graphical or scaling options possible, or otherwise high demand setups, that bring even the most powerful gaming rigs practically to their knees.   I've also seen games that aren't quite so hardcore just.. start to really chug and perform poorly on what one would think should be more than sufficiently powerful hardware for the task..

 

Now, I'm kinda an old guy.. I remember back in the days before digital distribution when it was actually important for companies to make sure the game worked before it was shipped.. because there wasn't really an option to "patch".. so you had a lot of QA work and testing before a launch... but..

 

I've always wondered.. "What kind of systems do the makers of these games use to create and test them?" 

 

From a lot of what I've heard, honestly, the specs of the hardware used by even AAA devs are often kinda rubbish..  Like, the systems they actually build the game on could never actually run what they create on a "max bling" kind of setting.. I'm not sure that is correct, but... it's what I've heard.

 

So.. if I take a game that isn't really meant to be brutally demanding on a system.. like say, Minecraft.  Nobody's gonna argue that it's a Crysis 3.. But I've seen "far better than average" hardware chug out and players running around in an unloaded section of the world just because it can't keep up...

 

So I have to wonder "man, what kind of systems must these guys be running to make sure their game actually works, or do they even particularly care about that anymore?"

 

Can anyone shed some light?

Link to comment
Share on other sites

Link to post
Share on other sites

Ideally you should be developing on the bare minimum system you're willing to support. It's much easier to scale up than it is to scale down.

 

The problem with specs that games require is that they often don't include what settings those requirements are for and what the expected performance is. For all I know, the minimum requirements is for absolutely everything set to the minimum settings. It still plays the game, it'll just look ugly doing it.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×