Jump to content

Most demanding game vs GPU then out?  Also lowest settings+fps you've put up with? How low you go b4 upgrade (not frequent)?

Hi guys :)  Was just curious about a few things …

 

What do you think was the most graphically demanding game ever releasedrelative to the GPU technology then available?  I've heard people talk about the original Crysis being quite demanding for its time.

However, as I read up more about it, it seems that high-end cards, like the GeForce 8800 GTX, seemed to be able to run it at 30fps, medium settings, 1600x1200 or 1920x1080 or something like that.But…What if you, like me, would be willing to turn settings way down to be able to play something?  For example, see the below simulation of a brief CS:GO match vs bots on Dust, encoded at 240p, 6fps, H.265 q42.

 

csgo 2017-08-06 18;24 - b - 240p, 6fps, h265, q42.mp4  (it's 744 kb & 47 seconds.  Is there a way to make it a playable-in-browser video without putting it on youtube, or is that where I'd need to put it?)

 

Has there ever been a game that struggled badly like that at very LOW settings, even on extremely high-end hardware like US$>2,000 worth of GPUs?  If so, what was it, and what was it like on hardware of the day?

 

Most demanding game relative to GPU hardware then available?  Also, lowest settings+fps you've had to put up with?

 

 

And, second part of the question... What's the lowest settings you've ever had to deal with in a game you were playing?
 

For me, it would be somewhat similar to the example above, although the "quality" wasn't quite as low (mainly because there was no setting to go that low), and I think I could get up to 8 or 10 fps looking at a blank wall.  The game was Team Fortress Classic (same engine as the original Half-Life) back in 1999 or 2000.  The GPU at the time I believe was the original ATI All-In-Wonder (based on 3D Rage II, I think), and a Pentium 166 MMX.  I don't remember what the RAM or HDD (no SSD obviously) was at the time, but it was probably on Windows 98.

 

 

 

Third part.  How long does it typically take before various price tiers of single GPUs (like $700, $500, $350, $200) get to where they can't peak at more than 12-15fps at lowest resolution and settings in a then-10-year-old esports / lighter-duty title?  Or does deprecated / lack of driver / API support for games on older cards make them not run at all before the performance gets that low?   For one example, I'm guessing a game like CS:GO would either struggle to get more than a few FPS at lowest settings on like an Nvidia Riva 128 or ATI Wonder/Mach/Rage, or, not run at all due to lack of support on those cards for certain features.

 

 

 

Lastly, for those of you who like me don't stay on the cutting edge, but hold onto your cards for a while, how often do you upgrade, and what's your preferred criteria for how low the performance gets before you do so?  For me, while I'd grudgingly accept the scenario in the preceeding paragraph if financially necessary, I'd probably prefer to upgrade when few-year-old games are almost always dipping below ~15-20fps at low settings and 480p or something like that.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

There comes a difference between games that are bad ports, optimized that kill gpus and such as ARK survival evovled. No matter what GPU you have, if there is enough of a base built you will drop to less then 30 fps. 

 

Thats the biggest difference to me, What exacty are you defining as demanding, to me upping settings to ultra or Max is just dumb, it adds little things that cause massive amounts of FPS loss. Id rather enjoy a great looking and smooth gameplay rather then a less then smooth one.

Link to comment
Share on other sites

Link to post
Share on other sites

Part 1) I don't know, I imagine there were never games developed like that because it would be stupid. Why make something that no system can run even on the lowest settings? There would be exactly 0% of the market that could buy your game. If I had to guess Crysis is probably the most demanding outside of optimised early access games like ARK. 


Part 2) I had to go down to medium for Ghost Recon Wildlands at 1440p on my 780ti before I bought my 1080.

 

Part 3) Impossible to say.... 10 years is a long time and e-sports games are meant to last basically forever, so they don't release new ones too often. However it's possible that games randomly stop increasing in graphical fidelity and GTX 1080 ti's can max games. It's also possible that games randomly exponentially increase and a 1080 ti is useless for even e-sports titles at lowest resolution and settings in 7 years. It's possible that games use DX 13 and the 1080 ti doesn't support it. Questions asking about a decade from now in PC terms is a bit silly.

 

HOWEVER the 8800GTX is a decade old and it runs CS:GO and League of Legends at 1080p high details with 60fps just fine. IDK how CS:GO has changed graphically over the years, but League got visual update a couple years back and it consistently has assets revamped.... Take that however you want. 

Link to comment
Share on other sites

Link to post
Share on other sites

@Shimejii Ahh, yeah, true.  Looks like I forgot to add the requirement to the first part that the game be already well optimized. :P Oh well :D

 

My bro plays ARK, and recently upgraded from a dead GTX 780 to a GTX 1080 Ti, using Intel HD 4600 briefly in the meantime.  I think he went from ~45fps Medium 1080p to 25fps 800x600 low to ~50-60fps 4K high, IIRC.

 

What I'm defining as "demanding" is low fps even at low resolution & settings on a high-end GPU, even with a well optimized game.  (Maybe it's just one that has insanely advanced graphics for its time.)

 

 

@SlaughterSmurf I could see the logic of your 1st part answer.  For 2nd part, I don't even have a monitor that can run 1440p yet :D and until I got my laptop with its 970M, or the 3GB 1060 in my desktop, I was running the 4790K's iGPU in games.  Interestingly, I was able to play the first few scenes of Witcher 3 at 1080p ultra on the iGPU.  Sure it was only 3-5 fps, but the gameplay itself was slowed way down.  (Riding a certain distance that normally took 30 seconds or so, was taking more like 4-5 minutes, and not because I was having trouble controlling the horse.)

I hadn't thought of testing League.  I have it but haven't played it.  (I'm not really planning to test much more right now.)

Yes, I realize 10 years is a long time. :) But sometimes I think I'd have upgrade cycles that are pretty long.  (I'm not one to upgrade soon as something better comes out - for example I didn't go OMG the 7700K is out, gotta replace my 6700K in my laptop! :P Side note: I was hoping to put the 8700K in eventually, but based on recent rumors I likely won't be able to, so 6700K will probably be CPU my laptop eventually dies with, or has when it's replaced in the early/mid 2020s or so.)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Only ones were 3D Mark benchmarks.

MAD-BOX Ryzen 1600X - ASRock X370 Killer SLI - Sapphire R9 Fury NITRO+  -Fried it... RIP

Xeon e5640 4.35ghz, CoolerMaster Seidon 240V, ASUS P6X58D-E, DDR3 8GB 1636mhz CL9, Sapphire Fury Nitro OC+, 2x Stone age storage @ 7200RPM, Crucial 960GB SSD, NZXT S340, Silverstone Strider Gold Evolution, Steelseries RIVAL, Mechanical Metal keyboard, Boogie Bug Aimb mouse pad.

Link to comment
Share on other sites

Link to post
Share on other sites

Depends on the game, generally 50-60fps is okay for me. Turning all the settings to the max shouldn't be the right thing to do, you can always tweak the different settings and find out which combination would be the best for you, or lower the resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

I remember playing prototype on GT 7600, fps was usually below 30, diped below 20 often as well. Same story with saints row 2 and 3

Laptop: Acer V3-772G  CPU: i5 4200M GPU: GT 750M SSD: Crucial MX100 256GB
DesktopCPU: R7 1700x GPU: RTX 2080 SSDSamsung 860 Evo 1TB 

Link to comment
Share on other sites

Link to post
Share on other sites

RUNNING Crysis wasn't the problem. The problem was maxing it out. My old 8400 GS could RUN crysis. Barely, but it could. It was considered a low range card at the time. If I recall correctly, you couldn't even run the game on ultra with 60fps on a 9800 GT. People had to tweak the .ini to get stable 40s. I don't even think that was at 1080p. 

 

I also recall playing prototype on the 8400 GS. Below low settings. At 800x600. At 10 fps.

Beat the game, though. Even if everything was just a big mess of blobs.

Link to comment
Share on other sites

Link to post
Share on other sites

@Evolution90 Speaking of 3D Mark benchmarks, you reminded me of something.  How would you compare performance of cards that are several generations / years apart?

 

For example, say I want to compare a GTX 1080 to a GeForce PCX 5950 (or FX 5950 Ultra).  I think one of the older 3DMark benchmarks had the 2 cards compared, but then there could be the issue of the 1080 saturating the older benchmark.  Is it maybe better to do a few steps, like FX 5950 Ultra vs GTX 280 in like 3DMark Vantage, then GTX 280 vs GTX 1080 in FireStrike?

Or would a site like PassMark be better for something like that?  Or how about if I wanted to go crazy and compare the first video card ever made (was it the ATI CGA Wonder in the mid 1980s which Passmark does NOT have listed? I'm sure there's something older though) vs like a Titan Xp, or a future high-end video card. :P 

I'm thinking of sometime making another topic or 2, one about comparing video cards (and CPUs, etc) released a long time apart, and another about upgrade cycle on the same platform.  The 2nd one would reference something like going from PCX 5950 to 1080, and lamenting that I can't get the same price/performance upgrade on a CPU over time cause sockets change so often. :/ (I don't like frequently replacing motherboards, not because of the purchase cost, but because of the labor involved in swapping them out. If only it was as easy as swapping headsets or similar :/)

 

Ahh, I see @SageOfSpice.  I guess I had gotten the impression that the cards struggled to run it at all, based on "but can it run Crysis?" :P Maybe that's because I'm willing to put up with such low settings/performance (compared to everyone else's tolerance) before I decide a game can't run? xD (And maybe that stems for my days when I was playing newer games on then-several-year-old GPUs that were low end when they were new…)

And by "prototype" I'm guessing you and @JuztBe mean like a pre-release beta of Crysis?  Or was it a game called "Prototype"?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, PianoPlayer88Key said:

 

And by "prototype" I'm guessing you and @JuztBe mean like a pre-release beta of Crysis?  Or was it a game called "Prototype"?

I was talking about game called prototype. Crysis 1  and warhead ran better than prototype, on everything lowest of course.

Laptop: Acer V3-772G  CPU: i5 4200M GPU: GT 750M SSD: Crucial MX100 256GB
DesktopCPU: R7 1700x GPU: RTX 2080 SSDSamsung 860 Evo 1TB 

Link to comment
Share on other sites

Link to post
Share on other sites

i can only answer part 2: minecraft with optifine, all lowest settings at 720p with max 5 fps. that was 3 years ago xD (no actual mods in use)

now i get over 3000 fps in it xD

also the rain was terrible back than, 2spf (yes, seconds per frame)

~i5-7600k @5GHz ~Be Quiet! Dark rock 3 ~MSI GTX 1070 Gaming X 8G ~Gigabyte GA-Z270-gaming K3 ~Corsair Vengeance Red led ~NZXT S340 Elite

Link to comment
Share on other sites

Link to post
Share on other sites

I used to play cities skylines on an old 2nd gen dual core i5 and integrated graphics at 15 - 20 fps 800x600. This was because instead of looking at requirements, I just thought 'hey, that looks like a cool game' and ordered it. This was on a machine which got 30fps low (with optifine) on Minecraft and 20fps on CS:GO. Strange thing is I didn't mind it at the time, only until after going back and playing CS:GO at a LAN party and seeing how shit it looked. 

 

(On a similar note I have a friend who still plays CS:GO at 20fps.)

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, PianoPlayer88Key said:

@Shimejii Ahh, yeah, true.  Looks like I forgot to add the requirement to the first part that the game be already well optimized. :P Oh well :D

 

My bro plays ARK, and recently upgraded from a dead GTX 780 to a GTX 1080 Ti, using Intel HD 4600 briefly in the meantime.  I think he went from ~45fps Medium 1080p to 25fps 800x600 low to ~50-60fps 4K high, IIRC.

 

What I'm defining as "demanding" is low fps even at low resolution & settings on a high-end GPU, even with a well optimized game.  (Maybe it's just one that has insanely advanced graphics for its time.)

 

 

@SlaughterSmurf I could see the logic of your 1st part answer.  For 2nd part, I don't even have a monitor that can run 1440p yet :D and until I got my laptop with its 970M, or the 3GB 1060 in my desktop, I was running the 4790K's iGPU in games.  Interestingly, I was able to play the first few scenes of Witcher 3 at 1080p ultra on the iGPU.  Sure it was only 3-5 fps, but the gameplay itself was slowed way down.  (Riding a certain distance that normally took 30 seconds or so, was taking more like 4-5 minutes, and not because I was having trouble controlling the horse.)

I hadn't thought of testing League.  I have it but haven't played it.  (I'm not really planning to test much more right now.)

Yes, I realize 10 years is a long time. :) But sometimes I think I'd have upgrade cycles that are pretty long.  (I'm not one to upgrade soon as something better comes out - for example I didn't go OMG the 7700K is out, gotta replace my 6700K in my laptop! :P Side note: I was hoping to put the 8700K in eventually, but based on recent rumors I likely won't be able to, so 6700K will probably be CPU my laptop eventually dies with, or has when it's replaced in the early/mid 2020s or so.)

 

If you're buying top end hardware with the intention of it lasting a full decade, stop. Open another savings account so you don't spend the money, spend half now, and half in 5 years. You will be infinitely better off. My general impression is that GPU's are best replaced every 2-3 years (every 2nd generation) and CPU's can last about 4-5 years. That's my plan for upgrade cycle. I buy a PC, I upgrade the graphics halfway through it's life, I buy a new PC, I upgrade GPU, ect. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, PianoPlayer88Key said:

@Shimejii Ahh, yeah, true.  Looks like I forgot to add the requirement to the first part that the game be already well optimized. :P Oh well :D

 

My bro plays ARK, and recently upgraded from a dead GTX 780 to a GTX 1080 Ti, using Intel HD 4600 briefly in the meantime.  I think he went from ~45fps Medium 1080p to 25fps 800x600 low to ~50-60fps 4K high, IIRC.

 

What I'm defining as "demanding" is low fps even at low resolution & settings on a high-end GPU, even with a well optimized game.  (Maybe it's just one that has insanely advanced graphics for its time.)

 

 

@SlaughterSmurf I could see the logic of your 1st part answer.  For 2nd part, I don't even have a monitor that can run 1440p yet :D and until I got my laptop with its 970M, or the 3GB 1060 in my desktop, I was running the 4790K's iGPU in games.  Interestingly, I was able to play the first few scenes of Witcher 3 at 1080p ultra on the iGPU.  Sure it was only 3-5 fps, but the gameplay itself was slowed way down.  (Riding a certain distance that normally took 30 seconds or so, was taking more like 4-5 minutes, and not because I was having trouble controlling the horse.)

I hadn't thought of testing League.  I have it but haven't played it.  (I'm not really planning to test much more right now.)

Yes, I realize 10 years is a long time. :) But sometimes I think I'd have upgrade cycles that are pretty long.  (I'm not one to upgrade soon as something better comes out - for example I didn't go OMG the 7700K is out, gotta replace my 6700K in my laptop! :P Side note: I was hoping to put the 8700K in eventually, but based on recent rumors I likely won't be able to, so 6700K will probably be CPU my laptop eventually dies with, or has when it's replaced in the early/mid 2020s or so.)

 

Oh wait nevermind, I didn't even think about before I had my current PC because I was a console gamer for 99% of it before. For about 3 months before I got this PC I played Skyrim at low settings 720p with like 20fps on my ATI 2600XT with Intel Core2 Duo E somethingorother. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Glennieboyyy007 said:

i can only answer part 2: minecraft with optifine, all lowest settings at 720p with max 5 fps. that was 3 years ago xD (no actual mods in use)

now i get over 3000 fps in it xD

also the rain was terrible back than, 2spf (yes, seconds per frame)

ya rain in mindcraft seem to eat all the fps. (would be nice to have a setting to turn it off) i no you can do an command block but still.

 

for me i was using an 1366 mb with an i7 920 cpu and an gtx 760 with 6gb of ram. despite it being old i probably could have just got a better gpu and used it for a year or two. i was going to use it tell it died but who no's how long that will take. ya the cpu would bottle neck the gpu but still think it would have been better. at some point there will be something that you need to play new games like dx12 and so on at that point you have to upgrade. (aka windows 10.....) but wright now there only a few games that are dx12 so.

 

as for setting there are some settings that are huge resource hogs for little gan. like super a f f or w/e they call it. also that setting  requiters more then a serten amount of gpu ram to use.

I have dyslexia plz be kind to me. dont like my post dont read it or respond thx

also i edit post alot because you no why...

Thrasher_565 hub links build logs

Corsair Lian Li Bykski Barrow thermaltake nzxt aquacomputer 5v argb pin out guide + argb info

5v device to 12v mb header

Odds and Sods Argb Rgb Links

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×