Jump to content

Why is "Future Proof" such a hated term?

Moonzy
Go to solution Solved by Moonzy,

Read this before commenting,

 

I'm not here to argue about future proof is good or bad, but trying to understand why people hate it.

 

so far, most people that commented against it assumed that future proof means buying the highest end equipment, which although it is a form of future proofing, it's definitely a dumb one in a financial sense.

 

future proof can also mean "buying something with moderate headroom above current needs to meet for future needs within the PC's lifespan", having some leeway to accomodate the unknown and unexpected that might occur in the PC's lifespan (3-5 years), which is the more sensible way to go about it.

 

though i do agree to some part where a major breakthrough in tech may make your investment look obsolete (like ray tracing), but only if you absolutely must have those features. I doubt companies would design a software to not run on majority of the market.

 

as well as if you do have a very tight budget, prioritizing on parts that can make a bigger impact today is a better option than investing in things you might need, like a higher tier GPU vs higher tier CPU.

It's annoying, both similar to a meme and a misnomer.

 

Future proof is not a real thing but can be fun to use or to justify a purchase that should last a few years in tech terms.

Phone 1 (Daily Driver): Samsung Galaxy Z Fold2 5G

Phone 2 (Work): Samsung Galaxy S21 Ultra 5G 256gb

Laptop 1 (Production): 16" MBP2019, i7, 5500M, 32GB DDR4, 2TB SSD

Laptop 2 (Gaming): Toshiba Qosmio X875, i7 3630QM, GTX 670M, 16GB DDR3

Link to comment
Share on other sites

Link to post
Share on other sites

Look at it this way, if you got a 970, then upgraded to a 1070 next generation, you might as well have gotten a 980 ti for 650 And skipped the upgrade and resale fiasco. It depends on how fast stuff is advancing. And the 980 ti is equivalent to today's 1660 or whatnot so you could have had 6 years of decent gaming preformance at 1440p for 650 dollars. On the other hand a 2080 ti is only a little bit better than the 3070 at 500 dollars (equal power but 2080 ti has more vram) so that would have been a horrible future proofing investment. A 980 ti would have been an amazing future proof investment, it's as good as a 400 dollar 1070 Pascal gpu and a 250 dollar modern 1660 (pre 30xx) a i7 2600k was an amazing future proofing investment. Sometimes future proofing works sometimes it's a flop. It's a real thing but when the tech companies have a quantum breakthrough it causes regret but when they stagnate and your cpu lasts 10 years your investment goes a long way.

Link to comment
Share on other sites

Link to post
Share on other sites

Also, the 970 would have been an amazing investment for 1080p future proofing, it was pretty much equivalent to the next generations 60 series and today's modern 50 series so that investment would have gotten you amazing 1080p preformance in 2014 and decent (medium-high settings)1080p gaming 6 years later for just a little over 300 dollars.

Link to comment
Share on other sites

Link to post
Share on other sites

given that consoles are getting 8c16t its pretty reasonable to assume that game developers will start to make use of these resources. sony and microsoft sure think so, otherwise they wouldn't have put 8 cores in them lol. so getting a 3700x right now would be a reasonable "futureproofing" of your system. for some people, just putting everything together once and letting it run for 5 years straight is worth the extra initial investment. for those of us that like throwing our money at companies and ripping our pcs apart every few weeks for new goodies, upgrading as new hardware comes out is also viable since you can generally resell low-end to midrange components without too much of a loss. even with the 30xx cards bringing heavy performance gains, 2070 supers are selling for >400 bucks used. 2080tis are around 600 rn which is a far larger loss, but thats the price paid for getting cutting edge raytracing tech ahead of its time.

 

basically, sure you cant predict the future with 110% accuracy. but making reasonable assumptions based on current releases and common sense shouldnt be taboo. if someone's modern system only requires 550w its not unreasonable to get a 650-750w psu in case of high tdp gpu releases like ampere. if a 3600 is enough to play current games acceptable settings for someone, a 3700x wouldnt be a bad choice either if they dont wanna deal with replacing it sooner for similar performance. a sata ssd is more than enough for current gaming purposes but a decent nvme is likely to be a useful investment eventually. if it turns out that a reasonable assumption is incorrect in a few years you're only out 100 bucks.

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, VeganJoy said:

given that consoles are getting 8c16t its pretty reasonable to assume that game developers will start to make use of these resources. sony and microsoft sure think so, otherwise they wouldn't have put 8 cores in them lol.

Consoles have had 8 cores in them for about 7 years now.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

Consoles have had 8 cores in them for about 7 years now.

the ps4/xb1 only have access to 6 for actual usage and theyre based on bulldozer iirc which wasnt great to develop for. additionally, they clock very low (under 2ghz for the original ps4 i think) so its easy to outperform it with less cores. hopefully the new consoles will demonstrate some serious power and set a good baseline for devs.

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

simple, your future isnt the same as others future.

futureproof is an expectation, varies between each individual

Link to comment
Share on other sites

Link to post
Share on other sites

Read this before commenting,

 

I'm not here to argue about future proof is good or bad, but trying to understand why people hate it.

 

so far, most people that commented against it assumed that future proof means buying the highest end equipment, which although it is a form of future proofing, it's definitely a dumb one in a financial sense.

 

future proof can also mean "buying something with moderate headroom above current needs to meet for future needs within the PC's lifespan", having some leeway to accomodate the unknown and unexpected that might occur in the PC's lifespan (3-5 years), which is the more sensible way to go about it.

 

though i do agree to some part where a major breakthrough in tech may make your investment look obsolete (like ray tracing), but only if you absolutely must have those features. I doubt companies would design a software to not run on majority of the market.

 

as well as if you do have a very tight budget, prioritizing on parts that can make a bigger impact today is a better option than investing in things you might need, like a higher tier GPU vs higher tier CPU.

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

I disagree with most people here, if I build a top of the line PC, and I'm not able to play new games in 10 years, then there is something very wrong. 

 

Future proofing is 100% possible, it's just not really worth it.

In real life, I act like I know less about tech than I actually do.

 

Main build: CPU: AMD Phenom ii X4 955 Black Edition @ 4,0GHz (Cooler Master 212) | GPU: MSI RX580 8Gb | RAM: 12Gb DDR3 | PSU: Chieftec CTB-500S  | Mobo: Asus M4A87TD/USB3 | Storage: Seagate 500Gb HDD

Yes, I'm aware my CPU is a huge bottleneck. No, I don't really care.

 

Laptop: Asus TUF FX505DT 60Hz 8Gb model

 

"Fortifications, cannons and foreign aid won't help unless every man knows that he himself is a guardian of his country"

-Carl Gustaf Emil Mannerheim

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Doqtori said:

if I build a top of the line PC, and I'm not able to play new games in 10 years, then there is something very wrong. 

10 years is a bit stretching it tbh

 

10 years ago top end would be i7 920/ 2600k? and GTX480?

not even viable in 2016 if you ask me, but 5-6 years is good enough imo

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Moonzy said:

10 years is a bit stretching it tbh

 

10 years ago top end would be i7 920/ 2600k? and GTX480?

not even viable in 2016 if you ask me, but 5-6 years is good enough imo

GTX 480 you say? I benched mine not too long ago.

 

Fastest GTX 480 Ice Storm in the world screen shot, whatcha think? Not too shabby huh?

 

2378869.thumb.png.8b8b9db2654a81d89d7afadfa8255343.png

2374333.thumb.jpg.f45d23d649cb10ca694d8e924a3fe826.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

As far as future-proofing goes, I'll just call it what it is, I'm a bit of a whore for good graphics.  I want the latest titles running obscenely smoothly on my 144Hz display.  I'm not a future-proofer, I just love visuals...high refresh rate monitors really spoiled me too...

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×