PC for the future
The point is; when you're using stronger gpu's it will get intensiver on the cpu to keep up and with gpu's growing in performance like fuck every year so a 3930k would bottleneck much earlier than you expect. This theory doesnt apply to current cpu bound games such as WoW/PS2 etc because the cpu is already the bottleneck and all you will get from a gtx 1080 the exact same frames with much lower gpu loads.
How long did it take a 2600K to form a bottleneck in a game? Lets say a game that's very multithreading friendly and can push it to a 100%. 2 years only when Crysis 3 was released -> youtube.com/watch?v=_hcuYiqib9I (look after he applies a patch 100% cpu load 70%/70%/70% gpu load)
With gpu's same thing imo, doesnt really exist. A card usually never lives longer than 2-3 years unless you really want to play with the lowest settings..
Stuff that you can futureproof; a high quality psu over a shitty firework psu, a heatsink over an AIO because they tend to give up much earlier than the warranty expires, or maybe a case but in which ways? Build quality? It's not like you're throwing your pc everyday out of your window. Expendability? Double ball bearing over sleeve bearing? Logitech/Corsair peripherals over Razer?
In short, you cant futureproof performance only the lifespan really..
I know what you mean but it's something that chip manufacturers can't do. To give you a chip with 25 times more IPC you're going to need to break moore's law, just guessing blindly it would require a die size of 0.003nm who knows.
The best strategy for good performance over the long term with computers is to buy at economically preferable points fairly regularly as the parts progress. Different parts gain performance at different rates so the ideal price point and upgrade time period differs with each. GPUs are one of the faster growing areas and CPUs are now quite slow comparably.
Look at it this way - a hyperthreaded CPU like the 4770 verses the 4670 is at most 20% faster. Its only 20% faster in the really quite rare circumstances where a program uses 8 cores and does so using a widely mixed range of instructions across floating point and integer calculations. This isn't something you find in Movie encoding, in games or really general desktop use. It can describe compression but most of the benefit is in enterprise like applications and databases. That feature on its own is charged at $100. We have had it since the original i7 920 series some 5 years ago and to this day barely any desktop software has managed to use it. Is it likely thus to deliver benefit in the future? Maybe, but its far from certain that it will. The current evidence suggests its probably not going to be critical.
My main advice is don't try and buy a machine that will last 5 years, instead buy reasonably priced performance machines today and save that money and buy upgrades an replacements in a year or two's time. You'll maintain better performance over the life of the PC and it will cost you less. Unlike the laptop and tablet users you wont be required to upgrade everything, so the HDD can stay as is and most likely you'll only be compelled to replace the CPU in 5 years or whereas the GPU will probably want 2 yearly upgrades.
il just do it this way then thanks guy's
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now