Jump to content

Has PC tech become stagnant? I think so.

Uttamattamakin
Go to solution Solved by Mira Yurizaki,

I would argue a lot of the apparent stagnation is simply because what we as consumers do with computers don't require a lot of power to reach acceptable levels of performance and quality. Especially when a lot of things moved to the web and we have phones to access the content, developers have to design things with even lower performance requirements in mind. Since we have apps and software that are meant to perform well on lower power devices, it makes sense that putting them on higher end hardware won't improve the apparent performance.

 

I'm sure a lot of people here haven't experienced the joys of actually worrying about if their PC will play an MP3 or not. Or that a 1 minute uncompressed WAV file (that wasn't even CD quality!) would eat all your RAM if you tried to work with it.

 

Also I would argue a vast majority of the performance boost from the 90s came simply from adding more speed. For example, the OG Pentium launched with a 60 MHz model. Two years later it had a 120 MHz model. Within 7 years in that time frame as well, we went from 60 MHz to 1GHz, a 16 fold increase. And while sure there were architecture improvements that helped, I'm not sure if those played a significant role in boosting general performance overall moreso than the clock speed bump. The speed bump from an i7-2600K to an i7-8700K assuming max turbo boost? 1.26x increase.

38 minutes ago, ewitte said:

Cut down defective enterprise cards, cheaper older designs, etc.

By cut down, all I've seen is dropping FP64 performance. FP64 performance isn't really necessary for consumer applications.

 

The GeForce 10 series is the first time NVIDIA has not used an older GPU design for any of the SKUs. The GeForce 900 series was close, but the bottom tier SKU used a first generation Maxwell.

 

EDIT: I should add, it's the first time in a while. After the GeForce 4 and MX 4 fiasco, the GeForce FX, 6, 8, 9 (arguably), and possibly the 500 series did not use an older design at all.

34 minutes ago, ewitte said:

Well how many normal consumers would recognize the difference between a nvme and sata SSD vs ANY SSD and a HDD?  Even budget enthusiasts mix and match all three types for different storage types.

I'm pretty sure most consumers would almost immediately recognize the responsiveness from an HDD to a SATA SSD. SATA SSD to NVMe? I'd argue most enthusiasts agree there's no appreciable improvement to responsiveness and unless you actually have a use-case that needs that bandwidth, it's not worth it.

 

So as an example of where bandwidth doesn't really mean much, here's some data showing storage activity when booting Windows 7 (it was on a VM, otherwise the data wouldn't really be obtainable)

2d1710a46f843ad05a6cdc43917607eb-650-80.

 

f7c315ab881e66ff45911658cbcf25e7-650-80.

 

The SSD in this case is a SATA SSD. Notice that it doesn't even top out over 100MB/sec.

 

Before you go "This is probably just a Windows thing", similar behavior happens on Linux Mint

 

41af433a46598b0a2506abf2ca088cb0-650-80.

1562a451ec6c814ba08374bf26321702-650-80.

 

Yes you could go "Aha! It reads at +200MB/sec", well okay, but only for a second. The rest of the time it's booting it's spending it under 50MB/sec.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, ewitte said:

Well how many normal consumers would recognize the difference between a nvme and sata SSD vs ANY SSD and a HDD?  Even budget enthusiasts mix and match all three types for different storage types.

As shown above by @M.yuriaki    The HDD to SSD difference would be felt.  The NVME to SSD difference in most use cases would not be felt as much.  There are times when big files are transferred that a few % difference can work out to a large time cost/savings. The average consumer rarely does that though.  

 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...
On 12/4/2018 at 7:53 PM, corrado33 said:

I don't think so really. Quantum will denote a complete switch in programming styles, hardware, support for external hardware etc.

 

The programming isn't there to support it. Programs are written to understand 1s and 0s. Quantum is more than that. It's a massive change, and something that I doubt will happen in the next decade or even more before it's widely supported. Quantum will represent a shift in computing where all backwards compatibility will be lost, and that'll be a HUGE thing. 

 

I mean... what are they going to do, release a quantum processor with no software support? No one will buy it. 

 

I think intel will do 1 or 2 more transistor shrinks before quantum even comes into the picture. 10nm, then probably a small jump like 8 nm or something. Then, they'll likely move on to other methods of increasing performance. Increasing clock speed, increasing cache size, hell, maybe we'll see a time of dual CPU boards being commonplace and software to support it. I think that'd be cool. I just really don't see quantum around the corner anytime soon.

Quantum computers will be great at solving mostly some specific (but really important) optimisation, simulation, and crypto problems. They'd be much worse than classical CPU for gaming or text processing. They'd probably be less performant than an abacus to run an OS. 

 

Think of it this way: planes are great to travel the world, but they'd be the worse to commute to work every morning. Planes were a revolution in transport, but they didn't replace the car. They solved a few task more effectively.

 

Back to quantum computers, I doubt any member of the general public will ever use them directly, let alone own one. They will probably be in server farms in their cryostat, with a cloud-based interface to access them (e.g. Google would love those for searches, they would dramatically speed up "reverse dictionary search", look it up). IBM already does that with their IBM Q Experience, I suggest you look it up, it's great. Maybe you could buy a quantum coprocessor in 3019, but a mainstream full Quantum CPU computer isn't happening in the foreseeable distant future, because there's no point. Just like people won't stop driving cars altogether to solely use Rockets for transportation because they're better.

 

There's already a handful of programming languages built around qubits, IBM's QSAM is a great example, and even compilers to optimise quantum circuits (i.e. series of operations done on qubits).

 

Whereas the classical CPU was kind of figured out as it was developed, there's already a tremendous amount of knowledge and research about quantum algorithms and quantum computing; quantum physics has been studied since the thirties. It's just an incredible technical challenge to build actual qubits.

 

More precisely, the quantum processing of information is very well understood nowadays and quantum information is a blooming field. When someone finally builds a working fault-tolerant quantum CPU, there won't be a lack of software at all.

 

If we succeed, it will be great for Science, and not just physics but also chemistry, pharmacology, maths, ... you name it. Closer to the everyday person on the internet, it might cost Google marginally less to handle your searches, and secure the web even more.

 

It just won't run Crysis.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, QuantumCakeIsALie said:

Whereas the classical CPU was kind of figured out as it was developed, there's already a tremendous amount of knowledge and research about quantum algorithms and quantum computing; quantum physics has been studied since the thirties. It's just an incredible technical challenge to build actual qubits.

 

More precisely, the quantum processing of information is very well understood nowadays and quantum information is a blooming field. When someone finally builds a working fault-tolerant quantum CPU, there won't be a lack of software at all.

 

If we succeed, it will be great for Science, and not just physics but also chemistry, pharmacology, maths, ... you name it. Closer to the everyday person on the internet, it might cost Google marginally less to handle your searches, and secure the web even more.

 

It just won't run Crysis.

I'd bet you know this but for the general consumption.  

 

Fun fact.  The programming for a classical CPU and the theory of how one would operate were done WAY before we were able to build anything practical.  The first chess program was written for Charles Babbage's computer by a lady Ada Lovelace

 

https://www.computerhistory.org/babbage/adalovelace/

 

We may be like people in 1833 trying to figure out how a X86 processor would work out before electronics were even invented.  It may take inventing something completely new to do quantum computing.    Maybe practical photonics...but people have been working on that. 

 

I say that not many people seem to know the fun fact that the first person to ever ever program a computer was a woman. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×