Jump to content

Has PC tech become stagnant? I think so.

Uttamattamakin
Go to solution Solved by Mira Yurizaki,

I would argue a lot of the apparent stagnation is simply because what we as consumers do with computers don't require a lot of power to reach acceptable levels of performance and quality. Especially when a lot of things moved to the web and we have phones to access the content, developers have to design things with even lower performance requirements in mind. Since we have apps and software that are meant to perform well on lower power devices, it makes sense that putting them on higher end hardware won't improve the apparent performance.

 

I'm sure a lot of people here haven't experienced the joys of actually worrying about if their PC will play an MP3 or not. Or that a 1 minute uncompressed WAV file (that wasn't even CD quality!) would eat all your RAM if you tried to work with it.

 

Also I would argue a vast majority of the performance boost from the 90s came simply from adding more speed. For example, the OG Pentium launched with a 60 MHz model. Two years later it had a 120 MHz model. Within 7 years in that time frame as well, we went from 60 MHz to 1GHz, a 16 fold increase. And while sure there were architecture improvements that helped, I'm not sure if those played a significant role in boosting general performance overall moreso than the clock speed bump. The speed bump from an i7-2600K to an i7-8700K assuming max turbo boost? 1.26x increase.

Watching this video brought to mind a question I have been thinking about for a while now. 

 

Are PC's basically stagnant  for the last decade or so?  I mean... when I started with personal computers in the mid 1980's to the mid 1990's your PC would have become obsolete at least five times over.  The whole architecture of the PC would change.  Even if you had an IBM PC in the Mid 1980's as a home user.  By 1989 it would be physically unable to do say Windows 3.0.  While a late 80's 386 SX or DX would be very unlikely to run well with Windows 95.   That is not even considering how HUGE the changes were in graphics.  Compare how much graphics used to advance from say CGA to VGA to SVGA.  It is not that your GPU from a few years earlier could do the same thing as a brand new on but lower res ...it would be physically unable to do what a new GPU does. 

Now .. I think I could dust off my old Surface PRO 3 and play GTA V on it well enough to enjoy.

Link to comment
Share on other sites

Link to post
Share on other sites

It's because stuff can't get smaller, they're trying but it ain't working. Not that the industry downs't want to get better, there are physical limits. Quantum is next though, get excited.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, fasauceome said:

Quantum is expensive though, get excited.

FIFY

My Build, v2.1 --- CPU: i7-8700K @ 5.2GHz/1.288v || MoBo: Asus ROG STRIX Z390-E Gaming || RAM: 4x4GB G.SKILL Ripjaws 4 2666 14-14-14-33 || Cooler: Custom Loop || GPU: EVGA GTX 1080 Ti SC Black, on water || PSU: EVGA G2 850W || Case: Corsair 450D || SSD: 850 Evo 250GB, Intel 660p 2TB || Storage: WD Blue 2TB || G502 & Glorious PCGR Fully Custom 80% Keyboard || MX34VQ, PG278Q, PB278Q

Audio --- Headphones: Massdrop x Sennheiser HD 6XX || Amp: Schiit Audio Magni 3 || DAC: Schiit Audio Modi 3 || Mic: Blue Yeti

 

[Under Construction]

 

My Truck --- 2002 F-350 7.3 Powerstroke || 6-speed

My Car --- 2006 Mustang GT || 5-speed || BBK LTs, O/R X, MBRP Cat-back || BBK Lowering Springs, LCAs || 2007 GT500 wheels w/ 245s/285s

 

The Experiment --- CPU: i5-3570K @ 4.0 GHz || MoBo: Asus P8Z77-V LK || RAM: 16GB Corsair 1600 4x4 || Cooler: CM Hyper 212 Evo || GPUs: Asus GTX 750 Ti, || PSU: Corsair TX750M Gold || Case: Thermaltake Core G21 TG || SSD: 840 Pro 128GB || HDD: Seagate Barracuda 2TB

 

R.I.P. Asus X99-A motherboard, April 2016 - October 2018, may you rest in peace. 5820K, if I ever buy you a new board, it'll be a good one.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Cereal5 said:

FIFY

New is always expensive. Not that I expect everyone to switch to quantum for household use in 3 years.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

38 minutes ago, Uttamattamakin said:

It is not that your GPU from a few years earlier could do the same thing as a brand new on but lower res ...it would be physically unable to do what a new GPU does. 

That was mostly because of software. It's not that hardware isn't getting better, it's that software has been getting better. Thanks to amazing improvements in the APIs (like DirectX and Vulkan) we're at a place where you don't NEED the latest video card to run a new game.

-KuJoe

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, KuJoe said:

That was mostly because of software. It's not that hardware isn't getting better, it's that software has been getting better. Thanks to amazing improvements in the APIs (like DirectX and Vulkan) we're at a place where you don't NEED the latest video card to run a new game.

Well... the fact you can seriously say that is proof of the stagnation. 

Compare buying a ... typical PC you could buy as a consumer in 1988 would likely have had an 8086 at 8MhZ or a 286 at 10 if you were a real baller.  The graphics would be 640x480 with four colors.  IF you bought a PCJr or a Tandy 1000 series 640x480 with 16 colors.  Fast forward just a couple of years and you'd have a 386 at 25Mhz, 2MB of ram, and a 40MB HDD able to run windows 3.1 and get on the early world wide web and just barely use a CD rom.    The graphics would be 800x600 and 256 colors. Two more years and it would be 1024x786 and 16 million colors.   etc.  

What I am saying is.  It used to be that an upgrade to a new computer meant being able to do things that were previously impossible.

 

Compare the situation now.  My moms Mac Book from 2008 can do everything my 2017 HP Specter X360 can do.  It just does it SLOWLY (but still fast enough to use).    Compared to how big of a leap an upgrade used to be it is not really worth it to buy a new PC until the thing breaks.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, fasauceome said:

New is always expensive. Not that I expect everyone to switch to quantum for household use in 3 years.

We won't see a quantum computer that is reasonable until we have it working at ambient room temperature.  As long as it requires a super cool design it will be relegated to gee whiz level tech.   Kinda like how superconducting wires were going to change everything at one point in time.

To compute with quantum states is really hard to do with thermal vibrations of most materials.

Link to comment
Share on other sites

Link to post
Share on other sites

I think this is to be expected as the PC hardware market matures. We likely won't have the same jumps in performance as wet did in the 80s and 90s. I think the growth will come in how we use technology. Previously analog devices will become digitized and interconnected. 

Link to comment
Share on other sites

Link to post
Share on other sites

That's mostly because back then it was all single core CPUs, though there were 2 CPU solutions, they weren't very well supported. Now, because everything is multicore capable, the "advances" have shifted to other technologies : it all comes from what I would call "consumables" : RAM, storage, GPUs.

 

Not so long ago (2 years, maybe less) you could still game on a Core2Quad, as long as you had 8GB or RAM and a nice recent GPU (GTX 7 or 9 series, or an an AMD R7 or R9). But if you had one of those Core2Quad in their original form, that would've been painful to say the least!

 

Imagine having a Q6600 with 4GB of slow DDR3, a 320GB HDD and a 512MG GPU, all on Windows XP ... YET, the same Q6600 with an SSD, 8GB of RAM, an R7 270 2GB on Windows 7 wasn't that bad (that's what I gave my brother a few years ago and he loved that budget gaming PC!).

 

Now it's all about 2nd gen cores, and I'm guessing that in a few years, those won't look so good anymore and we're going to look at the 3rd and 4th gen cores as the used budget solutions, give it enough times. ;)

If you need help with your forum account, please use the Forum Support form !

Link to comment
Share on other sites

Link to post
Share on other sites

Most industries have become stagnated.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Uttamattamakin said:

We won't see a quantum computer that is reasonable until we have it working at ambient room temperature.  As long as it requires a super cool design it will be relegated to gee whiz level tech.   Kinda like how superconducting wires were going to change everything at one point in time.

To compute with quantum states is really hard to do with thermal vibrations of most materials.

Didnt they get that working recently?

 

"Put as much effort into your question as you'd expect someone to give in an answer"- @Princess Luna

Make sure to Quote posts or tag the person with @[username] so they know you responded to them!

 RGB Build Post 2019 --- Rainbow 🦆 2020 --- Velka 5 V2.0 Build 2021

Purple Build Post ---  Blue Build Post --- Blue Build Post 2018 --- Project ITNOS

CPU i7-4790k    Motherboard Gigabyte Z97N-WIFI    RAM G.Skill Sniper DDR3 1866mhz    GPU EVGA GTX1080Ti FTW3    Case Corsair 380T   

Storage Samsung EVO 250GB, Samsung EVO 1TB, WD Black 3TB, WD Black 5TB    PSU Corsair CX750M    Cooling Cryorig H7 with NF-A12x25

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, TVwazhere said:

Didnt they get that working recently?

 

It seems like it but not really.  You see. A "Quantum Computer" is not just small or that quantum effects are important.  After all quantum mechanics is important to the working of a classical transistor.

 

Quantum computing works by manipulating "qubits".  i.e. the spin angular momentum of an electron, the polarization of photons, maybe the quantum states involved in the hall effect.   They involve processes where Heisenberg uncertainty kicks in, and can be harnessed for computation.

 

To see this consider how large the photons your eye sees with are.  Classical transistors are already small enough to be smaller than a wavelength of visible light.  They are classical because of the nature of their operation.  They manipulate bits, not qubits.

image.png.189bf4411b30ad4f76c6b51eb962e081.png

(Amazing to think that the transistors in your run of the mill CPU are smaller than the wavelength of the light you see with.)

 

In short that device manipulates bits using quantum effects just as the transistors in my surface pro do. 

Qubits are something else. ?  I'd bet my money on the Quantum Hall Effect being important to practical quantum computing.  http://science.sciencemag.org/content/315/5817/1379 

Edited by Uttamattamakin
Citing a source regarding the quantum hall effect.
Link to comment
Share on other sites

Link to post
Share on other sites

I would argue a lot of the apparent stagnation is simply because what we as consumers do with computers don't require a lot of power to reach acceptable levels of performance and quality. Especially when a lot of things moved to the web and we have phones to access the content, developers have to design things with even lower performance requirements in mind. Since we have apps and software that are meant to perform well on lower power devices, it makes sense that putting them on higher end hardware won't improve the apparent performance.

 

I'm sure a lot of people here haven't experienced the joys of actually worrying about if their PC will play an MP3 or not. Or that a 1 minute uncompressed WAV file (that wasn't even CD quality!) would eat all your RAM if you tried to work with it.

 

Also I would argue a vast majority of the performance boost from the 90s came simply from adding more speed. For example, the OG Pentium launched with a 60 MHz model. Two years later it had a 120 MHz model. Within 7 years in that time frame as well, we went from 60 MHz to 1GHz, a 16 fold increase. And while sure there were architecture improvements that helped, I'm not sure if those played a significant role in boosting general performance overall moreso than the clock speed bump. The speed bump from an i7-2600K to an i7-8700K assuming max turbo boost? 1.26x increase.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, M.Yurizaki said:

I would argue a lot of the apparent stagnation is simply because what we as consumers do with computers don't require a lot of power to reach acceptable levels of performance and quality. Especially when a lot of things moved to the web and we have phones to access the content,.......

 

I'm sure a lot of people here haven't experienced the joys of actually worrying about if their PC will play an MP3 or not. Or that a 1 minute uncompressed WAV file (that wasn't even CD quality!) would eat all your RAM if you tried to work with it.

 

Also I would argue a vast majority of the performance boost from the 90s came simply from adding more speed. For example, the OG Pentium launched with a 60 MHz model. Two years later it had a 120 MHz model. Within 7 years in that time frame as well, we went from 60 MHz to 1GHz, a 16 fold increase. And while sure there were architecture improvements that helped, I'm not sure if those played a significant role in boosting general performance overall moreso than the clock speed bump. The speed bump from an i7-2600K to an i7-8700K assuming max turbo boost? 1.26x increase.

Agreed agreed agreed.

I mean once the smart phone came around and there was money to be made there we stopped pushing for more.    I think about the fact that Windows Vista had more graphically intensive UI (compared to the hardware available) than windows 10.   You are right.

Unless VR becomes a thing that is actually required in order to use some killer app or play the must have games who needs to upgrade.  Unless having many many cores becomes required a quad core i7 from 2018 will still be usable in 2038.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, M.Yurizaki said:

I would argue a lot of the apparent stagnation is simply because what we as consumers do with computers don't require a lot of power to reach acceptable levels of performance and quality. Especially when a lot of things moved to the web and we have phones to access the content, developers have to design things with even lower performance requirements in mind. Since we have apps and software that are meant to perform well on lower power devices, it makes sense that putting them on higher end hardware won't improve the apparent performance.

 

I'm sure a lot of people here haven't experienced the joys of actually worrying about if their PC will play an MP3 or not. Or that a 1 minute uncompressed WAV file (that wasn't even CD quality!) would eat all your RAM if you tried to work with it.

 

Also I would argue a vast majority of the performance boost from the 90s came simply from adding more speed. For example, the OG Pentium launched with a 60 MHz model. Two years later it had a 120 MHz model. Within 7 years in that time frame as well, we went from 60 MHz to 1GHz, a 16 fold increase. And while sure there were architecture improvements that helped, I'm not sure if those played a significant role in boosting general performance overall moreso than the clock speed bump. The speed bump from an i7-2600K to an i7-8700K assuming max turbo boost? 1.26x increase.

This and a few other things.

 

You have two GPU makers and really only one at the moment when it comes to the high end being Nvidia

Intel was pretty much running the show CPU wise for a long time before Ryzen and now thats been shaking things up

RAM wise I think there have been a number of shortages mostly of the mobile stuff that impacted alot

 

Although I think that computing will be undergoing a few changes in the next decades centered around developing for automation, decentralization, and peer to peer networking but that's another topic. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, fasauceome said:

New is always expensive. Not that I expect everyone to switch to quantum for household use in 3 years.

I don't think so really. Quantum will denote a complete switch in programming styles, hardware, support for external hardware etc.

 

The programming isn't there to support it. Programs are written to understand 1s and 0s. Quantum is more than that. It's a massive change, and something that I doubt will happen in the next decade or even more before it's widely supported. Quantum will represent a shift in computing where all backwards compatibility will be lost, and that'll be a HUGE thing. 

 

I mean... what are they going to do, release a quantum processor with no software support? No one will buy it. 

 

I think intel will do 1 or 2 more transistor shrinks before quantum even comes into the picture. 10nm, then probably a small jump like 8 nm or something. Then, they'll likely move on to other methods of increasing performance. Increasing clock speed, increasing cache size, hell, maybe we'll see a time of dual CPU boards being commonplace and software to support it. I think that'd be cool. I just really don't see quantum around the corner anytime soon.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Uttamattamakin said:

Are PC's basically stagnant  for the last decade or so?  I mean... when I started with personal computers in the mid 1980's to the mid 1990's your PC would have become obsolete at least five times over.  The whole architecture of the PC would change.  Even if you had an IBM PC in the Mid 1980's as a home user.  By 1989 it would be physically unable to do say Windows 3.0.  While a late 80's 386 SX or DX would be very unlikely to run well with Windows 95.   That is not even considering how HUGE the changes were in graphics.  Compare how much graphics used to advance from say CGA to VGA to SVGA.  It is not that your GPU from a few years earlier could do the same thing as a brand new on but lower res ...it would be physically unable to do what a new GPU does. 

Now .. I think I could dust off my old Surface PRO 3 and play GTA V on it well enough to enjoy.

 

I would attribute this mainly to the lack of competition. Both Intel and Nvidia have had relatively little competition, especially in the high end consumer range over the past ten or so years, so they have no need to innovate there. Just look at Intel's reaction to the Ryzen release. More has happened over the past two years than the past eight, when it comes to improvements. Over the past two years the number of cores available at many price points have doubled and, if the zen 3000 leaks are accurate they will again. With software and your old Surface PRO 3 running GTA V, it is because improvements in software are tightly bound to improvements in hardware. No profitable game studio will create a game that the majority of possible consumers can not run enjoyably. When hardware innovation is abundant devs take advantage of what they can and, when it is not they do what they can with what exists. Every era of computing will have its own Crysis.

 

It you want to look for huge changes look at the parallelisation in DX12 when compared to DX11 and I suspect many more applications will become optimized for many cores. Or, maybe nothing as big will happen again. Maybe, computing has reached an age of increments not revolution, like many other now developed industries.

 

Or look at mobile computing ARM based CPUs are a whole other thing.

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Uttamattamakin said:

Are PC's basically stagnant  for the last decade or so?

Certainly not for the past decade, there have been massive improvements compared to PCs from 2008.

 

RTX as a whole is a big example. Also compare architectural improvements on both GPUs and CPUs. PCs have been steadily increasing over the years, but we're well past the point where each generation was exponentially better than its predecessor.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Uttamattamakin said:

Are PC's basically stagnant  for the last decade or so? 

No

why you ask?

Because of the advancement in technology in the past 10 years, 10 years being a decade and all. Look at the advancement in ram, cpu's, graphics!

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Crunchy Dragon said:

Certainly not for the past decade, there have been massive improvements compared to PCs from 2008.

 

RTX as a whole is a big example. Also compare architectural improvements on both GPUs and CPUs. PCs have been steadily increasing over the years, but we're well past the point where each generation was exponentially better than its predecessor.

I could have stated the question better.  Of course your Intel Core processor from this year is a bit better than one from 2008. 

However, that processor from 2008 can do everything one from 2018 could.  It would just do it slowly.

Conversely, a processor that was top of the line in 1998 likely won't boot up a copy of windows 7 from 2008. (Leaving aside 32 bit builds of windows 7 which may have existed.)

Compared to that kind of a change RTX is .... nothing. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Uttamattamakin said:

I could have stated the question better.  Of course your Intel Core processor from this year is a bit better than one from 2008. 

However, that processor from 2008 can do everything one from 2018 could.  It would just do it slowly.

That doesn't mean the industry is stagnant though, as most users need the speed of newer components. I don't believe that it would just be slow, I would use the term "unusable", assuming it didn't just straight up refuse to do a task.

 

4 minutes ago, Uttamattamakin said:

Compared to that kind of a change RTX is .... nothing. 

RTX is kind of a breakthrough, for a single card to ray trace in real time and still maintain a playable framerate? That's a big deal.

Quote or tag me( @Crunchy Dragon) if you want me to see your reply

If a post solved your problem/answered your question, please consider marking it as "solved"

Community Standards // Join Floatplane!

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Crunchy Dragon said:

That doesn't mean the industry is stagnant though, as most users need the speed of newer components. I don't believe that it would just be slow, I would use the term "unusable", assuming it didn't just straight up refuse to do a task.

 

IDK about that.  In my work as an educator I use a lot of older computers that the colleges either can't or won't replace.   PC and MAC.  Sometimes running rather demanding software.  The older computers are from what I can see just a bit ... slower... bigger.  noisier. *

I am sure that there are people who will really notice and suffer the differences. 

Like when it comes to RTX and tensor cores.  It would be nice to be able to write Matlab and or CUDA code to take advantage of those and run simulations based on actual general relativity and quantum field theory rather than approximations.  Who does that though?

* I am honestly having a hard time thinking of a software package that would not run on an older computer other than something really specific like CUDA code with one of the more recent compute capability levels.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, M.Yurizaki said:

developers have to design things with even lower performance requirements in mind.

yeah except for Android developers, they just keep adding useless RAM-devouring crap to the OS and its programs so you need to buy a new phone every year lmao.... no thanks, I'll keep my 2008 phone that can do most of what new ones can, like calling, sending texts or taking kinda crappy pictures.

 

 

about PCs, agreed, it was "add more speed" back in the day and now it's just ADD MORE CORES1!! look at the 9 series from Intel or the Ryzen 3XXX that will market 16c 16t just for GAMING.

ASUS X470-PRO • R7 1700 4GHz • Corsair H110i GT P/P • 2x MSI RX 480 8G • Corsair DP 2x8 @3466 • EVGA 750 G2 • Corsair 730T • Crucial MX500 250GB • WD 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, aezakmi said:

about PCs, agreed, it was "add more speed" back in the day and now it's just ADD MORE CORES1!! look at the 9 series from Intel or the Ryzen 3XXX that will market 16c 16t just for GAMING.

And the problem I see with this is that gaming isn't going to magically become better with more cores. Playing an FPS game like Doom where you're in small rooms fighting may be at most 5 enemies at a time isn't that hard to run. And something like Ashes of the Singularity where all of those cores actually do get some use is a rare use case.

 

I mean, maybe we can have a GTA where it actually simulates what conditions in LA are like during rush hour... but that doesn't make the game fun. If I wanted to experience LA rush hour traffic, I'll just go there myself.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×