Jump to content

Do you ever think that hardware will get to the point where..

blazetheway

Ok so this is a really weird late night thought I had a long time ago and it's kind of long but bare with me for a second. When Intel's XTU program came out and you could do reliable software overclocking from the desktop I started to have this thought.. wow.. when I was younger stuff like this wasn't even possible. To some of you it might be normal but to me it's wild to see how far we've come in my lifetime and how fast we got here. Never in a million years would we dream that we could trust any type of overclocking from the desktop to work reliably. Everything needed to be done via the BIOS and if you bricked your BIOS there was no fix the board was dead. Every single motherboard manufacturer made updating the BIOS extremely painful back in the day. It seemed as though every single time a new game came out you'd need to update a graphics card, a CPU, a sound card, everything. My 2080Ti from 2018 is still absolutely slaying games at 1080p. 10 years ago I'd have had 4 graphics cards between then and now. If something like ray tracing were to of come out 10 or 20 years ago you would have probably needed dedicated hardware to even use it and it would have been stupid expensive and impractical to do so. 

 

Nowadays things are starting to become very integrated. Sound cards are barely used anymore as onboard audio has gotten better. Lighting is all controlled from headers on the motherboard and even some power supplies are coming with light headers. No more cold cathode tubes. Now we have expensive LEDs that don't get hot that have remotes for lighting. Companies like Galax are shipping remotes for lighting with their video cards. Process nodes have shrunk in GPUs and CPUs very much. People use hard drives but much less due to heat and reliability. But soon SSDs will hold enough data and be so reliable that HDDs are going to disappear at least in the consumer space (not enterprise). SSDs have become the size of a stick of gum for multiple TBs worth of space. You can even use AI to overclock from the BIOS. Certain CPUs are coming out more locked down from the factory. We've seen this with AMD's Ryzen chips (x3d) and we've also seen Intel lock down undervolting recently. XTU now supports automatic overclocking from within Windows on 14th gen.

 

Where I see this going (for better or for worse).. and the reason why I made this thread is simple: I think that eventually you're not going to be able to overclock anymore in the future. It's easy to find guides and whatnot.. but most people I've met don't do it. Even the ones who build brand new computers think that their PCs will last longer if they just don't touch anything and leave a ton of headroom on the table for performance.  More sensitive and more powerful equipment will become locked down because it will be more complicated due to small circuitry. You can see that with quantum computing. Quantum hardware is so sensitive that the earth's magnetic field fries it and they have trouble keeping it stable. One of the reasons why you don't see factory overclocks is because silicon quality varies from chip to chip and batch to batch. If Intel for instance comes out with a new CPU and 95% of them can hit 7GHz but 10% can only hit 6.5 GHz then they are going to default the release to 6.5GHz or less (unless they save the good bins to print money). I feel like the tech at the factories will become so much more advanced that these inconsistencies will be solved. Another is (obviously) due to electricity/power cost and consumption and we will see much more integration than we do now. But with process nodes shrinking we're seeing much greater efficiency. Power supplies are even getting super small. One limitation right now for consumer electronics is battery tech.. but we're already seeing stuff like graphene batteries be experimented on.

 

Anyways that's just my long rant what do you all think?

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly overclocking hasn't been worth it for awhile now. Generally speaking the different turbo technologies are so good that they often times even beat your conventional overclock due to being able to boost individual cores to high frequency. Why waste hours stress testing and validating an overclock when it still can be slightly unstable and the performance increases is negligible? I would rather have a pc that is 100% stable over a few percentage faster especially because realistically I won't notice the performance difference but I will notice a game crash. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, blazetheway said:

and leave a ton of headroom on the table for performance.

Not really. The reason most people don't do it nowadays is there just isn't any headroom on modern chips. A 13700K, for instance, will top out at 5.6GHz if you're lucky and have a ton of cooling, up from the standard 5.3GHz. That's ~6% more performance in entirely CPU bound workloads, in the grand scheme of things it's completely unnoticeable for 100W higher power consumption. As manufacturing continues to get more and more consistent, that improvement will get smaller and smaller as Intel/AMD push the chips to run ever closer to maxed out from the factory. AMD CPUs are even worse, where 5% more performance is in the best case scenario, usually closer to 2-3%. 

 

The area where there still is a fair bit of headroom is RAM overclocking, and that has seem a bit of a resurgence over the past couple generations since you can get a fair bit of a performance uplift from it. The problem is that it's still not a ton, and the amount of effort it takes to overclock and especially stress test RAM is an order of magnitude higher than CPU overclocking. 

 

30 minutes ago, blazetheway said:

Another is (obviously) due to electricity/power cost and consumption and we will see much more integration than we do now. But with process nodes shrinking we're seeing much greater efficiency. Power supplies are even getting super small. One limitation right now for consumer electronics is battery tech.. but we're already seeing stuff like graphene batteries be experimented on.

You say node shrinking is bringing much more efficiency, but the problem is it isn't shrinking fast enough. I know AMD has been quoted saying (pretty sure Nvidia has also said this) that the data center's need for computer horsepower is by far exceeding the amount of performance they can squeeze out of a generational improvement with a node shrink, so in order to try and match that demand they're having to boost power figures to eek out that last few percent of performance. 

 

In the past, that last few percent was the overclocking headroom. In the past (say 6th gen Intel, for instance) it was incredibly rare for there to be a chip that couldn't do at least 10% higher than the rated clock, and the only reason it was that much lower instead of just making those 6700Ks that couldn't do 4.4GHz 6600Ks was because getting them 10% higher would dramatically increase the power consumption. For the people who wanted that extra performance and didn't care about the power demands, you could overclock it yourself. Nowadays when they're already pushing chips past where they're efficient to max out their performance, there just isn't overclocking headroom anymore. 

 

 

That is all to say I don't think overclocking will fully die. There will be those people in the tuning community who will want to eek out every last percent from their systems. Then there's the XOC folks like Elmor, Splave, Hicookie, etc. who bring different CPUs to world record speeds and effectively are a bit of marketing for whoever releases them (you'll be more inclined to think "Intel better" if you see that a W9-3495X holding the Cinebench world record, thanks to the halo effect), so not disabling it for those folks would be effectively free marketing. Still, it's not in a place where it actually makes sense for the vast majority of people. I personally haven't run overclocked CPUs in my daily systems in a while now, only really small undervolts that I don't bother stress testing. Memory I will still do some overclocking with, mostly because if you aren't going for insanity settings it's not that difficult (it's still way more than the standard user would want to do), but that's still a mega-enthusiast type of workload.

 

If you want to tune hardware for the sake of tuning hardware, getting a i7 920 and a half-decent X58 board is way more fun than you'll ever have with a 7950X and a Crosshair X670E Gene for 1/10th the price. Doing it with modern hardware, while it can be fun in some circumstances, usually isn't. 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think it'll go away any time soon. Intel and AMD both want to appeal to the hobbyist market. Most people who overclock do so knowing they won't get all that much performance out of it. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, josefah said:

I don't think it'll go away any time soon. Intel and AMD both want to appeal to the hobbyist market. Most people who overclock do so knowing they won't get all that much performance out of it. 

Either that or they want to make a cpu cook a hotdog. Btw anyone know how to cook a hotdog with the heat from a cpu?

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Milesqqw2 said:

Either that or they want to make a cpu cook a hotdog. Btw anyone know how to cook a hotdog with the heat from a cpu?

You're giving me reasons to splurge more on my PC. Now I just need to convince my wife we need a computer with the biggest and fastest CPU for cooking reasons. If you don't eat you'll die of starvation, so obviously we need to overclock our PC.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Issac Zachary said:

You're giving me reasons to splurge more on my PC. Now I just need to convince my wife we need a computer with the biggest and fastest CPU for cooking reasons. If you don't eat you'll die of starvation, so obviously we need to overclock our PC.

Yeah, I’m out here helping people eat.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×