Hi!
As of right now I have a ~6 years old laptop used as a server machine for hosting a few gameservers(1-4 at most) and a webserver, but its not really anything great for anything other than its low power consumption.
Little bit more than a year ago I built a new desktop setup which is at the moment unused, due to having a similar specd laptop. Recently I started thinking about making that old desktop build into a server since for the stuff I need it is still a lot more than enough.
The laptop:
Lenovo G505s
AMD A10-5750M running on base clocks
AMD Radeon™ 8570M 2 GB
2x4GB DDR3 - Running at 1600MHz
The desktop:
AMD Ryzen 5 2600X - with no oc, running on base 3.6GHz
MSi GTX 1660Ti Ventus 6G
G.Skill Aegis 2x8GB DDR4 - Running at 3000MHz
My question is, how can I make it so that this setup wont consume this much power? Although its a significant leap in performance compared to my old laptop, only the 2600X eats around 2 times or even more based on various sites benchmarks, than the maximum draw of my laptops ~90W power brick. Is that data even true? I mean several site claims that it wont really go above 100W unless OCd and under heavy load, and some claim that it uses 100W+ even with no OC and base clocks under moderate load. Also since the 2600X doesnt have an igpu, I can't just pull out the graphics card, so i though about getting something like a gtx 710 or somehow making the machine boot up everything on its own if the system restarts and accessing it through RDP. Is this a route even worth going for, or I shouldn't really bother with it at all? Also if someone could give some tips on how should I make this setup a bit more energy efficient, I would be glad(I don't mind getting a little bit worse performance).