Jump to content

Big Bad PSU math things yes.

Hamster Homie
Go to solution Solved by akio123008,
12 hours ago, Bombastinator said:

I didn’t say it didn’t consume energy I just said it wasn’t terribly expensive money wise

I think what he meant was that the processing itself doesn't "consume" energy aka all energy comes out as heat. As in, 200W into the PC means 200W of heat. He concluded that from your post (which is correct).

 

1 hour ago, Orian Pax said:

any tldr?

- power supply efficiency doesn't matter much for your power bill

- doing the maths is possible but getting an accurate number is hard if not impossible (eg the OP did the maths right, but as pointed out by others he's probably off by some amount due to the estimates he started with)

 

- processing information doesn't do any work, therefore all energy a computer uses comes out as heat

- therefore a computer is an expensive room heater that also happens to run software

 

- it could be argued that a computer consumes no power because it heats your room, so your heating system has to work less hard, so if your computer uses 200W, your heater can use 200W less, which means your net energy consumption is the same. Of course this does assume your heater adapts to the room temperature using some sort of thermostat; if it's a dumb device running at full power all the time this doesn't apply.

Hey, so I wanted to calculate how much energy I whould waste each year because of my PSU (not other components just PSU). I whould like to share with you my calcultaions and you tell me where I'm wrong. CPU and GPU wattages were taken from HWMonitor. The rest is googling and estimating.

 

75W for my GTX 1650

70W for my Ryzen 5 3600

70W for motherboard and RAM (Gigabyte a520m s2h mobo + 2x8Go 3200 RGB RAM)

25W for my cooling and rest RGB stuff (NZXT Kraken M22 + 2 fans in the front)

20W for my storage (2.5" 500Mb/s SSD + M.2 NVMe 1300Mb/s SSD)

 

This all rounds up to a 260 Watts. This is under the biggest load my PC can take and I think it probably has never taken such a load and never will. So when I am gaming, I think I whould be using :

 

95% GPU

75% CPU

90% mobo and RAM

75% cooling + RGB

10% storage

 

This gives me a 207W power consumption while gaming.

My PSU (Corsair CX550M) is 89% efficient at a 207W load. So it whould have to grab 232W from the wall to get to 207W DC ?

This should give me a 25W energy loss.

I think I game around 3 hours per day.

 

25W = 25Wh = 0.025kWh

0.025 * 3 = 0.075

 

This whould give me 0.075kWh of energy being wasted everyday.

In France, energy costs 0.1853€ per kWh.

 

1 / 0.075 = 13.3

 

It whould take me 13.3 days to get to 1 kWh and pay 0.1853€ of wasted energy.

 

0.075kWh * 365 = 27.375kWh

27.375kWh * 0.1853€ = 5.07€

 

So I whould pay 5.07€ every year.

 

 

 

Am I correct for all the above ?

 

 

 

Now let's say, I think I waste so much money per year doing this. like what 5€ omfg. I might want to switch to the TX550M with a 91% efficiency at 207W. (btw I cant really take other PSUs because my case is too small)

 

That whould grab 227W from the wall.

Making it a 20W energy loss.

 

20W = 20Wh = 0.020kWh

0.020 * 3 = 0.06

 

0.06kWh * 365 = 21.9kWh

21.9kWh * 0.1853€ = 4.05€

 

With all of that, I whould pay 4.05€ a year with the TX550M.

 

This is a 1.02€/year improvement compared to the CX550M. The TX550M costs 80€ and I will make profit in 81 years by saving 1€ every year if I buy it !!!!!!! gUyS i ThInK i'M gOnNa SwItCh To tHe TX550M i dId tHe mAtHs

 

If I'm wrong please excuse me I'm 15yo. Tell me where I'm wrong. Thx

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, Hamster Homie said:

Hey, so I wanted to calculate how much energy I whould waste each year because of my PSU (not other components just PSU). I whould like to share with you my calcultaions and you tell me where I'm wrong. CPU and GPU wattages were taken from HWMonitor. The rest is googling and estimating.

 

75W for my GTX 1650

70W for my Ryzen 5 3600

70W for motherboard and RAM (Gigabyte a520m s2h mobo + 2x8Go 3200 RGB RAM)

25W for my cooling and rest RGB stuff (NZXT Kraken M22 + 2 fans in the front)

20W for my storage (2.5" 500Mb/s SSD + M.2 NVMe 1300Mb/s SSD)

 

This all rounds up to a 260 Watts. This is under the biggest load my PC can take and I think it probably has never taken such a load and never will. So when I am gaming, I think I whould be using :

 

95% GPU

75% CPU

90% mobo and RAM

75% cooling + RGB

10% storage

 

This gives me a 207W power consumption while gaming.

My PSU (Corsair CX550M) is 89% efficient at a 207W load. So it whould have to grab 232W from the wall to get to 207W DC ?

This should give me a 25W energy loss.

I think I game around 3 hours per day.

 

25W = 25Wh = 0.025kWh

0.025 * 3 = 0.075

 

This whould give me 0.075kWh of energy being wasted everyday.

In France, energy costs 0.1853€ per kWh.

 

1 / 0.075 = 13.3

 

It whould take me 13.3 days to get to 1 kWh and pay 0.1853€ of wasted energy.

 

0.075kWh * 365 = 27.375kWh

27.375kWh * 0.1853€ = 5.07€

 

So I whould pay 5.07€ every year.

 

 

 

Am I correct for all the above ?

 

 

 

Now let's say, I think I waste so much money per year doing this. like what 5€ omfg. I might want to switch to the TX550M with a 91% efficiency at 207W. (btw I cant really take other PSUs because my case is too small)

 

That whould grab 227W from the wall.

Making it a 20W energy loss.

 

20W = 20Wh = 0.020kWh

0.020 * 3 = 0.06

 

0.06kWh * 365 = 21.9kWh

21.9kWh * 0.1853€ = 4.05€

 

With all of that, I whould pay 4.05€ a year with the TX550M.

 

This is a 1.02€/year improvement compared to the CX550M. The TX550M costs 80€ and I will make profit in 81 years by saving 1€ every year if I buy it !!!!!!! gUyS i ThInK i'M gOnNa SwItCh To tHe TX550M i dId tHe mAtHs

 

If I'm wrong please excuse me I'm 15yo. Tell me where I'm wrong. Thx

A lot of that math is based on estimates that can vary somewhat real world.  The only true measurement is something like a kill-a-watt between the machine and the wall power.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Hamster Homie said:

If I'm wrong please excuse me I'm 15yo. Tell me where I'm wrong. Thx

The maths isn't wrong but

 

what exactly is your point?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Bombastinator said:

A lot of that math is based on estimates that can vary somewhat real world.  The only true measurement is something like a kill-a-watt between the machine and the wall power.

By estimating, do you think I undershot my estimations ? I think I even overshot on some things.

56 minutes ago, akio123008 said:

The maths isn't wrong but

 

what exactly is your point?

I wanted to know if I really only paied 5€ a year for energy loss. I did the math to see if power supply efficiency really amtters that much. Like here at my level it really does not I feel. If you have bigger machines like servers or Threadripper + 3090 equiped gaming rigs then it can vary but for the most part, for regular gamers, you only need 80+Bronze and it should do the work not wasting money.

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Hamster Homie said:

for regular gamers, you only need 80+Bronze and it should do the work not wasting money.

True but there's other reasons people buy more expensive power supplies.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Hamster Homie said:

By estimating, do you think I undershot my estimations ? I think I even overshot on some things.

I wanted to know if I really only paied 5€ a year for energy loss. I did the math to see if power supply efficiency really amtters that much. Like here at my level it really does not I feel. If you have bigger machines like servers or Threadripper + 3090 equiped gaming rigs then it can vary but for the most part, for regular gamers, you only need 80+Bronze and it should do the work not wasting money.

No idea.  When you multiply though inaccuracies multiply too.  Stuff can get insufficiently accurate pretty fast.  This is a reason why often when one sees estimates they have high and low estimates which can be very far apart.  Columbus was famous for making use of this.  He took high estimates of the size of Asia and low estimates of the size of the earth to make an enticing number which turned out to have nothing to do with anything but did get his expedition funded. And that was around 1500ad

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

To muddy things even further, the "wasted" energy as you call it is wasted in the form of heat. So, perhaps, in the winter it contributes somewhat to heating your living space so you'd have to account for that, which would require knowing your heating cost efficiency. Much harder to calculate.

All in all I think you've got the idea, it's absurd to upgrade a power supply in the name of efficiency, and mostly absurd to worry about it in the name of economics.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, cachethrash said:

To muddy things even further, the "wasted" energy as you call it is wasted in the form of heat. So, perhaps, in the winter it contributes somewhat to heating your living space so you'd have to account for that, which would require knowing your heating cost efficiency. Much harder to calculate.

All in all I think you've got the idea, it's absurd to upgrade a power supply in the name of efficiency, and mostly absurd to worry about it in the name of economics.

The issue is electricity doesn’t cost that much even where it’s really expensive. The Video card generally draws a lot more power than the cpu in this day and age.  If you’re really interested in electric bill stuff I suggest getting a power meter like a kill-a-watt and use it on everything.  Computers are 100% efficient electric heaters. Whatever wattage you dump into the machine is going to come out as heat (yes this does mean that electric heaters should all be computers because then at least you’re getting something out of the whole thing.  The thing is computer chips are really expensive and nichrome wire is really cheap)

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/5/2021 at 4:23 AM, Bombastinator said:

The issue is electricity doesn’t cost that much even where it’s really expensive. The Video card generally draws a lot more power than the cpu in this day and age.  If you’re really interested in electric bill stuff I suggest getting a power meter like a kill-a-watt and use it on everything.  Computers are 100% efficient electric heaters. Whatever wattage you dump into the machine is going to come out as heat (yes this does mean that electric heaters should all be computers because then at least you’re getting something out of the whole thing.  The thing is computer chips are really expensive and nichrome wire is really cheap)

Wait, crunching 0s and 1s doesnt consume energy ? If you put 200W in a computer it's gonna crunch number AND rejet everything as heat ? As if it only needs electricity as a way to move 0s and 1s ? Or am I completely misunderstanding this ?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Hamster Homie said:

Wait, crunching 0s and 1s doesnt consume energy ? If you put 200W in a computer it's gonna crunch number AND rejet everything as heat ? As if it only needs electricity as a way to move 0s and 1s ? Or am I completely misunderstanding this ?

This is where you get into some incredibly complex physics (information theory that is), but that's exactly right.

 

I'm not going to pretend I fully understand these physics either, but what it boils down to is that information processing doesn't require energy from a pure theoretical perspective. It doesn't violate the laws of physics since processing information  technically doesn't do any work. So in theory, it would be possible to build a zero energy consumption computer.

 

The other posts mentioning a computer is essentially a room heater are also correct; 200W into a computer is 200W of heat in the room. 

 

Even the heat that goes into fans will turn into kinetic energy of the air moving, and acoustic energy (the sound of the fan) which will eventually be converted into heat elsewhere in your room. Same for energy that goes into LEDs and displays; that light will reach other objects (you and your eyes included) and get converted to heat there. So yes, it really is 100% a room heater.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, akio123008 said:

This is where you get into some incredibly complex physics (information theory that is), but that's exactly right.

 

I'm not going to pretend I fully understand these physics either, but what it boils down to is that information processing doesn't require energy from a pure theoretical perspective. It doesn't violate the laws of physics since processing information  technically doesn't do any work. So in theory, it would be possible to build a zero energy consumption computer.

 

The other posts mentioning a computer is essentially a room heater are also correct; 200W into a computer is 200W of heat in the room. 

 

Even the heat that goes into fans will turn into kinetic energy of the air moving, and acoustic energy (the sound of the fan) which will eventually be converted into heat elsewhere in your room. Same for energy that goes into LEDs and displays; that light will reach other objects (you and your eyes included) and get converted to heat there. So yes, it really is 100% a room heater.

 

 

 

Holy shit. I mean there are also quantom computer that already are "0 energy consumption computer". They just need to be as cold as phisically possible. Why that ? idk

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Hamster Homie said:

Wait, crunching 0s and 1s doesnt consume energy ? If you put 200W in a computer it's gonna crunch number AND rejet everything as heat ? As if it only needs electricity as a way to move 0s and 1s ? Or am I completely misunderstanding this ?

I didn’t say it didn’t consume energy I just said it wasn’t terribly expensive money wise.  Think of it this way: a space heater uses 1500w which is 3 or 4 times what most computers do most of the time.  They CAN use more, but usually don’t very often or for very long.  If you want to go hardcore you can get it down to a fifth.  With a laptop which is designed from the ground up for low electricity use it’s even less. Not unusual to see laptops with power bricks under 100w. It won’t even use 100w all the time, and 100w is one fricking lightbulb. Most of that is the video card though, so if you’ve got a mining rig with an mess of video cards you’re going to use a mess more electricity.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Bombastinator said:

I didn’t say it didn’t consume energy I just said it wasn’t terribly expensive money wise

I think what he meant was that the processing itself doesn't "consume" energy aka all energy comes out as heat. As in, 200W into the PC means 200W of heat. He concluded that from your post (which is correct).

 

1 hour ago, Orian Pax said:

any tldr?

- power supply efficiency doesn't matter much for your power bill

- doing the maths is possible but getting an accurate number is hard if not impossible (eg the OP did the maths right, but as pointed out by others he's probably off by some amount due to the estimates he started with)

 

- processing information doesn't do any work, therefore all energy a computer uses comes out as heat

- therefore a computer is an expensive room heater that also happens to run software

 

- it could be argued that a computer consumes no power because it heats your room, so your heating system has to work less hard, so if your computer uses 200W, your heater can use 200W less, which means your net energy consumption is the same. Of course this does assume your heater adapts to the room temperature using some sort of thermostat; if it's a dumb device running at full power all the time this doesn't apply.

Link to comment
Share on other sites

Link to post
Share on other sites

So according to your math you game 24/7 every day?

CPU: Ryzen 5800X3D | Motherboard: Gigabyte B550 Elite V2 | RAM: G.Skill Aegis 2x16gb 3200 @3600mhz | PSU: EVGA SuperNova 750 G3 | Monitor: LG 27GL850-B , Samsung C27HG70 | 
GPU: Red Devil RX 7900XT | Sound: Odac + Fiio E09K | Case: Fractal Design R6 TG Blackout |Storage: MP510 960gb and 860 Evo 500gb | Cooling: CPU: Noctua NH-D15 with one fan

FS in Denmark/EU:

Asus Dual GTX 1060 3GB. Used maximum 4 months total. Looks like new. Card never opened. Give me a price. 

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, DoctorNick said:

So according to your math you game 24/7 every day?

 

On 7/4/2021 at 9:28 PM, Hamster Homie said:

I think I game around 3 hours per day.

 

25W = 25Wh = 0.025kWh

0.025 * 3 = 0.075

 

This whould give me 0.075kWh of energy being wasted everyday.

 

:)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, akio123008 said:

I think what he meant was that the processing itself doesn't "consume" energy aka all energy comes out as heat. As in, 200W into the PC means 200W of heat. He concluded that from your post (which is correct).

 

- power supply efficiency doesn't matter much for your power bill

- doing the maths is possible but getting an accurate number is hard if not impossible (eg the OP did the maths right, but as pointed out by others he's probably off by some amount due to the estimates he started with)

 

- processing information doesn't do any work, therefore all energy a computer uses comes out as heat

- therefore a computer is an expensive room heater that also happens to run software

 

- it could be argued that a computer consumes no power because it heats your room, so your heating system has to work less hard, so if your computer uses 200W, your heater can use 200W less, which means your net energy consumption is the same. Of course this does assume your heater adapts to the room temperature using some sort of thermostat; if it's a dumb device running at full power all the time this doesn't apply.

that was one of best tldr on this forum i have ever seen.

Link to comment
Share on other sites

Link to post
Share on other sites

In general, computers don't cost a lot to run. Efficiency, however, is important because there are so many computers in use, both in homes and in businesses. Because of that, making each of those computers 1% more efficient is a huge deal. Furthermore, reducing the power consumption of those computers by 1% reduces the overall power consumption by more than 1% wherever those systems need to be air conditioned. Since a lot of those computers are located in places like California and Texas, which are both populous and hot, that's a worthy consideration.

 

At least in the United States, the power grid is outdated and inadequate in many places. Furthermore, a lot of that electrical power is generated by burning fossil fuels (mostly natural gas these days), which is a non-renewable resource. Burning it also releases CO2, which is a greenhouse gas.

 

Improved efficiency in the power supply also means less heat output in the power supply, which (in theory) should make it more reliable. In practice, efficiency seems to be a relatively small factor in the reliability of switching power supplies. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, H713 said:

In general, computers don't cost a lot to run. Efficiency, however, is important because there are so many computers in use, both in homes and in businesses. Because of that, making each of those computers 1% more efficient is a huge deal. Furthermore, reducing the power consumption of those computers by 1% reduces the overall power consumption by more than 1% wherever those systems need to be air conditioned. Since a lot of those computers are located in places like California and Texas, which are both populous and hot, that's a worthy consideration.

 

At least in the United States, the power grid is outdated and inadequate in many places. Furthermore, a lot of that electrical power is generated by burning fossil fuels (mostly natural gas these days), which is a non-renewable resource. Burning it also releases CO2, which is a greenhouse gas.

 

Improved efficiency in the power supply also means less heat output in the power supply, which (in theory) should make it more reliable. In practice, efficiency seems to be a relatively small factor in the reliability of switching power supplies. 

For small machines like at home computers not taking much more than 300W I dont think it's a huge deal because there is really not a lot of waste, as long as you go with 80+ there shouldn't be a lot of waste and even if there is, 1/2 of the year it can be used to warm up your house not needing to consume 300W of a room heater. 

Where it is really important is in offices or in servers needing to be cooled because more than 10 PCs bringing each 300W to the room is a lot of heat in the end needing to be cooled by other enegery consuming coolers. Servers and offices should really have 90+ PSUs and 3nm CPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/7/2021 at 5:06 AM, akio123008 said:

I think what he meant was that the processing itself doesn't "consume" energy aka all energy comes out as heat. As in, 200W into the PC means 200W of heat. He concluded that from your post (which is correct).

 

- power supply efficiency doesn't matter much for your power bill

- doing the maths is possible but getting an accurate number is hard if not impossible (eg the OP did the maths right, but as pointed out by others he's probably off by some amount due to the estimates he started with)

 

- processing information doesn't do any work, therefore all energy a computer uses comes out as heat

- therefore a computer is an expensive room heater that also happens to run software

 

- it could be argued that a computer consumes no power because it heats your room, so your heating system has to work less hard, so if your computer uses 200W, your heater can use 200W less, which means your net energy consumption is the same. Of course this does assume your heater adapts to the room temperature using some sort of thermostat; if it's a dumb device running at full power all the time this doesn't apply.

That argument would require that one has electric heat and the room is being heated at the time 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

That argument would require that one has electric heat and the room is being heated at the time 

Exactly. Maybe not heated all the time, but say heated whenever the computer is on.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, akio123008 said:

Exactly. Maybe not heated all the time, but say heated whenever the computer is on.

Which means that if, say, an air conditioner was on which are almost always electric will mean more electricity is being used.  There are some propane powered ones but they are rare and usually relegated to reefer trucks.  Swamp coolers (which include most “desktop air conditioners”) use water evaporation instead of electricity, but only work well in dry or desert climates. One can make one with a wet towel and a fan. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Bombastinator said:

Which means that if, say, an air conditioner was on

Oh yeah but that's where you get in to a totally different situation. Of course you could only see the heat from a computer as useful when the room requires heating (eg when it's cold outside). 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, akio123008 said:

Oh yeah but that's where you get in to a totally different situation. Of course you could only see the heat from a computer as useful when the room requires heating (eg when it's cold outside). 

Fair enough.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×