Search In
• More options...
Find results that contain...
Find results in...

# I really dont understand wattage

Go to solution Solved by mariushm,

Power  ( measured in watts )  is the product of voltage and current (how much electricity flows through the wires)

The power supply in your computer takes the 220v AC (+/- 10% , in Europe countries are supposed to standardize around 230v +/- 10% or something like that) and converts it internally into around 400v DC and from there, the power supply produces the voltages the computer needs ( 12v DC, 5v DC, 3.3v DC) all in the amount the parts need at  that moment in time... the power supply only takes as much power from the mains socket as the components require at a moment.

If a power supply has 500w on it, that doesn't mean your computer consumes 500 watts all the time, it simply means that if components inside your pc demand some power, the power supply can create this power, up to a maximum of 500 watts.

So for example, if your cpu wants 100w when you're gaming, your psu can produce these 100 watts and give them to cpu. If your video card wants 200w it can also handle that, because 100w+200w is less than 500w. If you add a 2nd video card which wants 300w then your power supply can no loner do that and most likely it will turn off, because 100w + 200w + 300w  is more than 500w and the power supply can only produce up to 500w.

Your cpu and your video card will constantly adjust their frequencies and turn off or slow down parts of themselves in order to save power, if they don't have work to give to those parts, so that's why your computer won't use the same amount of power all the time. When you're watching a Youtube video, only a tiny part of your video card will be busy showing the video, so the video card will use little power. When you're gaming, video card goes all the way.

It won't matter to you, but maybe others would appreciate it.

When the power supply takes the 220v AC (or 110v AC if in US or other such countries) and converts it to around 400v DC and then again converts this to the smaller voltages the parts want ...these conversions aren't perfect, there are some losses which result in heat produced and that's why power supply has fans.

But what I'm trying to say is that these conversion losses can be up to 10-15%, so for example if the components inside your computer consume 300 watts when you're gaming, your power supply may need to take 350 watts from the mains socket in order to "create" these 300 watts for the parts in your computer.

If your power supply says 500w on it, then if the parts take this much amount of power, the power supply may need to take 600 watts from the mains socket in order to create those 3 voltages and send all those 500w to all the parts in your computer.

In most 230v +/- 10%-ish countries, the mains sockets have 16 or 20A fuses and the wiring is thick enough to handle this... that's 230v x 16 A = ~ 3680 watts ... so 500-600w is not a big deal. It does matter if you have 4-5 computers and you plug them all in an extension cord.

If i had a 500w pc and i plugged it into a 220w home will it cause a power outage.

##### Share on other sites

You probably have 220v (volts) in your house. Not a 220w outlet.

220v (or 230/240v) are common voltages in Europe, probably some other places too.

if you get a PSU meant for your region and you have a house with an electrical circuit that is not 100 years old, you'll be fine.

"We're all in this together, might as well be friends" Tom, Toonami.

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

##### Share on other sites

I think you're talking 220v, not watts.

If you plugged your PC into a wall outlet, as long as the voltage setting is set correctly (only on older PSUs), it'll work.

Fan Comparisons          F@H          PCPartPicker         Analysis of Market Trends (Coming soon? Never? Who knows!)

Designing a mITX case. Working on aluminum prototypes.

Open for intern / part-time. Good at maths, CAD and airflow stuff. Dabbled with Python.

Please fill out this form! It helps a ton! https://linustechtips.com/main/topic/841400-the-poll-to-end-all-polls-poll/

thanks guys

##### Share on other sites

Also, watts = volts x amps so you can find out how much current you're drawing in case you're interested.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

Primary PC:

i7 8086k (won) - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

Linux Proliant ML150 G6:

Dual Xeon X5560 - 24GB ECC DDR3 - GTX 750 TI - old Seagate 1.5TB HDD - dark mode Ubuntu (and Win7, cuz why not)

##### Share on other sites

Power  ( measured in watts )  is the product of voltage and current (how much electricity flows through the wires)

The power supply in your computer takes the 220v AC (+/- 10% , in Europe countries are supposed to standardize around 230v +/- 10% or something like that) and converts it internally into around 400v DC and from there, the power supply produces the voltages the computer needs ( 12v DC, 5v DC, 3.3v DC) all in the amount the parts need at  that moment in time... the power supply only takes as much power from the mains socket as the components require at a moment.

If a power supply has 500w on it, that doesn't mean your computer consumes 500 watts all the time, it simply means that if components inside your pc demand some power, the power supply can create this power, up to a maximum of 500 watts.

So for example, if your cpu wants 100w when you're gaming, your psu can produce these 100 watts and give them to cpu. If your video card wants 200w it can also handle that, because 100w+200w is less than 500w. If you add a 2nd video card which wants 300w then your power supply can no loner do that and most likely it will turn off, because 100w + 200w + 300w  is more than 500w and the power supply can only produce up to 500w.

Your cpu and your video card will constantly adjust their frequencies and turn off or slow down parts of themselves in order to save power, if they don't have work to give to those parts, so that's why your computer won't use the same amount of power all the time. When you're watching a Youtube video, only a tiny part of your video card will be busy showing the video, so the video card will use little power. When you're gaming, video card goes all the way.

It won't matter to you, but maybe others would appreciate it.

When the power supply takes the 220v AC (or 110v AC if in US or other such countries) and converts it to around 400v DC and then again converts this to the smaller voltages the parts want ...these conversions aren't perfect, there are some losses which result in heat produced and that's why power supply has fans.

But what I'm trying to say is that these conversion losses can be up to 10-15%, so for example if the components inside your computer consume 300 watts when you're gaming, your power supply may need to take 350 watts from the mains socket in order to "create" these 300 watts for the parts in your computer.

If your power supply says 500w on it, then if the parts take this much amount of power, the power supply may need to take 600 watts from the mains socket in order to create those 3 voltages and send all those 500w to all the parts in your computer.

In most 230v +/- 10%-ish countries, the mains sockets have 16 or 20A fuses and the wiring is thick enough to handle this... that's 230v x 16 A = ~ 3680 watts ... so 500-600w is not a big deal. It does matter if you have 4-5 computers and you plug them all in an extension cord.