Jump to content

Is it safe to use a 6A (Max current) surge protector power strip for a 600 watt psu drawing 10A

Dandapani

Recently i noticed a noise coming out the power strip (surge protector) when im ~15 minutes into gaming. 

Im using 600 Watt psu and the power cord rating is 10A 250V. But the max current through the power strip is 6A.

 

Should i get a power strip that has max current of 10A or 13A is it something common with higher wattage PSUs.

 

(Image attached - surge protector power strip)

 

 

power strip.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Well if by safe you mean heat up and melt the plugs and wires then sure. It's safe.

Link to comment
Share on other sites

Link to post
Share on other sites

The power cable is rated 10A/250v because it just says how much electricity can pass through the cable. It's dependent on the thickness of the wires and the insulation on the wires. The power supply uses a mass produced cable, which is standard and may be bundled even with a 250w power supply, because it's just cheaper to make and stock 100k pieces of such cable, instead of having 3-4 separate cables.  Some countries may also require a minimum rating even through it's not required.

 

The power supply only consumes as much power as your computer components require - if your components consume only 100 watts, your power supply will only consume around 110-120 watts (some losses due to conversion efficiency).

If you're powering from 110v AC that's basically less than 1A of current on average  (110v AC = ~ 150v DC ... 120w/150v = 0.8A)

If you're in Europe or some country using 230v +/-10%, then you're looking at half a current

 

So, even when you're gaming and the computer and you're consuming 300 watts, the AC current is around 3-4A.

 

Now, there's a small fineprint here ... by design, switch mode power supplies can take energy in bursts from the main input, so for example your 110v powered supply may consume on average 3A of current when delivering 300 watts to your components,  but the power supply may suck 5-10A of current for something like 50-100ms, then maybe 2-4A of current for half a second and for the rest of around 400ms the power supply may not take anything from the mains input. The average is around 3A, but for very brief moments of time every second that demand can be much higher.

 

Depending on the power conditioner/ surge protector, you may see that the protector trips and turns off power to the PC because it thinks the PC consumes more power than the product is designed for (for those brief moments of time) and it wants to protect itself from overheating or whatever. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Even if your load is less than the rating of the power strip, you should still use one that is rated for the maximum rating of the circuit it's plugged into. I won't use a power strip less than 15A, the rating of any outlet I would plug it into.

Jeannie

 

As long as anyone is oppressed, no one will be safe and free.

One has to be proactive, not reactive, to ensure the safety of one's data so backup your data! And RAID is NOT a backup!

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mariushm said:

The power cable is rated 10A/250v because it just says how much electricity can pass through the cable. It's dependent on the thickness of the wires and the insulation on the wires. The power supply uses a mass produced cable, which is standard and may be bundled even with a 250w power supply, because it's just cheaper to make and stock 100k pieces of such cable, instead of having 3-4 separate cables.  Some countries may also require a minimum rating even through it's not required.

 

The power supply only consumes as much power as your computer components require - if your components consume only 100 watts, your power supply will only consume around 110-120 watts (some losses due to conversion efficiency).

If you're powering from 110v AC that's basically less than 1A of current on average  (110v AC = ~ 150v DC ... 120w/150v = 0.8A)

If you're in Europe or some country using 230v +/-10%, then you're looking at half a current

 

So, even when you're gaming and the computer and you're consuming 300 watts, the AC current is around 3-4A.

 

Now, there's a small fineprint here ... by design, switch mode power supplies can take energy in bursts from the main input, so for example your 110v powered supply may consume on average 3A of current when delivering 300 watts to your components,  but the power supply may suck 5-10A of current for something like 50-100ms, then maybe 2-4A of current for half a second and for the rest of around 400ms the power supply may not take anything from the mains input. The average is around 3A, but for very brief moments of time every second that demand can be much higher.

 

Depending on the power conditioner/ surge protector, you may see that the protector trips and turns off power to the PC because it thinks the PC consumes more power than the product is designed for (for those brief moments of time) and it wants to protect itself from overheating or whatever. 

 

Thank you for this information.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Lady Fitzgerald said:

Even if your load is less than the rating of the power strip, you should still use one that is rated for the maximum rating of the circuit it's plugged into. I won't use a power strip less than 15A, the rating of any outlet I would plug it into.

Yes, I have stopped gaming for now, just doing some programming work. I will get a new power strip later.

Hopefully my psu or the psu power cord won't be damaged till then.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×