Jump to content

How's this APU build look?

Seminole
Just now, Seminole said:

Let me tell you about thermodynamics. Thermodynamics is about the transfer or state of energy. In this case our energy is polarized electrons, or electricity. Every electron has the potential of turning into heat which is normally looked at as a "decay".

This is why power plants have to be close to cities, because for every amount of distance the electricity must travel, a certain percentage turns into heat (and sometimes other forms of energy as well). The percentage discussed is what determines efficiency.

Now for the math. Let's say your computer consumes 300w at idle, and 400w under load and you have a 500w power supply rated with 80+ efficiency. So we know at 20%, 50%, and 100% of usage you will see at least 80% efficiency. Since our computer ranges 300-400 watts, we know we will not dip below the 20% range so we will always have 80% efficiency.

So the PSU is giving our components 300-400 watts of electricity at any given time, and losing up to 20% of that electricity, but that's okay because 20% of 400w is 80, and 400w + 80w is not greater than 500w, so we would theoretically have no power issues.

But let's say we replace the PSU with a 500w unrated power supply. Now we are missing a whole part of our math problem, we don't know the efficiency. But let's just assume it is 75+ efficiency. 25% of 400w is 100w. 400w + 100w is 500w, meaning our PSU would theoretically output it's maximum amount of electricity, and if it dips below 75% efficiency or if our computer needs 1 more watt of electricity, everything shuts down and components are possibly damaged.

 

tl;dr inefficiency is electrons being unstable and turning in to heat energy. This in turn cuts a percentage of the power to your computer, if your computer doesn't have enough power, it crashes with the ability to damage components.

 

Therefore, instability.

Assuming your definition of thermodynamics is true and correct, lets try to correct your own mistakes.

 

Your problem starts here:

Quote

So the PSU is giving our components 300-400 watts of electricity at any given time, and losing up to 20% of that electricity, but that's okay because 20% of 400w is 80, and 400w + 80w is not greater than 500w, so we would theoretically have no power issues.

No, the PSU doesn't lose part of the power it is supposed to deliver. This is not how power supplies work at all. The PSU delivers whatever power is demanded, but it pulls more power from the plug to deliver said power, ie. compensate for its efficiency loss.

So yes, it loses power as heat, but it does deliver what it's rated to deliver. It just has to pull more to deliver the same power compared to, say an 80+ Power Supply.

 

Some examples:

If you have a 300W load on a 500W power supply, and it's efficiency at 300W is 70%, then the PSU will pull ~428.6 Watts from the wall, and supply 300 Watts of power to the computer.

If you have a 20W load on a 500W power supply, and it's efficiency at 20W is 30%, then the PSU will pull ~66.67 Watts from the wall, and supply 20 Watts of power to the computer.

If you have a 500W load on a 500W power supply, and it's efficiency at 500W is 90%, then the PSU will pull ~555.5 Watts from the wall, and supply 500 Watts of power to the computer.

If you have a 150W load on a 500W power supply, and it's efficiency at 150W is 50%, then the PSU will pull 300 Watts from the wall, and supply 150 Watts of power to the computer.

 

The rest of your argument revolves around your initially wrong concept. Hopefully now you understand.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Shahnewaz said:

Assuming your definition of thermodynamics is true and correct, lets try to correct your own mistakes.

 

Your problem starts here:

No, the PSU doesn't lose part of the power it is supposed to deliver. This is not how power supplies work at all. The PSU delivers whatever power is demanded, but it pulls more power from the plug to deliver said power, ie. compensate for its efficiency loss.

So yes, it loses power as heat, but it does deliver what it's rated to deliver. It just has to pull more to deliver the same power compared to, say an 80+ Power Supply.

 

Some examples:

If you have a 300W load on a 500W power supply, and it's efficiency at 300W is 70%, then the PSU will pull ~428.6 Watts from the wall, and supply 300 Watts of power to the computer.

If you have a 20W load on a 500W power supply, and it's efficiency at 20W is 30%, then the PSU will pull ~66.67 Watts from the wall, and supply 20 Watts of power to the computer.

If you have a 500W load on a 500W power supply, and it's efficiency at 500W is 90%, then the PSU will pull ~555.5 Watts from the wall, and supply 500 Watts of power to the computer.

If you have a 150W load on a 500W power supply, and it's efficiency at 150W is 50%, then the PSU will pull 300 Watts from the wall, and supply 150 Watts of power to the computer.

 

The rest of your argument revolves around your initially wrong concept. Hopefully now you understand.

Your math and my math are very similar. I never said that the power lost was never delivered and I didn't calculate for it for the sake of simplification. You're simply calculating the compensation of the power, I calculated the original displacement of that power.

The thing I'm trying to point out is if enough electricity is lost and has to be compensated for, it will overload the PSU. Efficiency is the way around that. If your PSU is not efficient enough then it will not be able to supply the needed electricity and overload.

Most power supplies have a bit of circuitry to measure how much electricity is entering and leaving. If it exceeds the rated throughput then it shuts off.

 

Simply put, if it isn't efficient enough, complications will ensue.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm using the 7850K, do not do it! If you're going budget get an FX-4350 and a 750Ti, that's your best bet. It's do-able under $400 USD and with kinguin.net you can get Windows 10 Pro for $30 USD.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Seminole said:

Your math and my math are very similar. I never said that the power lost was never delivered and I didn't calculate for it for the sake of simplification. You're simply calculating the compensation of the power, I calculated the original displacement of that power.

The thing I'm trying to point out is if enough electricity is lost and has to be compensated for, it will overload the PSU. Efficiency is the way around that. If your PSU is not efficient enough then it will not be able to supply the needed electricity and overload.

Most power supplies have a bit of circuitry to measure how much electricity is entering and leaving. If it exceeds the rated throughput then it shuts off.

 

Simply put, if it isn't efficient enough, complications will ensue.

How will lost electricity "overload" the PSU? IT JUST GETS DISSIPATED AS HEAT! You're contradicting your own definition of thermodynamics here. If the AC current isn't being efficiently converted to DC current, some of it will escape as heat, ie. energy lost. They'll just dissipate off the unit.

 

Yes, " if enough electricity is lost and has to be compensated for, it will overload the PSU." Put it better this way: If you try to pull more power than your PSU is designed to deliver, it will be overloaded. But that is what PSU ratings are for! You don't put a 400W PSU on a system where you need 1500 Watts! Regardless of efficiency, a PSU's power rating is simply the total power it can deliver. A 400W PSU delivers a total 400 Watts of power, a 650W delivers 650 Watts and so on.

 

And yes, some power supplies do have overload protection. But you'll have to go out of your way to trip it, ie. try to draw 650 Watts on a 400W power supply.

 

As I explained earlier, your PSU will deliver it's rated power just fine. Lower efficiency simply means more power input for the same power output. It might run a tad bit hotter than other PSUs because of its lower efficiency, but any built-in PSU fan is more than enough to take care of that.

 

I'm trying my best to make you understand.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shahnewaz said:

How will lost electricity "overload" the PSU? IT JUST GETS DISSIPATED AS HEAT! You're contradicting your own definition of thermodynamics here. If the AC current isn't being efficiently converted to DC current, some of it will escape as heat, ie. energy lost. They'll just dissipate off the unit.

 

Yes, " if enough electricity is lost and has to be compensated for, it will overload the PSU." Put it better this way: If you try to pull more power than your PSU is designed to deliver, it will be overloaded. But that is what PSU ratings are for! You don't put a 400W PSU on a system where you need 1500 Watts! Regardless of efficiency, a PSU's power rating is simply the total power it can deliver. A 400W PSU delivers a total 400 Watts of power, a 650W delivers 650 Watts and so on.

 

And yes, some power supplies do have overload protection. But you'll have to go out of your way to trip it, ie. try to draw 650 Watts on a 400W power supply.

 

As I explained earlier, your PSU will deliver it's rated power just fine. Lower efficiency simply means more power input for the same power output. It might run a tad bit hotter than other PSUs because of its lower efficiency, but any built-in PSU fan is more than enough to take care of that.

 

I'm trying my best to make you understand.

I'm not contradicting myself at all.

Watts = x

Efficiency as a fraction = y

Power needed/delivered = z

 

(z/y) + z = x

 

x can not exceed the maximum throughput or your PSU is very likely to overload.

 

EDIT: And just to clarify, you can not input more electricity than the maximum throughput. The capacitors simply can't handle that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Seminole said:

I'm not contradicting myself at all.

Watts = x

Efficiency as a fraction = y

Power needed/delivered = z

 

(z/y) + z = x

 

x can not exceed the maximum throughput or your PSU is very likely to overload.

 

EDIT: And just to clarify, you can not input more electricity than the maximum throughput. The capacitors simply can't handle that.

Your equation is wrong. It's simply, z = x * y.

 

A power supply can actually pull more AC power than it's rated; to deliver the DC power it is rated at. If your PSU is rated at 500W with 90% efficiency at full load, then it will deliver 500 Watts DC but it will consume 555.6 Watts of AC power from the wall. Don't think that the rating is the maximum power the PSU can physically draw from the wall, the PSU is designed to pull as much power as it needs to deliver it's rated power.

 

A power supply rated 400W is not its maximum power that it can pull from the wall, rather the maximum power that it can output to your system.

 

And to be clear, none of this has anything to do with the stability, or the performance of the power supply. It's only when you exceed the maximum capacity of the PSU when you'll start to face problems, if any.

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Shahnewaz said:

Your equation is wrong. It's simply, z = x * y.

 

A power supply can actually pull more AC power than it's rated; to deliver the DC power it is rated at. If your PSU is rated at 500W with 90% efficiency at full load, then it will deliver 500 Watts DC but it will consume 555.6 Watts of AC power from the wall. Don't think that the rating is the maximum power the PSU can physically draw from the wall, the PSU is designed to pull as much power as it needs to deliver it's rated power.

 

A power supply rated 400W is not its maximum power that it can pull from the wall, rather the maximum power that it can output to your system.

 

And to be clear, none of this has anything to do with the stability, or the performance of the power supply. It's only when you exceed the maximum capacity of the PSU when you'll start to face problems, if any.

The equation is rough, I'm not good at math so forget that.

But the PSU loses power in the contacts and wires within the PSU. The capacitors hold that electricity. The capacitors are only made to put through the rated amount of electricity, and can only hold that amount. You pull x amount of watts from the wall, it goes into your PSU and the electricity lost turns into heat before it's put into your PC. That's why PSUs have fans.

The unit is only made to contain a certain amount of watts and output that amount. That's the number on the box. If you pull in more than that, it overloads in some way or another. That's how they work.

 

Linus has a whole video about power supplies. Watch it. I really want you to understand how it works. Thermodynamics is a very interesting field, I think you'll like it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Seminole said:

The equation is rough, I'm not good at math so forget that.

But the PSU loses power in the contacts and wires within the PSU. The capacitors hold that electricity. The capacitors are only made to put through the rated amount of electricity, and can only hold that amount. You pull x amount of watts from the wall, it goes into your PSU and the electricity lost turns into heat before it's put into your PC. That's why PSUs have fans.

The unit is only made to contain a certain amount of watts and output that amount. That's the number on the box. If you pull in more than that, it overloads in some way or another. That's how they work.

 

Linus has a whole video about power supplies. Watch it. I really want you to understand how it works. Thermodynamics is a very interesting field, I think you'll like it.

No, the PSU doesn't lose power through contacts and wires. That would short out the PSU.

Yes, the units and its capacitors are rated to provide a certain number of watts, but won't you agree, a 500W rated PSU delivers 500 Watts of power? Isn't that what I've been trying to tell you so far?

 

 

These people are regurgitating the EXACT same thing I've been trying to explain you about efficiency. It has NOTHING to do with stability or performance. The last video here is the one I suspect is the source of all your problems.

 

Linus said it's "inappropriate" to buy a PSU for a system, whose power consumption falls waaaaay below (less than 20%) of the PSU's load. He didn't say it's dangerous, or it causes instability. You'll JUST get lower efficiency if you pull 100 watts at most from a 1500W power supply. Meaning, a bit more power will be wasted, and that's it! Nothing else. You'll also be wasting so much more money for a 1200W PSU, and you wouldn't utilize even half of its potential. Hence the term "inappropriate".

Quote

The problem is that this is an nVidia product and scoring any nVidia product a "zero" is also highly predictive of the number of nVidia products the reviewer will receive for review in the future.

On 2015-01-28 at 5:24 PM, Victorious Secret said:

Only yours, you don't shitpost on the same level that we can, mainly because this thread is finally dead and should be locked.

On 2016-06-07 at 11:25 PM, patrickjp93 said:

I wasn't wrong. It's extremely rare that I am. I provided sources as well. Different devs can disagree. Further, we now have confirmed discrepancy from Twitter about he use of the pre-release 1080 driver in AMD's demo despite the release 1080 driver having been out a week prior.

On 2016-09-10 at 4:32 PM, Hikaru12 said:

You apparently haven't seen his responses to questions on YouTube. He is very condescending and aggressive in his comments with which there is little justification. He acts totally different in his videos. I don't necessarily care for this content style and there is nothing really unique about him or his channel. His endless dick jokes and toilet humor are annoying as well.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ju

15 hours ago, Shahnewaz said:

No, the PSU doesn't lose power through contacts and wires. That would short out the PSU.

Yes, the units and its capacitors are rated to provide a certain number of watts, but won't you agree, a 500W rated PSU delivers 500 Watts of power? Isn't that what I've been trying to tell you so far?

 

 

These people are regurgitating the EXACT same thing I've been trying to explain you about efficiency. It has NOTHING to do with stability or performance. The last video here is the one I suspect is the source of all your problems.

 

Linus said it's "inappropriate" to buy a PSU for a system, whose power consumption falls waaaaay below (less than 20%) of the PSU's load. He didn't say it's dangerous, or it causes instability. You'll JUST get lower efficiency if you pull 100 watts at most from a 1500W power supply. Meaning, a bit more power will be wasted, and that's it! Nothing else. You'll also be wasting so much more money for a 1200W PSU, and you wouldn't utilize even half of its potential. Hence the term "inappropriate".

Okay, let's just agree to disagree.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×