Jump to content

Whats the difference between a charger and a power supply?

Canada EH

I was at a Thrift store yesterday, looking for a laptop charger to charge up my 18V battery used for a light.

There were a lot of power supplies for printers, scanners but no chargers except for the small cell phone type that give out 0.5A which is useless for me.

 

When I was trying to understand this, I took my meters and I measured the voltage and current of a charger for lithium ion battery.

The voltage would slowly increase 0.01V at a time and yet the current was maxed out.

Then as the voltage got closer to the chargers max, the current would slow down.

Until equilibrium was achieved.

 

I am guessing that the a power supply will just give whatever the battery wants.

Full voltage and full current.

 

Which undoubtedly wont be good for the battery.

 

I got to thinking if I used a power supply then put a charger in series then the cc/cv stages will be achieved.

Of course there would have to be ground isolation between all the units, which is easily accomplished, and easily confirmed.

The video I saw, the guy does not explain what power supply he uses. That guy, used a psu and two laptop chargers, everything in series.

There are power supplies that have cc/cv and are used in charging applications all the time, and is very reliable.

They range in prices for a genuine to clones to outright hacks.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

It depends on the charger and the battery. For consumer electronics like laptops, cellphones, etc. the batteries are connected only to a smart charging system which handles charging and discharging of the battery. That part controls how much power goes into the battery for charging. If you're charging loose batteries, then the charger needs to have the smarts to control how much goes into the battery. There are dumb chargers that provide a constant voltage and/or current. But otherwise, the battery can be considered a load when charging it, which means it's not any different from a resistor (well, it's a constantly changing one) for analysis purposes.

 

Otherwise for all intents and purposes, there's no difference.

Link to comment
Share on other sites

Link to post
Share on other sites

A battery management system is ideal and makes complete sense, but a bms is not necessary.

Have to realize the c-rate you want to achieve and not go over the specifications of the battery itself. I like to keep my c-rates around 1C for charging, sometimes lower depending on the chemistry used.

 

I need to find a decent electronics forum.

 

 

 

3 hours ago, M.Yurizaki said:

smarts to control

 

Link to comment
Share on other sites

Link to post
Share on other sites

Jesus Christ.

 

A power supply... ANY power supply... should output the voltage it is rated at.  It will put out that voltage up to the current the supply is rated for.  The thing to look at is how much current the power supply can put out and then how much you need.

 

You can't just shove a DMM into a power adapter and set it to current and expect it to tell you what it can output. That's dumb.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×