Understanding the conversion of a chargers output to time to charge
23 minutes ago, dealer of aces said:I'm trying to understand the theoretical components of time to charge for a device. Let's assume that a battery is 2000 mAh. Power = Voltage * Amp. So lets take a charger that charges at both 24v 1 amp and 12v 2 amps. That provides us with a 24 watt output on both. Everything I've been able to find talks about discharge time being battery size divided by discharge rate. I want to know the reverse. How do we get from a 24 watt charger to filling the capacity of the 2000 mAh battery. As far as charging rate goes is there a difference between 24v 1 amp and 12v 2 amps? How can I take the wattage of a charger and convert that into mAh?
Regards
I had some electrical engineering courses in college so speak as technically as you would like. Unfortunately I don't remember some of the specific content do to a head injury but I understand conceptually most things.
So you're missing something important here: Voltage of the battery.
To be able to do any math here, you need 2 out of the 3 (voltage, wattage, or amperage).
if the voltages of the battery and the charger output voltage is the same, it's just total amp-hours of the battery, divided by amperage output of the charger. Example: 12v 2000mAh battery, and the charger is 12v / 2A, The equation that I remember is as follows: 2000 mAh / 2000 mA (converting to same unit) = 1h. You do have efficiency loss thanks to heat and resistance, which usually works out to around a 1:1.4 ratio. So at the end of the equation, multiply your hour by 1.4 and that will give you an estimate.
If it's different voltages, I THINK you just convert both to watts, then divide again, then multiply by 1.4.
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now