Jump to content

USB Meter help

tinpanalley

Help me understand this, I know this is very basic stuff but I'm reading different sites and getting myself confused.

 

I've got a USB power meter and it's plugged into my charging dock which can give up to 12V at 2A

I then charge a rpi2 remote with the following battery specs:

3.7V - 2.96Wh - 800mAh

Charging voltage: 4.4V - 5.2V

Charging current 300mA

 

and the remote's input specs:

5V -- 300ma max -- operating voltage 3.3V

 

In reference to what I'm charging, what is the display on my usb meter telling me exactly when it displays V, A, W, and mAh?

The instructions are so confusing I actually don't understand what I'm looking at anymore, but trying to learn.

Link to comment
Share on other sites

Link to post
Share on other sites

I like to think of electrical circuits like plumbing. Voltage is essentially pressure, current is essentially the volume you're moving, and resistance is how restrictive your system is. Watts is a unit of work, in this case, electrical work. It is derived by voltage times the current. Mah (or mililamp hours) is a measurement of stored energy. Say you have 1000mah of stored energy, you can pull 1000ma for one hour and then the supply will be drained. 

ASU

Link to comment
Share on other sites

Link to post
Share on other sites

Even though it's not correct, you could look at things like this:

 

A device must receive at least some amount of voltage in order to work and could be damaged if the voltage is too high - as an analogy think of it as a factory that needs to have at least one truck coming into the factory , but if too many trucks try to enter the factory unloading docks they'll damage the unloading docks and factory stops working.

 

The current represents how much energy is available for devices to consume - a device will only take as much energy as it needs - think of it the factory unloading only as much raw materials as they need to use during a period of times from the trucks in the loading docks.  The power supply can provide some maximum amount of current, it's like loading each truck with some amount of raw materials. A power supply that can output a lot of current = truck full with raw materials ... a power supply that can make available less current = trucks with less raw materials per truck.

 

The amount of energy is calculated in Watts, and it's Voltage multiplied by the current ... or continuing with the analogy  number of trucks on unloading docks multiplied by number of pallets unloaded from them.

 

The USB is designed to function on 5v, but devices meant to work with USB must be able to work with an input voltage between 4.5v and 5.5v , because any length of cable has some resistance (which varies depending on thickness of the cable and length and the amount of current flowing through the wire)

The actual formula is a classic, very well known .... voltage = current times resistance

 

So for example, if your power supply outputs 5v and you have 1 meter of cable with 0.1 ohm per meter resistance between the power supply and your device that you want to power and your device will consume 1A of current , then you will lose  Voltage = 1 A x 0.1 ohm x 2 meters of cable (1m in either direction) = 0.2v  so your device actually "sees" 4.8v instead of 5v.

If your device only needs to consume 0.1A (100mA), like a keyboard or mouse, then the voltage lost due to cable resistance will be much lower, at  V  = 0.1A x 0.1 ohm x 2 meters = 0.02 volts

 

Your "charging dock" is special... it's a special kind of USB device. By default, it will produce energy at 5v level and provide up to some amount of current, let's say 2A.  A device on the other end (like a phone for example) could "talk" to this charging dock and sends a message through the USB data wires which basically says  "Hey charging dock, I'm a bit smarter than a regular USB device, and I can tolerate voltages higher than 5v."

If the charging dock understands, then the charging dock may raise the voltage to 9v or even 12v.

So your charging dock only outputs 12v if the device at the other end explicitly asks the charging dock to change the voltage it sends to it.

 

The reason a phone may ask the charging dock to use a higher voltage is because of that wire resistance. See formula above and how current plays a big part in how much voltage is lost in the cables ... again, you have Voltage = Current x Resistance  ... resistance is fixed for a particular length of cable with a fixed thickness.

 

So the phone may use the default 5v and 2A of current and take 10 watts of energy (5v x 2a = 10w) from the power supply, but because of wire resistance there may be let's say 0.5v voltage loss on the cables, so the phone actually receives only  4.5v  x 2 A = 9w of energy - 1w is lost in the cables between the power supply and phone as heat.

 

If the phone talks to the charger and tells it to change the voltage to 12v, then the power supply will still provide 10 watts but at 12v, so the current will be much lower :  I = W/V = 10w / 12v = 0.83A , so the losses in the cables will be also less than a half, less than 0.5v in our example... let's say 0.2v  ... so the phone will receive 11.8v x 0.83w = 9.83 watts and the losses will be only 0.16 watts.

 

So you see, if the phone can talk to the charger and negociate a higher voltage, it will be better - the phone will charge faster because it receives more energy in an amount of time simply because less energy is lost between the power supply and the phone.

 

But your Raspberry Pi is probably not smart enough to negociate the voltage like a phone, so the voltage your charging dock sends to it will always be 5v.

 

Now about your remote:

 

3.7V - 2.96Wh - 800mAh

Charging voltage: 4.4V - 5.2V

Charging current 300mA

 

The first line tells you the default voltage of the battery, 3.7v .... this is the voltage the battery will be at most of the time.

Think of it like regular non-rechargeable AA batteries that have 1.5v written on them. When they're fully charged, an AA battery may have a voltage up to 1.65v and as energy is taken out of it, the voltage will drop down to 1.35v or even less. Below some threshold, the amount of energy that comes out from the battery is too low to matter.

Just the same, these lithium batteries have 3.7v written on them, but when they're fully charged the voltage can be up to 4.2v and when they're almost empty, the battery voltage can be as low as 3.2v. The most amount of energy will be provided to devices with a voltage around 3.7v so that's the default voltage written on the battery.

The 800mAh is the amount of current the battery can provide safely for long periods of time - if the device takes less the battery is happy, if the device takes more the battery will discharge much faster and lose energy

It also tells you that you shouldn't charge the battery by pumping more than this amount of current into the battery - if you pump too much the insides can be damaged.

 

The second line says that the remote uses a chip which controls the charging process for the battery, and this chip needs at least 4.4v to function and pump energy into the battery. If you try to give more than 5.2v, this charging chip may be damaged.

 

The third line tells you that the charging chip will use as much as 300mA of current to charge the battery. Even though the battery could receive up to 800mA of current and charge faster, the chip will only push as much as 300mA into the battery.

 

The battery will charge slower and will be happier and will last longer because it doesn't heat as much (the insides can also be damaged very slightly from heat over time, over multiple charges)

 

It's also important to understand that the amount of energy (current) a battery "sucks" in it isn't constant. Even if the charging chip is capable of pumping up to 300mA of current into the battery, and the battery is capable of accepting up to 800mA, the battery may simply not take that much energy, it simply depends on how much charge is in it.

If the battery is almost empty, the battery will be happy to "absorb" as much energy as you give it, but as it fills up, the energy is absorbed at a slower rate into the battery.

 

So for example, let's say you have the battery at 90% and reaches a voltage of 3.9v ... even if the charging chip can "make available" 300mA (0.3A) to the battery and the power supply may be capable of giving up to 2A to the charging chip, the battery may only "absorb" 0.1A .. 0.2A of current until it reaches 100% and is fully charged.

 

 

If you don't understand something i wrote above, ask and I'll try to explain. I have auto-notification and I'll see.

 

And last

 

remote's input specs:

5V -- 300ma max -- operating voltage 3.3V

 

This basically says that the device (your remote) wants 5v at the input (which is the standard voltage for USB) and may take up to 300mA from the connector (which will happen when it charges the battery, but will use much less if the battery is close to fully charged, or fully charged)

And the operating voltage ... basically it tells you that the device has some kind of voltage regulator chip inside which will take either 5v give or take a few tens of V from the USB connector, or let's say 3.5v .. 4.2v from the internal battery and produce 3.3v for all the chips and devices inside.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/26/2018 at 4:01 PM, mariushm said:

Even though it's not correct, you could look at things like this:...

 

Ok, I'm gonna need some time to read all this but thank you. Just so you didn't think you gave me all this info and I didn't acknowledge.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×