Linux, politics, and other interesting things
For some time I’ve been wondering how the wire size for power supplies limits the power. So I’ve done some quick calculations to determine if it’s a problem.
The first type that is of interest are the “Inverters” that are used to convert 12VDC to 240VAC (mains power) to allow electric devices to be operated in a car. I’ve seen some reports from dissatisfied user about Inverters not supplying as much power as expected and I’ve had problems with my 150W Inverter not always supplying my Thinkpad (which definitely doesn’t draw 150W). The second type is phone chargers as charging a phone in a reasonable amount of time is always a problem.
My Thinkpad Power Supply claims “Efficiency Level IV” which according to the US EPA document describing the efficiency marking protocol for external power supplies  means that it is at least 85% efficient when supplying 50W+. The peak output of the PSU is 4.5A at 20V which is 90W peak output, 90/0.85 == 106W power drawn. One would hope that wouldn’t be a problem from a 150W PSU.
But the fine print on the PSU says that it can provide 110W continuously and 150W for 10 minutes. So according to my calculations I’m within 4W of overloading the PSU if my Thinkpad uses full power. It also says that it is designed for 13.8V input. I have no idea how the performance of the Inverter changes as the supply Voltage changes between the 12.6V that a 6 cell lead-acid battery is designed to provide and the 13.8V charge from the car alternator. But I have had occasions when my Inverter stopped working correctly presumably due to being unable to supply as much current as my Thinkpad draws.
As an aside I measured the Voltage in my car (with the engine off) at 12.85V from the cigarette lighter socket and 13.02V directly from the battery. I wonder if there is some sort of overload protection on the cigarette lighter which has a side effect of reducing the Voltage. Resistance in wires reduces the Voltage, but all Voltage meters are designed to have a high resistance to prevent that from being an issue. If anyone has an explanation for the 0.17Volt drop then please write a comment!
If the Inverter is also 85% efficient (and it might be less as it has no indication of efficiency on the box) then when supplying 110W it would draw 110/0.85 == 129.4W (I’ll round it up to 130W).
The current that goes through a circuit is equal to the Voltage divided by the resistance (see the Wikipedia page on Ohm’s law for more information). This also means that the resistance equals the Voltage divided by the current. 12.85V/10.12A == 1.27 Ohms. Note that this is the resistance of the entire circuit, all the wires going to the battery, the circuitry inside the Inverter, and the internal resistance of the battery.
The Inverter’s cable is 1M long (2 meters of wire) and each wire is about 3.5mm in diameter including the insulation which means that the copper wire is probably equivalent to a single core conductor that is about 1mm in diameter. According to one of the online guides to resistance  wire that is 1.02mm in diameter will have a resistance of 0.02 Ohms per meter which gives a resistance of 0.04 Ohms. 0.04 Ohms is 3% of the total resistance of the circuit which doesn’t seem like it will be a real problem.
In practice I’ve noticed that the connector gets extremely hot when it’s in use while the cable doesn’t get warm enough to notice. I suspect that the quality of the connector limits the power that is available but I don’t have an easy way of measuring this.
Inverters that are rated at 300W are designed to attach directly to the battery. An Inverter that is rated at 300W would draw 300W/0.85 == 352W from the battery. That needs 352W/13.02V == 27.04A and therefore a circuit resistance of 13.02V/27.04A == 0.48 Ohms total resistance. I wonder whether dirt on the battery terminals would give a significant portion of that.
I’ve also been wondering about why mobile phones take so long to charge, and now I’ve finally done the calculations.
The latest standard for mobile phones is to use USB for charging. The Wikipedia page about USB says that the standard is for USB 2.0 to supply up to 500mA at 5V +-5%. That means 0.5A*5V == 2.5W +- 5%. If we assume that the internal power supply in a phone is also 85% efficient then that means 2.5*0.85 == 2.125W going to the battery.
My Samsung Galaxy S3 has a battery which is rated at 7.98Wh. According to the Wikipedia page about Lithium Ion batteries the charge/discharge efficiency is 80% to 90% – I’ll assume that it’s 85% for further calculations. If the battery in the phone is 85% efficient and the phone is doing nothing but charging then the charge time for a regular USB port would be 7.98Wh/0.85/2.125W == 4.42 hours (4 hours 25 minutes) of charge time. That probably means something closer to 5 hours to totally charge the phone while it’s running. There are dedicated “charging ports” for USB which can supply up to 1.5A. The 3rd party charger which came with my phone was rated at 1A and would hopefully be capable of completely charging the phone in less than 3 hours (but in practice isn’t). It’s interesting to note that MacBooks expose the amount of current drawn from a USB port with a GUI, so it should be possible to measure a phone charge rate by connecting it to a MacBook (which is cheaper than cutting up a phone cable).
My old Samsung Galaxy S has a battery which is rated at 5.55Wh, by the same calculations it would take slightly more than 3 hours to charge on a standard USB port or 1.5 hours on my newest USB charger. In practice it has never got anywhere close to that, I presume that the phone is designed to draw less than 500mA.
The charger that came with my Galaxy S has a cable that is about 1.75M long, the cable is flat and measures just over 1mm thick and about 2mm wide. Presumably the wire is equivalent to a single core that’s about 0.4mm in diameter thus giving it a resistance of about 0.134 Ohm per meter, or 1.75*2*0.134 == 0.469 Ohm for the cable. The charger is rated at 0.7A. To supply 0.7A at 5V the resistance would be 5V/0.7A == 7.143 Ohm – so about 6.6% of the total resistance of the circuit would be in the wire from the charger to the phone.
The charger that came with my Galaxy S3 has a round cable that’s just over 3mm thick and about 90cm long. If each wire in the cable is equivalent to a solid wire that is 0.912mm in diameter then it would be 0.0264 Ohm per meter of wire or 0.9*2*0.0264 == 0.0475 Ohm. The total circuit resistance would be 5V/1A == 5 Ohm. So 0.0475 Ohm is less than 1% of the circuit resistance.
The Voltage across a part of a circuit is proportional to the resistance (see the Wikipedia page on Series and Parallel Circuits for a good explanation).
Basically this means that if 1% of the resistance of a circuit is in the wire then 1% of the Voltage drop will also be in the wire, so if we have a 5V supply with my Galaxy S3 cable then each of the two wires in the cable will have a difference of about 0.025V between the ends and the phone will receive a supply of 4.95V, the difference isn’t something that is worth worrying about. But the cable from my Galaxy S has a resistance equivalent to 6.6% of the circuit resistance which means that the theoretical charge time will be 6% longer than it might be – or 6% more current will be drawn from the mains than should be needed.
The charger that came with my Samsung Galaxy S isn’t much good. Wasting 6.6% of the power in the wire is unreasonable.
Phones keep getting more power hungry and batteries keep getting larger. There are third party phone batteries and external batteries that are charged by USB which have more than twice the capacity of the stock phone batteries – this means more than twice the charge time. This problem will keep getting worse.
The problem of a phone in active use drawing more power than the charger can provide (and running out of battery while on the charger) seems likely to stay with us. So while an Android phone has the potential to be a great little embedded server it seems that hacking the power supply is going to be a required first step for realising that potential.
The decision to make 5V the USB power standard was reasonable at the time as it was the voltage used for most things on the motherboard. The decision to use USB as the phone charging standard was also reasonable, it allows phones to be charged anywhere. The combination of those two decisions isn’t good for the user. If a higher Voltage such as 12V was used then 5* the power could be supplied through the same wires at the same level of efficiency.
It would be really good if cars came with built in Inverters and supplied 240VAC or 110VAC depending on the region they were manufactured for. It’s becoming a fairly common feature to have a “cigarette lighter” port in the car boot as well as at least two ports inside the car. When a car has three sockets and only one device to actually light cigarettes (which I suspect is only provided to fill an empty socket) it’s very obvious that people want to connect random devices. Also having USB charging ports inside the car would be a really good idea (one for each seat would be good for Ingress).