Archives

Categories

Power Supplies and Wires

For some time I’ve been wondering how the wire size for power supplies limits the power. So I’ve done some quick calculations to determine if it’s a problem.

The first type that is of interest are the “Inverters” that are used to convert 12VDC to 240VAC (mains power) to allow electric devices to be operated in a car. I’ve seen some reports from dissatisfied user about Inverters not supplying as much power as expected and I’ve had problems with my 150W Inverter not always supplying my Thinkpad (which definitely doesn’t draw 150W). The second type is phone chargers as charging a phone in a reasonable amount of time is always a problem.

Inverter Rating Fine Print vs Laptop PSU

My Thinkpad Power Supply claims “Efficiency Level IV” which according to the US EPA document describing the efficiency marking protocol for external power supplies [1] means that it is at least 85% efficient when supplying 50W+. The peak output of the PSU is 4.5A at 20V which is 90W peak output, 90/0.85 == 106W power drawn. One would hope that wouldn’t be a problem from a 150W PSU.

But the fine print on the PSU says that it can provide 110W continuously and 150W for 10 minutes. So according to my calculations I’m within 4W of overloading the PSU if my Thinkpad uses full power. It also says that it is designed for 13.8V input. I have no idea how the performance of the Inverter changes as the supply Voltage changes between the 12.6V that a 6 cell lead-acid battery is designed to provide and the 13.8V charge from the car alternator. But I have had occasions when my Inverter stopped working correctly presumably due to being unable to supply as much current as my Thinkpad draws.

As an aside I measured the Voltage in my car (with the engine off) at 12.85V from the cigarette lighter socket and 13.02V directly from the battery. I wonder if there is some sort of overload protection on the cigarette lighter which has a side effect of reducing the Voltage. Resistance in wires reduces the Voltage, but all Voltage meters are designed to have a high resistance to prevent that from being an issue. If anyone has an explanation for the 0.17Volt drop then please write a comment!

Can a Car Provide 130W from the Cigarette Lighter socket?

If the Inverter is also 85% efficient (and it might be less as it has no indication of efficiency on the box) then when supplying 110W it would draw 110/0.85 == 129.4W (I’ll round it up to 130W).

The power in Watts is equal to the Voltage multiplied by the current in Amps (W=V*I). Therefore I=W/V so if the car battery was at 12.85V then 130W/12.85V == 10.12A will flow.

The current that goes through a circuit is equal to the Voltage divided by the resistance (see the Wikipedia page on Ohm’s law for more information). This also means that the resistance equals the Voltage divided by the current. 12.85V/10.12A == 1.27 Ohms. Note that this is the resistance of the entire circuit, all the wires going to the battery, the circuitry inside the Inverter, and the internal resistance of the battery.

The Inverter’s cable is 1M long (2 meters of wire) and each wire is about 3.5mm in diameter including the insulation which means that the copper wire is probably equivalent to a single core conductor that is about 1mm in diameter. According to one of the online guides to resistance [2] wire that is 1.02mm in diameter will have a resistance of 0.02 Ohms per meter which gives a resistance of 0.04 Ohms. 0.04 Ohms is 3% of the total resistance of the circuit which doesn’t seem like it will be a real problem.

In practice I’ve noticed that the connector gets extremely hot when it’s in use while the cable doesn’t get warm enough to notice. I suspect that the quality of the connector limits the power that is available but I don’t have an easy way of measuring this.

Inverters that are rated at 300W are designed to attach directly to the battery. An Inverter that is rated at 300W would draw 300W/0.85 == 352W from the battery. That needs 352W/13.02V == 27.04A and therefore a circuit resistance of 13.02V/27.04A == 0.48 Ohms total resistance. I wonder whether dirt on the battery terminals would give a significant portion of that.

Phone Charging

I’ve also been wondering about why mobile phones take so long to charge, and now I’ve finally done the calculations.

The latest standard for mobile phones is to use USB for charging. The Wikipedia page about USB says that the standard is for USB 2.0 to supply up to 500mA at 5V +-5%. That means 0.5A*5V == 2.5W +- 5%. If we assume that the internal power supply in a phone is also 85% efficient then that means 2.5*0.85 == 2.125W going to the battery.

My Samsung Galaxy S3 has a battery which is rated at 7.98Wh. According to the Wikipedia page about Lithium Ion batteries the charge/discharge efficiency is 80% to 90% – I’ll assume that it’s 85% for further calculations. If the battery in the phone is 85% efficient and the phone is doing nothing but charging then the charge time for a regular USB port would be 7.98Wh/0.85/2.125W == 4.42 hours (4 hours 25 minutes) of charge time. That probably means something closer to 5 hours to totally charge the phone while it’s running. There are dedicated “charging ports” for USB which can supply up to 1.5A. The 3rd party charger which came with my phone was rated at 1A and would hopefully be capable of completely charging the phone in less than 3 hours (but in practice isn’t). It’s interesting to note that MacBooks expose the amount of current drawn from a USB port with a GUI, so it should be possible to measure a phone charge rate by connecting it to a MacBook (which is cheaper than cutting up a phone cable).

My old Samsung Galaxy S has a battery which is rated at 5.55Wh, by the same calculations it would take slightly more than 3 hours to charge on a standard USB port or 1.5 hours on my newest USB charger. In practice it has never got anywhere close to that, I presume that the phone is designed to draw less than 500mA.

Phone Cable Resistance

The charger that came with my Galaxy S has a cable that is about 1.75M long, the cable is flat and measures just over 1mm thick and about 2mm wide. Presumably the wire is equivalent to a single core that’s about 0.4mm in diameter thus giving it a resistance of about 0.134 Ohm per meter, or 1.75*2*0.134 == 0.469 Ohm for the cable. The charger is rated at 0.7A. To supply 0.7A at 5V the resistance would be 5V/0.7A == 7.143 Ohm – so about 6.6% of the total resistance of the circuit would be in the wire from the charger to the phone.

The charger that came with my Galaxy S3 has a round cable that’s just over 3mm thick and about 90cm long. If each wire in the cable is equivalent to a solid wire that is 0.912mm in diameter then it would be 0.0264 Ohm per meter of wire or 0.9*2*0.0264 == 0.0475 Ohm. The total circuit resistance would be 5V/1A == 5 Ohm. So 0.0475 Ohm is less than 1% of the circuit resistance.

Voltage Drop

The Voltage across a part of a circuit is proportional to the resistance (see the Wikipedia page on Series and Parallel Circuits for a good explanation).

Basically this means that if 1% of the resistance of a circuit is in the wire then 1% of the Voltage drop will also be in the wire, so if we have a 5V supply with my Galaxy S3 cable then each of the two wires in the cable will have a difference of about 0.025V between the ends and the phone will receive a supply of 4.95V, the difference isn’t something that is worth worrying about. But the cable from my Galaxy S has a resistance equivalent to 6.6% of the circuit resistance which means that the theoretical charge time will be 6% longer than it might be – or 6% more current will be drawn from the mains than should be needed.

Conclusion

The charger that came with my Samsung Galaxy S isn’t much good. Wasting 6.6% of the power in the wire is unreasonable.

Phones keep getting more power hungry and batteries keep getting larger. There are third party phone batteries and external batteries that are charged by USB which have more than twice the capacity of the stock phone batteries – this means more than twice the charge time. This problem will keep getting worse.

The problem of a phone in active use drawing more power than the charger can provide (and running out of battery while on the charger) seems likely to stay with us. So while an Android phone has the potential to be a great little embedded server it seems that hacking the power supply is going to be a required first step for realising that potential.

The decision to make 5V the USB power standard was reasonable at the time as it was the voltage used for most things on the motherboard. The decision to use USB as the phone charging standard was also reasonable, it allows phones to be charged anywhere. The combination of those two decisions isn’t good for the user. If a higher Voltage such as 12V was used then 5* the power could be supplied through the same wires at the same level of efficiency.

It would be really good if cars came with built in Inverters and supplied 240VAC or 110VAC depending on the region they were manufactured for. It’s becoming a fairly common feature to have a “cigarette lighter” port in the car boot as well as at least two ports inside the car. When a car has three sockets and only one device to actually light cigarettes (which I suspect is only provided to fill an empty socket) it’s very obvious that people want to connect random devices. Also having USB charging ports inside the car would be a really good idea (one for each seat would be good for Ingress).

9 comments to Power Supplies and Wires

  • neonsignal

    Even a cheap multimeter should have a 1M to 10M internal resistance on the voltage ranges, so the current draw there is insignificant and will not affect the measured voltage.

    There is overload protection on a cigarette lighter – the fuse in the fuse box. Also, there may be a fusible link near the battery itself. These would have a fairly low resistance.

    You may get large resistance at the point of contact of the multimeter probes with the cigarette lighter; the contacts will sometimes oxidize, especially if it has actually been used as a lighter. If this is the case, then merely pressing the probe harder will change the measured voltage.

    Alternatively, it may simply be that when you are measuring the lighter from inside the car you have just opened the door, and are drawing power (eg for the courtesy lights) which is lowering the measured voltage at that point in the circuit.

    Regarding the USB; the old 2.0 standard only allowed 500mA if this was negotiated by the device (which a phone should do) – and that is the total per hub, not per port. In theory the USB master is supposed to limit this. But I’ve noticed on my few machines only one laptop that does so (which can be annoying, because the current draw from an external pocket drive exceeds the limit while it is spinning up, and the USB hardware would shut down the port until the next reboot). Most machines appear to supply as much current as the device will pull, usually to the point of letting the smoke out of the USB drive chip, and I’ve even seen tracks burnt off a board by a shorted output. I gather changes to the standard now allow for higher currents to be negotiated.

  • Many phones have custom charger identification protocols. If you use the right charger, they will pull 2 or 2.5A. When you plug into a regular laptop USB port or a charger from a different vendor, they will pull 1A or whatever the USB spec allows.

    This may put a crimp in the use-macbook-to-measure-current idea.

  • Your phone identifies the charger using the data pins on the USB cable. If it sees a short across D+ and D- it will draw considerably more current than it does with a simple 5v power supply. The easiest way to accomplish this without cutting up cables and things is to buy an iPad compatible ( 2.1A ) car charger. This by itself will not charge the phone at 2.1A because Apple devices use a voltage divider to supply different voltages to both data pins to identify maximum current capability. You can however buy USB cables marked “Charge Only”. This means that they have the data pins internally shorted to allow quick charging.

    My MacBook Pro ( 5th Generation, 3rd Revision ) only supports 1A charging ( double the USB spec, half what the SGS3 can draw ), and only with Apple devices.

  • Andreas Gleaser

    in my opiion it would not make much sense anyway to convert the DC current in any car to AC first and back to DC again, so it is high enough to operate your thinkpad. The broken Thinkpad I have here, has a 16 V, 4,5 A power supply.
    It would be more sensible to use a DC-DC converter fpr the purpose. So probably this is what you would need:
    http://www.reichelt.de/PC-Netzteile-Mini-Micro-ATX/M2-ATX/3/index.html?;ACTION=3;LA=446;ARTICLE=70010;GROUPID=4422;artnr=M2-ATX;SID=10URTSv38AAAIAAFPGewkc6e77ac5a5ce09a99b9f6a5c6da0b860
    It is designed for cars, it seems to keep to ATX-specifications and provides a maximum of 160 W. If you need something around 16V, then you can either take ‘Yellow’ plus ‘Orange’, i.e. 12 V + 3.3 V = 15.3 V or ‘Yellow’ plus ‘Red’, i.e. 12V + 5V = 17 V, both works probably.

  • Andreas Glaeser

    2x’Red’ and 2x’Orange’ would be another possibility. This is 5 + 5 + 3.3 + 3.3 V = 16.6 V. This is most probably the closes you can get to 16 V, and it is better balanced, because four different wires are used.
    Use pins 23, 12, 13 and 6 of the ATX-connector for example.

  • gary greer

    Cigarette lighters are notorious for having poor power supply, they’re designed to only supply current for a short period of time. Pulling ten amps out of it for a long period of time will probably begin to heat up the wiring loom. The lighter connections are also not designed to provide a low resistance contact. Best way to run these types of items is with a direct connection to the battery, with fuses in both the positive and negative leads, and a more robust socket. An auto electrician will be able to help.

  • Anthony

    http://hackaday.com/2010/08/03/reverse-engineering-apples-recharging-scheme/
    … talks about how someone reverse engineered the setup for how some iDevices detect how much current they’re allowed to pull, as per Chris’s comment.

    Presumably since the article was written, they’ve used another value to indicate 2A (for newer iPads) over 1-1.5A.

    No doubt, once Apple thinks enough people have switched to their newer connector, they’ll change it again with some firmware and epoxy blob black magic :)

  • etbe

    http://etbe.coker.com.au/2013/05/17/voltage-inside-a-car/

    neonsignal: You are correct that I made some mistakes in the testing, I’ve done some new tests and documented the results at the above URL. I was surprised by the difference that opening the door made.

    Marius: Good point about chargers and matching devices, I’ve seen problems in this regard before. I now try and match devices and chargers even though they are supposed to all inter-operate.

    dhardy: That just sounds like a bad idea to me. I think we should just have standard connectors for higher voltages. Some such connectors exist, I have some NEC desktop PCs that have a 12V socket for powering speakers as well as a Dell monitor with the same socket. So I now have speakers which came included with an NEC PC running from a Dell monitor. It would be nice if more companies supported that apparent standard.

    Chris: Thanks for a great illustration of what’s wrong with USB charging!

    Andreas: For many years IBM sold Thinkpad car power supplies. I presume that I can buy a special car power supply for my laptop if I want to. But I don’t feel that inclined to spend $200 on a PSU when I already have an inverter that works. Note that I use the inverter (which was a lot cheaper than any sort of Thinkpad PSU) for lots of other things so it’s much better value for money. But thanks for the suggestion of other ways to solve this.

    Gary: I’m sure that older cars were designed for small amounts of power to be supplied. But in modern cars they are clearly aiming for uses other than lighting cigarettes. My car has 3 sockets, one that could be used for lighting cigarettes and two that are specifically designed for other things. As most of the things that one might connect to a car power source will take power for hours at a time (fridges, phone chargers, etc) I’m sure that they design cars to work with them. That said most devices draw a lot less power than a laptop, for example my car fridge is rated at 48W.

    Probably the best thing to do to alleviate this problem would be to get a Thinkpad PSU that takes 12V input to remove some loss and to only use it when the car engine is running (and the voltage is higher).

    Anthony: Thanks for that link, the video showing how to reverse engineer things is interesting.