Submitted by OneGuyJeff t3_1274d8q in explainlikeimfive
afcagroo t1_jecjm23 wrote
The voltage put out by a battery is determined largely by the materials used. While you can increase the available current by making it bigger (or ganging up a bunch of smaller ones in parallel), the voltage is somewhat fixed.
The standard batteries used for many years had a natural voltage of about 1.5V. But for the transistors of the time, that voltage was not optimal.
There's a hack for that. If you "stack" two batteries in series, then their voltages add up. Two 1.5V batteries will output 3V. If you stack four of them you get 6V. Which was pretty useful when the "standard" for many transistors and integrated circuits was 5V.
A lot of things have changed since those days, of course. Different battery materials are in widespread use, and integrated circuit technology has changed such that 5V is not only undesirable, it's not acceptable.
As transistors were made smaller and smaller ("shrunk"), it became necessary to reduce their supply voltages. This has gone through many phases, from 5V -> 3.6V -> 3V etc. etc. A lot of integrated circuits now are very happy to run on 1V or so. If a gizmo uses only such ICs, it can use a single battery.
But there are at least a couple of reasons to use higher voltages. One is that there are still some old technology devices around that run on a 3V standard, or use signaling busses between them that use a >1V standard. There are also some components, such as certain displays, that work on higher voltages.
It is possible to boost voltages, but a better strategy is often to use a power source that provides the highest voltage needed. For the lower voltage devices, it's relatively easy/cheap to drop the voltage down.
So you use two batteries in series and that provides a nominal voltage of 3V (actually less pretty quickly). For things that don't need that, you reduce the voltage.
Viewing a single comment thread. View all comments