Back to

Energy use from vacuum tube to integrated circuit and beyond

I thought it would be fitting this time around, as climate change negotiations proceed in Copenhagen, to discuss an interesting study done by researchers at Lawrence Berkeley National Laboratory, Stanford University, Microsoft, and Intel.

by Marcin Wichary on Flickr

by Marcin Wichary on Flickr

Many people have heard some version of “Moore’s Law” – that the number of transistors on a chip approximately doubles every two years. This study reveals a related trend in energy consumption per computation.

There are some details to consider, of course. For starters, it helps to know that everything a computer does is broken down into individual arithmetic operations. When talking about how buff a supercomputer is, for example, we talk about the number of floating point operations it can do per second (today they run upwards of one million billion operations – what we call Peta-flops). This study doesn’t take modern supercomputers into account, but the researchers note that today’s supercomputers are made up of clusters of individual processors, some of which they did measure.

Now, your computer isn’t always working to it’s full potential. When it does, it typically doesn’t respond well to user interaction (like when you start all of your apps in the morning). So to perform the measurements, the researchers had to keep the processors busy for the period during which they measured energy consumption. They also disabled the displays on the laptops that they measured to put them on equal footing with the other computers.

In addition to these modern measurements, the researchers incorporated historical power consumption data from some of the original vacuum tube supercomputers and other earlier computer models that aren’t available today. The methods used in these prior studies varied, but the graph resulting from this aggregate study shows a remarkably clear trend:

J. Koomey et. al., "Assessing Trends in the Electrical Efficiency of Computation over Time", Dec 12, 2009

J. Koomey et al., "Assessing Trends in the Electrical Efficiency of Computation over Time", Submitted to IEEE Annals of the History of Computing: August 5, 2009,, Dec 12, 2009

The vertical axis here is logarithmic and labeled in scientific notation. For those unfamiliar, the number after the + sign corresponds to the power of ten, so 1.E+01 means 10, 1.E+03 means 1000, and so on. Each horizontal line represents a ten-fold increase over the line below.

Note that the early vacuum tube computers operated at less than a Mega-flop per kilowatt hour. These machines could perform ten thousand calculations using the same amount of energy as a 100 Watt lightbulb left on for an hour. You can see a jump in efficiency as transistors and integrated circuits replaced vacuum tubes around 1960. And as we have packed more and more computational power into the same amount of space on a microchip, the distance that electrons have had to travel without doing useful computation has steadily been reduced, roughly doubling the number of calculations that could be done with the same amount of energy every 18 months (very similar to the trend in speed that we’re familiar with). So, although your laptop can’t do a Peta-flop in a second, it can do it using only about 1KWh of electricity – a billion times more efficiently than those vacuum tube supercomputers (although we should note that manufacturing cost is not factored in).┬áThis is the power of exponential growth.

And according to a recent interview with Lorie Wigle of Intel, while this trend continues into the future, manufacturers will be looking at additional ways to improve the efficiency of information technology, from power supply design to power management via the operating system. Hopefully this will be combined with higher rates of e-Waste recycling, leading to more sustainable computing practices.

Filed under: Technology

Comments are closed.