Computer energy efficiency doubles every 18 months

Energy efficiency has become the new yardstick for measuring computing progress, according to a new scientific paper.

This article, published by Kate Greene in Massachusetts Institute of Technology’s (MIT) September issue of Technology Review, claims that the energy computing devices get through in processing data is becoming more and more important in the 21st century.

Up until the present, what is known as Moore’s law has been the basis for quantifying the progress of computer microchips. According to the law (which is named after its author, the co-founder of multinational chip manufacturer Intel Gordon Moore), computer processing power doubles every 18 months. Indeed, nobody is disputing that Moore’s law no longer holds true.

Koomey’s law

However, K. Greene asserts that the energy efficiency of computers also doubles every 18 months. Furthermore, Jonathan Koomey, a consulting professor of civil and environmental engineering at Stanford University and lead author of the study, claims that this law has more relevance to the modern age than Moore’s law.

This conclusion, dubbed “Koomey’s law” after the paper’s lead author, is proven by 60 years of data, as was Moore’s law. According to Koomey, the power consumption trend could have even greater relevance than Moore’s law as battery-powered devices – such as mobile phones, tablets and sensors, flourish:

“The idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half,” he said.

This continuing improvement, says Koomey, should facilitate the development of more and more mobile computing and sensing applications in the future. To put this theory into context, if a MacBook Air were as inefficient as a computer from 1991, its battery life would be two-and-a-half seconds.

Jumps in efficiency

The research was carried out in collaboration with Intel and software firm Microsoft, and co-authored by Microsoft’s Stephen Berard, Carnegie Mellon University’s Marla Sanchez and Intel’s Henry Wong. Data used went back to the ENIAC machine – the first general purpose computer in 1946. The ENIAC was used to work out US Army firing tables and could perform several hundred calculations. This huge machine was 1,800 square feet in size and munched through a staggering 150 kilowatts of power – hardly a model of electrical efficiency by today’s standards. Koomey says that many of the elements that have increased computing power – such as reducing component size and the communication time between them – have simultaneously increased energy efficiency.

Erik Brynjolfsson, a professor of the Sloan School of Management at MIT, says that while Moore’s law remains important, Koomey’s law could be more relevant to frontline consumers:

“Everyone’s familiar with Moore’s law and the remarkable improvements in the power of computers, and that’s obviously important,” says Brynjolfsson. “I think that’s [people’s paying more attention to the battery life of their electronics as well as how fast they can run] more and more the dimension that matters to consumers, and in a sense, ‘Koomey’s law,’ this trend of power consumption, is beginning to eclipse Moore’s law for what matters to consumers in a lot of applications.”

Bookmark and Share

About admin