By John Gruber
Jiiiii — All your anime stream schedules in one place.
Kate Greene, for MIT Technology Review:
Researchers have, for the first time, shown that the energy efficiency of computers doubles roughly every 18 months.
The conclusion, backed up by six decades of data, mirrors Moore’s law, the observation from Intel founder Gordon Moore that computer processing power doubles about every 18 months. But the power-consumption trend might have even greater relevance than Moore’s law as battery-powered devices — phones, tablets, and sensors — proliferate.
“The idea is that at a fixed computing load, the amount of battery you need will fall by a factor of two every year and a half,” says Jonathan Koomey, consulting professor of civil and environmental engineering at Stanford University and lead author of the study.
Fascinating, really. And I’d say this is what Apple’s been chasing for at least a decade, whilst its competitors remained focused on Moore’s Law. (Thanks to DF reader Aditya Sood.)
Update: Via Kottke, here’s Alexis Madrigal thinking about the implications of this:
Imagine you’ve got a shiny computer that is identical to a Macbook Air, except that it has the energy efficiency of a machine from 20 years ago. That computer would use so much power that you’d get a mere 2.5 seconds of battery life out of the Air’s 50 watt-hour battery instead of the seven hours that the Air actually gets. That is to say, you’d need 10,000 Air batteries to run our hypothetical machine for seven hours. There’s no way you’d fit a beast like that into a slim mailing envelope.
Now think forward 20 years.
★ Thursday, 15 September 2011