Forget Moore's Law - we need Koomey's Law says ARM Director


  •   4 min reads
Forget Moore's Law - we need Koomey's Law says ARM Director

Moore's Law is not only nearing the end of its life; it is no longer what we need. Instead, we need Koomey's Law says a Director of Technology at ARM.

The phrase Moore's Law is an oxymoron, so for that matter is Koomey's Law; there is no law, no law of nature; they are merely observations. It might be better if we referred to them as examples of learning rates; the more we do something, the better we get. Wright's Law, named after Theodore Wright, an aeronautical engineer who observed, back in the 1930s, that every doubling in aircraft manufacturing led to a fall in cost, is an example of a learning rate. Renewables and lithium-ion batteries also follow a learning rate.

Moore's Law describes the number of transistors on an integrated circuit but effectively refers to processing power. It used to say computers double in speed every 18 months, although the rate of doubling has slowed. So it may be more accurate to say computers increase in power at a rate that relates to computer output, creating the illusion that they follow a time function.

Moore's Law gets a pick me up

Koomey's Law refers to the number of computations per joule of energy dissipated. Or the energy efficiency per unit of processing power.  The Law used to state that this efficiency doubles every 1.5y years.

The need

There is a physical limit to how fast conventional computers can go because the gap between transistors can't go smaller than an atom. So instead, computer processing power must advance using different technology such as photonics, in which light rather than electric signals links transistors or quantum computing.

Fifth industrial revolution technologies take a step closer
Three new technology announcements of the last few days concerning brain-computer interfaces, thinking robots based on frog DNA, and quantum computers are so extraordinary that they can’t obviously come under the moniker Fourth Industrial Revolution.
Does new Google AI bring technological singularity closer?
A Google paper proposes a method for using AI to develop AI chips. Does this take us closer to technology singularity?

But we live in an age when the call for sustainability is hard to resist. And computers use a lot of energy.

According to one estimate, the Internet uses around one-tenth of the world's energy.

ARM says we need a switch

In a blog post, Rob Aitken, Fellow & Director of Technology, Arm, said that the "technology roadmap can no longer prioritise processing power."

He said: "3.7 billion people worldwide still do not have full access to digital technology. Closing the digital divide is a moral imperative, but it also poses a new conundrum for the tech industry."

He then posed the question: "How do we mitigate the environmental impact of 3.7 billion new digital consumers, connecting everyone, everywhere without catastrophically accelerating climate change?"

He said: "Koomey's Law is arguably more relevant to the way consumers experience computing today – and the way we should be constructing tech roadmaps. Our digital life tends to span multiple devices, where battery life and performance per watt are more important than gross performance alone."

His conclusion: Performance per watt needs to be the new paradigm.

ARM and Nvidia

Of course, an ARM Director would say that, wouldn't he? The essence of ARM is creating chip designs that require efficient use of energy — that is why ARM licensed devices sit in hundreds of millions of smartphones and even more Internet of Things devices.

But Nvidia is purchasing ARM.

Nvidia is famous for its chips that soup up computers, especially useful for PCs used to play computer games. However, Nvidia technology also helps form the backbone of neural networks — the technology that makes deep learning — the cutting edge application of AI — possible.

Between 2012 and 2018, "the computational resources needed to produce a best-in-class AI model" increased 300,000 times, stated a report on Forbes.

According to an article in Technology Review, training a simple AI algorithm can emit as much carbon as the lifetime emissions of five cars.

Then, of course, there is the carbon footprint of blockchain.

Maybe one solution is for data centres, neural networks and Bitcoin miners is for them to be put into orbit, with energy coming from solar power.

Mr Aitken said: "As we begin to close the digital divide, sharing the benefits of connectivity with billions of new users of technology, this relentless focus on efficiency will become ever more vital. If we're to avoid catastrophic climate change, keeping power and energy numbers stable is not enough; we must work to ensure that they actively decrease, reducing energy consumption and lowering emissions wherever computing happens.

"Performance per watt must become the new paradigm, guiding product roadmaps that extract an increasing amount of performance from an ever-decreasing power envelope."

It is hard to disagree.

Related News

You've successfully subscribed to Techopian - The conversation and voice for ethical technology
All done, we'll keep you informed when we post articles. Just check your email
Welcome back!
Success!
Success! Your billing info is updated.
Billing info update failed.
Your link has expired.