Does Improving Efficiency Really Save Energy?
Friday, January 14, 2011
by Karl Stephan, Engineering Ethics Blog
You might almost say that what health is to doctors or justice is to lawyers, efficiency is to engineers. Making machines more efficient summarizes a good bit of everything that has gone on in technology and engineering over the last couple of hundred years or so. And if you broaden the definition of efficiency to include useful (or desirable) work performed per unit cost (and not just per unit of raw energy input), then everything from airplanes to zippers has gotten more efficient over the years. Increased efficiency in energy-consuming products has been viewed as the no-brainer answer to the problem of rising energy demands around the globe. Instead of building more coal-fired power plants, conservationists say, just replace X million incandescent bulbs with compact fluorescents, and you’ve save tons of carbon at virtually no infrastructure cost. This is all very well, but a recent article in The New Yorker calls into question the universally accepted idea that increasing energy efficiency truly leads to less energy consumed.
A nineteenth-century economist named William Stanley Jevons was among the first to point out that improved energy efficiency in manufacturing iron, for example (Jevons’ father was an iron merchant) doesn’t necessarily mean that you will end up using less coal to make iron in the long run. What can happen, especially when the cost of energy is a large portion of the finished product cost, is that when the price goes down due to smaller energy usage, people start using more iron—so much more, in fact, that even with more energy-efficient production, the total amount of iron sold is so much larger that the industry as a whole ends up consuming more energy than before, not less.
Jevons’ idea obviously applies to a lot of things besides iron. Take computers, for example. The first electronic computer occupied a room the size of a small house and consumed about 150 kilowatts of power. Its computing ability was much less than what a tiny 8-pin embedded microprocessor can do today. On a strict efficiency basis, measured by almost any yardstick—energy consumption, cost, space, weight—today’s microprocessor is thousands or millions of times more efficient. But guess what? In 1946 there was exactly one electronic computer of the type I’m describing (ENIAC, installed at the U. S. Army’s Aberdeen Proving Ground), and today there are many millions of computers of all sizes, plus giant server farms that tax the power-generating capability of the entire power grid of the Northwest U. S. The total amount of electricity devoted to electronic computing has gone from 150 kW in 1946 to many gigawatts today, if you count all the mobile phones on batteries, the computerized cash registers, and so on.
So what’s an engineer to do? Give up on making things more efficient because people will only use more of them? This is a great example of a case where doing the right thing in a micro-environment (a single company or even industry) may lead to complicated consequences in a macro-environment such as the economy of a country or even the globe. In fact, it goes to the heart of what engineering is all about, and makes one face the question of how to justify energy consumption on a fundamental level.
While this blog is not about global warming, there are those who believe that radical reductions in the world’s carbon footprint are imperative if we are to avoid a gigantic creeping disaster that will flood most of the world’s coastal cities, which means, more or less, many of the world’s cultural and political capitals. Oh, and by the way, millions will die prematurely. Although I do not happen to agree with this premise, let’s grant it for the sake of argument. Given an immediate need to reduce energy consumption by a large fraction, what should we do? Make everything that uses energy more efficient? Jevons’ idea says this simply won’t work. In the broad definition of efficiency we’ve been using, improving efficiency often leads to more energy use, not less.
The unpleasant alternative to what looked like a win-win solution—improved energy efficiency and less energy usage—is some form of rationing: either energy taxes, or simple flat-out restrictions on energy use. Many countries practice this already: it’s called power outages. Power is on only at night, or three hours a day, or not for weeks at a time. It’s arbitrary, unfair, and hits the poorest hardest, but it works. The tax alternative has the advantage that it provides some economic incentive for improving efficiency—but if technology really improves to the point that the tax is compensated for, you’re right back where you started. The only sure-fire way to keep people from using energy as much as they want is to put them under the government’s thumb somehow. Cuba, I understand, has raised this process to an art form—if you consider old cars towed by mules artistic.
Don’t get the idea I think efficiency is bad. If I did, I couldn’t very well call myself an engineer. However, Jevons reminds us that, like many other things in life, energy efficiency can be helpful in limited circumstances. But expecting it to solve all the world’s energy problems is not only unrealistic, but probably counterproductive as well.
Sources: David Owen’s article “The Efficiency Dilemma” appeared in the Dec. 20 & 27, 2010 issue of The New Yorker, pp. 78-85.
source: engineeringethicsblog
0 comments:
Post a Comment