Wednesday, January 13, 2010

Molecular Manufacturing







In fairness to all, nanotechnology is in early days.  We are seeing a lot of exciting proof of principal type work.

 

We can make some predictions.

 

We need an engineering template idea similar to the integrated circuit to organize and use this knowledge.  Otherwise we will be studying thousands of noncompatible dead ends.  Right now we do not know if this is possible.  This template system is likely to be three dimensional.

 

Our capabilities will swiftly surpass anything imagined by Mother Nature.  This is important because Mother Nature has actually built out with a lot of constraints that is could not work around easily.  We actually have the option of introducing exotic combinations that super perform.

 

In short, expect to be surprised and surprised and surprised.

 

This NRC report is attempting to establish a focus and support system within the community to ease the process.  I think that we still need much more data and conferences to settle directions.  Again this is early days and this process has likely decades to develop.  Right now we want to avoid closing off any avenues.

 

Molecular Manufacturing:

 

http://metamodern.com/2010/01/07/molecular-manufacturing-the-nrc-study-and-its-recommendations/

 

The NRC study and its recommendations

 

by ERIC DREXLER on JANUARY 7, 2010

Part 6 of a series prompted by the recent 50th anniversary of Feynman’s historic talk, “There’s Plenty of Room at the Bottom”. This is arguably the most important post of the series, or of this blog to date.

Topics:
The most credible study of molecular manufacturing to date

The study’s recommendations for Federal research support
The current state of progress toward implementation
The critical problem: not science, but institutions and focus


Committee to Review the National Nanotechnology Initiative, National Research Council


A formal, Federal-level study has examined the physical principles of high-throughput atomically precise manufacturing (aka molecular manufacturing), assessing its feasibility and closing with a call for experimental research.

Surprisingly, this recommendation smacks of heresy in some circles, and the very idea of examining the subject met strong opposition.

The process in outline: Congress voted to direct the U.S. National Research Council,the working arm of the U.S. National Academies, to conduct, as part of the lengthy Triennial Review of the National Nanotechnology Initiative, what in the House version had been described as a “Study on molecular manufacturing…to determine the technical feasibility of the manufacture of materials and devices at the molecular scale”, and in response, the NRC convened a study committee that organized a workshop, examined the literature, deliberated, and reported their conclusions, recommending appropriate research directions for moving the field forward, including experimental research directed toward development of molecular manufacturing.

NRC studies are not haphazard processes, and the National Academies website describes its procedures in substantial detail. Because the NRC often advises the Federal government on politically charged questions, “Checks and balances are applied at every step in the study process to protect the integrity of the reports and to maintain public confidence in them.” These include independent scientific review of reports that are themselves the product of independent experts assembled with attention to potential conflicts of interest.
It’s worth taking a moment to compare the NRC to the three previous leading sources of information on molecular manufacturing: committed advocates, committed critics, and self-propagating mythologies. None of these is remotely comparable. Unless one has studied the topic closely and in technical detail, it seems reasonable to adopt the committee’s conclusions as a rough-draft version of reality, and to proceed from there.
Here are some excerpts that I think deserve special emphasis, followed by the concluding paragraph of the report:
Technical Feasibility of Site-Specific Chemistry for Large-Scale Manufacturing

The proposed manufacturing systems can be viewed as highly miniaturized, highly articulated versions of today’s scanning probe systems, or perhaps as engineered ribosome-like systems…

…The technical arguments make use of accepted scientific knowledge but constitute a “theoretical analysis demonstrating the possibility of a class of as-yet unrealizable devices.”22

Construction of extended structures with three-dimensional covalent bonding may be easy to conceive and might be readily accomplished, but only by using tools that do not yet exist.25 In other words, the tool structures and other components cannot yet be built, but they can be computationally modeled.


[ ... concluding paragraph:]


Although theoretical calculations can be made today, the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time. Thus, the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence. Finally, the optimum research paths [to advanced systems] cannot be reliably predicted at this time. Research funding that is based on the ability of investigators to produce experimental demonstrations that link to abstract models and guide long-term vision is most appropriate to achieve this goal.


22. K.E. Drexler. 1992. Nanosystems, Molecular Machinery, Manufacturing and Computation. New York: Wiley & Sons.


25. M. Rieth and W. Schommers, eds. 2005. Handbook of Computational and Theoretical Nanotechnology. American Scientific Publishers.


My summary in a nutshell:

The committee examined the concept of advanced molecular manufacturing, and found that the analysis of its physical principles is based on accepted scientific knowledge, and that it addresses the major technical questions. However, in the committee’s view, theoretical calculations are insufficient: Only experimental research can reliably answer the critical questions and move the technology toward implementation. Research in this direction deserves support.
\
I should note that the tone of the report is skeptical, emphasizing what the committee [correctly] sees as the unusual approach and the [resulting,methodologically inherent] incompleteness of the results. A quick skim could easily suggest a negative assessment. A closer reading, however, shows that points raised are in the end presented, not as errors, nor even as specific, concrete weaknesses in the analysis, but instead as work not yet done, motivating the development of a research program directed toward validating and achieving the proposed technological objectives.

The call for research

The report closes with a call for research on pathways toward molecular manufacturing, quoted above, and an earlier section outlines some appropriate objectives:
To bring this field forward, meaningful connections are needed between the relevant scientific communities. Examples include:
Delineating desirable research directions not already being pursued by the biochemistry community;
Defining and focusing on some basic experimental steps that are critical to advancing long-term goals; and
Outlining some “proof-of-principle” studies that, if successful, would provide knowledge or engineering demonstrations of key principles or components with immediate value.

The response and progress

The technology roadmap

 

Research directions toward molecular manufacturing have been charted in the subsequent Technology Roadmap for Productive Nanosystems, the result of a project led by the Battelle Memorial Institute, the manager of research at U.S. National Laboratories that include Pacific Northwest, Oak Ridge, and Brookhaven. These labs hosted several Roadmap workshops and provided many of the participating scientists and engineers; I served as the lead technical consultant for the project.

The Roadmap is responsive to the NRC request above, and recommends research that includes work along the lines I describe below.

Molecular engineering methodologies

 

The crucial research objective is the development of systematic experimental and design methodologies that enable the fabrication of large, multicomponent, atomically precise nanostructures by means of self-assembly. This research direction fits the NRC committee’s criteria: it is, by nature, strongly experimental, and in mimicking macromolecular structures and processes in biology, it holds promise for near-term biomedical applications.

Structural DNA nanotechnology

 

In the year the NRC report reached print, a Nature paper reported a breakthrough-level development, “DNA origami”. This technology opened the door to systematic, atomically precise engineering on a scale of hundreds of nanometers and millions of atoms.

Since then, we’ve seen rapid progress in structural DNA nanotechnology. I discussed recent landmark achievements here and here.

Polypeptide foldamer nanotechnology

 

There’s also been rapid progress in design methodologies for complex, atomically precise nanoscale structures made from polypeptide foldamers (aka proteins). In recent years, protein engineering has achieved a functional milestone: systematically engineering devices that perform controlled molecular transformations (see“Computational tools for designing and engineering biocatalysts”).

Framework-directed assembly of composite systems

 

Looking forward, promising next steps involve integrating structural DNA frameworks with polypeptide foldamers, other foldamers, and other organic and inorganic materials. These classes of components have complementary properties (as discussed in my comments on “Modular Molecular Composite Nanosystems”).


Why these developments are important

 

As is now well recognized, “existing biological systems for protein fabrication could be harnessed to produce nanoscale molecular machines with designed functions” (“Computational protein design promises to revolutionize protein engineering”). Further, as biological systems demonstrate, programmable molecular machine systems can be harnessed to build programmable molecular machine systems.

As I’ve discussed, this capability could be exploited to pursue a spiral of improvement in materials, components, and molecular machine systems.

The path ahead

 

This spiral of development, in which molecular tools are used to construct more capable next-generation molecular tools, could be exploited to develop products with expanding applications, falling cost, and increasing value.

As I discussed in “Making vs. Modeling: A paradox of progress in nanotechnology”,each generation of tools can be expected to enable fabrication processes and products that are more robust, more susceptible to computational simulation, and better suited to established systems engineering design methodologies. This indicates the potential for an accelerating pace of development toward a technology platform that can support the implementation of high-throughput atomically precise fabrication.

This path is being followed today, yet the level of support and organization, of mission and urgency, does not come close to matching its potential for solving long-term yet urgent problems.

 

Appropriate and inappropriate responses to the NRC report on molecular manufacturing

The evaluation of the feasibility of molecular manufacturing and recommendations for research form the concluding section of the body of the NRC’s Triennial Review of the National Nanotechnology Initiative. In the three years since the publication of the NRC report, I have seen no document from a Federal-level source that acknowledges these conclusions, and, of course, none that offers a substantive response.

This is of concern, because the NRC report calls for a sharp break with past thinking. To put it bluntly, much of the opinion in general circulation about molecular manufacturing (both pro and con) is rubbish because it is based on mythology, and not on the scientific literature. The NRC report can be criticized on several points, but it isn’t rubbish.

Fulfilling the initial promise of nanotechnology

 

Atomically precise fabrication technologies exist today, and as I have noted,advanced atomically precise fabrication is the promise that initially defined the field of nanotechnology. I believe the record shows that advanced atomically precise fabrication is also the promise that got it funded.

Building on recent advances, strategically targeted research in atomically precise fabrication could draw on and contribute to fields across the spectrum of modern nanotechnologies, from materials to deviced, and could bring them together to elevate the technology platform for further advances. Ultimately, as the NRC report suggests, those advances could potentially deliver what was promised at the inception of the field.

Make no mistake: the path to high-throughput atomically precise manufacturing will not be short, and it will not be direct. It will be a multi-stage development process, and as I have discussed, the early steps differ greatly from the ultimate results in both their form and their potential applications.

Growing urgency

 

Today, the potential promise of high-throughput atomically precise manufacturing must be regarded as credible. As a consequence of its inherent productive capacity, it offers a credible potential solution to problems of energy production and climate change. The National Research Council of the U.S. National Academies of Science, Engineering, and Medicine has called for the support of research explicitly directed toward the development of this technology. This has become urgent.

The strength and limitations of current research support

 

It is both laudable and problematic that the research I’ve reported above is chiefly funded by programs in biology and medicine. This support has enabled great progress, and I know from long discussion that researchers in these areas have ambitious visions for the future. There are, however, limits to what can be achieved while developing molecular engineering within the framework of biotechnology, much as there would have been if aeronautical engineering research had been developed as a field of ornithology.

The critical need today is not for new scientific results, but for an integrative approach to molecular systems engineering, directed toward strategic technology objectives. The science is ready. The institutions are not.

 

A word to readers:

 

The implications of the NRC report call for reconsidering views that have shaped policy in the research disciplines critical to progress toward molecular manufacturing, yet like many other NRC reports, it is virtually unknown. Directing other readers to what I have written here could help to remedy this problem.

(And a further note to readers who are bursting with frustration: Please don’t. It is counterproductive, and generates far more heat than light.)

Note: I say in the first paragraph that Congress voted for “…what in the House version had been described as a ‘Study on molecular manufacturing…to determine the technical feasibility of the manufacture of materials and devices at the molecular scale’” to reflect an oddity of the legislative history behind the study: After the House transmitted the bill to the Senate, a nanotechnology business association successfully lobbied to replace “molecular manufacturing”, thereby calling for a (puzzling) “Study on molecular self-assembly”. An uproar followed. In the end, the NRC did a study of molecular self-assembly, as directed in the final bill, but also responded to the request by the House for a study of molecular manufacturing. In the end, molecular manufacturing dominated the agenda of the workshop. [I corrected the main text and this description after reviewing the GPO documents, several hours after the initial posting.]

In a later section, I note that “I have seen no document from a Federal-level source that acknowledges these conclusions”. There is, in fact, a document that quotes from the conclusions, but the quoted material is edited in a way that wrongly indicates that the recommendations regarding molecular manufacturing are, instead, recommendations regarding molecular self-assembly (see “The National Nanotechnology Initiative: Second Assessment and Recommendations of the National Nanotechnology Advisory Panel”, p.43).

[Dec 8: Updated to add the paragraph beginning “I should note that the tone of the report is skeptical...” I would expect this tone to strongly influence the impression left on casual readers, blunting the impact of what, in substance, amounts to a sharp rebuke to the conventional wisdom.]

An open comment thread for this post can be found here.

Carbon Dioxide Optimization







This is a pretty good review of carbon dioxide and its role in the climate of Earth.  Now that the political version of climate science has been exposed as largely bogus, a lot of the contradictory voices are swiftly getting published and gaining an audience.

A lot of this we already know but the clear take-home message is that the geological record supports CO2 levels at 1000 ppm as likely the best overall level for supporting our ecosystem.  It may turn out that the ongoing recovery from the ice age is actually promoting a return to that effective level.  We have many centuries to go yet.

What is been buried is the curious hypothesis that rising CO2 is driving global warming.  It simply is not.  There might be a contribution, but we cannot even show that.  Right now the folks here have satisfied themselves that such contribution is clearly negligible.

The climate certainly varies and often surprises.  We presently have been riding through a peak cosmic ray flux which argued this early fall for a miserable winter.  Thus we could predict a miserable winter.   So far we have been having a miserable winter that certainly is not disappointing our predictions.  


Steven D.Levittand Stephen J. Dubner: The green gadflys

Posted:January 07, 2010, 10:30 AM by NP Editor


Not so many years ago, schoolchildren were taught that carbon dioxide is the naturally occurring lifeblood of plants, just as oxygen is ours. Today, children are more likely to think of carbon dioxide as a poison. That’s because the amount of carbon dioxide in the atmosphere has increased substantially over the past 100 years, from about 280 parts per million to 380.
But what people don’t know, say the scientists at Intellectual Ventures labs in Bellevue, Wash., is that the carbon dioxide level some 80 million years ago — back when our mammalian ancestors were evolving — was at least 1,000 parts per million. In fact, that is the concentration of carbon dioxide you regularly breathe if you work in a new energy-efficient office building, for that is the level established by the engineering group that sets standards for heating and ventilation systems.
So not only is carbon dioxide plainly not poisonous, but changes in carbon dioxide levels don’t necessarily mirror human activity. Nor does atmospheric carbon dioxide necessarily warm the earth: Ice-cap evidence shows that over the past several hundred thousand years, carbon dioxide levels have risen after a rise in temperature, not the other way around.
Meet Ken Caldeira, a soft-spoken man with a boyish face and a halo of curly hair. He runs an ecology lab at Stanford University for the Carnegie Institution. Caldeira is among the most respected climate scientists in the world, his research cited approvingly by the most fervent environmentalists. He and a co-author coined the phrase “ocean acidification,” the process by which the seas absorb so much carbon dioxide that corals and other shallow-water organisms are threatened. He also contributes research to the Intergovernmental Panel on Climate Change, which shared the 2007 Nobel Peace Prize with Al Gore for sounding the alarm on global warming.
If you met Caldeira at a party, you would likely place him in the fervent-environmentalist camp himself. He was a philosophy major in college, for goodness’ sake, and his very name — a variant of caldera, the crater-like rim of a volcano— aligns him with the natural world. In his youth (he is 53 now), he was a hard-charging environmental activist and all-around peacenik.
Caldeira is thoroughly convinced that human activity is responsible for some global warming and is pessimistic about how future climate will affect humankind. He believes that “we are being incredibly foolish emitting carbon dioxide” as we currently do.
Yet his research tells him that carbon dioxide is not the right villain in this fight. For starters, as greenhouse gases go, it’s not particularly efficient. “A doubling of carbon dioxide traps less than 2% of the outgoing radiation emitted by the earth,” he says. Furthermore, atmospheric carbon dioxide is governed by the law of diminishing returns: Each gigaton added to the air has less radiative impact than the previous one.
Caldeira mentions a study he undertook that considered the impact of higher carbon dioxide levels on plant life. While plants get their water from the soil, they get their food — carbon dioxide, that is — from the air. An increase in carbon dioxide means that plants require less water to grow.
Caldeira’s study showed that doubling the amount of carbon dioxide while holding steady all other inputs— water, nutrients and so forth— yields a 70% increase in plant growth, an obvious boon to agricultural productivity.
“That’s why most commercial hydroponic green houses have supplemental carbon dioxide,” a colleague says. “And they typically run at 1,400 parts per million.”
“Twenty thousand years ago,” Caldeira says, “carbon dioxide levels were lower, sea level was lower — and trees were in a near state of asphyxiation for lack of carbon dioxide. There’s nothing special about today’s carbon dioxide level, or today’s sea level, or today’s temperature. What damages us are rapid rates of change. Overall, more carbon dioxide is probably a good thing for the biosphere — it’s just that it’s increasing too fast.”
The gentlemen of Intellectual Ventures abound with further examples of global warming memes that are all wrong.
Rising sea levels, for instance, “aren’t being driven primarily by glaciers melting,” Lowell Wood says, no matter how useful that image may be for environmental activists. The truth is far less sexy. “It is driven mostly by water-warming — literally, the thermal expansion of ocean water as it warms up.”
Sea levels are rising, Wood says — and have been for roughly 12,000 years, since the end of the last ice age. The oceans are about 425 feet higher today, but the bulk of that rise occurred in the first thousand years. In the past century, the seas have risen less than eight inches.
As to the future: Rather than the catastrophic 30-foot rise some people have predicted over the next century — goodbye, Florida! — Wood notes that the most authoritative literature on the subject suggests a rise of about one and a half feet by 2100. That’s much less than the twice-daily tidal variation in most coastal locations. “So it’s a little bit difficult,” he says, “to understand what the purported crisis is about.”
Caldeira, with something of a pained look on his face, mentions a most surprising environmental scourge: trees. Yes, trees. As much as Caldeira personally lives the green life — his Stanford office is cooled by a misting water chamber rather than air conditioning — his research has found that planting trees in certain locations actually exacerbates warming because comparatively dark leaves absorb more incoming sunlight than, say, grassy plains, sandy deserts or snow-covered expanses.
Then there’s this little-discussed fact about global warming: While the drumbeat of doom has grown louder over the past several years, the average global temperature during that time has in fact decreased.
In the darkened conference room, Intellectual Ventures co-founder Nathan Myhrvold cues up an overhead slide that summarizes IV’s views of the current slate of proposed global warming solutions. The slide says:
  • Too little
  • Too late
  • Too optimistic
Too little means that typical conservation efforts simply won’t make much of a difference. “If you believe there’s a problem worth solving,” Myhrvold says, “then these solutions won’t be enough to solve it. Wind power and most other alternative energy things are cute, but they don’t scale to a sufficient degree. At this point, wind farms are a government subsidy scheme, fundamentally.”
What about the beloved Prius and other low-emission vehicles? “They’re great,” he says, “except that transportation is just not that big of a sector.”
Also, coal is so cheap that trying to generate electricity without it would be economic suicide, especially for developing countries. Myhrvold argues that cap-and-trade agreements, whereby coal emissions are limited by quota and cost, can’t help much, in part because it is already …
Too late. The half-life of atmospheric carbon dioxide is roughly one hundred years, and some of it remains in the atmosphere for thousands of years. So even if humankind immediately stopped burning all fossil fuel, the existing carbon dioxide would remain in the atmosphere for several generations. Pretend the United States (and perhaps Europe) miraculously converted overnight and became zero-carbon societies. Then pretend they persuaded China (and perhaps India) to demolish every coal-burning power plant and diesel truck. As far as atmospheric carbon dioxide is concerned, it might not matter all that much. And by the way, that zero-carbon society you were dreamily thinking about is way …
Too optimistic. “A lot of the things that people say would be a good thing probably aren’t,” Myhrvold says. As an example, he points to solar power. “The problem with solar cells is that they’re black, because they are designed to absorb light from the sun. But only about 12% gets turned into electricity, and the rest is reradiated as heat — which contributes to global warming.”
Although a widespread conversion to solar power might seem appealing, the reality is tricky. The energy consumed by building the thousands of new solar plants necessary to replace coal-burning and other power plants would create a huge long-term “warming debt,” as Myhrvold calls it. “Eventually, we’d have a great carbon-free energy infrastructure but only after making emissions and global warming worse every year until we’re done building out the solar plants, which could take 30 to 50 years.”
FromSuperFreakonomicsby Steven D.Levittand Stephen J. Dubner. Copyright © 2009 by Steven D.Levittand Stephen J. Dubner. Published with arrangement by HarperCollinsCanada

Alzheimer Advance






This is a promising experimental protocol that may lead to exact knowledge of the underlying issues and perhaps from there to actual treatment.  I have never thought the disease was anything like intractable like our old enemy cancer.  It really begged recognizing a specific failure similar to insulin failure in diabetes.  This protocol allows us to ask questions and easily test interventions.

 

A pathway exists that is not yet understood.  I hope progress is now swift.  This disease is not just personally tragic; it imposes a huge healthcare burden that needs to be ameliorated.  We want the elderly to retain personal independence to the end. This also suggests that progress may be possible on comparable ailments for the same reasons.

 

 

University of Central Florida Alzheimer's Discovery Could Lead to Long-sought Preventive Treatment

 

The research was published in the science and medicine journal PLoS ONE, also demonstrates how the unique application of an existing cell research technique could accelerate the discovery of treatments to exploit the new findings.

Most Alzheimer's studies have focused on brain cells already damaged by amyloid-beta or the effects of high concentration of amyloid-beta. The University of Central Florida team, led by James Hickman, head of the UCF NanoScience Technology Center's Hybrid Systems Laboratory, instead explored impacts of very low amyloid-beta concentrations on healthy cells in an effort to mimic the earlier stages of Alzheimer's. The results were shocking.

The UCF team found that over time, though there are no outward signs of damage, exposure to moderate amyloid-beta concentrations somehow prevents electrical signals from traveling normally through the cells. Because the effect is seen in otherwise healthy cells, Hickman believes the team may have uncovered a critical process in the progression of Alzheimer's that could occur before a person shows any known signs of brain impairment.



"What we're claiming is that before you have any behavioral clues, these electrical transmission problems may be occurring," he says.


If this proves true, then the team has opened a promising potential path to an Alzheimer's treatment that could block the onset of the mild cognitive impairment associated with early Alzheimer's. In contrast, all currently available treatments manage symptoms of Alzheimer's after they first appear -- when it is likely too late for prevention.



Kucku Varghese, a former graduate student in the Hickman lab now at the University of Florida, first demonstrated amyloid-beta's effects at low concentrations on healthy cells using a common cell research method that is laborious and unsuitable for long-term experiments. But the Hickman team quickly moved to more advanced experiments using microelectrode arrays (MEA) to study the new finding. MEA studies use cultures of neurons on plates embedded with tiny electrodes that can send and measure electrical signals through nearby cells without damaging them, allowing extended experimentation.

Hickman hopes to use MEAs and other tools to pinpoint the physiological and chemical changes within the brain cells that cause the loss of signal generation in healthy cells. Mechanisms responsible for the changes could offer potential targets for drugs, which pharmaceutical companies could search for using the MEA techniques demonstrated, and the mechanisms might provide a measurable target for early diagnosis of Alzheimer's. 

"We're trying to find a marker that will lead to detection and treatment while slowing down Alzheimer's progression and can really make a difference by delaying or even preventing onset of the disease," says Hickman.



Tuesday, January 12, 2010

US Car Fleet Shrinks







It is one thing to acknowledge that the gross market for cars is shrinking.  This has provided the consumer with an opportunity to reduce his actual dependence also.  Folks are making the effort to use cars less.

 

A big part of it is that the cash drain itself is huge in proportion to anyone’s income. It is the second largest direct expense after housing, and in many cases it is directly comparable.  In a time of recession, it is also the quickest way available to cut expenses.

 

Of course, the real shoe has not dropped yet in the auto industry.  The true electric car is about to arrive.  This is a disaster for the manufacturing and service tail of the business in countries that are already saturated.

 

An electric car is far simpler to manufacture and simple to maintain.  That means the service tail of the industry will largely disappear.  This fed the auto industry.

 

The change over is also promising to be really abrupt if EEStor delivers this year on its promised battery.  The instant that it can be delivered, the market for gasoline based cars will dry up quickly.

 

The only delay will be in tooling up new designs and battery manufactories.  Customers will hold off until delivery.

 

No one wants a vehicle obsolete today and part of a collapsing service system.

 

The electric car will not happen without a successful battery technology such as threatened by EEStor and also others.  However, the instant it is available, the changeover is swift.  The industry is already prepped to fill the supply chain, even if they hate what it does to their own industry.

 

A positive result I suppose is that car markets in China and India will saturate must quicker than anyone ever imagined.  I also suspect that GM is getting awfully tired of hearing about the creative destruction of the free market.  The cash drain on the American household to the auto industry will be slashed hugely.  No wonder they have not embraced the electric car with any enthusiasm at all.

 

 

 

U.S. car fleet shrank by four million in 2009

 

6 JAN 2010 BY LESTER BROWN


America’s century-old love affair with the automobile may be coming to an end. The U.S. fleet has apparently peaked and started to decline. In 2009, the 14 million cars scrapped exceeded the 10 million new cars sold, shrinking the U.S. fleet by 4 million, or nearly 2 percent in one year. While this is widely associated with the recession, it is in fact caused by several converging forces. 
Future U.S. fleet size will be determined by the relationship between two trends: new car sales and cars scrapped. Cars scrapped exceeded new car sales in 2009 for the first time since World War II, shrinking the U.S. vehicle fleet from the all-time high of 250 million to 246 million. It now appears that this new trend of scrappage exceeding sales could continue through at least 2020.
Among the trends that are keeping sales well below the annual figure of 15–17 million that prevailed from 1994 through 2007 are market saturation, ongoing urbanization, economic uncertainty, oil insecurity, rising gasoline prices, frustration with traffic congestion, mounting concerns about climate change, and a declining interest in cars among young people.
Market saturation may be the dominant contributor to the peaking of the U.S. fleet. The United States now has 246 million registered motor vehicles and 209 million licensed drivers—nearly 5 vehicles for every 4 drivers. When is enough enough?
Japan may offer some clues to the U.S. future. Both more densely populated and highly urbanized than the United States, Japan apparently reached car saturation in 1990. Since then its annual car sales have shrunk by 21 percent. The United States appears set to follow suit.
The car promised mobility, and in a largely rural United States it delivered. But with four out of five Americans now living in cities, the growth in urban car numbers at some point provides just the opposite: immobility. The Texas Transportation Institute reports that U.S. congestion costs, including fuel wasted and time lost, climbed from $17 billion in 1982 to $87 billion in 2007.
Mayors across the country are waging a strong fight to save their cities from cars, trying to reduce traffic congestion and air pollution. Many are using a “carrot-and-stick” approach to reduce costly traffic congestion by simultaneously improving public transportation while imposing restrictions on the use of cars.
Almost every U.S. city is either introducing new light rail lines, new subway lines, or express bus lines, or they are expanding and improving existing public transit systems in order to reduce dependence on cars. Among the cities following this path are Phoenix, Seattle, Houston, Nashville, and Washington, D.C. As urban transit systems expand and improve, commuters are turning to public transit as driving costs rise. Between 2005 and 2008, transit ridership climbed 9 percent in the United States. Many cities are also actively creating pedestrian and bicycle-friendly streets, making it easier to walk or bike to work.
Forward-looking cities are also reconsidering parking requirements for new buildings. Washington, D.C., for example, has rewritten its 50-year-old codes, reducing the number of parking spaces required with the construction of both commercial and residential buildings. Earlier codes that once required four parking spaces for every 1,000 square feet of retail space now require only one.
As parking fees rise, many cities are moving beyond coin-fed parking meters and replacing them with meters that use credit cards. The nation’s capital is making this shift in early 2010 as it raises street parking fees from 75¢ to $2 per hour.
Economic uncertainty makes some consumers reluctant to undertake the long-term debt associated with buying new cars. In tight economic circumstances, families are living with two cars instead of three, or one car instead of two. Some are dispensing with the car altogether. In Washington, D.C., with a well-developed transit system, only 63 percent of households own a car.
A more specific uncertainty is the future price of gasoline. Now that motorists know that gas prices can climb to $4 a gallon, they worry that it could go even higher in the future. Drivers are fully aware that much of the world’s oil comes from the politically volatile Middle East.
Perhaps the most fundamental social trend affecting the future of the automobile is the declining interest in cars among young people. For those who grew up a half-century ago in a country that was still heavily rural, getting a driver’s license and a car or a pickup was a rite of passage. Getting other teenagers into a car and driving around was a popular pastime.
In contrast, many of today’s young people living in a more urban society learn to live without cars. They socialize on the Internet and on smart phones, not in cars. Many do not even bother to get a driver’s license. This helps explain why, despite the largest U.S. teenage population ever, the number of teenagers with licenses, which peaked at 12 million in 1978, is now under 10 million. If this trend continues, the number of potential young car-buyers will continue to decline.
Beyond their declining interest in cars, young people are facing a financial squeeze. Real incomes among a large segment of society are no longer increasing. College graduates already saddled with college loan debt may find it difficult to get the credit to buy a car. Young job market entrants are often more interested in getting health insurance than in buying a car.
No one knows how many cars will be sold in the years ahead, but given the many forces at work, U.S. vehicle sales may never again reach the 17 million that were sold each year between 1999 and 2007. Sales seem more likely to remain between 10 million and 14 million per year.
Scrappage rates are easier to project. If we assume an auto life expectancy of 15 years, scrappage rates will lag new sales by 15 years. This means that the cars sold in the earliest of the elevated sales years of 15–17 million vehicles from 1994 through 2007 are just now reaching retirement age. Even though newer cars are more durable than earlier models, and may thus stay on the road somewhat longer on average, scrappage rates seem likely to exceed new car sales through at least 2020. Given a decline of 1–2 percent a year in the fleet from 2009 through 2020, the U.S. fleet could easily shrink by 10 percent (25 million), dropping from the 2008 fleet peak of 250 million to 225 million by 2020.
At the national level, shrinkage of the fleet combined with rising fuel efficiency will reinforce the trend of declining oil use that has been under way since 2007. This means reduced outlays for oil imports and thus more capital retained to invest in job creation within the United States. As people walk and bike more, it will mean less air pollution and fewer respiratory illnesses, more exercise and less obesity. This in turn will also reduce health care costs.
The coming shrinkage of the U.S. car fleet also means that there will be little need to build new roads and highways. Fewer cars on the road reduces highway and street maintenance costs and lessens demand for parking lots and parking garages. It also sets the stage for greater investment in public transit and high-speed intercity rail.
The United States is entering a new era, evolving from a car-dominated transport system to one that is much more diversified. As noted, this transition is driven by market saturation, economic trends, environmental concerns, and by a cultural shift away from cars that is most pronounced among young people. As this evolution proceeds, it will affect virtually every facet of life.

For more information and data resources regarding this article, please visit the Earth Policy Institute website.