Thursday, December 18, 2008

1kg CIGS = 5kg Uranium


Martin Roscheisen made this very telling comparison on his blog recently. Without question, the one side effect of nano techniques is the dearth of raw materials actually required. This something not fully appreciated by those in the materials side of things.

Of course, that one kilo has to be spread on many square miles of substrate to generate the actual energy.
Regardless this technology is going to change all aspects of the energy business and the sheer capacity to have one tool produce the energy equivalent of one nuclear plant each and every year has not sunk in yet. That is going to need a year, by which time I hope to have an early version of the Eden machine operating.

Before we are all finished, every square mile of usable land on Earth will have a large surplus of power to dispose of in the daily course of business, either at the farm gate or internally. It is just that simple.

It will still take decades to perfect efficient networks and adapt our civilization to this new bounty of energy. But make no mistake, it is a bounty. The average household will be primary producer of power at a nominal cost. There will be no distribution system to pay for.

This power source has a starting selling price of $1.00 per watt which competes now with every other option. A couple years out, the selling price can be much lower and capacity can be at four nuclear power plants and doubling every year therafter until the Earth is saturated.

1kg CIGS = 5kg Uranium

December 16, 2008

By Martin Roscheisen, CEO - Nanosolar

The notion of a kilogram of enriched Uranium conjures up an image of a powerful amount of energy.
Enough to power an entire city for years when used in a nuclear power plant, or enough to flatten an entire county when used in a bomb — that’s presumably what many people would say if one asked them about their thoughts.

In our new solar cell technology, we use an active material called CIGS, a Copper based semiconductor. How does this stack up against enriched Uranium?

Here’s a noteworthy fact, pointed out to me by one of our engineers: It turns out that 1kg of CIGS, embedded in a solar cell, produces 5 times as much electricity as 1kg of enriched Uranium, embedded in a nuclear power plant.

Or said differently, 1kg of CIGS is equivalent to 5kg of enriched Uranium in terms of the energy the materials deliver in solar and nuclear respectively.

The Uranium is burned and then stored in a nuclear waste facility; the CIGS material produces power for at least the warranty period of the solar cell product after which it can then be recycled and reused an indefinite number of times.

Cattail Culture by Daniel Little

Good item on the use of cattails. My thoughts on harvesting, is that it will need to be mechanized. That is the very good reason it was not a primary food crop of our ancestors. Something like a potato lifter should do it with a pressure line to wash out the mud perhaps.

With the right machinery it will be easy to harvest and ultimately macerate the rhizomes.

The cattail paddy would be harvested in late October after it has been drained and allowed to firm up. And after the stalks and leaves have also been harvested and processed. This harvesting cycle will also allow possible separation of the cattail seed head fiber and the ripe seeds themselves all having recognized value.

This also happens after the mosquitoes are suppressed in northern locales.

I wonder how this plant behaves in the saline waters of the tarsand tailings pond. It would be nice if they could extract the various salts. I doubt if it does much good, but its propensity to work well in sewage ponds is a positive indication.

These folks are wrestling with the economics of operating a coop that could be self sufficient and the prospect of efficiently producing one’s own fuel is critical. They appear to have made real progress.

It is obvious that a starch rich crop easily harvested in the off season like cattails and not competing for food is a valuable addition to the community business plan.

James Gustave Speth has written a really important book on sustainability within a modern society. The book is called The Bridge at the Edge of the World: Capitalism, the Environment, and Crossing from Crisis to Sustainability , and it’s an important contribution. One of the most fundamental conclusions that Speth arrives at is the idea that sustainability will require a truly profound transformation of how we think about a “good life,” and a rethinking of the kinds of material circumstances we might aspire to in order to create a world system that is genuinely sustainable.

One way we might try to pursue this line of thought is to consider whether gardens and local biofuel production might provide a basis for more sustainable human activity. Could we use more of our own time and labor to create some of the material necessities of our lives, and do so in a way that imposes a smaller footprint on the world’s energy and resource system?

David Blume was a guest on NPR’s Science Friday on August 15. Blume is the author of
Alcohol Can Be a Gas!: Fueling an Ethanol Revolution for the 21st Century. Blume is an advocate for the idea that alcohol can be a large and ecologically positive component of our modern energy economy (website). And he believes that it is possible to imagine a more decentralized energy economy for the United States in which local producers and distillers satisfy a large percentage of the energy needs of a region.

Blume made an observation that I found intriguing: that the common wetland plant, the cattail, can be a fuel source for producing ethanol. (Here’s a news
story on Blume’s comments about cattails on an earlier occasion.) Corn produces about 250-300 gallons of ethanol per acre, and it is estimated that cattails would produce something less than this. (Blume himself estimates that the yield of cattail ethanol production would be “many, many times” that of corn, and says that 7,000 gallons per acre is feasible. This seems unsupportable, given the potential yield of other biofuel crops.) But cattails also have ecological advantages: they soak up excess nutrients (e.g. agricultural fertilizer runoff or sewage waste plant effluent), and they require little cultivation. Here are a few news stories (story, story) with some interesting background.

So here’s the question: what would be involved in creating a community that is energy self-sufficient based on ethanol production? Could households grow their own fuels? What would the economics of a cooperative community-based distillery look like? How much land, labor, and money would be required for the household?

It should be noted that there is serious disagreement about the most basic features of the commercial ethanol economy: does ethanol production lead to a net gain in energy, or do the inputs into the cultivation and distilling processes exceed the energy content of the resulting volume of alcohol? Here’s a
discussion at FuturePundit and a summary of the findings of a national expert, David Pimental from Cornell University. Here are the central conclusions of a recent study by Pimentel and Tad Patzek at UC-Berkeley:

Turning plants such as corn, soybeans and sunflowers into fuel uses much more energy than the resulting ethanol or biodiesel generates, according to a new Cornell University and University of California- Berkeley study. “There is just no energy benefit to using plant biomass for liquid fuel,” says David Pimentel, professor of ecology and agriculture at Cornell. “These strategies are not sustainable.”
(Other studies reach a very different conclusion. See a summary of studies on the energy balance of current ethanol production on this Oregon
website.)

But still, let’s think it through a bit. The scenario I’m imagining is labor-intensive and local, so the costs of energy associated with mechanization and transportation are reduced or eliminated. Could we imagine a local energy economy based on crops and distillation that could be fitted into an otherwise acceptable lifestyle? (The analysis will begin to sound like Piero Sraffa’s exercise,
Production of Commodities By Means of Commodities: Prelude to a Critique of Economic Theory .)

A family’s energy budget might look something like this, estimated in gallons of ethanol:

transportation 800 gallons (10,000 miles)
cooking 300 gallons (365 days)
heating 1000 gallons (180 heating days)
illumination 100 gallons (365 days)
refrigeration 200 gallons

This adds up to 2,400 gallons of ethanol required for a year’s energy use. But we aren’t finished yet, because cultivation and distillation also have an energy cost, and this cost is a function of the volume of alcohol required. Let’s take a more optimistic estimate than that provided by Pimental above, and assume that the energy cost of distillation is 30%. (We’re working with a coop, after all!) To produce a gallon of ethanol we have to expend .3 gallons in the distillation process. And let’s assume that cultivation is done by hand without mechanization, but that the crop needs to be transported to the distillation facility at a 10% cost. (That is, I assume that the net transportation cost of transporting the thousands of pounds of feed crop to the processor is 10% of the net alcohol product of the crop.) These estimates imply that the household requires 4,000 gallons of alcohol.

Now assume that the alcohol yield of an acre of cattails is 250 gallons; this implies a fuel farm size of 16 acres. (It would be nice to extend the exercise to include a food garden as well; this is left for the reader! Here’s an interesting United Nations
article from the 1980s on the economics of family gardening that can help get the analysis started.)

Now how many hours of labor time need to be devoted to cultivating and harvesting this crop? Evidently cattails don’t require much by way of fertilizers, irrigation, and pest control. But I’m sure there is some level of maintenance needed, and 16 acres is a large area. In fact, it represents a rectangular plot that is 200 feet by 3,500 feet — more than half a mile long. So let’s assume that basic maintenance of the cattail crop requires 2 hours a day of adult labor. The large investment of labor, however, occurs at the harvest. About 14,000 pounds of cattails will be harvested per acre, or 224,000 pounds for the farm over the course of the harvest. If we assume that an adult can harvest 200 pounds per hour, this represents 1,120 hours of harvest work. Let’s assume that harvesting can be spread out over a couple of four-week periods or 56 days; this implies 20 hours of adult labor per day during the harvest season. So it would take 10 hours a day, 7 days a week during the eight weeks of harvest season for two adults to harvest this volume of cattails. Two months of very hard work devoted to harvesting will eventually produce enough ethanol to support the household’s chief energy needs.

Now what about the economics of the cooperative’s distillery? If we assume a cooperative involving 100 households of the scale just discussed, the distillery needs to process 22,400,000 pounds of material in order to produce 400,000 gallons of ethanol. The households will be farming an area of 1,600 acres of cattails — about three square miles. And the system will be supporting the energy needs of about 500 people. If we keep our assumption of a 30% ratio of input-to-output, this process will consume 120,000 gallons of ethanol. The coop members will need to fund the purchase and maintenance of the still and the labor costs associated with operation of the still. Perhaps it’s a labor coop too? In this case, each household will need to devote several hours a week to work in the distillery. And we might imagine that the coop would require a “tax” of some small percentage of the alcohol produced to cover maintenance and operating expenses. Here’s a research
article from AGRIS that examines the costs of a small distillery of roughly this size. The conclusion is somewhat discouraging: “The analysis indicates that the distillery would not be profitable at current prices for corn and ethanol.” In other words, the cost of inputs and operation of the distillery exceed the value of the alcohol produced, according to this analysis. But this conclusion isn’t quite relevant to our scenario, because the raw materials are not purchased through the market and the product is not sold on the market. Nonetheless, the finding implies that there’s a shortfall somewhere; and it may well be that it is the unpaid labor of the fuel farmers that is where the shortfall occurs.

So here’s the upshot of this back-of-the-envelope calculation: it would be a major commitment of land and labor for a household or a village community to achieve energy self-sufficiency through cooperative-based ethanol distillation. And I’ve made an assumption I can’t justify: that the energy input to the distillation process is 30% of the energy content of the resulting quantity of ethanol. If that ratio is 60% instead of 30%, then the land and labor requirements for each household are greatly increased; and if the ratio approaches or exceeds 100%, then the whole idea falls apart. But even on these assumptions, the life style associated with this model sounds a lot closer to that of a peasant village in medieval France or traditional China than to that of a modern US citizen. It involves hard physical labor during several months of the year and a moderate level of labor effort during the remainder of the year. And if we imagine that the scenario is extended by incorporating a substantial amount of food gardening for family consumption, then the balance of necessary labor to free labor tips even further in the direction of the peasant economy.

Wednesday, December 17, 2008

Benny Peiser on Poznan

To say that the players at Poznan are having sober second thoughts is an understatement. The science was supposed to be settled, yet now it is not. In the meantime CO2 has risen sharply while global temperatures flat lined and then recently dropped. I am still waiting for someone to come out with an explanation. We obviously will have to wait a while longer while a few more scientists can distance themselves from previous positions.

This article by Benny Peiser summarizes the reasons for the developing political collapse. The costs are been felt at the same time that the causation is evaporating. The true believers are still walking the walk to their shame as scientists. Eventually even the politicians will start jumping ship unless next spring delivers a convincing warm spell in the Arctic and global warming resumes. I look forward to been surprised.

Right now we are catching a very convincing cold winter with no surplus heat to spare anywhere. I would actually go so far as to predict that the global temperature drop of 0.7 degrees experienced last year will be added to this year by around 0.3 degrees. By the end of next year solar cycle 24 should be back in play and the temperature will then stabilize thereafter. Perhaps it will even get warm again.

None of this is good news for the believers who will need to keep up morale for at least another year. We are also not that far from catching weather like the late fifties when you could count on been hammered every winter just like now. In the meantime I will have to put on the winter boots tomorrow in Vancouver and expect repeats this year. Usually it is once slightly if at all.

DECEMBER 15, 2008, 4:57 P.M. ET

Cooling on Global Warming

Germany and the rest of Europe are getting more rational on climate change.

By
BENNY PEISER From today's Wall Street Journal Europe

Participants at last week's United Nations climate conference in Poznan, Poland, were taken aback by a world seemingly turned upside-down. The traditional villains and heroes of the international climate narrative, the wicked U.S. and the noble European Union, had unexpectedly swapped roles. For once, it was the EU that was criticized for backpedalling on its CO2 targets while Europe's climate nemesis, the U.S., found itself commended for electing an environmental champion as president.

The wrangle over the EU's controversial climate package at a separate summit in Brussels wrong-footed the world's green bureaucracy. The EU climate deal was diluted beyond recognition. Instead of standing by plans to cut CO2 emissions by 20% below 1990 levels by 2020, the actual reductions might be as trivial as 4% if all exemptions are factored in.

The Brussels summit symbolizes a turning point. The watered-down climate deal epitomizes the onset of a cooling period in Europe's hitherto overheated climate debate. It may lead eventually to the complete abandonment of the unilateral climate agenda that has shaped Europe's green philosophy for nearly 20 years.

The reasons for the changing political atmosphere in Europe are manifold. First, the global economic crisis has demoted green policies nearer to the bottom of the political agenda. Saving the economy and creating jobs take priority now.

Second, disillusionment with the failed Kyoto Protocol has turned utopian thinking into sobriety. After all, most of the Kyoto signatories failed to reduce their CO2 emissions during the last 10 years. There are also growing doubts about the long-term viability of the EU's Emissions Trading Scheme. The price of carbon credits has collapsed as a result of the financial crisis. The drop in demand and the recession are likely to depress carbon prices for years to come. As a result, the effectiveness of the extremely volatile scheme is increasingly questioned.

Third, a number of countries have experienced a political backlash over their renewable energy schemes. Tens of billions of euros of taxpayers' money have been pumped into projects that depend on endless government handouts. Each of the 35,000 solar jobs in Germany, for instance, is subsidized to the tune of €130,000. According to estimates by the Rhine-Westphalia Institute for Economic Research, green subsidies will cost German electricity consumers nearly €27 billion in the next two years.

Perhaps even more important is the growing realization that the warming trend of the late 20th century has, for the last 10 years or so, essentially come to a temporary halt. The data collected by international meteorological offices confirm this. This most peculiar fact is rarely mentioned in policy debates, but it certainly provides decision makers with a vital respite to reconsider their climate policy options.

Above all, Europe's politicians have recognized that green taxes have turned into liabilities that may undermine economic stability and their chances of re-election. As German radio Deutsche Welle put it last week: "With the recession tightening its grip on the German economy, [Chancellor Angela] Merkel is betting that job reassurance is more important to the average worker than being a pioneer in tackling climate change."

Nowhere has the fundamental change of the political landscape been more pronounced and less expected than in Germany. For more than 20 years, Europe's economic powerhouse has been the major bastion of green politics.

In the 1990s, Angela Merkel steered and implemented Europe's Kyoto policy as Germany's first environment minister. Now serving as chancellor, she was hailed as Europe's climate savior after playing host to last year's G-8 summit in Heiligendamm. Only 18 months later, however, she no longer wears a halo. As a result of a concerted campaign by Germany's heavy industry, as well as growing opposition from within her Christian Democratic party, Mrs. Merkel has been forced to abandon her green principles and image.

The deepening economic crisis seems to transform the mood of the German public. Next year's general election looms large, and voters right now are worried about the economy and jobs, and not green issues.
In early December, more than 10,000 angry metal workers and trade unionists -- most of them from Germany -- protested outside the European Parliament in Brussels against the EU's climate policy, which they fear will increase unemployment.

For many international observers, the ease with which Mrs. Merkel overturned her celebrated climate policy has come as a shock. But she was almost the last member of her Christian Democratic party willing to accept that a change in strategy was necessary given the immense costs of the EU's original climate plans. In fact, her party demanded that Mrs. Merkel veto the climate package if German industry did not receive an exemption from the Emissions Trading Scheme's auctioning of carbon credits. The exemption was duly granted.

Perhaps the most critical factor for Mrs. Merkel's almost unchallenged about-face is the vanishing strength of the Social Democratic Party, whose members were once among the most forceful climate alarmists. Mrs. Merkel's junior coalition partner has lost much of its support in recent years. And amid growing fears of a deepening recession, there are also signs of a split within the party on climate and energy issues.

At the forefront of the left-wing opposition to the EU's climate policy has been EU Industry Commissioner Günter Verheugen. The German Social Democrat has been arguing throughout the year that the climate targets should only be accepted if "truly cost-effective solutions" could be found. Other prominent dissenters in his party include Hubertus Schmoldt, the head of the mining, chemical and energy industrial union, who has recently called for a two-year postponement of the climate package.

In part as a result of German -- as well as Italian and Polish -- objections, Europe's climate package did not survive in its original form. The inclusion of a revision clause, pushed by Italy, is particularly significant as it makes the EU's climate targets conditional on the outcome of international climate talks.
If the U.N.'s Copenhagen conference in 2009 fails to seal a post-Kyoto deal, it is as good as certain that some of the EU's targets will be further cut. By linking its decisions to those of the rest of the world, Europe has begun to act as a more rational player on the stage of international climate diplomacy.

Instead of yielding to the siren calls of climate alarmists, European governments would be well advised to focus their attention on developing pragmatic policies capable of safeguarding their industries, labor forces and environment at the same time.

Compressed Air Energy Storage

Compressed air energy storage or CAES has been around for a while and I think most of us have been dismissive of it. It made little obvious sense when most power production was fuel driven. That is now changing. What is more, most of the population is living in areas that have geological storage potential. In that case, it makes perfect sense to transfer nighttime base load into compressed air and to shove it underground.

The interesting fact that this same compressed air needs to be heated up before it enters a power turbine is very interesting and is possibly a major opportunity.

Huge amounts of fuel are ideal sources of low grade heat that is currently wasted. That may even include the spent water in a thermal plant which is still at the edge of boiling. Fuel sources such as municipal waste are normally unsuitable as useful energy sources unless augmented with a higher grade fuel.

An excellent example of this that I once appraised was the use of open hearth incinerators. The burn temperature was just too low to completely consume the contained carbon with a resultant waste removal problem. By the simple blending of a fifteen percent chopped tire component (a high grade hydrocarbon) it was possible to raise the temperature sufficiently to consume the carbon and even to oxidize the steel. This worked particularly well with wood waste.

Any such process still produces a lot of heat in the form of hot gases that if not already to temperature can be readily upgraded with a little natural gas.

The point is that with a lot of not too clever engineering, compressed air would fit very nicely into existing thermal plants, existing incineration operations and existing industrial power systems that are already using cogeneration. This is not a dumb idea.

This also suggests that reservoir storage should be part of every power plant’s design, not just windmills in the Dakotas. The creation of, and existence of such reservoirs will obviously stimulate the building of nearby windmill farms.



Air forced underground could provide energy

WHAT'S IN STORE FOR THE GRID?

It's generally accepted economical energy storage is the key that will to unlock the potential of renewable energy and help the electricity system operate more efficiently.

"Energy storage to me is the big breakthrough," says Ken Kozlik, chief operating officer of the Independent Electricity System Operator, which manages the supply and demand of power in Ontario.
Batteries have advanced significantly and show tremendous promise for smaller grid applications. Examples of such battery chemistries include lithium-ion, zinc-bromide, vanadium-flow, sodium-sulphur, sodium-nickel-chloride and even improved lead-acid. Problem is, they can only store energy for a few and are expensive when deployed on a large scale.

The same holds true for most other emerging energy-storage technologies, such as flywheels and ultracapacitors, though an unexpected breakthrough could change the game.

On the other end of the spectrum is old-fashioned pumped storage. This involves pumping water from a lower body of water up to a massive natural reservoir, then releasing the water so it can turn turbines on the way down.

Pumped storage can be economical but only at a massive scale – 1,000 megawatts or larger. Also, it's restricted by geography and geology. There are few natural reservoirs close to populated cities or transmission corridors that could accommodate such a project and creating a man-made reservoir would be prohibitively expensive and environmentally risky.

In the middle is compressed-air energy storage. It's economical when deployed on a large scale and the underground reservoirs – such as depleted gas fields or salt caverns – are widely available in southwestern Ontario, which also happens to be rich with wind resources.As energy-storage technologies such as CAES and batteries advance, more energy experts are citing the potential of an electricity system based on 100-per-cent renewable energy.

"My prediction is that renewable power plus storage will outperform any second-generation nuclear plant or coal with carbon capture and will be much easier and faster to install," says Roger Peters, a senior technical adviser at The Pembina Institute, an energy and environmental think-tank.

- Tyler Hamilton

Storing compressed air below ground could provide the grid with wind power when it's needed, not just when it blows

Dec 15, 2008 04:30 AM

Tyler Hamilton

ENERGY REPORTER

Pumping compressed air underground so it can be extracted later to generate electricity could prove one of the most effective ways in the short term for Ontario to add vast amounts of renewable energy to the power system, industry experts say.

So-called compressed-air energy storage, or CAES, has been around for more than 20 years and while only two facilities have ever been built – a 110-megawatt plant in Alabama and a 290-megawatt plant in Germany – officials from New York, California, Texas and a number of other U.S. states are beginning to seriously explore the potential. Iowa has already taken the leap.

The basic concept is that cheap, surplus electricity available overnight is used to compress air and inject it into underground reservoirs, like a salt cavern or depleted gas field. When power is needed during the day and can fetch a higher price on the market, the air is released, exposed to heat and put through an expansion turbine that generates electricity.

"It's beginning to capture people's imagination," says Mark Tinkler, an energy consultant with Emerging Energy Options and former manager of distributed energy technologies at Ontario Power Generation.
Five years ago, Tinkler did a study for OPG on the economics of using CAES and at the time he concluded it didn't make sense. Looking back, he says, enough has changed in the world to revisit the idea.

"My personal feeling is that the time is right to do another assessment."

The reason? In a word, wind.

The wind blows intermittently, so unlike a coal-fired power plant that can dispatch electricity when we need it, a wind farm often generates electricity when we don't need it (or it fails to when we do). Energy storage can level the playing field between renewables and fossil fuels, allowing us to capture wind energy whenever it blows and dispatch the power as demand dictates – much like a coal plant operates today.

It turns out the wind blows best at night, when there's little or no demand for it. Wind-farm operators will often shed the energy or sell it for practically nothing to other utilities.


"It comes down to what the value of electricity is at night," says Tinkler. "Five years ago we didn't have any wind. Now, it's a completely different equation."

Geologically, Ontario is well equipped to embrace CAES – particularly southwestern Ontario. It's often forgotten the region was once the hub of oil and gas exploration in North America and was home to the world's first commercial oil well.

More than 50,000 wells have been drilled in Ontario over the past 150 years and slightly more than 2,000 still produce today. Union Gas and Enbridge Gas Distribution already use depleted gas fields in southwestern Ontario to store natural gas for the heating season. In fact, the Sarnia-Lambton region accounts for 60 per cent of Canada's natural gas storage capacity.

Andrew Hewitt, manager of the petroleum resources centre in Ontario's Ministry of Natural Resources, says the region is also rich in wind resources. He's currently studying the CAES option, having decided several months ago the opportunity was ripe for consideration, particularly as the province moves to shut down its coal plants.

"The compressed-air component doesn't have to be in the same area as a wind farm, it just has to be hooked into the same region of the province," says Hewitt, who hopes to brief the minister on his findings once his research is complete.

"The oil and gas industry has been doing this kind of storage for years. You're using the same technology and just substituting it (natural gas) with air."

The problem is, engineers from power utilities know little about geology and underground technologies. Likewise, engineers from the oil and gas sector are not as knowledgeable about the above-ground machinery that generates electricity.

"You've got to bring teams of these people together to make compressed-air storage happen," says Robert Schainker, a senior technical executive and CAES expert at the Palo Alto, Calif.-based Electric Power Research Institute.

Schainker says it's worth the effort if the geological conditions are right and the goal is bulk energy storage, such as a CAES facility that can store 200 megawatts for 10 hours or more – the equivalent of powering two million 100-watt light bulbs or 400,000 dishwashers for half a day.

True, a number of advanced battery-storage technologies are becoming economical for much smaller applications – for example, one megawatt for one to three hours of storage.

These technologies include zinc-bromide, sodium-sulphur, lithium-ion and vanadium flow battery chemistries. But at much larger scales batteries are simply too expensive.

CAES, on the other hand, isn't economical on a small scale since the bulk of capital costs relates to the compressors and other turbo-machinery. The underground storage costs are the same whether you've got a small or large reservoir.

Adding an additional hour of storage to a CAES project will only cost $1 (U.S.) or $2 per kilowatt-hour, compared with $350 to $500 per kilowatt-hour of additional battery storage, says Schainker.

Still, there are a few wild cards that could influence the future cost of compressed-air storage. The current generation of CAES facilities still require fuel, typically natural gas, to heat the air before it enters the expansion turbine. Generally, a CAES plant consumes a third less natural gas for every kilowatt-hour it generates, compared with a simple-cycle natural gas or "peakier" plant.

Tinkler says when Ontario Power Generation studied the economics of CAES, the cost of natural gas was $3 per thousand cubic feet. At the time, "we were looking at a $5 break-even point," he says.

"As the price of natural gas goes up, compressed-air storage looks better and better."

Today, natural gas is above $5 per thousand cubic feet. The National Energy Board is projecting it could go as high as $9 over the winter and the U.S. Energy Information Administration is projecting it will hit $6.25 in 2009. As recently as this summer it was higher than $13.

Another factor that would make CAES even more attractive is carbon pricing. Both Canada and the United States plan to introduce a continental cap-and-trade system for carbon emissions. CAES, by increasing our use of wind energy and reducing our consumption of natural gas, would become more economical over time by lowering carbon dioxide emissions in the province.

"You should redo your studies," says Schainker, referring to OPG's initial study in 2003. "CO2 costs will be a big one."

The fact that a CAES facility, like wind farms, can also be built in two or three years also makes it attractive when compared with building a nuclear facility, which, because of more rigorous regulatory requirements, can take 10 years to plan and build.

And the technology continues to mature, Schainker adds, pointing to next-generation designs that can take the waste heat that results from compressing the air and use it in place of natural gas to reheat the air during the electricity generation process. No facility has ever been built around this design, but it's only a matter of time.

"There would be no fuel used whatsoever, no CO2 emissions," he says.

"On paper, it looks very attractive. We're working on it."

Andrew Hewitt at the natural resources ministry says making it happen in Ontario would necessarily require the participation of OPG. He says wind developers in the region could get together and build a facility to share, or a single operator of a large wind farm may decide to pursue such a project alone.

"It doesn't have to be the big utilities," he says. "Commercializing it would depend simply on who wants to get into that business."

XCOR Testfires 5K18 Rocket Engine

This bit out by XCOR. Rather good news and anything else would have been viewed as a set back. They certainly have compiled an enviable track record in this sort of thing.

I have not seen much of the specs here but something is going right and obviously modern design engineering is able to do all this on the computer before anyone does something incredibly expensive.

I have felt for some time that the massive costs in this industry has preselected designs that minimized the gross weight to the detriment of good final design. The experience in the aircraft business reminds us amazingly that bigger is both better and actually easier.

I have often wondered if a large space frame could ease our road into space. Conventional jet engines could easily lift such a craft off the ground and ease it up into typical subsonic cruising altitudes. Ramjets could then lift the bird and increase the speed to optimals for high atmospheric flight, after which the herein described rocket engines could kick in and lift the craft into orbit. Three engine classes plus the kerosene demands a large craft. Yet a large craft would also have a large load capacity to work with. Obviously piggy backing on a 747 is a form of this strategy but is likely still too slow.

The key to this design strategy is to use air to approach the critical orbit injection zone where kicking in high trust engines will not tear the craft apart from buffeting. One may even be able to accept slow and high.

The point that I am making is that such a strategy can never be tackled with a small craft and too small engines. Bigger makes a lot of what I described a real possibility, but the economic demand to build small is likely keeping us on the ground.


December 17, 2008, Mojave, CA: XCOR Aerospace, Inc., announced today that it has successfully completed its first test fire of the rocket engine that will be used to power its Lynx suborbital launch vehicle to the edge of space.

The new engine, designated the 5K18, produces between 2500-2900 lbf thrust by burning a mixture of liquid oxygen and kerosene. The engine was fired Monday, December 15th, 2008 at XCOR’s rocket test facility located at the Mojave Air and Space Port. The first test of the engine was performed using pressure-fed propellants whereas the final version of the engine will be fed using XCOR’s proprietary cryogenic piston pump for liquid oxygen and a similar piston pump for kerosene

.“Today’s successful hot fire marks an important step forward in building the Lynx,” said XCOR CEO Jeff Greason. “The 5K18 builds on our previous experience in designing and building reliable, durable and fully reusable rocket engines from 15 lbf thrust up to 7500 lbf, that will make it possible to provide affordable access to space.”

During its nine years of existence, XCOR has conducted over 3,600 hot fires of rocket engines. During this time, XCOR has built, test-fired, and flown many different engines. The 5K18 is the eleventh engine design XCOR has built and fired. All have had perfect safety records. XCOR has not had a single lost time injury due to engine operations during its nine years of existence. It has also never seen one of its engines wear out, which is in marked contrast with the experience of most of the aerospace industry.
XCOR’s experience also includes building rocket-powered vehicles. The company has already developed and safely flown two generations of rocket-powered aircraft. Overall, the firm has flown these vehicles 66 times, and XCOR alone accounts for more than half of all manned rocket-powered flights in the 21st century. The Lynx will mark the company’s third rocket-powered vehicle, and the first designed for space access.
“Firing a new rocket engine is always an important milestone,” said COO Andrew Nelson. “It gives everyone on the team a tremendous sense of accomplishment and demonstrates to customers and investors that XCOR knows how to take new ideas and make them a reality.”

“The propulsion system is not only the hardest part of the launcher to design and build, it also determines every other aspect of the vehicle,” said XCOR CEO Jeff Greason. “The engine’s power and the amount and types of propellants it consumes determine the design and capabilities of the vehicle. There are examples in the aerospace industry where unforeseen problems forced a change of engines which then resulted in extensive redesigns of entire vehicles. By getting our rocket engines right from the beginning, XCOR reduces this type of risk.”

“XCOR’s revolutionary rocket engines are the heart of our vehicle design,” Nelson states. “They are a disruptive technology in the space launch industry because they make it possible to deliver payloads with much higher reliability, significantly shorter lead times and dramatically lower operating costs. Our safety-enhanced engines are also easier on the environment. They will make the Lynx a game-changer in the space launch industry.”

The Lynx will use four of the 5K18 engines to carry people or payloads to the edge of space. Earlier this month, XCOR announced that RocketShip Tours, of Phoenix, AZ, has begun sales of tickets for suborbital flights on the Lynx. Tickets will sell for $95,000. RocketShip Tours can be contacted via its website:
www.RocketShipTours.com or phoned toll free at: 888-778-6877.

XCOR Aerospace is a California corporation located in Mojave, California. The company is in the business of developing and producing safe, reliable and reusable rocket-powered vehicles and propulsion systems that enable affordable access to space. Visit our website at
http://www.xcor.com/

Tuesday, December 16, 2008

Steven Chu and Global Warming

Steven Chu sounds like a true believer in the case for man made global warming. That is unfortunate. But then, it appeared to be well supported by facts on the ground up to the fall of 2007. Right now the facts on the ground are in revolt.

As I have argued in the past, there is nothing wrong in doing the right thing for the wrong reasons. The right thing would be to establish a global cap and trade system that was totally offset by biochar sequestration. Unfortunately, the globe has had poor luck in actually implementing global financial solutions that actually work so I cringe at the thought.

The thought of a free African landowner actually been paid to add carbon to his soil and thus improve his soil is heart warming. The real problem lies in getting it from the buyer in Miami to this gentleman. It really calls for a global banking system, perhaps like the one been put together by Mohammad Younis.
Otherwise, science will have a solid working voice in government in Steven Chu and I hope a much greater importance than even in the past.

As I have pointed out many times, we have yet to land a single dollar bill on the moon, but the effort jump started the modern computer based world and a lot more besides. We need to be continually at war with the future. A massive global investment in upgrading all soils with biochar will achieve three things.

1 The CO2 problem will disappear.
2 We can feed a massive population increase.
3 This will also complete the global transition to a global middle class civilization.

As said, we can dream about doing the right thing for whatever. My real fear is that we simply get another nasty consumption tax diverted into the very important boondoggles those politicians so love.

It is unimaginable that a congress that subsidizes and protects US agro industry to the clear detriment of every small third world farmer, will then set up a system that subsidizes those same farmers. It will be a challenge to get them to convert those same subsidies into biochar credits.

Yet if they did so, I have no doubt that the jump in productivity will carry the cost.

INTERVIEW: Obama’s energy czar discusses global warming

In recent years, Steven Chu, picked by US president-elect Barack Obama to be his energy secretary and co-winner of the Nobel Prize in Physics in 1997 for his work on cooling and trapping atoms using lasers, developed a keen interest in climate change. Director of Lawrence Berkeley National Laboratory, Chu was invited to speak on climate change and the education of the next generation of scientists as part of celebrations surrounding Academia Sinica’s 80th anniversary, attended by directors of national science councils from around the world. The scientist sat down with ‘Taipei Times’ staff reporter Shelley Huang last Sunday and shared his views on the inevitability of global warming and what this entails for humanity.

BY Shelley Huang

STAFF REPORTER

Monday, Dec 15, 2008, Page 2

Taipei Times: In your speech, you used the ‘Titanic’ crashing into an iceberg as a metaphor for the problem of climate change. Can you give an estimate as to when the crash would happen?

Steven Chu : It’s a gradual crash. We have already seen a substantial change in climate, sea level rising, the melting of glaciers all over the world … The heat is bleaching coral at a faster rate, the number of forest fires has increased, so you can go down the list of things that are related to increases in heat and melting of polar caps … The Tibetan plateau and the Himalayas actually feed water to many of the major river basins around the world, like the Ganji River, the Yellow River … [Polar caps are] melting at a rate more than 1m in thickness a year now, but because it stretches over millions and millions of square miles [kilometers], it means a lot of water. I’ve heard stories where in India the Ganji water level has risen, it always goes up and down but the average level has risen to the point where it displaces people who live around the water, and they’ve become refugees.

This is predicted to accelerate. Pine forests in the US and Canada are dying. When the forests die we’re very exposed to floods because the mountainsides no longer have trees, and if it rains then there’s a lot of erosion.

In California and many places around the world, the moisture’s kept in the mountains by trees and snow and if you don’t have snow or trees, what happens during the wet season is you have floods, and instead of a continuous supply of water you would get floods and droughts. We’ve begun to see these effects in the last decade, and the predictions are it’s going to get much, much worse.

TT: So what are our options?

Chu: We want it to be bad, but not awful. In order to keep it at just “bad,” we have to immediately start decreasing the amount of energy we use. That doesn’t necessarily mean that everybody doesn’t heat their homes or turn on air conditioning.

For example, the lighting in this building doesn’t really have to be as bright as it is.

TT: How can we use energy more efficiently?

Chu: It turns out that most people don’t understand how to build buildings. The reason I say that is because there is a major US company called United Technologies, they make air conditioning, building control systems, elevators, helicopters, jet engines … They’re a very high-tech company.

In one of their buildings — a high-rise building maybe 50 stories high — the architect changed the window and did things in such a way that it became impossible to cool the upper 15 stories of the building below 85 degrees [Fahrenheit, 29.4ºC]. So they had to do a lot of re-engineering, but the design architects and the structural engineers weren’t really talking to one another and didn’t fully understand the airflow patterns. Usually people keep the airflow pattern very simple, there’s an inlet and an outlet and you just force the airflow to happen, but forcing it could also be fighting against natural convection and the natural design of the building, making it much more energy-intensive.

TT: Are energy-efficient buildings more expensive to build than regular ones?

Chu: Energy-efficient buildings will pay for themselves. For example, if you have a building with a flat roof, and you make the roof white, such as using white pebbles instead of dark ones, depending on the shape of the building, you can be reducing 10 [percent] to 20 percent of the air conditioning load.

There’s a recently published paper from people in our laboratory that says, if you take only the city buildings that have flat-topped roofs and make them light-colored, and make the roads light-colored by using cement, the amount of carbon dioxide decreased is equivalent to taking all the cars in the world [carbon emission] and turning them off for 10 years.

Rooftops don’t cost much money, and it saves on air conditioning, as well as reflects the light back from where it came from. These are things which we should be doing today. It’s actually pure ignorance.

The architects fought against this for a while, because they felt that nobody should tell them what color their roofs should be, even though you can’t see the roof, by the way. Having a white roof will not dramatically alter your lifestyle. If you have white roofs and lighter colored pavement, you will notice the cities becoming cooler. Cities are much hotter than in the countryside during the summer, because they’re absorbing all this energy and also generating energy from air conditioning. So we should be doing this a few years from now.

Vaclav Smil on Fossil Fuel Legacy


This article by Vaclav Smil does a superb job of quantifying the logistical reality of making a transition to alternative power sources. He emphasizes the sheer weight of legacy infrastructure that will not be replaced on a whim.

He is right to observe that our civilization is fueled by coal primarily and nuclear as a secondary. It is also true that we have plenty of coal to last us for a long time yet.

Our static energy sources are not going to run out anytime soon. That needs to be remembered. Our vulnerability has always been in transportation fuels and in the fact that both fuels produce a huge amount of CO2 which is undesirable.

That is why the push to transition to electric cars is gaining momentum and is capable of doing it all within a generation.

That is also why a distributed solar and wind energy system is so important in terms of individual energy use. The advent of cheap nanosolar systems allows every household to entertain energy independence including electric cars. We are not there yet, but it is achievable.

As that is been achieved, the power grid will transition to industrial power support for which it is needed.

All this will be augmented by increasing use of industrial grade batteries that collect power surpluses everywhere for industrial use. In fact, these batteries will in time squeeze most of the inefficiencies out of power production and transmission. That will approach doubling the overall capacity.

Moore’s Curse and the Great Energy Delusion

By Vaclav Smil From the Magazine: Wednesday, November 19, 2008

Filed under:
Big Ideas

Our transition away from fossil fuels will take decades—if it happens at all.

During the early 1970s we were told by the promoters of nuclear energy that by the year 2000 America’s coal-based electricity generation plants would be relics of the past and that all electricity would come from nuclear fission. What’s more, we were told that the first generation fission reactors would by then be on their way out, replaced by super-efficient breeder reactors that would produce more fuel than they were initially charged with.

During the early 1980s some aficionados of small-scale, distributed, “soft” (today’s “green”) energies saw America of the first decade of the 21st century drawing 30 percent to 50 percent of its energy use from renewables (solar,wind, biofuels). For the past three decades we have been told how natural gas will become the most important source of modern energy: widely cited forecasts of the early 1980s had the world deriving half of its energy from natural gas by 2000. And a decade ago the promoters of fuel cell cars were telling us that such vehicles would by now be on the road in large numbers, well on their way to displacing ancient and inefficient internal combustion engines.

These are the realities of 2008: coal-fired power plants produce half of all U.S. electricity, nuclear stations 20 percent, and there is not a single commercial breeder reactor operating anywhere in the world; in 2007 the United States derives about 1.7 percent of its energy from new renewable conversions (corn-based ethanol, wind, photovoltaic solar, geothermal); natural gas supplies about 24 percent of the world’s commercial energy—less than half the share predicted in the early 1980s and still less than coal with nearly29 percent; and there are no fuel-cell cars.

This list of contrasts could be greatly extended, but the point is made: all of these forecasts and anticipations failed miserably because their authors and promoters ignored one of the most important realities ruling the behavior of complex energy systems—the inherently slow pace of energy transitions.
It is delusional to think that the United States can install in 10 years wind and solar generating capacity equivalent to that of thermal power plants that took nearly 60 years to construct.

“Energy transitions” encompass the time that elapses between an introduction of a new primary energy source oil, nuclear electricity, wind captured by large turbines) and its rise to claiming a substantial share (20 percent to 30 percent) of the overall market, or even to becoming the single largest contributor or an absolute leader (with more than 50 percent) in national or global energy supply. The term also refers to gradual diffusion of new prime movers, devices that replaced animal and human muscles by converting primary energies into mechanical power that is used to rotate massive turbogenerators producing electricity or to propel fleets of vehicles, ships, and airplanes. There is one thing all energy transitions have in common: they are prolonged affairs that take decades to accomplish, and the greater the scale of prevailing uses and conversions the longer the substitutions will take. The second part of this statement seems to be a truism but it is ignored as often as the first part: otherwise we would not have all those unrealized predicted milestones for new energy sources.

Preindustrial societies had rather simple and fairly stationary patterns of primary energy use. They relied overwhelmingly on biomass fuels (wood, charcoal, straw) for heat and they supplemented their dominant prime movers(muscles) with wind to sail ships and in some regions with windmills and small waterwheels. This traditional arrangement prevailed in Europe and the Americas until the beginning of the 19th century, and it dominated most of Asia and Africa until the middle of the 20th century. The year 1882 was likely the tipping point of the transition to fossil fuels, the time when the United States first burned more coal than wood. The best available historical reconstructions indicate that it was only sometime during the late 1890s that the energy content of global fossil fuel consumption, nearly all of it coal, came to equal the energy content of wood, charcoal, and crop residues.

The Western world then rapidly increased its reliance on fossil fuels and hydroelectricity, but in large parts of Africa and Asia the grand energy transition from traditional biomass fuels to fossil fuels has yet to be completed. Looking only at modern primary energies on a global scale, coal receded from about 95 percent of the total energy supply in 1900 to about 60 percent by 1950 and less than 24 percent by 2000. But coal’s importance continued to rise in absolute terms, and in 2001 it even began to regain some of its relative importance. As a result, coal is now relatively more important in 2008 (nearly 29 percent of primary energy) than it was at the time of the first energy “crisis” in 1973 (about 27 percent). And in absolute terms it now supplies twice as much energy as it did in 1973: the world has been returning to coal rather than leaving it behind.

These are the realities of 2008: coal-fired power plants produce 50 percent of U.S.electricity, nuclear stations 20 percent, and there are no operating commercial breeder reactors.

Although oil became the largest contributor to the world’s commercial energy supply in 1965 and its share reached 48 percent by 1973, its relative importance then began to decline and in 2008 it will claim less than 37 percent of the total. Moreover, worldwide coal extraction during the 20th century contained more energy than any other fuel, edging out oil by about 5 percent. The common perception that the 19th century was dominated by coal and the 20th century by oil is wrong: in global terms, the 19th century was still a part of the millennia-long wooden era and 20th century was, albeit by a small margin, the coal century. And while many African and Asian countries use no coal, the fuel remains indispensable: it generates 40 percent of the world’s electricity, nearly 80 percent of all energy in South Africa (that continent’s most industrialized nation), 70 percent of China’s, and about 50 percent of India’s.

The pace of the global transition from coal to oil can be judged from the following spans: it took oil about 50 years since the beginning of its commercial production during the 1860s to capture 10 percent of the global primary energy market, and then almost exactly 30 years to go from 10 percent to about 25 percent of the total. Analogical spans for natural gas are almost identical: approximately 50 years and 40 years. Regarding electricity, hydrogeneration began in 1882, the same year as Edison’s coal-fired generation, and just before World War I water power produced about 50 percent of the world’s electricity; subsequent expansion of absolute production could not prevent a large decline in water’s relative contribution to about 17 percent in 2008. Nuclear fission reached 10 percent of global electricity generation 27 years after the commissioning of the first nuclear power plant in 1956, and its share is now roughly the same as that of hydropower.

These spans should be kept in mind when appraising potential rates of market penetration by nonconventional fossilfuels or by renewable energies. No less important is the fact that none of these alternatives has yet reached even 5 percent of its respective global market. Nonconventional oil, mainly from Alberta oil sands and from Venezuelan tar deposits, now supplies only about 3 percent of the world’s crude oil and only about 1 percent of all primary energy. Renewable conversions—mainly liquid biofuels from Brazil, the United States, and Europe, and wind-powered electricity generation in Europe and North America, with much smaller contributions from geothermal and photovoltaic solar electricity generation—now provide about 0.5 percent of the world’s primary commercial energy, and in 2007 wind generated merely 1 percent of all electricity.

The absolute quantities needed to capture a significant share of the market, say 25 percent, are huge because the scale of the coming global energy transition is of an unprecedented magnitude. By the late 1890s, when combustion of coal (and some oil) surpassed the burning of wood, charcoal, and straw, these resources supplied annually an equivalent of about half a billion tons of oil. Today, replacing only half of worldwide annual fossil fuel use with renewable energies would require the equivalent of about 4.5 billion tons of oil. That’s a task equal to creating de novo an energy industry with an output surpassing that of the entire world oil industry—an industry that has taken more than a century to build.

The scale of transition needed for electricity generation is perhaps best illustrated by deconstructing Al Gore’s July 2008 proposal to “re-power” America: “Today I challenge our nation to commit to producing 100 percent of our electricity from renewable energy and truly clean carbon-free sources within 10 years. This goal is achievable, affordable, and transformative.”

Nuclear fission reached 10 percent of global electricity generation 27 years after the commissioning of the first nuclear power plant.

Let’s see. In 2007 the country had about 870 gigawatts (GW) of electricity-generating capacity in fossil - fueled and nuclear stations, the two nonrenewable forms of generation that Gore wants to replace in their entirety. On average,these thermal power stations are at work about 50 percent of the time and hence they generated about 3.8 PWh (that is, 3.8 x 1015 watt-hours) of electricity in 2007. In contrast, wind turbines work on average only about 23 percent of the time, which means that even with all the requisite new high-voltage interconnections, slightly more than two units of wind-generating capacity would be needed to replace a unit in coal, gas, oil, and nuclear plants. And even if such an enormous capacity addition—in excess of 1,000 GW—could be accomplished in a single decade (since the year 2000, actual additions in all plants have averaged less than 30 GW/year!), the financial cost would be enormous: it would mean writing off the entire fossil-fuel and nuclear generation industry, an enterprise whose power plants alone have a replacement value of at least $1.5 trillion (assuming at least $1,700/installed kW), and spending at least $2.5 trillion to build the new capacity.

But because those new plants would have to be in areas that are not currently linked with high-voltage (HV)transmission lines to major consumption centers (wind from the Great Plains to the East and West coasts,photovoltaic solar from the Southwest to the rest of the country), that proposal would also require a rewiring of the country. Limited transmission capacity to move electricity eastward and westward from what is to be the new power center in the Southwest, Texas, and the Midwest is already delaying new wind projects even as wind generates less than 1 percent of all electricity. The United States has about 165,000 miles of HV lines, and at least 40,000 additional miles of new high-capacity lines would be needed to rewire the nation, at a cost of close to $100 billion. And the costs are bound to escalate, because the regulatory approval process required before beginning a new line construction can take many years. To think that the United States can install in 10 years wind and solar generating capacity equivalent to that of thermal power plants that took nearly 60 years to construct is delusional.

And energy transitions from established prime movers to new converters also take place across time spans measured in decades, not in a decade. Steam engines, whose large-scale commercial diffusion began with James Watt’s improved design introduced during the 1770s, remained important into the middle of the 20th century. There is no more convincing example of their endurance than the case of Liberty ships, the “ships that won the war” as they carried American materiel and troops to Europe and Asia between 1942 and 1945. Rudolf Diesel began to develop his highly efficient internal combustion engine in 1892 and his prototype engine was ready by 1897. The first small ship engines were installed on river-going vessels in 1903, and the first oceangoing ship with Diesel engines was launched in 1911. By 1939 a quarter of the world’s merchant fleet was propelled by these engines and virtually every new freighter had them. But nearly 3,000 Liberty ships were still powered by oil-fired steam engines. And steam locomotives disappeared from American railroads only by the late 1950s, while in China and India they were indispensable even during the 1980s.

A decade ago the promoters of fuel-cell cars were telling us that such vehicles would by now be on the road in large numbers.

Automobilization offers similar examples of gradual diffusion, and the adoption of automotive diesel engines is another excellent proof of slow transition. The gasoline-fueled internal combustion engine—the most important transportation prime mover of the modern world—was first deployed by Benz, Maybach, and Daimler during the mid-1880s, and it reached a remarkable maturity in a single generation after its introduction (Ford’s Model T in 1908).

But massive automobilization swept the United States only during the 1920s and Europe and Japan only during the 1960s, a process amounting to spans of at least 30 to 40 years in the U.S. case and 70 to 80 years in the European case between the initial introduction and decisive market conquest (with more than half of all families having a car). The first diesel-powered car (Mercedes-Benz 260D) was made in 1936, but it was only during the 1990s that diesels began to claim more than 15 percent of the new car market in major EU countries, and only during this decade that they began to account for more than a third of all newly sold cars. Once again, roughly half a century had to elapse between the initial introduction and significant market penetration.

And despite the fact that diesels have been always inherently more efficient than gasoline-fueled engines (the difference is up to 35 percent) and that modern diesel-powered cars have very low particulate and sulphur emissions, their share of the U.S. car market remains negligible: in 2007 only 3 percent of newly sold cars were diesels.

And it has taken more than half a century for both gasoline- and diesel-fueled internal combustion engines to displace agricultural draft animals in industrialized countries: the U.S. Department of Agriculture stopped counting draft animals only in 1963, and the process is yet to be completed in many low-income nations.

Finally, when asked to name the world’s most important continuously working prime mover, most people would not name the steam turbine. The machine was invented by Charles Parsons in 1884 and it remains fundamentally unchanged 125 years later. Gradual advances in metallurgy made it simply larger and more efficient and these machines now generate more than 70 percent of the world’s electricity in fossil-fueled and nuclear stations (the rest comes from gas and water turbines as well as diesels).

There is no common underlying process to explain the gradual nature of energy transitions. In the case of primary energy supply, the time span needed for significant market penetration is mostly the function of financing, developing, and perfecting necessarily massive and expensive infrastructures. For example, the world oil industry annually handles more than 30 billion barrels, or four billion tons, of liquids and gases; it extracts the fuel in more than 100 countries and its facilities range from self-propelled geophysical exploration rigs to sprawling refineries, and include about 3,000 large tankers and more than 300,000 miles of pipelines. Even if an immediate alternative were available, writing off this colossal infrastructure that took more than a century to build would amount to discarding an investment worth well over $5 trillion—but it is quite obvious that its energy output could not be replicated by any alternative in a decade or two.

Renewable conversions now provide about 0.5 percent of the world’s primary commercial energy, and in 2007 wind generated merely 1 percent of all electricity.

In the case of prime movers, the inertial nature of energy transitions is often due to the reliance on a machine that may be less efficient, such as a steam engine or gasoline-fueled engine, but whose marketing and servicing are well established and whose performance quirks and weaknesses are well known, as opposed to a superior converter that may bring unexpected problems and setbacks.
Predictability may, for a long time, outweigh a potentially superior performance, and associated complications (for example, high particulate emissions of early diesels) and new supply-chain requirements (be it sufficient refinery capacity to produce low-sulfur diesel fuel or the availability of filling stations dispensing alternative liquids) may slow down the diffusion of new converters.

All of these are matters of fundamental importance given the energy challenges facing the United States and the world. New promises of rapid shifts in energy sources and new anticipations of early massive gains from the deployment of new conversion techniques create expectations that will not be met and distract us from pursuing real solutions. Unfortunately, there is no shortage of these unrealistic calls, such as the popular claim that America should seek to generate 30 percent of its electricity supply from wind power by 2030.

And now Al Gore is telling us that the United States can completely repower its electricity generation in a single decade! Gore has succumbed to what I call “Moore’s curse.” Moore’s Law describes a long-standing trend in computer processing power, observed by Intel cofounder Gordon Moore, whereby a computer’s power doubles every year and a half. This led Gore to claim that since “the price paid for the same performance came down by 50 percent every 18 months, year after year,” something similar can happen with energy systems.

But the doubling of microprocessor performance every 18 months is an atypically rapid case of technical innovation. It does not represent—as the above examples of prime mover diffusion make clear—the norm of technical advances as far as new energy sources and new prime movers are concerned, and it completely ignores the massive infrastructural needs of new modes of electricity generation.

The historical verdict is unassailable: because of the requisite technical and infrastructural imperatives and because of numerous (and often entirely unforeseen) socio-economic adjustments, energy transitions in large economies and on a global scale are inherently protracted affairs. That is why, barring some extraordinary commitments and actions, none of the promises for greatly accelerated energy transitions will be realized, and during the next decade none of the new energy sources and prime movers will make a major difference by capturing 20 percent to 25 percent of its respective market. A world without fossil fuel combustion is highly desirable and, to be optimistic, our collective determination, commitment, and persistence could accelerate its arrival—but getting there will demand not only high cost but also considerable patience: coming energy transitions will unfold across decades, not years.

Vaclav Smil is the author of Energy at the Crossroads and Energy in Nature and Society (MIT Press). He is Distinguished Professor at the University of Manitoba.

Monday, December 15, 2008

Ron Paul and the Great Contraction

Ron Paul has made himself the spokesman of the gold crowd that has maintained a minority position on gold since the War of Independence. He has through his dynamic candidacy brought another generation into that world. It makes enticing reading.

Unfortunately, it is all dangerous rubbish and capable of driving catastrophic financial policy. We today are in the midst of a global credit contraction. That means a global deflation of pricing structures. Or haven’t you noticed?
The first wave is always in commodity prices. Commodities are now busted and the only one with a sustainable upside is oil because we have lost supply elasticity and we are waiting now for the production shoe to drop with disastrous repercussions.

The fact is that the global financial system lent trillions of dollars and now needs to get a lot of it repaid in order to cover accelerating losses, while at the same time shore up their balance sheets to maintain the good loans that they have.

The money that is been printed today on fabulous terms is to replace all this credit that has disappeared.

Let me make this as clear as humanly possible, so that you can understand just how ugly this all is.

If the auto industry defaults on and never pays back fifty billion dollars (not actually very likely) the American financial industry will eat the loss as a capital loss. It will then be unable to lend 500 billion to a trillion dollars to the rest of us, nicely wiping out any benefit from the so called mortgage bailout. You wonder why the industry is hoarding cash and taking its time to reenter the lending market? You would too.

And yes we still have not solved the mortgage problem in the one way that it might be solved as I posted a couple of months ago. Liquidation pressures continue to mount and no bank can solve it alone and the liquidation blowout will continue to destroy bank capital.

The Great Depression wiped out the banking system for exactly the same reason. The Great Contractor is loose and has not been visibly halted yet. We are hoping that the prompt injection of massive liquidity will stem the tide and I believe it should. In this case it must start soon with a major uptick in the volume of house sales to reassure frightened bankers.

Right now, the fed is struggling to prevent a sharp reduction in the real money supply.

The gold crowd’s prescriptions would take us back to a dollar a day, little credit and a financial depression every decade that would keep the population impoverished. I think I will pass.

And yes the auto industry needs to go through the rigors of chapter 11 in order to break their labour contracts so that their costs can match those of their onshore competitors. Otherwise we will revisit this particular disaster and the industry will be in far worse shape and be able to save far fewer jobs. Remember British Leyland.


Ron Paul: Bailouts Will 'Destroy the Dollar'

Thursday, December 11, 2008 12:26 PM

By: Jim Meyers

U.S. Representative and former presidential candidate Ron Paul tells Newsmax that bailouts of U.S. corporations are “bad morally” — and says current federal economic policies “will literally destroy the dollar.”

He also insists that the use of “counterfeit” paper money instead of a gold-backed currency is “insane,” and declares it is “foolhardy” for Barack Obama to propose national health care under the present economic conditions.

The Texas legislator ran for president as the Libertarian candidate in 1988, and sought the Republican presidential nomination beginning in March 2007. He withdrew this past June and did not endorse GOP candidate John McCain.

Asked by Newsmax’s Ashley Martella about the bailouts of Wall Street, the banking industry and apparently the Big Three automakers, Paul — a member of the House Financial Services Committee — said:

“I think we’re going in the wrong direction and I strongly oppose it.

“I find it to be bad economics. I find it bad morally to transfer wealth from one group of people to another no matter what kind of problems they have…

“Lo and behold, the Constitution doesn’t talk much about allowing Congress to go and bail out their friends. So I oppose it from practical and well as philosophic reasons.”

Martella noted that some of the big problems automakers face are union-related, such as commitments to life-long pensions and health care for retired workers.

Paul said the automakers are “sort of trapped because they’ve signed these contracts…

“These commitments, which had been signed onto by the pressure of the unions, which were backed up by law, [have] brought them to their knees.

“If we take the funds from those people who have been more efficient to prop this system up, we’ll never see the correction…

“Excessive labor costs are very very important but the business people, the people who run the car companies, won’t dare say so, or won’t say very much, because they can’t offend the liberals in Congress who are the ones who are going to bail them out.”

Paul said his fellow legislators are “working real hard, we’re working overtime, maybe this weekend we’re going to work real hard to prolong the agony and not allow the market to correct the imbalances.”

Paul has called for abandoning the Federal Reserve System and returning the nation to a gold and silver standard. He told Newsmax why.

“It’s not so much that gold is perfect, it’s that paper is insane. To give politicians and bureaucrats and secret bankers the license to counterfeit money and create money out of thin air is destined to fail, and it has. That’s why we’ve had this financial bubble develop since the linkage to gold has been severed in 1971…

“Now they’re trying desperately to print and spend, but the bubble was overwhelming and the bursting of this bubble is something they can’t contain. It would never happen under a gold standard because there would be no legal right for our central bank to spend money and create money out of thin air. The arrogance of it all is unbelievable.

“If we continue doing what we’re doing now, we will literally destroy the dollar.”

Paul, who is a physician, was critical of Obama’s stated aim of developing a national health care plan. He said: “He has no money. Where is he going to get the money?

“He has no intention of bringing our troops home. He’s talked a little about Iraq, but we’re maintaining a world empire to the tune of a trillion dollars a year. He wants more troops in Afghanistan … You have to save some money someplace.

“So if you want to help some people who are sick, we’ll have to change our foreign policy and bring our troops home.

“I believe that all goods and services in a free society should be by voluntary means and never through government coercion. The more the government’s involved, the more money they spend, and the more they pretend they’re helping, it does but one thing — it pushes prices up.

“When Obama says something like that, somebody in the media someday would have to say, ‘Where are you going to get the money?’ If he’s going to steal it from someone, who is he going to steal from? The producers are hurting. The corporations are bankrupt. There’s no funding.

“Instead of coming back to a balanced budget and living within our means, to propose national health care, and not attack our empire, is just foolhardy and will seal our fate.”

An opponent of the Patriot Act, Paul was asked if he would give any credit to the measure for keeping Americans safe since 9/11.

“No, not really,” he said. “All it’s done is regulate people. We’ve regulated the American people. The people are less free, but the fact that we haven’t had an attack is probably just a coincidence.”

Paul was especially popular on college campuses during his most recent presidential campaign. Martella asked: “Did you sort of feel like a rock star when you spoke to college students?”

Paul responded: “No, not really. I’m pleased that they’re interested in the issue of freedom and individual responsibility, so I’m delighted with that, but I guess the rock star status goes to Obama and others.”

Bronze Age Disaporia

Those who have followed my postings for some time know that I am interested in mapping the extent of Bronze Age global trade. Where are we at?

The fully mature Bronze Age ended with the 1159 BCE blast that smashed Northern Europe back into a herding culture and ended the sea trade centered on the city state of Atlantis. This mature phase had lasted for at least a millennia and had been preceded by a millennia long expansion of the technology.

The core technology is believed to have originated in Mesopotamia, but I am rather skeptical about that. We have an excellent locale in the Mekong highlands where both metals were richly available literally across the river from each other.

Another issue that I think is under appreciated is the use of copper likely had a very long history that is not visible in the archeological record. The reason for this invisibility is that it represented a convenient medium of exchange and was way too valuable to bury with the dead or even lose track of. Besides that raw copper does rot away pretty well in a few hundred years in any environment that permits water movement.

Think how sharply our understanding of European copper age improved with the recovery of Oetzi with his handy copper axe head and palette of choice stone tools and weapons. This alone ended most of the controversy over the lifeways of the copper age. Scholars have been afraid to use their imaginations and common sense in describing these worlds when all the real evidence simply rots away.

I cannot prove that the natives of New Guinea have been using hardened wood arrows for thousands of years. But the real question needs to be why where they not? A friend of mine has such a bow and arrow set acquired there in the highlands.

The bow is too obvious an invention to not have been made just as soon as someone figured out how to make a bowstring, a much more difficult trick.

The production of copper from a fairly rich ore has been known since antiquity. It takes heat, but not extreme heat and is well within the range produced by charcoal to produce a quality product.

To emphasize this point, the method used by prospectors to evaluate a copper ore in the field was to crush a charge of the ore with some flux in a steel pipe (or pottery retort?) and stick it in the camp fire. This would roast off the sulphur and produce a crude copper slag separation. It is hardly efficient but great for qualifying an ore.

It is pretty obvious that an ancient campfire set with a ring of ore would generate obvious beads of copper in the ash. And just how much of a clue do you need? Again the question needs to be why were they not using copper?

The point that has to be made is that copper is useful and a convenience but not a replacement for an obsidian weapon. It was currency. And that is why so little is found in the archeological record. Just how many present day coins would you find if you chose to dig up a present day graveyard? I have no doubt that outside local barter, copper and then bronze was the principal currency. Homer speaks first of the number of bronze tripods captured. If there ever was an unnecessary luxury usage that is it. Yet it kept your wealth conveniently traveling with you.

Bronze Age culture was rich and palace centered. There is no sense in Europe of a centralized state as in Mesopotamia. There is a sense of a sea borne commonwealth that traded actively with the Americas and there is a sense of advanced antique Indian cultures responding to the influence of these contacts.

We can say that this global trading phenomenon brought about by the necessities of the advent of a bronze based economy, spread a common advanced concept of religion and palace ruler ship around the world. That any of this happened in true isolation is nonsense and reflects only the difficulty in finding actual proof in a background of local artifacts.

What did not particularly happen throughout the Bronze Age was actual colonization. The best recent comparable was the colonization of West Africa. It simply never happened. The only modest attempt appears to have been in New England and it was swiftly overwhelmed and/or absorbed when the trade ended in 1159 BCE.

For a thousand years at least, the sea peoples lived a robust healthy live that allowed them to rove the Atlantic littoral to its fullest. The evidence fully supports that even while it has been studiously ignored. Once again, they could, they should and they did it in far greater strength than I or anyone else originally thought. Once again lack of specific evidence is not evidence of lack and here we have a mountain of specific evidence in every likely prospective location and a few unlikely ones.

I would love to have a European dig come up with an occurrence of maize preferably in southwest Spain just to make that point.

Water Mill as Eden Machine

This story is about an atmospheric water harvesting device operating on household current and producing a supply of drinking water. This is a nice design compared to other examples that I have also seen. Because the intent is to supply potable water, there is a lot of design effort in the problem of polishing the water.

The Eden machine described last week will be stripped of such extras and will focus purely on separating the water from the atmosphere. The importance of controls and sensors is clearly indicated.

Obviously the water can be distilled from the atmosphere but is also likely to collect its fair share of dust. Since it is meant to go directly into the ground, this is not an issue. If it should be used for drinking water, there are plenty of options for operators to use on their own.

The one advantage this particular design protocol has is that you can count of household humidity of over 60% even when outside humidity is a mere 15%. Yes we do expel that much water.

It is much more problematic in the open field with variable humidity and has to be managed by the control system.

What this simply demonstrates is the reality of the protocol itself. It should never be in dispute and that is one unneeded prototype field test.


Our objective will be to produce several times as much as demonstrated here for a comparable selling price. That is the technical challenge that is now achievable.

http://msnbcmedia1.msn.com/j/MSNBC/Components/Photo/_new/081201-WaterMill-1.widec.jpg

Turning air into water? Gadget does just that

WaterMill is touted as a pricey but environmentally friendly H20 source

A new home appliance called the WaterMill converts outdoor air into nearly 13 quarts of fresh water every day. Touted as an eco-friendly alternative to bottled water, the appliance uses ultraviolet light to cleanse itself and advanced sensors to efficiently adapt to its surroundings.


By Bryn Nelson

Columnist

msnbc.com

updated 6:02 a.m. PT, Mon., Dec. 8, 2008

Remember those sweltering summer days when the air was so muggy you could practically drink it? A
new home appliance is promising to make that possible by converting outdoor air into nearly 13 quarts of fresh water every day.

Originally envisioned as an antidote to the shortage of clean drinking water in the world, the WaterMill has the look of a futuristic air conditioner and the ability to condense, filter and sterilize water for about 3 cents per quart.

At $1,299, the 45-pound device doesn’t come cheap, and it is neither the first nor the biggest machine to enter the fast-growing field of atmospheric water generators. But by targeting individual households with a self-cleaning,
environmentally friendly alternative to bottled water, Kelowna, British Columbia-based Element Four is hoping its WaterMill will become the new must-have appliance of 2009.

“The idea is making this thing intelligent,” said Jonathan Ritchey, inventor of the original WaterMill prototype and president of Element Four. “So what happens is the machine knows where it is. If you put it in a rainforest, it will sample that environment every three minutes, and it will adapt.” Ditto for a desert. That adaptation, he said during a November preview at Manhattan’s WIRED Store, is critical for energy efficiency.

Cooling the machine’s condensation chamber to just below the dew point, or the temperature at which the air becomes saturated with water vapor and begins to condense, is central to the process.

“If I have a dumb machine, it might bring the air down to just three degrees above dew point and I wouldn’t get any water,” Ritchey said.

“If I take the air way below the dew point, I’m using what’s called latent heat. It’s sort of like taking an ice cube and trying to freeze it some more. You’re just wasting your energy.”

The unit’s activated carbon filter offers another feature not found on most appliances.

“We’ve actually designed a system that knows when the filter is spent and will tell you, the consumer, ‘Time to change the filter, time to change the filter,’ Ritchey said. “And then if you don’t, we’ve got it dummy-proofed. It will shut itself down. Either you change the filter, and it makes pure water, or it doesn’t make water at all.”

Microbes are another big concern in water coolers, hot water tanks, industrial-sized air conditioning units and other places where water vapor can become contaminated.

The WaterMill was designed to overcome that issue with a self-sterilizing condensation chamber that boasts a reflective wall surrounding its condensation coil. During the machine’s daily sterilization cycle, UV light ricochets off the wall and efficiently sterilizes both the front and back sides of the coil.

Most environments around the world have plenty of water vapor that can be converted into liquid water. In fact, if you could wring out all the water in the air around the world and pour it into a lake, its volume would equal about 3,095 cubic miles, or more than that of Lake Superior, according to the
U.S. Geological Survey.

Element Four estimates that its machine can convert between 10 percent and 40 percent of vapor into liquid water, depending on the relative humidity.

In 91 degree heat with 69 percent relative humidity, the machine tops out at a little less than 13 quarts per day. And because water vapor is continually replenished though the planet’s water cycle, removing it from the air could continue indefinitely without disrupting local ecosystems.