Tuesday, July 15, 2008

Solar Cycles

There has been a lot of commentary on the current sunspot count with many commentators jumping the gun and predicting a long lived low like the rather famous Maunder Minimum. This timely article puts us back on track and tells us to hold on a minute. Fears of a protracted low are very premature.

Another aspect of the sun spot record that has always bothered me is that although our records since the early 1700’s has been well maintained and certainly meeting today’s standards, the early period of telescope usage between 1609 and 1700 may have been a lot more dicey. At least that was the apparent consensus forty years ago. In other words during the early going and even late into the mid nineteenth century with the advent of Wolf’s methodology, sunspot counting was prone to subjective decisions.

This may not sound like a lot, but you only need to decide among your group of observers that the observed image is one foot across for fifty years and then switch to better equipment and an easier two foot image to change the image usefulness by a factor of four. There were only a handful of observers who all knew each other and it is easy to see how improving equipment would have quietly allowed the sun spot count to creep up.

In other words, the low count for the Maunder Minimum may hugely reflect the limitations of the equipment and numbers of observers. I still think that there was a minimum but I simply do not totally trust the data.

Even today, no one is sitting there counting sun spots. Rather data sampling and formulas are shaking out the current number as they should. It is just a huge mistake to extrapolate that level of precision back over the centuries. Yet it feels like it could be done.

This means that the attempt to link the known event of the Little Ice Age with the shaky Maunder Minimum is unconvincing and similar to the linking of CO2 concentration and Global Warming,

What's Wrong with the Sun? (Nothing)


July 11, 2008: Stop the presses! The sun is behaving normally.

So says NASA solar physicist David Hathaway. "There have been some reports lately that Solar Minimum is lasting longer than it should. That's not true. The ongoing lull in sunspot number is well within historic norms for the solar cycle."

This report, that there's nothing to report, is newsworthy because of a growing buzz in lay and academic circles that something is wrong with the sun. Sun Goes Longer Than Normal Without Producing Sunspots declared one recent press release. A careful look at the data, however, suggests otherwise.

But first, a status report: "The sun is now near the low point of its 11-year activity cycle," says Hathaway. "We call this 'Solar Minimum.' It is the period of quiet that separates one Solar Max from another."

http://science.nasa.gov/headlines/y2008/images/solarcycleupdate/ssn_predict_l.gif

Above: The solar cycle, 1995-2015. The "noisy" curve traces measured sunspot numbers; the smoothed curves are predictions. Credit: D. Hathaway/NASA/MSFC. [more]

During Solar Max, huge sunspots and intense solar flares are a daily occurance. Auroras appear in Florida. Radiation storms knock out satellites. Radio blackouts frustrate hams. The last such episode took place in the years around 2000-2001.

During Solar Minimum, the opposite occurs. Solar flares are almost non-existant while whole weeks go by without a single, tiny sunspot to break the monotony of the blank sun. This is what we are experiencing now. Although minima are a normal aspect of the solar cycle, some observers are questioning the length of the ongoing minimum, now slogging through its 3rd year.

"It does seem like it's taking a long time," allows Hathaway, "but I think we're just forgetting how long a solar minimum can last." In the early 20th century there were periods of quiet lasting almost twice as long as the current spell. (See the end notes for an example.) Most researchers weren't even born then.

Hathaway has studied international sunspot counts stretching all the way back to 1749 and he offers these statistics: "The average period of a solar cycle is 131 months with a standard deviation of 14 months. Decaying solar cycle 23 (the one we are experiencing now) has so far lasted 142 months--well within the first standard deviation and thus not at all abnormal. The last available 13-month smoothed sunspot number was 5.70. This is bigger than 12 of the last 23 solar minimum values."

In summary, "the current minimum is not abnormally low or long."

The longest minimum on record, the Maunder Minimum of 1645-1715, lasted an incredible 70 years. Sunspots were rarely observed and the solar cycle seemed to have broken down completely. The period of quiet coincided with the Little Ice Age, a series of extraordinarily bitter winters in Earth's northern hemisphere. Many researchers are convinced that low solar activity, acting in concert with increased volcanism and possible changes in ocean current patterns, played a role in that 17th century cooling.

http://science.nasa.gov/headlines/y2008/images/solarcycleupdate/ssn_yearlyNew2.jpg

For reasons no one understands, the sunspot cycle revived itself in the early 18th century and has carried on since with the familiar 11-year period. Because solar physicists do not understand what triggered the Maunder Minimum or exactly how it influenced Earth's climate, they are always on the look-out for signs that it might be happening again.

The quiet of 2008 is not the second coming of the Maunder Minimum, believes Hathaway. "We have already observed a few sunspots from the next solar cycle," he says. (See Solar Cycle 24 Begins.) "This suggests the solar cycle is progressing normally."

What's next? Hathaway anticipates more spotless days1, maybe even hundreds, followed by a return to Solar Max conditions in the years around 2012.

Stay tuned to Science@NASA for updates.

Author: Dr. Tony Phillips | Credit: Science@NASA

Monday, July 14, 2008

Solar Windows

This news release on the making of very efficient panes of glass that collect and convert a portion of the incoming light is a nifty bit of work. It can allow the use of installed windows as household energy collectors.



Previously discussed printed nanosolar systems are hugely more important but do not easily address windows. What we have here is a neat strategy for siphoning a portion of the incoming light and transporting it to the edges were in concentrated form it is collected by efficient standard solar cells.



This also reminds me of the use of window panes containing a minute amount of dissolved gold on office buildings. I believe that they cut hugely into the amount of infrared light that came through. Other metals are obviously now used for the same effect.



The difficulty with this technology is the wonderful word ‘organic’. To get working dyes that will last even twenty years let alone forever is a very tall order. Mechanical protection is no big trick and installed diodes on the edges may not even have to be continuous. We are looking at concentration factors of at least twenty to one and likely much higher (the factor is twice edge length in inches for an eighth inch thick pane.). The constraint will be the absorption capacity of the photovoltaic diode. It should also be possible to design things so that a broken pane can be replaced without replacing the diodes mounted in the frame.



Again, once manufacturing and usage becomes ubiquitous, the technology can be advanced on a step by step basis to achieve better efficiencies. Thus been able to trade out the panes at will is a commercial advantage. Remember that we have used the light bulb socket for one hundred year. Our new technology which finally became possible had to design around this installed base. I could not imagine using the screw-in format otherwise.



This is, even with concerns over the life of the dyes, a neat way to utilize windows without affecting anything else in the building itself such as placing a system of panels on the roof. This technology neatly changes out the current installed base of windows while providing a power source that can be easily integrated into the newly developing paradigm of local solar energy generation that is about to be driven by cheap nanosolar power.



I have this vision of thousands of houses dumping surplus power as well as any unused stored power into the grid every day. There is good reason to expect every household to become a net power exporter even after consuming its share of urban transportation energy.



MIT opens new 'window' on solar energy

Cost effective devices expected on market soon

Elizabeth A. Thomson, News Office


July 10, 2008

Imagine windows that not only provide a clear view and illuminate rooms, but also use sunlight to efficiently help power the building they are part of. MIT engineers report a new approach to harnessing the sun's energy that could allow just that.

The work, to be reported in the July 11 issue of Science, involves the creation of a novel "solar concentrator." "Light is collected over a large area [like a window] and gathered, or concentrated, at the edges," explains Marc A. Baldo, leader of the work and the Esther and Harold E. Edgerton Career Development Associate Professor of Electrical Engineering.

As a result, rather than covering a roof with expensive solar cells (the semiconductor devices that transform sunlight into electricity), the cells only need to be around the edges of a flat glass panel. In addition, the focused light increases the electrical power obtained from each solar cell "by a factor of over 40," Baldo says.

Because the system is simple to manufacture, the team believes that it could be implemented within three years--even added onto existing solar-panel systems to increase their efficiency by 50 percent for minimal additional cost. That, in turn, would substantially reduce the cost of solar electricity.

In addition to Baldo, the researchers involved are Michael Currie, Jon Mapel, and Timothy Heidel, all graduate students in the Department of Electrical Engineering and Computer Science, and Shalom Goffri, a postdoctoral associate in MIT's Research Laboratory of Electronics.

"Professor Baldo's project utilizes innovative design to achieve superior solar conversion without optical tracking," says Dr. Aravinda Kini, program manager in the Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science, a sponsor of the work. "This accomplishment demonstrates the critical importance of innovative basic research in bringing about revolutionary advances in solar energy utilization in a cost-effective manner."

Solar concentrators in use today "track the sun to generate high optical intensities, often by using large mobile mirrors that are expensive to deploy and maintain," Baldo and colleagues write in Science. Further, "solar cells at the focal point of the mirrors must be cooled, and the entire assembly wastes space around the perimeter to avoid shadowing neighboring concentrators."

The MIT solar concentrator involves a mixture of two or more dyes that is essentially painted onto a pane of glass or plastic. The dyes work together to absorb light across a range of wavelengths, which is then re-emitted at a different wavelength and transported across the pane to waiting solar cells at the edges.

In the 1970s, similar solar concentrators were developed by impregnating dyes in plastic. But the idea was abandoned because, among other things, not enough of the collected light could reach the edges of the concentrator. Much of it was lost en route.

The MIT engineers, experts in optical techniques developed for lasers and organic light-emitting diodes, realized that perhaps those same advances could be applied to solar concentrators. The result? A mixture of dyes in specific ratios, applied only to the surface of the glass, that allows some level of control over light absorption and emission. "We made it so the light can travel a much longer distance," Mapel says. "We were able to substantially reduce light transport losses, resulting in a tenfold increase in the amount of power converted by the solar cells."

This work was also supported by the National Science Foundation. Baldo is also affiliated with MIT's Research Laboratory of Electronics, Microsystems Technology Laboratories, and Institute for Soldier Nanotechnologies.

Mapel, Currie and Goffri are starting a company, Covalent Solar, to develop and commercialize the new technology. Earlier this year Covalent Solar won two prizes in the MIT $100K Entrepreneurship Competition. The company placed first in the Energy category ($20,000) and won the Audience Judging Award ($10,000), voted on by all who attended the awa

Friday, July 11, 2008

Earth's Magnetic Field

This article is a nice summary of the present state of our ideas about the Earth’s interior and its magnetic engine. To be totally fair, it is complete rubbish to think in terms of swirling dynamos. The core is packed and any motion must be glacial at best. Electron flow is quite a different matter. But even that requires a meaningful potential. And we already know that the mere movement of the magnetic field itself rules out the core acting like a permanent magnet.



An actual review of the global map of magnetic strength reveals a non homogenous field that still preserves the polar orientation. Sort of.



I have come to the conclusion that there exists a thin layer of (liquid) carbon between the crustal material which is disassociating and drawing contained graphitic carbon down with it before hitting the carbon melt point and then rising. The carbon must go deepest to reach a possible melt point. Or perhaps it is all graphite. The point to remember is that all other elements are dissociated there and in liquid state.



This explains how a charge of liquid carbon is able to spear its way through the crust and even reaches the surface at around seventy miles per hour. That is also fast enough to leave a little of the carbon to remain in crystal form. The rest will be consumed by the crustal material itself. Recall the known low viscosity of graphite.



This layer may only be a hundred feet thick if the average size of a kimberlite pipe is an indicator. It is certainly everywhere and about eighty miles deep. It is the slip plane between the crust and the core itself and the reason why any crustal movement is even possible at all.



The mere existence of kimberlite pipes and diamonds is proof of the existence of this layer of pure liquid carbon.



It is also a great place for storing electrons and a natural generator of a strong magnetic field that can shift and move in reaction to modest electrical or even mechanical stimulation. It likely insulates the core itself from expressing magnetic activity.



Earth's Core, Magnetic Field Changing Fast, Study Says

Kimberly Johnson


for National Geographic News
June 30, 2008 Rapid changes in the churning movement of Earth's liquid outer core are weakening the magnetic field in some regions of the planet's surface, a new study says.



"What is so surprising is that rapid, almost sudden, changes take place in the Earth's magnetic field," said study co-author Nils Olsen, a geophysicist at the Danish National Space Center in Copenhagen.



The findings suggest similarly quick changes are simultaneously occurring in the liquid metal, 1,900 miles (3,000 kilometers) below the surface, he said.



The swirling flow of molten iron and nickel around Earth's solid center triggers an electrical current, which generates the planet's magnetic field.



The study, published recently in Nature Geoscience, modeled Earth's magnetic field using nine years of highly accurate satellite data.



Flip-Flop

Fluctuations in the magnetic field have occurred in several far-flung regions of Earth, the researchers found.



In 2003 scientists found pronounced changes in the magnetic field in the Australasian region. In 2004, however, the changes were focused on Southern Africa.





The changes "may suggest the possibility of an upcoming reversal of the geomagnetic field," said study co-author Mioara Mandea, a scientist at the German Research Centre for Geosciences in Potsdam.



Earth's magnetic field has reversed hundreds of times over the past billion years and the process could take thousands of years to complete.



(Related story: "Magnetic Field Weakening in Stages, Old Ships' Logs Suggest" [May 11, 2006])

Upper Atmosphere Radiation



The decline in the magnetic field also is opening Earth's upper atmosphere to intense charged particle radiation, scientists say.



Satellite data show the geomagnetic field decreasing in the South Atlantic region, Mandea said, adding that an oval-shaped area east of Brazil is significantly weaker than similar latitudes in other parts of the world.



"It is in this region that the shielding effect of the magnetic field is severely reduced, thus allowing high energy particles of the hard radiation belt to penetrate deep into the upper atmosphere to altitudes below a hundred kilometers (62 miles)," Mandea said.



This radiation does not influence temperatures on Earth. The particles, however, do affect technical and radio equipment and can damage electronic equipment on satellites and airplanes, Olsen of the Danish space center said.



Keep Watching



The study documents just how rapidly the flow in Earth's core is changing, said Peter
Olson, a geophysics professor at Johns Hopkins University in Baltimore, Maryland, who was not involved with the research.



By using satellite imagery, researchers have a nearly continuous measurement of changes, he said.



"They provide a good rationale to continue this monitoring longer," Olson said.

Thursday, July 10, 2008

Politics of CO2

The last two years in particular has seen the steady rise of political pressure in the developed world to progressively reduce CO2 emissions. It is reaching the point were decisions are pending that will cause the economy to shift a great deal of its resources. A good part of this shift was inevitable in view of the advent of peak oil supply market behavior. After all, we have gone from a perennial surplus position to a clearly perennial shortfall situation. The working price range has tripled and is now choking demand and forcing the development of alternatives. True global energy security is gone.

There is plenty of merit in weaning ourselves from the hydrocarbon based energy system as the current price regime makes very clear. The first comments have come out suggesting that this price shock will be worse than that of the seventies. This is regrettably very possible. The economic reality that we are all just beginning to wrestle with is that oil has actually priced itself out of the market. Current levels will force a rapid shift in hardware and behavior and a fair bit of hardship. A price move to $300 will actually shut down economic activity which is an unwanted consequence.

What can make this crisis far worse is a decline in deliveries due to loss of production. Right now the new price regime, which I think is already maxed out, is forcing demand to be curtailed directly freeing up production and in the process rebuilding reserves. This process has only begun. We need $100 oil and the world awash in oil to restore some level of confidence. We can survive that. At twice the price, we are looking at a global economic depression sparing no one.

This makes direct interference in the CO2 end of the business terribly ill timed and actually inappropriate. Everyone in the world is now working at reducing their oil footprint as fast as possible. It hardly needs a push and such steps can be very damaging.

We already know that several strategies now exist to comfortably get us out of the oil business and onto a sustainable protocol. Just read my many posts on the various options. Ethanol from cattail farming is a gimme and the advent of printed solar cells will produce a distributed peak energy supply for transport very soon. Both will be very price competitive.

What I find frustrating is that the current scenario was clearly developing and was certainly obvious to astute observers even several years ago. In fact I personally predicted that the price shift would arrive during the last year of the current president’s term of office. And I was hardly an insider. The industry has known that this day was unavoidable, but they had no answers either. The result has been that no preparation was promoted except a little silliness over corn ethanol.

We now have to move our economy on a dime to avoid the worst effects.

So what about CO2? The argument that CO2 is the causation of the very real phenomena of global warming is likely very misplaced. The science itself has been forcefully challenged and is difficult to actually prove anyway. A warming climate is certainly not a proof. We have a soft theory made up to support the facts on the ground that appears to be independent. Just the temperature experience that we have uncovered for the whole of the Holocene tells us that we had better be a lot better prepared before we attempt to link CO2 levels to apparent global temperatures.

The global economy is now beginning a transition over to an energy regime that will eschew fossil fuels, just as we transitioned out of using wood for fuel. It will not take very long and will be largely done in the next two generations.

That then leaves us with the question of what to do with the surplus CO2 in the atmosphere. Once we stop adding to the inventory, just letting nature take its course is a very viable option. It will surely take centuries but we can expect a steady increase in biomass to offset the surplus CO2.

I personally see a far better answer for humanity. Without question, the addition of elemental carbon to all our soils promotes a vast increase in general fertility and general soil nutrient stability. The argument is also made that is also promotes a sharp increase in biological activity that sometimes releases carbon. It is still easily fixed by the expedient of adding a major surplus of carbon as exists in the original terra preta soils.

It can be easily accomplished using both primitive methods and now with solid technological means.

The one other thing that we need to do is to completely revegetate the Sahara and the Sahel. This is a tall order that is best done over a couple of centuries. The remaining deserts can also be so converted provided we are able to tap atmospheric water. The huge benefit of this is to capture a huge amount of heat and moisture in the northern hemisphere now lost to desert heat as well as all the carbon we ever produced.

The hemisphere will become even more suitable for agriculture and we may even make the boreal forest partially productive for agriculture.

The problem we all face is how to guide the political drive to rein in the CO2 production problem into beneficial protocols such as I have described. The technical problem is thought daunting and my proffered solutions are also thought daunting. That means that most minds simply cannot comprehend the actual scale of what they want. Yet I think that it can all be done be the simple expedient of modestly empowering and educating every individual agriculturist on the globe. We only have to recall the organizational achievement of micro finance by Muhammad Yunis.

Wednesday, July 9, 2008

UFO Enigma

Mankind is on the verge of achieving energy efficient spaceflight. And the only thing holding us up from complete mastery of the physical universe is a major omission in our understanding of mathematics. Most of what has been accomplished has been done in the two hundred years since we invented the steam engine. The rest is prelude.

Biological engineering can give us a very long life and allow us to prosper in space while completely avoiding the multiple hazards of Earth. It is all happening now.

My point is that this may have already happened. I have spoken of the harshness of ice age conditions, but that may be unfair. We had the tropics and we had huge coastal plains that would have been ocean moderated for much of the globe. This provided ample room for huge human populations to develop.

And then mankind leapt into space and was happy with the result. It was a simple step to properly reorient the crust ending the northern Ice Age and liberating the vast temperate land areas while giving up the coastal plains.

It was perfectly logical to allow the many surviving hunter gatherers to pick up the pieces and rebuild a new world. It has taken us 12,900 years and it is obvious that we did not break any records. Our understanding today is built on a finite set of mathematical tools and a ton of good empirical knowledge. For most of that history, we have resisted advancing that knowledge.

The only thing holding us back from moving into space is the need for an effortless one g thruster that allows a human habitat to prosper. The rest we have. Would you hesitate if it meant living for hundreds of years?

An extant space faring human population of billions living in our solar system would completely explain the much scorned UFO phenomena. It certainly removes the problem of motive. Now we know why they bother and why they do not communicate. There is no mystery for them. I would only be saddened if they did not experience compassion for us who live in the gravity well in ignorance of our options and potential lifespan.

Today we know what man can really accomplish, because we have done it. We had that exact same opportunity for fifty thousand years before the Pleistocene Nonconformity. Ask yourself, do you think that we did not do as much then?

Surprisingly, a close reading of the material that has come down through to us embedded in our cultural artifacts has tended to throw out conforming tales that fit just such a history. Or perhaps it is all shared wishful thinking.


It is a trivial matter to build an effective space habitation once man himself is modestly re engineered to prosper in the necessary environments. Everyone has heard of the Dyson sphere which is perhaps a step too far. I simply expect that we will simply inflate a balloon anchored to the ends of a central axis and spin it around the axis up to one gee at the equator. Cargo vessels can access at the axis end (zero gravity) and material may then be lowered to the equator. A suspended structure can be built from the equator inward, possibly occupying half to a third of the contained volume while maintaining useful gravity throughout. The outer wall can be readily thickened with additional balloon wall material and expansive foams that also bind forming stress skin panels. A structure with a radius of 2000 feet could house a million individuals all living with ample gravity.

Thousands could be built in the Asteroid belt and never be noticed. And an efficient outer surface of nano solar cells would be a black as coal.

I can do all this today with today's technology. I am only missing a cheap one gee thruster accessing limitless energy. Perhaps dear reader you will be inspired to invent such a device.




Tuesday, July 8, 2008

July Sea Ice

I want to show you these two trend lines for polar sea ice. It very much reflects what has been experienced. This year we have bounced back to norm and we can expect little this year that compares to last year’s drama. Then last year, almost no one else was watching while today it seems everyone is.


North Pole:

http://nsidc.org/data/seaice_index/n_plot.html

South Pole:

http://nsidc.org/data/seaice_index/s_plot.html

Without question, we have a very clear twenty year negative slope for the north whose explanation is a strict linear surplus of heat annually injected into the Arctic. Last year we saw that the annual melt is now starting to go non linear. This is hugely masked by the annual coverage of the one year ice, but rest assured that the long term ice is now shrinking very fast and will be all gone within the next several years and possibly as soon as 2012. This masking effect kept everyone asleep until submarine survey work in 2000 disclosed that sixty percent of the perennial ice had disappeared.

Now that we have this chart, it appears likely that the majority of the sixty percent loss actually took place during the decade of the nineties, rather than stretched out over the preceding forty years. The apparent rapid decline we are now witnessing fits this scenario very well and is well beyond my most conservative expectations.

An inspection of the current sea ice cover reveals that the extent of the Arctic Sea currently exhibiting one hundred percent ice cover is likely around a mere twenty percent. Even more curious, this concentration is scattered throughout the Arctic as a result of wind concentration. There are no apparent huge zones of embayment that I suspect was the expectation a couple of decades ago. Everything is floating and drifting.

In fact, it is possible to speculate that the right combination of winds could even open up the North Pole for shipping and even all sorts of alternative circumpolar routings. The lesson here though, is that this winter sea ice is ample enough to likely always represent a formidable barrier to actual summer shipping. Oh well, it was a nice idea.

This means that the perennial ice is warming up nicely as it circulates and is certainly disappearing very quickly. All things point to the Arctic in its equivalent of its spring breakup phase. And this season is not demonstrating any reversal of this activity. I think that essentially all long term ice will be gone by 2012. We will still have the annual ice cover and it’s melting to sort out and possibly understand.

Of course, if the sunspot theorists are right this is likely our very last warm Arctic summer for a long time. We are currently between cycles and the onset of sunspots is long overdue. That means that two popularized scientific theories, both based on far too incomplete data bordering on singular coincidence are holding diametrically opposed positions. Do you wonder why wise politicians are trying to keep their heads down?

I suppose that the most amusing scenario would be to see nothing much happen. That means that this modest warming matures and leaves us with a warm period not unlike many before and explained exactly the same way. That would allow both theories to be quietly forgotten while we reorganize industry away from burning fossil fuels anyway.

Monday, July 7, 2008

Exploding Asteroid Hypothesis Strengthens

Ohio Diamonds linked to Arctic Diamond Fields

We observe that there is a straight line between this Ohio site described in the attached article, the Arctic diamonds fields and the impact craters in the Carolinas. We can reasonably assume that the primary impact took place on the diamond fields and obviously penetrating the ice and causing a great amount of debris to be hurled into the Ohio Valley. Additional parts of the asteroid obviously impacted in the Carolinas. Little of the evidence for this extreme event is terribly obvious today, but now that we know that it is likely to be there and also to be very extensive, we can start looking.

It seems very unlikely that any argument for glacial transport can hold up as an alternative.

I would like to see a more extensive search for the 12900 horizon and the related charcoal. The extraordinary burn off surely succeeded in leaving an uncommon charcoal zone throughout the eastern USA and the event obviously sent a shock wave that likely killed of the majority of the mega fauna both in North America and also Siberia. Survivors needed to be in the lee of a natural obstruction and then they had to survive the heat wave also.

The initial explosions likely took place over Siberia culminating with the primary impact in the ice shielded diamond field area. Other parts of the incoming object likely exploded over the Carolinas as they neared the surface. It is worth observing that the Mammoth evidence in Siberia strongly supported just such an abrupt extinction and has been commented on decades ago by others. Animals died with food in their mouths.

This impact event was almost ideally placed to promote a shift of the crust thirty degrees south, taking the northern ice cap centered in Hudson Bay out of the polar region. Had I ordered it up to do exactly this job, I hardly could have done better. So all you enthusiasts for divine intervention now have something to chew on. This put the Caribbean into the tropics and turned the Gulf Stream into a powered up deicing machine.

When I laid out the arguments for the Pleistocene Nonconformity, I was hardly going to argue for a silver bullet. I tried to work around something far uglier. And now we have a silver bullet that also damaged the one continent that was clearly both barely populated and home to the saber toothed tiger. It needed to wipe out the mega fauna to make it as hospitable as it is today.

I do not like amazing coincidences. This has the signature of a planned human Terraforming project. Otherwise, it is simply too good to be true. And to paraphrase Sherlock Holmes, when all other explanations are eliminated, one must consider the unthinkable. Humanity had the time on Earth to do this in spades. They certainly had several convenient tropical homelands to develop an advanced civilization in.

And once the resources and knowledge existed, it was a simple step to execute this crustal shift program. The crust may even have shifted in the past to demonstrate the feasibility. All that was needed was the knowledge to be able to send a large mass into the appropriate orbit. And then to get everyone well out of the way. That means that our predecessors became space faring and have spent the last twelve thousand years elsewhere, but obviously not too far away if the UFO phenomenon means anything at all. In fact this provides us with a very believable and satisfying UFO paradigm to work with.

We can assume that these hypothetical humans have genetically modified themselves to prosper in space and may not be in any rush to live here. And they have allowed the recovering earth to be an interesting experiment in natural human development.

I have sketched out an unusual paradigm for us to contemplate. It is vastly more real and possible to me than is can ever be to you. It resolves a whole range of unspoken questions that have been hanging over our heads and studiously ignored. Like what was modern man doing for fifty thousand years before the abrupt end of the ice age? Particularly in view of what we have done in the 12,900 years since.

Exploding Asteroid Theory Strengthened by New Evidence Located in Ohio, Indiana

Was the course of life on the planet altered 12,900 years ago by a giant comet exploding over Canada? New evidence found by UC Assistant Professor of Anthropology Ken Tankersley and colleagues suggests the answer is affirmative.

Date: 7/2/2008


By: Carey Hoffman

Geological evidence found in Ohio and Indiana in recent weeks is strengthening the case to attribute what happened 12,900 years ago in North America -- when the end of the last Ice Age unexpectedly turned into a phase of extinction for animals and humans – to a cataclysmic comet or asteroid explosion over top of Canada.

A comet/asteroid theory advanced by Arizona-based geophysicist Allen West in the past two years says that an object from space exploded just above the earth’s surface at that time over modern-day Canada, sparking a massive shock wave and heat-generating event that set large parts of the northern hemisphere ablaze, setting the stage for the extinctions.

Ken Tankersley

Now University of Cincinnati Assistant Professor of Anthropology Ken Tankersley, working in conjunction with Allen West and Indiana Geological Society Research Scientist Nelson R. Schaffer, has verified evidence from sites in Ohio and Indiana – including, locally, Hamilton and Clermont counties in Ohio and Brown County in Indiana – that offers the strongest support yet for the exploding comet/asteroid theory.

Samples of diamonds, gold and silver that have been found in the region have been conclusively sourced through X-ray diffractometry in the lab of UC Professor of Geology Warren Huff back to the diamond fields region of Canada.

The only plausible scenario available now for explaining their presence this far south is the kind of cataclysmic explosive event described by West’s theory. "We believe this is the strongest evidence yet indicating a comet impact in that time period," says Tankersley.

Ironically, Tankersley had gone into the field with West believing he might be able to disprove West’s theory.

Tankersley was familiar through years of work in this area with the diamonds, gold and silver deposits, which at one point could be found in such abundance in this region that the Hopewell Indians who lived here about 2,000 years ago engaged in trade in these items.

Prevailing thought said that these deposits, which are found at a soil depth consistent with the time frame of the comet/asteroid event, had been brought south from the Great Lakes region by glaciers.

"My smoking gun to disprove (West) was going to be the gold, silver and diamonds," Tankersley says. "But what I didn’t know at that point was a conclusion he had reached that he had not yet made public – that the likely point of impact for the comet wasn’t just anywhere over Canada, but located over Canada’s diamond-bearing fields. Instead of becoming the basis for rejecting his hypothesis, these items became the very best evidence to support it."

Additional sourcing work is being done at the sites looking for iridium, micro-meteorites and nano-diamonds that bear the markers of the diamond-field region, which also should have been blasted by the impact into this region. Ken Tankersley in the field Ken Tankersley seen working in the field in a cave in this publicity photo from the National Geographic Channel.

Much of the work is being done in Sheriden Cave in north-central Ohio’s Wyandot County, a rich repository of material dating back to the Ice Age.

Tankersley first came into contact with West and Schaffer when they were invited guests for interdisciplinary colloquia presented by UC’s Department of Geology this spring.

West presented on his theory that a large comet or asteroid, believed to be more than a mile in diameter, exploded just above the earth at a time when the last Ice Age appeared to be drawing to a close.

The timing attached to this theory of about 12,900 years ago is consistent with the known disappearances in North America of the wooly mammoth population and the first distinct human society to inhabit the continent, known as the Clovis civilization. At that time, climatic history suggests the Ice Age should have been drawing to a close, but a rapid change known as the Younger Dryas event, instead ushered in another 1,300 years of glacial conditions. A cataclysmic explosion consistent with West’s theory would have the potential to create the kind of atmospheric turmoil necessary to produce such conditions.

"The kind of evidence we are finding does suggest that climate change at the end of the last Ice Age was the result of a catastrophic event," Tankersley says.

Currently, Tankersley can be seen in a new documentary airing on the National Geographic channel. The film "Asteroids" is part of that network’s "Naked Science" series.

The new discoveries made working with West and Schaffer will be incorporated into two more specials that Tankersley is currently involved with – one for the PBS series "Nova" and a second for the History Channel that will be filming Tankersley and his UC students in the field this summer. Another documentary, this one being produced by the Discovery Channel and the British public television network Channel 4, will also be following Tankersley and his students later this summer.

As more data continues to be compiled, Tankersley, West and Schaffer will be publishing about this newest twist in the search to explain the history of our planet and its climate.

Climate change is a favorite topic for Tankersley. "The ultimate importance of this kind of work is showing that we can’t control everything," he says. "Our planet has been hit by asteroids many times throughout its history, and when that happens, it does produce climate change."

Friday, July 4, 2008

Magnetic Refrigeration Advances

Hot on the news about nano solar we have this item about advances in magnetic refrigeration. I draw your attention to two points made in this report. The first is that the energy efficiency is at the 60% level as compared to 40% for current technology which has been extant for almost a hundred years. The fact that mechanical devices such as valves and compressors and the like also disappear is a bonus. This means that a refrigerator will consist as always of an insulated box with a much simpler cooling panel doing the work. The redesign and conversion will be swift.

The other point made is that an operating range of 100 degrees is claimed. When I reviewed work in this field around two years ago, the best operating range for the best magnetics was around 10 to 20 degrees and represented major stumbling block. This is even more important than efficiency.

This makes completely new applications of refrigeration completely feasible and the reengineering of old applications a must.

The first and most obvious fix is the creation of solar powered magnetic air conditioning units. This would eliminate that component of grid load and make most of the benefits freely available during the hottest hours. The resulting units should be cheap enough to sell world wide. I can almost envisage a third world residence with its obligatory TV been run of the solar panel and air conditioning unit.

The second important fix is the manufacture of stand alone combined units that simply produce atmospheric water for adjacent crops. These last two innovations permit the cost to drop down to below the thousand dollar point. And once mass production kicks in the cost will continue to come down.

The point I am making is that there are now no remaining technology road blocks.

May 19, 2008

Magnetic refrigeration moves on

Nanocomposites produced from metallic glasses could make promising magnetic refrigeration materials according to new work by scientists in France. The materials are as good as the best currently available magnetic refrigerants with some added advantages. Refrigerators using such materials would be environmentally friendly and more efficient than existing devices that rely on a vapour cycle.

"Magnetic refrigeration is an environmentally friendly cooling technology, unlike the gas-compression refrigerators used today," team member Stéphane Gorsse of the Institute of Condensed Matter Chemistry of Bordeaux (ICMCB-CNRS) told nanotechweb.org. It uses no ozone-depleting, hazardous chemicals or greenhouse gases (such as hydrofluorocarbons used in conventional refrigeration systems). Moreover, the energy efficiency can reach up to 60%, compared to just 40% for the best gas-compression refrigerators.

Original material

And that's not all: current magnetic refrigerants are only efficient in a narrow temperature range of a few degrees above and below their transition temperature. The new material is the first to efficiently perform in a wide temperature range of about 100 K. Moreover, the working temperature and operating range can be tailored by tuning the composition and manipulating the microstructure.

"Our material is original because its properties combine advantages of crystallized and amorphous materials due to its unique microstructure: it is a nanocomposite made of gadolinium nanocrystallites embedded in a gadolinium-aluminium-manganese (Gd60Al10Mn30) metallic glass matrix," explained Gorsse.

Unique properties

Metallic glasses are still relatively "immature" materials and have few applications – mainly in sports equipment (zirconium-based metallic glasses in golf clubs, for example). But these materials exhibit unique properties thanks to their disordered atomic structure.

Gorsse and colleagues made their nanocomposite by rapid quenching of melt to avoid crystallization and to form a metastable disordered amorphous solid (the metallic glass). The glass was then subjected to a heat treatment, which needs to be stopped early to prevent the glass from fully crystallizing.

"Our material is as good as the best currently available materials that are crystallized and which exhibit first-order transitions and strong magnetocrystalline coupling," said Gorsse. "These materials present several disadvantages compared to ours – they have highly hysteric and hard magnetic behaviour, which reduces the efficiency of the cooling process since it leads to energy losses. Also, structural changes in these materials promote crack nucleation and propagation that can cause severe damage to the refrigerant material during cycling."

The microstructure of the nanocomposite (volume fraction, size and composition of the nanocrystallites formed in situ), and thus the resulting magnetocaloric properties and refrigeration capacity, depends on the heat treatment temperature and time. The researchers therefore plan to study and model how the microstructure of their material evolves during heat treatment and how the glass composition affects crystallization. "We also hope to identify and produce the ideal microstructure that gives the best material with improved magnetic refrigeration and working temperature," revealed Gorsse.

The work was published in Appl. Phys. Lett.

About the author

Belle Dumé is contributing editor at nanotechweb.org

Thursday, July 3, 2008

Tunguska Implications

This was published in Nature by Duncan Steel to commemorate the Tunguska event centenary. It particularly describes the various attempts made to explain the event over the past century.

The fact not mentioned too loudly is that the vast majority of meteorites are stony and clearly capable of the same behavior. The incoming trajectory is likely comparable to a comet like orbital coming from the sun and arriving at speed.

What is important today is that we now know that a rapidly arriving large stony mass will be pulverized as it passes through the atmosphere leading to a massive explosion and heat release. Even without a visible touchdown we have seismic activity.

The big event associated with the Pleistocene nonconformity was vastly larger but that does not mean that the events of Tunguska were not mirrored. It is now clear that the heat front burned of all of North America causing the collapse of the mega fauna. The shock wave damaged the whole northern hemisphere even where heat was not a problem.

The existence of apparent impact craters in the Carolinas without debris in evidence conforms to air bursts along the entry path. The reaction of supersonic shock waves and air bursts with the surface is as yet not understood. We are seeing some rather broad hints.

I have already pointed out that the location of the disturbance was well situated to release the crust allowing it to settle into today’s configuration. This all requires vastly less kinetic energy than I had previously anticipated and described. Why use a hammer when it appears a tap will do? It was still the mother of all meteoric events during human history.

I had not anticipated a properly timed meteor strike with a flight path rather well aimed to achieve maximum effect. That was calling for a remarkable coincidence without any evidence.

Those readers who have not read my article on the Pleistocene Nonconformity are advised to do so since we tackle the key issue of crustal stickiness there.

It is clear that the risk of meteor striking the earth and causing massive damage is currently underestimated. Piecing together the Tunguska mystery has clarified many likely events in our past.

This also underlines the need for a global meteorite defense net and the existence of kinetic high orbit missiles able to intercept and disturb a meteor’s path. It need not be overbuilt in order to do its job. It just needs to be available for a one off need every century or so. Our hardware is likely up to the scanning job even now.

The statistical risk is very low but is not zero. The risk that such an event could destroy centers of humanity is also very low but not zero. Integrating a simple defense measure with our ongoing research efforts is a simple method of reducing even those odds. And as we progress in space exploration and exploitation the odds become even longer.

Today our culture is global, thus making a Tunguska like event survivable. Little comfort for the casualties, though.

Duncan Steel // Published online 25 June 2008 | Nature 453, 1157-1159 (2008)

PLANETARY SCIENCE: TUNGUSKA AT 100

"Sooner or later, it was bound to happen. On June 30, 1908, Moscow escaped destruction by three hours and four thousand kilometers - a margin invisibly small by the standards of the universe."

So begins Rendezvous with Rama , a 1972 novel by Arthur C. Clarke in which mankind learns the hard way about the dangers posed by incoming asteroids. The 2077 impact in northern Italy that Clarke goes on to describe is fictional: the 1908 blast was real. The early morning of 30 June 1908 saw, in an area around the Stony Tunguska river, the most explosive cosmic impact in recent history, hundreds of times more powerful than the atomic weapons set off over Hiroshima and Nagasaki. And yet, in part because it happened so far from civilization, and in part because it left no crater, it has not always been recognized as such. For decades it existed in a strange realm between science and pseudoscience, blamed on antimatter, black holes and alien spacecraft as easily as on a very fast bit of interplanetary refuse, and developing a mystique that has seen it associated with everything from energy drinks and rock bands to military missiles and The X-Files .

The approximate site of the blast's epicentre is now marked by a totem pole that researchers have dedicated to Agdy, the god of thunder in local mythology. Getting there is quite a trek, but the fascination of the site still draws an intermittent stream of scientists to the remote wilderness about 1000 kilometres north of Lake Baikal; they leave offerings at the totem pole to commemorate the trek. In the years directly after the blast, though, no one came at all. The first researchers did not arrive until the 1920s.

That does not mean there was no significant contemporary evidence to bring to bear. Siberia was and is an empty place - but a blast which, had it happened over Chicago, would have been heard from Georgia to the Dakotas, still drew a lot of attention. In the days following the blast, A.V. Voznesenskij, the director of the Irkutsk magnetic and meteorological observatory near Lake Baikal, began collecting accounts that are vivid with detail. There are people being knocked off their feet, a man needing to hold onto his plough to avoid being swept away by a powerful wind, the feeling of great heat "as if my shirt had caught fire", herds of hundreds of reindeer being killed, trees set alight by the radiance of the fireball only for the flames to be snuffed out by the subsequent blast wave. And the reports are unequivocal on the source of the blast. G.K. Kulesh, head of a meteorological station at Kurensk, 200 kilometres from the epicentre, told Voznesenskij that: "A meteorite of very enormous dimensions had fallen." (G.K. Kulesh: "There appeared in the northwest a fiery column Å  in the form of a spear. When the column disappeared, there were heard five strong, abrupt bangs, like from a cannon, following quickly and distinctly one after another Å  there had been a strong shaking of the ground, such that the window glass was broken in the houses Å  It is probably established that a meteorite of very enormous dimensions had fallen.")

In the days after the blast, much of Europe experienced eerie 'bright nights': readers wrote to The Times in London, remarking that its columns could be read outdoors at midnight. Polarization measurements are consistent with this being due to sunlight scattered by dust in the very high atmosphere; observatories recorded increased atmospheric opacity and scattering across the Northern Hemisphere. This spreading dust may have been due to a plume ejected backwards along the incoming object's path by its explosion. Such plumes were seen on Jupiter when the fragments of Comet Shoemaker-Levy 9 slammed into it in 1994; hydrodynamic modelling by Mark Boslough and his colleagues at Sandia National Laboratories in Albuquerque, New Mexico, indicates that a similar terrestrial plume could be expected for an impact such as that at Tunguska.

There was, however, one good reason to doubt that a small asteroid was involved: the belief of the time that this would deliver a valuable hunk of iron to the surface. The Russian meteorite hunter Leonid Kulik, who led the first expedition to the epicentre in the 1920s, obtained funding from the Soviet government on the basis that he would find a valuable ore body there. But when he reached his goal in 1927 he found no metal. Nor did he find the crater that an impact was expected to leave. (There are now claims that nearby Lake Cheko might be such a crater, but these are widely disputed.) There were clear signs of violence - trees knocked flat over a vast swath of land - but no big hole in the ground. What could have happened?

In 1930, US astrophysicist Harlow Shapley suggested that the lack of a crater was due to the nature of the impactor. If it had been a comet, and comets were light and fluffy, then it would have exploded at altitude. This idea persisted for decades: in 1982 some planetary scientists were willing to postulate the extraordinarily low density of 3 kilograms per cubic metre in order to explain Tunguska in terms of the blast from a disintegrating comet.

Other explanations were even more far fetched than candyfloss comets. Soviet science-fiction author Alexander Kazantsev realized, as Shapley had, that the best explanation involved an explosion at altitude, and suggested in 1946 that a nuclear-powered alien spaceship exploding just before landing might have been the culprit, an idea taken up eagerly and earnestly in the following decades.

A more scientifically promising possibility was naturally occurring antimatter, a suggestion made independently by various people at various times. In 1940, Vladimir Rojansky of Union College, Schenectady, NY, suggested that some meteors and comets might be made of antimatter - 'contraterrene' matter in the terms of the time - and that their odd behaviour might be detectable. (More than 30 years later Rojansky suggested that it would be worth checking if Comet Kohoutek was one of the antimatter ones.) In 1941, Lincoln LaPaz of OSU in Columbus published two articles in the magazine Popular Astronomy that argued that large terrestrial craters and the craterless Tunguska explosion were both due to antimatter meteors; he later wrote to the Soviet Academy of Sciences suggesting a search for anomalous isotopes at the site.

More than a decade later, Philip Wyatt, a graduate student at Florida State University in Tallahassee, and Boris Podolsky, author of a famous paper with Einstein exploring apparent paradoxes of quantum mechanics, went to a movie in which antimatter featured. Podolsky pointed Wyatt towards Rojansky's 1940 paper and suggested he look into the impacts idea. Wyatt - now the chief executive of the Wyatt Technology Corporation in Santa Barbara, California - says that he was "mostly interested in looking for residual radioactivity" and published some ideas on the subject in Nature. "Other explanations were even more far fetched than candyfloss comets."

This notion was expanded on by three eminent American scientists (including 1960 Nobel Prize winner Willard Libby and Clyde Cowan, co-discoverer of the neutrino) in 1965. Libby, the original developer of the carbon-14 dating technique, found support for the idea of an antimatter impact from what seemed to be an elevated carbon-14 level in tree rings around the world in 1909, suggesting that significant quantities of the isotope had been created by radiation given off when the antimatter annihilated itself on contact with the thicker layers of the atmosphere. Even at the time, though, there were good arguments against the idea: among other things, the first gamma-ray-detecting satellites were not seeing the tell-tale radiation from antimatter annihilation elsewhere in the nearby cosmos.

Even more extreme, in 1973 two University of Texas physicists suggested that the cause was a black hole passing through Earth. This was nothing if not fashionable: miniature black holes had just been postulated by Stephen Hawking as after-effects of the Big Bang. Again the explanation was incomplete and its implications - an exit on the other side of the planet, and a seismic signal lasting well after the initial impact - unobserved. Similar caveats apply to the intriguing hybrid idea, aired as recently as 1989, that the culprit was a deuterium-rich comet turned into a hydrogen bomb by the heat and pressure of its arrival in the atmosphere.

Another approach has been to suggest that, despite the straightforward implications of eyewitness accounts of a bright object zipping across the sky, the source of the blast was in fact beneath the surface. A recent example is a claim that it was due to a 10-million-tonne belch of methane that subsequently exploded high in the sky. Others see a geophysical source involving peculiar tectonic behaviour.

The fact that such ideas were entertained (and still are, in some circles) speaks both of a certain fascination with the fanciful and the abiding need to explain that confusing lack of a crater. The fact that, by the 1960s, various craters around the world had been accepted as meteorite strikes meant that the anomalous lack seemed all the more confusing. In 1993 that confusion was allayed, at least for most people, by Chris Chyba, Kevin Zahnle and Paul Thomas. With the help of computer simulations derived from nuclear weapons' tests they showed that a solid, stony object about 50 metres across - the most likely sort of thing in that size range to hit the Earth - would not be expected to reach the ground. There was no need to invoke weirdly low cometary densities - at the relevant speeds the shock wave generated within a solid body as it slams into the atmosphere would rip up an everyday rock just fine. Formations such as Meteor Crater in Arizona are left by tougher impactors made of metal; the shock waves don't get the better of them until they've reached the ground.

A similar explanation was arrived at by Jack Hills, working at Los Alamos National Laboratory in New Mexico with Patrick Goda, and both teams had been to some extent pre-empted by a Soviet team led by V.P. Korobeinikov, the work of which had not been widely appreciated in the West. These various models led to an estimate that the blast was equivalent to about 15 megatonnes of high explosive - bigger than all but the very largest thermonuclear weapons. However, work by Boslough indicates that the energy required to fit the observed phenomena could be rather less, around 3 to 5 megatonnes.

That analysis assumes that the impactor was a stony asteroid - but a comet is still a possibility. In 1978, L'ubor Kresák suggested the Tunguska impactor was a fragment of Comet Encke. The peak of an annual intense meteor shower associated with dust from Encke occurred around 30 June 1908, but because the meteors arrived from the direction of the Sun, the shower would not have been visible to the naked eye. What the eyewitnesses said about the direction of the Tunguska projectile is consistent with that idea. An analysis of many hundreds of possible pre-impact orbits for the object published in 2001, by a team that had been led by the late Paolo Farinella, indicated that an asteroidal orbit was more likely than a cometary orbit - but using that paper's definitions, Comet Encke, which takes just 40 months to orbit the Sun, has an asteroidal orbit. Another line of evidence, suggested in 1977, was that a comet might explain the carbon-14 signature reported by Cowan in the 1960s; a comet in space might naturally be thoroughly irradiated.

The question of what the object was is not purely academic. If Tunguska was indeed a 15-megaton event, it was rather unlikely - such things are expected only every 1,500 years or so. That calculation, though, assumes that the flux of near-Earth objects is constant over time. If the population of near-Earth objects is replenished from time to time by the break-up of a comet, then shortly after that break-up, impacts from Tunguska-sized fragments will be more likely. Earth may suffer near misses from Tunguska's dark and stealthy cousins every time it passes through Encke's dust stream - fragments too small to be easily observed, but big enough to cause quite a mess if they hit.

In Rendezvous with Rama, Clarke's solution to the threat of impacts was an asteroid search programme aimed at ensuring that such a catastrophe could never occur again: he called it Project Spaceguard. This became the name of a real-life programme, and that search continues. But 50-metre objects are too small to spot far in advance of their impact. So although another Tunguska coming out of the blue is not a likely event in any given June, it is not out of the question.

Duncan Steel is an astronomer and writer after whom Arthur C. Clarke once named a robot.

-- +++

NEO News (now in its fourteenth year of distribution) is an informal compilation of news and opinion dealing with Near Earth Objects (NEOs) and their impacts. These opinions are the responsibility of the individual authors and do not represent the positions of NASA, Ames Research Center, the International Astronomical Union, or any other organization. To subscribe (or unsubscribe) contact dmorrison@arc.nasa.gov. For additional information, please see the website http://impact.arc.nasa.gov. If anyone wishes to copy or redistribute original material from these notes, fully or in part, please include this disclaimer.

Wednesday, July 2, 2008

Judicial Activism and CO2

This short item is not particularly earth shattering, but it sometimes takes a judge to force some common sense into the political debate. Especially when well funded special interests will be quite happy to grease the opposing position regardless now tenable. We only need recall the tobacco war that sacrificed millions of citizens behind a tattered veil of junk science. The statistics were there for decades. Just as today the excess use of sucrose is clearly linked to obesity and diabetes.

It is very clear that smoke stacks pollute and dump huge amounts of CO2 and has always been true. That this is a community issue is a given. That it is necessary is also a current given. That politicians see no future in touching this issue is also rather obviously a given. Yet the solution is a nationwide standard that is acceptable for the present and to which all new plants must comply. New plants should also be able to challenge existing substandard plants for market share.

That is what could be done and what the lobbyists will move heaven and earth to avoid. Yet this type of banal ethic is damaging citizens here and elsewhere. The judiciary has a hammer and at least one is beginning to apply it. Let us support him and let us see if any real law can be promoted out of it.

Breaking News: Georgia judge blocks coal plant over CO2 emissions

The AP has the bombshell news. A judge has finally used the Supreme Court decision that carbon dioxide is a pollutant:

The construction of a coal-fired power plant in Georgia was halted Monday when a judge ruled that the plant’s builders must first obtain a permit from state regulators that limits the amount of carbon dioxide emissions.

The ruling, from Fulton County Superior Court Judge Thelma Wyatt Cummings Moore, is here [big PDF]. What did the judge find?

E&E News (subs req’d) explains:

Permit filings for the 1,200-megawatt Longleaf Energy Station coal plant, to be built by LS Power Group and Dynegy in Early County, Ga., did not include provisions detailing the plant’s CO2 emissions. Yet EPD permitted it anyway on grounds that while CO2 may be a pollutant, the gas was not subject to regulation under the act.


‘No question’ CO2 subject to regulation

Moore disagreed, saying the respondents’ position “is untenable.”

“There is no question that CO2 is subject to regulation under the act,” Moore wrote.

The judge also found that Georgia regulators failed to sufficiently consider best available control technology for the plant by allowing developers to forgo consideration of integrated gasification combined cycle technology that would have allowed for CO2 capture.

Kudos to the judge for bringing some climate sanity back into the coal-plant permitting process.

Tuesday, July 1, 2008

National Energy Program

Summer is clearly here and in full swing. The press is full of economic angst. People are discovering that $140 per barrel oil matters and a great adjustment is now underway throughout the globe. The most visible effect so far is watching the airline industry downsize. Most of the other costs have yet to push their way through the economy. So far most economic participants are eating some of the losses in order to preserve markets. This obviously cannot last, yet this slow response is likely the best response since we are going to see a sharp retreat in oil prices as the decline in consumption bites into the economy. Who wants to shrink demand for your product that you will be struggling to replace next year?

In the meantime and really in the background, the credit readjustment is also rolling through the US economy and is step by step reducing the supply of lending cash in the global economy. The bubble of excess cash has evaporated and hopefully does not slide below what the market requires. The comment has been made that this crisis is proportionally less than the savings and loan debacle brought on in the mid eighties. We survived that quite handsomely. The important thing to remember is that the sub prime disaster is rather localized and is quite open to smart intervention, even though, we are on the road to getting FEMA instead.

We have explored every energy option that we could get our hands on in this blog for a reason. The core to our global economy is a sufficient supply of usable energy. Nothing else even counts. All other commodities are quickly replaceable with only a modest price shift. In fact the commodity boom of the past three years is promoting a rapid expansion in global capacity that will soon place the entire globe into a permanent surplus position. Recall that today perhaps only a third of the global population is not yet fully participating in the Global economy and that third will finish their transition over the next generation.

In fact, we can expect a sharp expansion in food supplies over the next twelve months as farmers rush to take advantage of the current price regime. The same thing has already happened with the other commodities. The only thing that remains surprising is the apparent willingness of participants to maintain the high price structures so long. The sellers are still making so much money that they are not yet feeling the need to dump excess inventories. They may even believe that prices are going higher.

Right now everything is fully priced and suppliers are filling everyone’s boots while the credit contraction in the USA is shrinking surplus leverage out of the system. This whole situation is one headline away from price breakdown and a global retrenchment. We may have to wait for a strong decline in US demand before the shoe drops.

Returning to the energy market, my posts late last week should make it extremely clear that the first phase of the solar energy transition is now upon us. The initial price is set at $1.00 per manufactured watt. This is already cheap enough for a rapid rollout and a single machine will produce the power equivalent of a nuclear power plant each year.

Since manufacturing is now on internet time, the transition will be utterly swift and powerfully supported by frightened Americans, who never again want to be beaten up at the gas pump.

There may have been no political support two years ago for a national energy policy, but I am certain that the presidency will go to the candidate who enunciates a credible energy program, rather than a carbon tax or its like. Americans are scared and they want relief and reassurance now.