Wednesday, September 24, 2008

NASA reports 20% drop in Solar Pressure


I find a twenty percent drop in the pressure of the solar wind unexpected and one immediately wonders if this is related to anything or means anything at all. Certainly the little that we know about sunspots seems to support a solar cooling cycle. And as far as we can tell, the effect on earth is a lagging indicator. So the enthusiast are predicting several years of cold weather.

That means that we stand by for several more cold mean winters that bring us back below the twenty year average. At least it is nice to approach each winter with a heightened sense of curiosity.

In any event this is unique over a fifty year span and the real magnitude is large. That alone is significant.

We now have a unique new variable may mean absolutely nothing but can be blamed for other later unique events. Oh well.


Sept. 23, 2008: In a briefing today at NASA headquarters, solar physicists announced that the solar wind is losing power.

"The average pressure of the solar wind has dropped more than 20% since the mid-1990s," says Dave McComas of the Southwest Research Institute in San Antonio, Texas. "This is the weakest it's been since we began monitoring solar wind almost 50 years ago."

McComas is principal investigator for the SWOOPS solar wind sensor onboard the Ulysses spacecraft, which measured the decrease. Ulysses, launched in 1990, circles the sun in a unique orbit that carries it over both the sun's poles and equator, giving Ulysses a global view of solar wind activity:

Curiously, the speed of the million mph solar wind hasn't decreased much—only 3%. The change in pressure comes mainly from reductions in temperature and density. The solar wind is 13% cooler and 20% less dense.

"What we're seeing is a long term trend, a steady decrease in pressure that began sometime in the mid-1990s," explains Arik Posner, NASA's Ulysses Program Scientist in Washington DC.
How unusual is this event?

"It's hard to say. We've only been monitoring solar wind since the early years of the Space Age—from the early 60s to the present," says Posner. "Over that period of time, it's unique. How the event stands out over centuries or millennia, however, is anybody's guess. We don't have data going back that far."

Flagging solar wind has repercussions across the entire solar system—beginning with the heliosphere.

The heliosphere is a bubble of magnetism springing from the sun and inflated to colossal proportions by the solar wind. Every planet from Mercury to Pluto and beyond is inside it. The heliosphere is our solar system's first line of defense against galactic cosmic rays. High-energy particles from black holes and supernovas try to enter the solar system, but most are deflected by the heliosphere's magnetic fields.

"The solar wind isn't inflating the heliosphere as much as it used to," says McComas. "That means less shielding against cosmic rays."

In addition to weakened solar wind, "Ulysses also finds that the sun's underlying magnetic field has weakened by more than 30% since the mid-1990s," says Posner. "This reduces natural shielding even more."

Unpublished Ulysses cosmic ray data show that, indeed, high energy (GeV) electrons, a minor but telltale component of cosmic rays around Earth, have jumped in number by about 20%.

These extra particles pose no threat to people on Earth's surface. Our thick atmosphere and planetary magnetic field provide additional layers of protection that keep us safe.

But any extra cosmic rays can have consequences. If the trend continues, astronauts on the Moon or en route to Mars would get a higher dose of space radiation. Robotic space probes and satellites in high Earth orbit face an increased risk of instrument malfunctions and reboots due to cosmic ray strikes. Also, there are controversial studies linking cosmic ray fluxes to cloudiness and climate change on Earth. That link may be tested in the years ahead.

Some of most dramatic effects of the phenomenon may be felt by NASA's two Voyager spacecraft. After traveling outward for 30+ years, the two probes are now at the edge of the heliosphere. With the heliosphere shrinking, the Voyagers may soon find themselves on the outside looking in, thrust into interstellar space long before anyone expected. No spacecraft has ever been outside the heliosphere before and no one knows what the Voyagers may find there.

NASA is about to launch a new spacecraft named IBEX (short for Interstellar Boundary Explorer) that can monitor the dimensions of the heliosphere without actually traveling to the edge of the solar system. IBEX may actually be able to "see" the heliosphere shrinking and anticipate the Voyager's exit. Moreover, IBEX will reveal how our solar system's cosmic ray shield reacts to changes in solar wind.

"The potential for discovery," says McComas, "is breathtaking."

Tuesday, September 23, 2008

Vanadium Battery

I was reminded recently of the vanadium battery – see attached item from Wikipedia. The low energy density has kept this technology sidelined in the battery business since the apparent demand opportunity is in mobile applications. Now that the solar industry is about to become cheap and mainstream, this demand profile is about to radically change.

These batteries do not consume themselves and are fast. They do not wear out. This makes them excellent for immobile applications such as the solar water generator we have already described. One can live with a few gallons of working fluid.

I am in possession of an article on a large system operated successfully in Japan in an industrial setting. It is still a modest beginning for the industry.

The redox reactions are all vanadium reactions so mixing is not a problem and the separation is on the basis of ionic variety into the appropriate tanks. This becomes very convenient.

The pent oxide form is sold in the $1.00 to $2.00 range which is certainly cheap enough for the battery business, since it is not consumed.

Vanadium redox battery

From Wikipedia, the free encyclopedia

Jump to: navigation, search

The vanadium redox (and redox flow) battery in its present form (with sulfuric acid electrolytes) was patented by the University of New South Wales in Australia in 1986 [1]. It is a type of rechargeable flow battery that employs vanadium redox couples in both half-cells, thereby eliminating the problem of cross contamination by diffusion of ions across the membrane. Although the use of vanadium redox couples in flow batteries had been suggested earlier by Pissoort[2], by NASA researchers and by Pellegri and Spaziante in 1978 [3], the first successful demonstration and commercial development was by Maria Skyllas-Kazacos and co-workers at the University of New South Wales in the 1980's [4]. The Vanadium redox battery exploits the ability of vanadium to exist in solution in 4 different oxidation states, and uses this property to make a battery that has just one electroactive element instead of two.

The main advantages of the vanadium redox battery is that it can offer almost unlimited capacity simply by using larger and larger storage tanks, it can be left completely discharged for long periods with no ill effects, it can be recharged simply by replacing the electrolyte if no power source is available to charge it, and if the electrolytes are accidentally mixed the battery suffers no permanent damage.

The main disadvantages with vanadium redox technology are a relatively poor energy-to-volume ratio, and the system complexity in comparison with standard storage batteries.

Diagram of a Vanadium Flow Battery

Diagram of a Vanadium Flow Battery

Operation

A Vanadium redox battery consists of an assembly of power cells in which the two electrolytes are separated by a proton exchange membrane. Both electrolytes are vanadium based, the electrolyte in the positive half-cells contains VO2+ and VO2+ ions, the electrolyte in the negative half-cells, V3+ and V2+ ions. The electrolytes may be prepared by any of several processes, including electrolytically dissolving vanadium pentoxide (V2O5) in sulfuric acid (H2SO4). The solution remains strongly acidic in use.

In vanadium flow batteries, both half-cells are additionally connected to storage tanks and pumps so that very large volumes of the electrolytes can be circulated through the cell. This circulation of liquid electrolytes is somewhat cumbersome and does restrict the use of vanadium flow batteries in mobile applications, effectively confining them to large fixed installations, although one company has focused on electric vehicle applications, using rapid replacement of electrolyte to refuel the battery.

When the vanadium battery is charged, the VO2+ ions in the positive half-cell are converted to VO2+ ions when electrons are removed from the positive terminal of the battery. Similarly in the negative half-cell, electrons are introduced converting the V3+ ions into V2+. During discharge this process is reversed and results in a typical open-circuit voltage of 1.41 V at 25 °C.

Other useful properties of Vanadium flow batteries are their very fast response to changing loads and their extremely large overload capacities. Studies by the University of New South Wales have shown that they can achieve a response time of under half a millisecond for a 100% load change, and allowed overloads of as much as 400% for 10 seconds. The response time is mostly limited by the electrical equipment.

Generation 2 Vanadium redox batteries (vanadium/polyhalide) may approximately double the energy density and increase the temperature range in which the battery can operate.

Energy density

Current production Vanadium redox batteries achieve an energy density of about 25 Wh/kg of electrolyte. More recent research at UNSW indicates that the use of precipitation inhibitors can increase the density to about 35 Wh/kg, with even higher densities made possible by controlling the electrolyte temperature. This energy density is quite low as compared to other rechargeable battery types, e.g. Lead-acid (30-40 Wh/kg) and Lithium Ion (80-200 Wh/kg).

Applications

The extremely large capacities possible from vanadium redox batteries make them well suited to use in large power storage applications such as helping to average out the production of highly variable generation sources such as wind or solar power, or to help generators cope with large surges in demand.

Their extremely rapid response times also make them superbly well suited to UPS type applications, where they can be used to replace Lead-acid batteries and even diesel generators.

Installations

Currently installed vanadium batteries include:

  • A 1.5MW UPS system in a semiconductor fabrication plant in Japan
  • A 275 kW output balancer in use on a wind power project in the Tomari Wind Hills of Hokkaido
  • A 250 kW, 2MWh load leveler in use at Castle Valley, Utah

Monday, September 22, 2008

Bailout

It is difficult to stand by watching the American financial system pass through a period of upheaval not seen or even approached since the great depression and not continue to comment. So here goes.

First off, they have grabbed the bull by the horns and stepped up to truly backstop the banking system itself.
Bad loans or paper will be swallowed, restoring the banks’s ability to lend to their good customers. And yes their owners will be getting a haircut as demonstrated by the haircut handed AIG. It is even conceivable that this portfolio now been acquired by the Federal Government will be unwound very successfully, although it is likely too much to ask for a profit. It could happen though that a rejuvenated AIG could quickly regain value on the market.

This is after all a lot about a collapse in confidence triggered by a shift in mortgage portfolios form an 8% exposure to the present 22%. Boy was that dumb!

The result of course is that a trillion dollars at least have been lost beyond recovery by these lenders. And let us understand this. The money is already in the economy beyond the reach of the lenders in any way.

Therefore when the fed pays out a trillion dollars to bail out the obviously reckless lenders, they are actually preventing these lenders from contracting their portfolios by forcing the liquidation of their good business which is actually impossible as was so brilliantly demonstrated during the depression.

And yes it is going to work. The only improvement to what they are doing would be to follow my mark to market strategy in order to salvage as much individual credit as possible to keep folks from walking away. The banks themselves would be well advised to apply this method since they have the regional footprint necessary.
The result would be a swifter recovery.

After this very sobering event, the real estate market will be good business for a generation as prices consolidate and recover
They are also confronting the problem of short selling. I do not take it too seriously but perhaps some good may come out of that.

Friday, September 19, 2008

Thermophilic Algae converts Agri-waste to Ethanol

This adds even more to the algae story. Suddenly we have a bug able to brew up any agricultural feedstock directly into ethanol. Of course nothing is said whatsoever about the actual yield. It seems too much to ask for one hundred percent conversion, but from the sound of this, there is a low temperature cooking process under way not unlike that for yogurt. And it does sound like a protocol that could drain of the pregnant liquor while adding water until the biomass is completely consumed.

I am particularly pleased to see cardboard included.

This also sounds like it will be amenable to small batch work for a small operation. One can assume that the liquid reaches the ten percent plus level of ethanol concentration before it must be drained into a vaporizer or some other separation tool.

However, the lack of news releases over the past year speaks volumes to the present yield situation. If this is the problem, then we may have to wait a long while for the development to reach commercialization.

In any case, coupled with the other recent developments that I have posted on, it is clear that development of algae based solutions is in full swing and that the results are very tangible and happening very fast.

I was quite negative on attempts to convert cellulose based materials, primarily because Mother Nature had not totally already done so. That suggested that it was likely to be very resistant to resolution. That does not seem to be the case.

Now we are seeing a number of methods emerging.

These all lead to all forms of agricultural and forest waste achieving real economic value to the community at large as a direct source of fuel.

The promise of this technology is to be able to process all organic wastes into ethanol with a minimal input of heat while consuming little of the waste while doing so. We may be replacing our landfills with rows of silos brewing at around 65 degrees celsius. This is certainly superior to what we do now.

`This revolution is coming about because of the revolution in real biological engineering. If nature does not supply the perfect organism, then a quick gene splice and we are in business. Nature had little reason to preferentially produce ethanol as a waste material. That it did so with yeast was the exception. That we can then take other useful microorganisms down the same path allows us to chew through organic wastes including cellulose to convert the long chain sugars (see cellulose) into ethanol.

It is also allowing other organisms to convert organic feedstock directly into jet fuel.

This has all blown up over the past year or so, pushed by the realization that we need to end the fossil fuel business because it is unbalancing the environmental CO2 content. The price of oil is only a signal of real supply issues and encourages the influx of money. The real driver is the global recognition that environmental impact must be properly managed and that doing so is usually profitable.

I certainly expect to see many more breakthroughs announced in the algae business because it has a rapid research turn around.



September 16, 2008

Bacteria from Compost

Could Provide 10% of UK Transport Fuel Needs

Dublin, Ireland [RenewableEnergyWor
ld.com]
http://www.renewabl eenergyworld. com/rea/news/ story?id= 53581

Often found in compost heaps, the bacteria that converts waste plant fiber into ethanol could eventually provide up 10% of the UK's transport fuel needs, scientists heard last week at the Society for General Microbiology' s Autumn meeting being held at Trinity College, Dublin.

Researchers from Guildford, UK have successfully developed a new strain of bacteria that can break down straw and agricultural plant waste, domestic hedge clippings, garden trimmings and cardboard, wood chippings and other municipal rubbish in order to convert them into useful renewable fuels for the transport industry.

"The bioethanol produced in our process can be blended with existing gasoline to reduce overall greenhouse gas emissions, help tackle global warming, reduce dependence upon foreign oil and help meet national and international targets for renewable energy," said Milner, Fermentation Development Manager of
TMO Renewables Ltd, based in Surrey Research Park, Guildford.

The new strain of bacteria allows ethanol to be produced much more efficiently and cheaply than in traditional yeast-based fermentation, which forms the basis for most current commercial bioethanol production.

"Conventional ethanol production is energy-intensive, expensive and time-consuming as the barley malt or other material being brewed needs to be heated up as a mash in feedstock pre-treatment. Then it is significantly cooled from that high temperature to a lower temperature for yeast fermentation, only to be re-heated when it is later distilled into ethanol. Our process is much more energy-efficient. " said Milner.

TMO's microbiologists screened thousands of different wild types of bacteria, looking for one that could survive high temperatures and fed off a wide variety of plant-based materials.

"We found some heat-loving bacteria in a compost heap, from the Geobacillus family, which in their wild form produce lactic acid as a by-product of sugar synthesis when they break down biomass," said Milner. "We altered their internal metabolism, adapting them to produce substantial amounts of ethanol instead."

"Our new microorganism, called TM242, can efficiently convert the longer-chain sugars from woody biomass materials into ethanol. This thermophilic bacterium operates at high temperatures of 60-70°C and digests a wide range of feedstocks very rapidly," said Milner.

The scientists estimate that some 7 million tons of surplus straw is available in the UK every year. Turning it into ethanol could replace 10% of the gasoline fuel used in this country. "As our process uses agricultural waste materials such as straw, wood, paper and plants and other cellulosic fiber from domestic and municipal waste, it provides significantly greater environmental and economic benefits than crop-derived biofuels, which some believe have contributed to the increased prices of basic food in so many countries," said Milner.

"We have recently completed commissioning the UK's first cellulosic ethanol demonstration facility — one of just a handful worldwide," said Milner. "We are constantly researching new, better ways to produce biofuels. We also believe that our process can be used successfully beyond biofuels to produce other high-value chemicals and drug ingredients that are currently derived from oil."

Thursday, September 18, 2008

Final Sea Ice

Needless to say, I am no longer alone in recognizing that the perennial sea ice will be gone in five years. When I posted the aggressive 2012 prediction late last summer, the consensus was many decades. This item shows that most are now bowing to the inevitable. The areal extent was slightly larger than last year, but the unusual winds of 2007 did not come along and perhaps compress it more.

We have no way of knowing for sure, but I expect that this year’s actual ice loss was significant though not as large as last years. It is still a loss rather than a gain and what is now obvious to everyone is that we are observing the dissolution of the perennial Arctic sea ice. As I posted in the past, the average loss per year is linear inasmuch as roughly the same value M will be extracted from the total. But now the exposed area of the Arctic is increasing sharply, the value of M can even be expected to modestly increase as more solar energy is absorbed.

All this adds up to an accelerating collapse of the ice over the next four years. I have yet to see a reason to back of my 2012 prediction. It is necessary to have a much colder and much longer winter than last year’s.

As I pointed out to my readers last year, this decline is all about the effect of a small incremental increase M in available heat to the Arctic. As the total ice mass declines, the effect of m steadily increases, until it becomes the dominant factor when there is little of the original ice left. We are obviously there and it can only get worse now until the long term ice is all gone over the next four years or so.

This summer, the melting and warming of the ice mass continued. Major parts of the super thick floating ice sheets broke free confirming the ongoing weakening and warming of even this ice. And this item reports that new ice that is visible is weak and thin.

I would like to believe that the apparent slight reduction in solar energy is sufficient to induce a cooling of the Northern Hemisphere. Right now the evidence is pretty sketchy and not obvious, reports to the contrary.

We have also just had a lively hurricane season which shows that the equatorial heat engine is not shut down and is winding up again. Maybe with this blow out, we will now get a couple of quiet years.

Last year proved that none of this helped in predicting the upcoming winter. However a mild winter seems to presage a warm summer in the Arctic.

Arctic Sea Ice Season Underscores Accelerating Decline
Written by Dana Nuccitelli

Published on September 17th, 2008

Posted in Environmental & Climate Science

According to the National Snow and Ice Data Center (NSIDC), Arctic sea ice cover appears to have reached its minimum extent for the year, the second-lowest extent recorded since the dawn of the satellite era.

While above the record minimum Arctic sea ice extent set on September 16, 2007, this year further reinforces the strong negative trend in summertime ice extent observed over the past thirty years.

Despite overall cooler summer temperatures, the 2008 minimum extent is only 390,000 square kilometers (150,000 square miles), or 9.4%, more than the record-setting 2007 minimum. The 2008 minimum extent is 15.0% less than the next-lowest minimum extent set in 2005 and 33.1% less than the average minimum extent from 1979 to 2000.

This season further reinforces the long-term downward trend of sea ice extent.

Even though the sea ice didn’t retreat this year as much as last summer, “there was no real sign of recovery,” said Walt Meier of NSIDC. This year was cooler and other weather conditions weren’t as bad, he said.

“We’re kind of in a new state of the Arctic basically, and it’s not a good one,” Meier said. “We’re definitely sliding towards a point where the summer sea ice will be gone.”

Scientists have predicted that the Arctic will become ice free in the summer by the year 2013, if not sooner. This also does not bode well for global warming, since ice reflects sunlight whereas dark oceans absorb it.

On top of that, the Arctic ice melting trend has shifted. Normally the ice would reach its minimum extent by early September, but after the record melt of 2007, much of the ice reformed with much less thickness, allowing it to continue to melt through mid-September this year.

The Arctic is warming at a faster rate than the rest of the planet, and can be considered a ‘canary in the coal mine’. Right now, that canary is not in good health.

Wednesday, September 17, 2008

Colony Collapse Disorder Resolved

I got this article over at viewzone.com and am pleased to see an anomaly possibly resolved. Of course we will now have a blast of lawsuits and corporate dodging.

Viewzone is one of the first online magazines and is recommended. They published an article of mine titled ‘Pleistocene Nonconformity’. Go read it.

The collapse of the bee population had the signature of a rogue pesticide. We can see from this article the actual culprit. It is clear from the known distribution of the chemical that the probability of causation is approaching the 95 percentile mark and that it will only take a minimal effort to tighten that up.

This chemical needs to be banned immediately without further discussion.

It continues to be clear that the introduction of new product into agriculture must be by planned step by step distribution that allows ample time for environmental impacts to emerge. This also suggests that the proper test area needs to be stepped up from the initial test field to township size.

Field tests almost certainly missed the impact on the bees. That would not have happened in a township sized test run for three years.

Fortunately most products used have the long benign histories. It is only when we want to launch a totally new product that some thought is needed. This product had the red flags flying, or why would they have bothered with the coating protocol to discover later that it failed.

This is likely one more solution that is simply unusable.

And the disease in question is surely been managed by rotation today as is that other pest, the corn borer (unless we are talking about the same thing) and since rotation is always called for in any event for sound soil management, quit been simply greedy.

At least now that the cause is understood – where is the press? – The stocks can be quickly rebuilt. I wish we could do the same with bird populations that are also hard hit by something other than global warming.

It is noteworthy that the raptor population has hugely recovered in the past thirty years with the change in human attitudes to them. DDT may well have been the causation of the original decline. The only alternative to the DDT explanation is the mythical farm boy and his rifle.

But even with real predation by native hunters, the eagle populations are now increasing strongly. So it is reasonable to accept the DDT model.

This should mean carefully managed application if that is even possible. The problems arise from excessive and reckless application of pesticides. It is not hard to believe that more is better and to over apply the product.

Honey bees are dying all over the globe. Here's why!

by Dan Eden for Viewzone

For over a year, the media has been reporting about the dramatic loss of bees in Europe and North America. As many as 50% to 90% of the bee populations have simply vanished, leaving their hives empty and forcing farmers to demand investigations to determine the cause.

At first it was only the honeybees that were decimated -- then the bumblebee populations began to disappear. Bumblebees are responsible for pollinating an estimated 15 percent of all the crops grown in the U.S., worth $3 billion, particularly those raised in greenhouses. Those include tomatoes, peppers and strawberries. The crisis was eventually given a name: Colony Collapse Disorder or CCD.

CCD is a "fake disease!"

The most popular theory, aside from the varroa mite [right] and cellphone RF radiation, has been the belief that a virus -- similar to AIDS -- has infected the bees. A team led by scientists from the Columbia University Mailman School of Public Health, Pennsylvania State University, the USDA Agricultural Research Service, University of Arizona, and 454 Life Sciences found a significant connection between the Israeli Acute Paralysis Virus (IAPV) and colony collapse disorder (CCD) in honey bees.

A team of scientists from Edgewood Chemical Biological Center and University of California San Francisco identified both a virus and a parasite that are likely behind the recent sudden die-off of honey-bee colonies. Using a new technology called the Integrated Virus Detection System (IVDS), which was designed for military use to rapidly screen samples for pathogens, ECBC scientists last week isolated the presence of viral and parasitic pathogens that may be contributing to the honeybee loss.

But it now appears that a much more basic culprit has killed the bees -- Bayer Corporation. Colony Collapse Disorder is poisoning with a known insect neurotoxin called Clothianidin, a pesticide manufactured by Bayer, which has been clearly linked to massive bee die offs in Germany and France.
Clothianidin = "Colony Collapse Disorder"

Here's the story. One of the most important crops is corn. It's used as a feed for chickens and pigs and cattle. It's used in flour and in the production of high fructose corn syrup. Just about everything we eat depends on corn. Recently, with the energy crisis, corn has also been pressed to make ethanol to run our cars. But corn has an enemy called the root worm.

This pesky bug, called diabrotica vergifera vergifera, [right] burrows in the newly forming roots of the corn plant and causes the plant to wither and eventually die. Farmers have long sought some type of pesticide to kill the bug and, in 2003, Bayer Pharmaceutical introduced a new product called Clothianidin. Their own studies showed that this pesticide was highly toxic to bees but justified the widespread use because it could be applied to corn seed and would be buried in the soil where it would presumably be harmless to other creatures.

In theory, farmers were instructed to buy special machines that would coat their seeds multiple times with clothianidin and a special adhesive, dry the seeds, and then plant them. The poison is supposed to stick to the seed coat and to be toxic to the rootworm as it attempts to burrow in to the newly forming roots.

Bayer, who make the pesticide, and Monsanto, who make the adhesive, have patented the method of coating their proprietary seeds with clothianidin, which are now growing all over the globe.

Oooooops!

The first clue that Colony Collapse Disorder was a simple case of poisoning -- similar to the DDT bird kill-off decades ago -- was when clothianidin was used on corn crops in Germany's Baden-Wuerttemberg state.

In July of 2007, the German crop was infested with the rootworm. The German government ordered that every possible method should be used to eradicate this pest, including the use of clothianidin. Shortly after the seeds were planted, in May of 2008, some 330-million bees abruptly died!

According to the German Research Center for Cultivated Plants, 29 out of 30 dead bees had been killed by direct contact with clothianidin.

Philipp Mimkes, spokesman for the German-based Coalition Against Bayer Dangers, said: "We have been pointing out the risks of neonicotinoids for almost 10 years now. This proves without a doubt that the chemicals can come into contact with bees and kill them. These pesticides shouldnÕt be on the market."

An investigation revealed that the seed coating did not stay in the soil but was introduced to the air (and the rest of the plant) by simple abrasion -- the rubbing together of seeds -- as they are stored, moved and injected in to the soil by farming machines.

German authorities suggested that the seeds were not treated with a special polymer, called a "Sticker," which makes the pesticide adhere to the seed. But it is noted also that the formulation of clothianidin does not require this "sticker" in typical applications and most farmers find this additional coating too cost prohibitive.

The German government quickly banned this pesticide and gave compensation to the farmers and issued a strong warning against using this chemical in agriculture. According to the German Federal Agriculture Institute,

"It can unequivocally be concluded that poisoning of the bees is due to the rub-off of the pesticide ingredient clothianidin from corn seeds."


According to the U.S. Environmental Protection Agency (May 30, 2003):


"Clothianidin has the potential for toxic chronic exposure to honey bees, as well as other nontarget pollinators, through the translocation of clonianidin redidue in nectar and pollen."

[In the same report] "The fate and disposition of clothianidin in the environment suggest a compound that is asystemic insecticide that is persistent and mobile, stable to hydrolysis, and has potential to leach to ground water, as well as runoff to surface waters."

"Clothianidin is highly toxic to honey bees on an acute contact basis (killing 50% of tested populations at greater than 389 mg/kg). It has the potential for toxic chronic exposure to honey bees, as well as other nontarget pollinators, through the translocation of clothianidin residues in nectar and pollen. In honey bees, the effects of this toxic chronic exposure may include lethal and/or sub-lethal effects in the larvae and reproductive effects in the queen."

Clothianidin = neurotoxin

The cigarette industry used to brag that one or two cigarettes doesn't give a person lung cancer. Likewise, the pharmaceutical companies are quick to show that feeding bees a specific amount of neurotoxins, like clothianidin, doesn't kill the bees. And, of course, this is true.

While small traces of clothianidin may not kill bees outright, it can and apparently does interfere with their ability to navigate to and from the hive. The pollen that they manage to bring back to the hive is then further concentrated and exposed to the entire colony, causing suppression of their immune systems and subsequent infection by any number of parasites and pathogens. This is exactly what beekeepers and farmers have been reporting -- half empty, infested bees or abandoned hives with no dead bodies to be found anywhere. It has also been noted that the empty colonies are absent the usual parasitic bugs that typically take advantage of an abandoned hive. The colonies appear sterile.

People dressed as bees demonstrate against pesticides 21 April 2007 in front of the Bayer headquarters in Brussels. Three millions bees have been dying each year since the introduction on the market of the pesticide manufactured by Bayer. Poster reads :'Gaucho Bayer, only kills if one uses it.'

Not Just Corn

The tragedy in Germany and France showed that bees who became exposed to clothianidin also infected bee colonies that were not harvesting corn pollen, thus spreading the toxin to regions at some distace to areas cultivating corn plants. It is theorized that they could have become disoriented and mingled with bees from other colonies or contaminated the pollen of plants where other bee colonies were also pollenating.

Same old story...

Money talks. Agro-business is huge and their influence is deep in the sciences and politics. Their own scientists must know very well that their product has threatened the global population of bees, yet they allow the conspiracy theories of a mysterious "Colony Collapse Disease" to endure. Clothianidin and imidacloprid (another pesrticide also banned by Germany and France) account for much of Bayer's agrochemical profits.

I used to think of Bayer as the company that made aspirin and medicine, but I recently saw a list of poisons that they made and marketed to kill everything from microbes to insects. It seems odd to me that a company that makes poisons also makes medical cures... Is there a link there? Perhaps it's just different sides of the same dollar or Euro.

UPDATE

(08-18) 18:37 PDT -- The U.S. Environmental Protection Agency is refusing to disclose records about a new class of pesticides that could be playing a role in the disappearance of millions of honeybees in the United States, a lawsuit filed Monday charges.

The Natural Resources Defense Council wants to see the studies that the EPA required when it approved a pesticide made by Bayer CropScience five years ago.

The environmental group filed the suit as part of an effort to find out how diligently the EPA is protecting honeybees from dangerous pesticides, said Aaron Colangelo, a lawyer for the group in Washington.

In the last two years, beekeepers have reported unexplained losses of hives - 30 percent and upward - leading to a phenomenon called colony collapse disorder. Scientists believe that the decline in bees is linked to an onslaught of pesticides, mites, parasites and viruses, as well as a loss of habitat and food.
$15 billion in crops

Bees pollinate about one-third of the human diet, $15 billion worth of U.S. crops, including almonds in California, blueberries in Maine, cucumbers in North Carolina and 85 other commercial crops, according to the U.S. Department of Agriculture. Not finding a cause of the collapse could prove costly, scientists warn.

Representatives of the EPA said they hadn't seen the suit and couldn't comment.

Clothianidin is the pesticide at the center of controversy. It is used to coat corn, sugar beet and sorghum seeds and is part of a class of pesticides called neonicotinoids. The pesticide was blamed for bee deaths in France and Germany, which also is dealing with a colony collapse. Those two countries have suspended its use until further study. An EPA fact sheet from 2003 says clothianidin has the potential for toxic chronic exposure to honey bees, as well as other pollinators, through residues in nectar and pollen.

The EPA granted conditional registration for clothianidin in 2003 and at the same time required that Bayer CropScience submit studies on chronic exposure to honeybees, including a complete worker bee lifecycle study as well as an evaluation of exposure and effects to the queen, the group said. The queen, necessary for a colony, lives a few years; the workers live only six weeks, but there is no honey without them.

"The public has no idea whether those studies have been submitted to the EPA or not and, if so, what they show. Maybe they never came in. Maybe they came in, and they show a real problem for bees. Maybe they're poorly conducted studies that don't satisfy EPA's requirement," Colangelo said.

Tuesday, September 16, 2008

Crash and Burn on Wall Street

Almost remote from Main Street and just as remote from the American public, the pillars of the US financial system are in a struggle for life or death. Bear Sterns is absorbed by Morgan, Lehman is entering liquidation and Merrill is absorbed by Bank of America. Their real failure cannot be contemplated. But we are looking at a contraction in the supply of available credit. This means massive financial losses throughout the global economy, however well sheltered. Today AIG asked the Fed for help.

A crisis in confidence is visibly shrinking the economy and forcing premature liquidation and scaring sound money from coming to the table.

CNBC: Warren Buffett No Longer In Talks With AIG

Posted By:Alex Crippen

Sectors:Insurance

Don't count on Warren Buffett to "rescue" AIG.

A few minutes ago, CNBC's David Faber reported on the air that Buffett is "no longer" in talks with the insurer "about an investment or anything else."

AIG is desparetely trying to sell assets and raise new capital to avoid what would be a disastrous downgrade of its debt by the credit rating agencies.

Faber reports that people familiar with the situation tell him that talks between Buffett's Berkshire Hathaway and AIG did take place last Friday and Saturday, but there's been nothing since then and nothing is happening now on that front.

Faber says AIG is focusing its attention on getting billions in bridge financing from the Federal Reserve, to allow for massive asset sales.

This is forced liquidation and it is not the liquidation of AIG itself that is troubling, it is the idea that such liquidation is thought possible in the first place. This is the domino liquidation scenario that is impossible to execute in a crumbling credit market. Again it will be necessary for the fed to step in and bridge every piece of script out there in order to stop the collapse.

And do not get it wrong. A collapse means a collapse in available credit for everyone and a collapse in the value of real assets to match that of the great depression. The ensuing economic collapse will then be driven by a lack of financial liquidity.

Right now this financial disaster is having babies, but the damage so far has been contained to the smart money crowd who are supposed to know better.

There is still time to end this rolling disaster by going to the mark to market strategy partially outlined in earlier posts.

I have been reticent on what I have been saying about the current credit situation for fear of adding gasoline to a forest fire. Right now, I fail to see how it could get any hotter.

A full disclosure of real exposure by all participants is needed followed by a mark to market exit strategy promulgated by the fed and backed by the fed’s guarantee. Right now there is no such strategy and no clarity as to where this will all end. The result is that one failure is feeding the next failure.

This can continue like a string of dominos until every mortgage is made almost worthless and every house selling at a fraction of its cost to manufacture. The objective of the fed is to stop this process, and so far they have been bailing like crazy and every event is a surprise.

The next few short days should tell the story as far as the equity side is concerned. I suspect that another 1500 points of decline is possible. We lost 500 today.

It will then take months for the money managers to get everything rebalanced and count the damage.
In the meantime, oil is heading for $65 to help offset this shock.

Perhaps this spring the public can reenter the housing market and clean up the inventory at double speed and get the big boys off the hook.

We are living through an historic credit contraction, once again brought about by pure negligence. Reckless lending always has champions and naïve lawmakers to play along. Putting armies of financial industry executives may keep them sober for a couple of generations.

Take note that I do not blame greed. That is a constant. What in hell were they thinking when they bought mortgages not guaranteed and managed by a local bank? An acceptable loss ratio for a bank portfolio is half a percent. With no local management you would need an impossible zero percent. It is as if the fix was in and the mafia was running the show laying the paper off onto whoever they could bag.

Monday, September 15, 2008

Solazyme brews Jet Fuel

This item can be described as more good news coming out of the ongoing efforts to harness algae. We are seeing second, third and forth generational ideas paying of quickly.

Replacing jet fuel with an equivalent biological was unhoped for because I have made the natural assumption that like oil, a substantial processing phase would need to be engineered once a biological oil source was built out. Instead we have clearly got another brew master’s operation that can use plant material as feed stock without a lot of fuss.

I hope this means that it can be built out in farm sized units to avoid excessive haulage costs. Just as obviously, if they can produce jet fuel or even an unrefined precursor at this scale, it should also be possible to produce from the same system a gasoline and diesel equivalent.

This sounds a lot easier than the many other protocols that we have discussed so far. Pyrolysis was always a nonstarter for the liquid fuel cycle and so was playing with natural algae. Ethanol was possible if algae or cattails produced the feedstock. The idea that we can side step all these issues and natural complexities and brew up jet fuel from plant waste is almost too good to be true. It is certainly a good objective to achieve and let us hope that this company is not been premature.

The company has focused its research on marine algae and has announced and tested biodiesel produced through their work. My sense is that they are pushing the research envelop to perfect the necessary production protocols. Actual commercialization should be the next step.

It would be a remarkable development if it becomes possible to shift transportation fuel production completely into agriculture at the same time consuming agricultural waste.

The use of agricultural waste as a feed stock for producing biochar is important for manufacturing high quality soils, but is not necessary once such soils are produced. Conversion to fuel nicely consumes this surplus.


Microbes Grow Jet Fuel in the Dark
September 10, 2008

The South San Francisco company
Solarzyme announced this week that it has produced the world's first microbial-derived jet fuel to pass the eleven most challenging specifications needed to meet the Aviation Turbine Fuel standards.

Solarzyme's algal-derived aviation fuel was analyzed by the Southwest Research Institute, one of the nations leading fuel analytical laboratories. The tested areas included the key measurements for density, thermal oxidative stability, flashpoint, freezing point, distillation and viscosity, the biggest hurdles needed to develop a commercial and military jet fuel.

Given Solarzyme's excellent cold-temperature performance and the clean characteristics of the oil, former military fuels specialists note that new algae-based fuels could help the DOD comply with recently enacted mandates to reduce our dependence on foreign oil and utilize environmentally friendly fuels.

In the U.S. alone, 1.6 billion gallons of jet fuel are used every month resulting in significant greenhouse gas emissions. The need for environmentally friendly and sustainable alternatives is growing rapidly. The EU is requiring that every nation landing there
must adhere to their emission standards by 2012.

But it's not merely foreign legislative pressure that's driving change. As peak oil nears, jet fuel already accounts for 36 percent of airline industry costs -- up from 13 percent just six years ago -- and could account for 40 percent of industry costs next year.

While algae-based fuel is currently almost as expensive as oil to produce, it has a significantly different estimated cost going forward, since it is made up of cells that double exponentially over time (2,4,8,16,32, 64...). Oil supplies will be increasingly scarce and expensive to extract over that time period.

Solarzyme is currently producing thousands of gallons of oil a month at scale and is the only advanced biofuels company that has produced fuels that have passed specification testing and are compatible with the existing transportation fuel infrastructure. Solarzyme uses directed evolution to engineer an organism to perform a desired function, the same technique farmers have employed since the dawn of civilization to breed new strains of higher production grain and so on, but this is done at the gene sequence level.

Solarzyme's process needs no sunlight, unlike other algae farming startups such as the New Zealand startup that
will be flying a Boeing test to San Francisco this month. This lack of a need for sunlight makes for an efficient and fast process, and the feedstock is very sustainable: agricultural waste, cellulosic material such as switchgrass and industrial byproducts. Algae doesn't require vast amounts of land. You can even grow algae on the roof of a sewage plant.

Unlike the materials utilized in any other mass production process that we enterprising humans have ever used to make things with, by its very nature, algae just keeps on growing.

Friday, September 12, 2008

Terraforming the Boreal Forest with Cattail Paddies

I have introduced readers to the astounding productivity of the wild cattail. This is possible because it is a wetland plant that never lacks water and is a sponge for dissolved nutrients. It produces a biomass an order of magnitude greater than the most prolific field crop and much of that are starch rich rhizomes.

I am revisiting it because its natural range takes it deep into the boreal forests of both Canada and Eurasia. This is the single largest woodland wilderness in existence and has always resisted human agricultural methods. The soils, derived from pine forests if they have any depth at all, resist cropping and are best left undisturbed. Much of that is from the actual recent nature of these soils from the end of the Ice Age.

Yet in Canada in particular vast wetlands remain as well as multitudes of interconnected lakes. The actual plant life is well adapted but with minimal speciation throughout. You see the same handful of species. Some of this is perhaps also a result of the Ice age and the limited time for local speciation.

The cattail prospers in most of this range. It does not seem to go all the way to the tree line, but then neither do the trees except as stunted remnants. The one range map that I am able to use shows coverage throughout the lands south of a latitude running through James Bay. I suspect that I will be hearing of its presence much further north.

What this means is that land greater in area than the Great Plains can possibly be farmed using a cattail paddy culture. One can even envisage using beaver ponds as the natural cattail field.

Farm preparation will require the leveling and grading of large swathes of wetlands and setting up a system of fall drainage to accommodate pre winter harvesting.

This also means that direct management of the adjacent wildlife becomes possible. Deeper waters are natural fish farms easily isolated for management purposes because of the general slow movement of water through these lands. Non migratory sturgeon are particularly promising.

More obvious is the moose that will naturally graze these lands and can be harvested in the fall. In fact it is perhaps reasonable to store the chopped reeds as winter fodder for the herds of moose. The same may also applicable to the other ruminants but the moose is surely best source of commercial meat.

Once the paddy culture is established and the moose husbandry integrated, many other options will become practical. The beaver, in particular is eminently domesticatible and readily adapted to this culture. In fact, both these animals will be around anyway and one may as well maximize their value. Actual beaver harvesting will take place in the winter when the pelt is at its best. The meat also is of value.

Note that so far I have made sure that harvesting takes place in the late fall and winter. The first reason for this is that the insect burden during the summer is prohibitive. The second reason is that the animal and fish populations must be hugely reduced before full winter kicks in and their food supplies disappear.

In any event we have yet another model farm concept that can work and put boots on the ground everywhere throughout the boreal forest.

Curiously, the Cree whose ancestral lands these are likely survived the onslaught of Europeans best of all indigenous peoples. They are well positioned to develop this new form of agriculture and provide an economic base that once seemed improbable.

We have discovered it is possible to turn the boreal forest to agricultural advantage and thus terraform one of the largest biomes on Earth.

Thursday, September 11, 2008

Commodity Decline

While we have been regaled with the ongoing unraveling and reconsolidation of the massive US mortgage market, the rise and fall of the oil market is delivering another casualty. As my readers know, I called both the price run up past $100 per barrel and the turn at $145 per barrel. This price move was necessary to force the public to pay attention to our serious exposure to the presently inelastic condition of the supply side of the oil equation. We now have a global consensus for shifting out of the fossil fuel business and demand has been visibly throttled. I now expect a return to $65 per barrel.

Prior to the oil price run up we had a huge price lift in commodity prices. This created a huge amount of credit and has thoroughly funded the metals industry in a way not possible for generations. The ongoing oil price decline appears to be collapsing that long lasting bubble.

To give my readers a meaningful standard to work with, I will share one fundamental idea. All commodities are normally sold at a price very close to the real cost of production. The rationale is obvious. Higher prices allow all producers to ramp up production and to invest in technologies and new operations that will bring costs down. The only real constraint to this behavior is the time needed to make this happen. Well, guess what? We have had the necessary two to three years to dust off every mothballed project from the past two generations and blast them through permitting.

And now the credit in the commodity markets is evaporating.

Up to about three years ago, all copper mines worked against a copper price of around $0.70 per pound. This had been the average since the sixties! This had actually driven new mine development out of North America. But all mines worked against an operational break even of around that seventy cent mark.

I address copper in particular because it continues to be the leader in terms of mining innovation and cost cutting. When I first got into the business in 1972, it was still possible to contemplate mining a several million ton deposit carrying twenty pounds of copper to the ton over several years. Today that represents a month’s supply in most major mines. Such scale has permitted mining grades to hang around eight pounds to the ton.

What I learned early on was that this technology ultimately came to every other minable commodity. I know of deposits in certain commodities that would idle every other mine in operation if brought on stream.

Returning to copper, we have a commodity that requires three or four years to ramp up but then can be produced for well under $1.00 per pound. Yet we have been forced to pay $4.00 per pound and now are back at $3.00 per pound. This I see returning to around $1.50 to put everything back in balance.

Of course there are an army of analysts who will argue vehemently that this is not so. Oh well!

We have just been through an old fashioned commodity boom and bust carried out over three years. All the producers are flush with cash and are bringing fresh production on stream. This is also happening throughout the global agricultural business. This next year we will be awash with huge surpluses and a rapid global business recovery driven be suddenly lower costs across the board.

Importantly, the market has decisively signaled the need to vacate the carbon business and governments are getting mandates to do just that. This is giving us the time to do it right.

As I have posted, the simple shift now from diesel to LNG in the USA alone will release half of our demand for oil. The advent of THAI will let North America become the globe’s strategic oil reserve with perhaps two trillion barrels of producible reserves booked before we are finished. That is twice all the oil produced to date.
This all means that the economic rebound will be very strong for the next three years.

Wednesday, September 10, 2008

Ocean of Heat

As you probably suspect, I glean reams of news stories related to the Global Warming theory. It goes without saying that much of it is errant nonsense. It is again time to spell out the facts that are facts.


1 The Northern Hemisphere has warmed. We know that because the Arctic has a longer spring and summer. A direct result has been the opening of the two northern sea routes for the first time. Also glaciers are retreating all over the northern hemisphere. That pretty well confirms that this is a warm spell. Even the number fools who note that 2008 is the coolest year in the past five point out that still means that it is above average and has been above average for five years. We need seriously below average now to reset the averages back to normal. So yes it is actually much warmer in the Arctic.

2 The Southern Hemisphere has cooled. Glaciers are growing. The amount of cooling involved is either a mirror image of the northern warming or a very large fraction thereof. It is colder and thankfully, we do not try to have billions live there.

3 External drivers such as excess CO2 and solar variation have been presumed to alter global temperatures. The scientific support for either proposition is tenuous. For CO2 it was for ten years clearly coincidental. Since then for ten years it has been contrary. The solar variation idea is also potentially coincidental but more compellingly so for the two specific cases that exist.

4 It is not obvious that Northern Europe is a viable proxy for the Globe. Again we have the present situation in which a clear northern warming trend is been countered by a southern cooling trend. This is an actual expectation for an invariable heat supply model that must surely trend in one direction or the other however slightly.

In the event, we have a serious lack of proxies around the globe over a useful time range to provide confident projections. The historical weather data keeping that is most of two hundred years old has been vulnerable to the heat island effect adding yet another source of error that must be continuously adjusted with unaffected comparables. It has been claimed that this is well handled but even that has come into question.

We have data over a fairly short time period of two centuries in some parts of the globe. We have a lot more data covering the past century steadily improving to the beginning of satellite monitoring almost thirty years ago. Present methods can be the gold standard and actually accepted as having a very low measurable error. Yet that is only thirty years old and all preceding proxies, however obtained will always be suspect at least. Look at the debate over whether the warming of the 1930’s is comparable to the warming ending in 1998.

Right now we lack an explanation for the clear decadal heat shift from the south to the north. Or perhaps we mistake the cooling of the south as anomalous. It could well be simply a continuation of normal behavior that is perhaps slightly slowed because it is warmer generally. The cooling engine of the south is huge compared to the cooling effects of the northern winter. The idea that it is still cooling while a little more heat is keeping the north warmer could be made. Again we lack the centuries of data needed to address these facts coherently.

My principle hypothesis is that left to its own devices, the northern hemisphere will slowly warm to Bronze Age conditions in which the winter sea ice is fully removed every year. This implies improved growing conditions in the higher latitudes. The Baltic becomes pleasant again and the permafrost will disappear in Greenland and over vast swathes of the boreal forest. The tree line will perhaps advance a little.

The evidence shows that this natural warming trend of the atmosphere has been interrupted by volcanic activity again and again. It certainly appears to be the most likely causation of abrupt declines in temperatures over the span of the Holocene. Other sources of cooling will not be so obviously abrupt but still must be considered. The reason for this is that it appears that the earth conserves a lot more heat than we have ever given it credit for. The atmosphere is not the sole store of heat. Without doing a calculation, oceanic heat storage is easily as potent and the ambient temperature of land is also available very slowly.

We live in an ocean of heat and it is very difficult to determine trends and variances with any confidence let alone assign actual cause and effect.

Tuesday, September 9, 2008

Atmospheric Nanocarbon

This is a welcome article on the fact that the nanocarbon component of the atmosphere is largely unmeasured and barely recognized. My readers know that I have been discussing the importance of nanocarbon from the beginning, particularly as a key constituent of terra preta soils.

What I like to ask my readers to imagine is a small particle of carbon operating at the same scale or even close to the same scale as an ordinary chemical molecule. This is obviously important. That carbon forms a solid crystalline acid makes it the most important such nanoparticle. The only comparable in nature are some volcanic ashes that are also solid crystalline acids.

Larger porous particles of carbon exhibit similar behavior but are obviously not as prone to remain in the atmosphere,

Understanding this, one certainly does not ignore this component where it is present. And scientific papers that ignore such a component are going to be skewed. Such skewing will often be written off to data error.

When you look at the atmosphere about Beijing which is heavily dust and smog, it is natural to measure the gross volume of particulate. Yet silica particulate and most other rock based materials are chemically neutral and generally benign. The human body is very good at removing these. The carbon content is very small by any comparison but may be profoundly reactive.

They are called solid crystalline acids for a reason. Some are 10,000 times more reactive than sulphuric acid.

Atmosphere observations around China, and by extension India, show a daily conditioning by the injection of nanocarbon or brown carbon as well as other aerosols. This must alter atmospheric processes and surely need to be accounted for. The normal chemical aerosols that have traditionally attracted attention are troublesome in their own right but are also obviously neutralized for much the same reason that toxins can be disposed of at sea. Dilution and reactivity will finish them of.

I do not feel so sanguine about nanocarbon. It is reactive but is not neutralized in the process. It is only neutralized when it falls into the sea or on to land and biological agents take it over. That means that an active part of stack gas can persist.
We have already gotten excited about the persistence of CO2 which is on its own essentially harmless in terms of local atmospheres. Persistent nanocarbon and its effects are not understood at all. I suspect that it is hugely linked to smog and likely promotes the persistence of nitric acid in the atmosphere from combustion engines.

Recall that lightening storms produce massive amounts of that same nitric acid and is washed out of the sky immediately without producing any smog whatsoever. It makes perfect sense that atmospheric nanocarbon will grab nitric acid, just as it does in terra preta spoils.

I personally suspect that the persistence of nanocarbon will be found to be generally benign but not because of any lack of performance. We just need the research to be done to confirm any of this.

Global Warming Forecasts Not Taking Into Account Nanoscale Atmospheric Aerosols

ScienceDaily (Aug. 9, 2008) — Arizona State University researchers have made a breakthrough in understanding the effect on climate change of a key component of urban pollution. The discovery could lead to more accurate forecasting of possible global-warming activity, say Peter Crozier and James Anderson.

Crozier is an associate professor in ASU's School of Materials, which is jointly administered by the College of Liberal Arts and Sciences and the Ira A. Fulton School of Engineering. Anderson is a senior research scientist in the engineering school's Department of Mechanical and Aerospace Engineering.

As a result of their studies of aerosols in the atmosphere, they assert that some measures used in atmospheric science are oversimplified and overlook important factors that relate to climatic warming and cooling.

The research findings are detailed in the Aug. 8 issue of Science magazine, in the article "Brown Carbon Spheres in East Asian Outflow and Their Optical Properties," co-authored by Crozier, Anderson and Duncan Alexander, a former postdoctoral fellow at ASU in the area of electron microscopy, and the paper's lead author.

So-called brown carbons – a nanoscale atmospheric aerosol species – are largely being ignored in broad-ranging climate computer models, Crozier and Anderson say.

Studies of the greenhouse effect that contribute directly to climate change have focused on carbon dioxide and other greenhouse gases. But there are other components in the atmosphere that can contribute to warming – or cooling – including carbonaceous and sulfate particles from combustion of fossil fuels and biomass, salts from oceans and dust from deserts. Brown carbons from combustion processes are the least understood of these aerosol components.

The parameter typically used to measure degrees of warming is radiative forcing, which is the difference in the incoming energy from sunlight and outgoing energy from heat and reflected sunlight. The variety of gasses and aerosols that compose the atmosphere will, under different conditions, lead to warming (positive radiative forcing) or cooling (negative radiative forcing).

The ASU researchers say the effect of brown carbon is complex because it both cools the Earth's surface and warms the atmosphere.

"Because of the large uncertainty we have in the radiative forcing of aerosols, there is a corresponding large uncertainty in the degree of radiative forcing overall," Crozier says. "This introduces a large uncertainty in the degree of warming predicted by climate change models."

A key to understanding the situation is the light-scattering and light-absorbing properties – called optical properties – of aerosols.

Crozier and Anderson are trying to directly measure the light-absorbing properties of carbonaceous aerosols, which are abundant in the atmosphere.

"If we know the optical properties and distribution of all the aerosols over the entire atmosphere, then we can produce climate change models that provide more accurate prediction," Anderson says.

Most of the techniques used to measure optical properties of aerosols involve shining a laser through columns of air.

"The problem with this approach is that it gives the average properties of all aerosol components, and at only a few wavelengths of light," Anderson says.

He and Crozier have instead used a novel technique based on a specialized type of electron microscope. This technique – monochromated electron energy-loss spectroscopy – can be used to directly determine the optical properties of individual brown carbon nanoparticles over the entire visible light spectrum as well as over the ultraviolet and infrared areas of the spectrum.

"We have used this approach to determine the complete optical properties of individual brown carbon nanoparticles sampled from above the Yellow Sea during a large international climate change experiment," Crozier says.

"This is the first time anyone has determined the complete optical properties of single nanoparticles from the atmosphere," Anderson says.

It's typical for climate modelers to approximate atmospheric carbon aerosols as either non-absorbing or strongly absorbing. "Our measurements show this approximation is too simple," Crozier says. "We show that many of the carbons in our sample have optical properties that are different from those usually assumed in climate models."

Adds Anderson: "When you hear about predictions of future warming or changes in precipitation globally, or in specific regions like the Southwestern United States, the predictions are based on computer model output that is ignoring brown carbon, so they are going to tend to be less accurate."

The research was funded for a six-year period with grants to ASU from the National Science Foundation (NSF) Chemistry Program ($319,000) and the National Aeronautics and Space Administration (NASA) Radiation Science Program ($327,000).

The work is part of the Aerosol Characterization Experiment (ACE) program, which encompasses three projects to date carried out by hundreds of researchers from multiple countries.

Crozier and Anderson have been involved in the U.S. component of the ACE-Asia experiment, a large-scale, multi-agency effort to characterize aerosols from East Asia, involving the NSF, NASA, the National Oceanic and Atmospheric Administration, the Department of Energy and others.

Monday, September 8, 2008

Historic Sarah Palin Rise and Energy Independence

I normally do not comment at all on current news since rarely is it truly important in the framing of the issues of the day except perhaps through the mists of time. Then there are weeks like the past that reach a threshold to really stand out and beg for comment. Here last week we had a possible market bottom and the abrupt emergence of Sarah Palin as political kingmaker.

I say possible market bottom because it is the first week of September, the down move is now testing a prior low from two months ago when the worst financial news was first breaking. Also this low is 22% of the all time market peak. To go the next 10% all that strongly touted bad news would have to be true. If that bad news is not true and I am seeing solutions going into place to prevent the worst, then stocks are cheap again and the worst is over.

Remember that the real stupidity has been cleaned out over the past two years. All the holders of bad paper have been working overtime to repair the damage. Stabilizing Fannie Mae and Freddie Mac should mostly end the surprises and also stabilize housing prices. We are almost ready to return to prudent business as usual.

There will now be plenty of cash around to buy equities and this is one of the better times of the year for such buying of bargains. This last brief shakeout sets the stage for a decent bullish quarter offsetting the past nine months of ugly financial news and markets.

The price of oil continues to slip as everyone wakes up to the idea that $145 oil suppresses demand and that Americans want out of the oil business now. The oil price decline has also taken the speculative money out the commodities markets in general and they are all in major decline. This was long overdue. Next year everyone will be complaining about surplus everything.

This readjustment in global pricing will have the immediate benefit of improving cash flows as this source of costs is finally abated. This means that business can improve across the board over the next two years.

Then into this mix we have the introduction of Sarah Palin as the vice presidential candidate for the Republican Party. What a week!

I dislike commenting on the US political scene since I am an outsider looking in and can surely only offend everyone. However, I have watched the media give Obama a free pass for the past year and have been disquieted. I am not the only one. He has not been brought to heel on his visible attachment to long discredited policies of the liberal left. And let us be totally fair. The wacko right has policies that are just as far from the pale of acceptability. I simply believe that a country’s leader must be pragmatic and an independent thinker above the simplistic ideas of the ideologues. Obama may well be such a thinker, but he hides it well.

This set him up with a giant bull’s eye on his back for the moment that someone changed the story line. You must also admit that the press was getting tired of the great storyline presented by the Obama candidature. The nomination was wonderfully choreographed as the coronation of the prince culminating in the most heralded oration since the Gettysburg Address. I think he got twelve hours of coverage before the rug got pulled by McCain. Since then for the past several days he has barely gained a mention as the press leapt happily onto the story line of Sarah Palin.

The Campaign or battle is now truly joined and he is up against two wonderful scrappers who are surely not going to let go control of the script for the next sixty days.

More importantly, their success means a probable one term McCain presidency followed by a probable two term Palin presidency. Goodbye Hilary.

Vastly more important to the Republican brand itself is that their supporters identify with this renegade style of candidate. Remember that the original success of the party with Newt and the boys was as renegade outsiders. That they became insiders much too fast was their undoing.

I have no doubt that the first task undertaken by a McCain presidency will now be the reforging of the entire US energy business. My readers know what the solutions must be and see in my postings many of the most viable options. These have been mere words to date. In the hands of visionary leadership they can become total reality in the next four years. Sarah was totally right to drive the building of a working gas pipeline from the Alaska North Slope through the Mackenzie Valley (which may turn out to be the greatest store of hydrocarbons on earth) to the pipelines in place in northern Alberta to the lower forty eight. A huge long lived store of natural gas is a giant boon for the American economy and this pipeline and others have been dickered over for thirty years. It took political will to finish the job.

Readers already know that North America can be energy independent. It takes immediate conversion of the long haul transportation industry over to LNG, preferably from Alaska and rapidly expanding THAI oil production in both the Alberta tar sands and possibly in formerly depleted US oil fields. Cattail ethanol can also be expanded everywhere wetlands exist at a production rate a least an order of magnitude greater than corn. Full out this can all be done in the next four years.

I have seen no evidence that Obama can even imagine change on such a scale. I suspect that McCain/Palin are both able to and believe enough in themselves to go forward.

I think that this week will be remembered as truly historic.

Friday, September 5, 2008

Free Energy

Forty years ago Arthur C. Clarke made the observation in a book, on his predictions for the future that the ultimate aim of technology development was super cheap energy delivered anywhere. He went on to point out that the likeliest solution was somehow harnessing sola energy. At the time such energy was available for dollars per installed watt. He pointed out that the installed cost needed to drop below $1.00 per watt.

I think that it is no accident that Nanosolar chose an opening price of $1.00 per watt.

We are now entering a future of free energy. I am saying this because solar energy needs no fuel and potentially needs almost no upkeep. Nanosolar has announced that their $2,000,000 solar cell printing tool can produce sufficient product to replace a 600 MW nuclear power plant in one year. I do not need a cost analysis to understand that the operating cost for producing that nuclear power plant equivalent is peanuts.

This means that a paid for installed base will be very cheap to operate long after it is paid for. Nuclear was attractive only because the fuel cost was small compared to plant capacity. That plant costs rose to hugely offset those efficiencies was never expected. We now have an energy source that can be built out and paid for without any direct fuel costs.

This is free energy. Even more compelling, it is clearly feasible to build efficiently down to any scale. That means folks, that I can walk over to any location on earth and access this energy to drive whatever I want. The grid no longer really matters.

I have already discussed the potential of a solar driven stand alone atmospheric water harvester. Recently, all the necessary breakthroughs have fallen in place to allow this product to be actively pursued. And I have attached a student design concept that employs the key principles to produce an emergency water and power source.

I cannot make it any clearer. Energy storage is around the corner and solid state refrigeration is now also feasible and around that same corner. We have cheap solar power from Nanosolar deliverable shortly.

This allows us to completely reengineer the human environment. The most optimistic imaginings of a utopian future are now possible. We will all live in living spaces able to provide full environmental control while supplied solely by the sun. We will travel in very light electric vehicles constantly recharging. And we can do this while situated anywhere we please.

Ask yourself what you can do with free energy. Six billion people will soon have that option.

Check out this link for images of the proffered device.

Student Designer

Mr Scott Norrie

University

UNSW

Product Description and Principal Function(s)

The harshness of remote areas can often take a toll on its visitors. Ill-conceived plans, misguided adventure and automotive breakdown can intensify the strain of isolation. In these environments, a lack of water can lead to severe dehydration and even death. Ersa is a transportable, renewable source of power and water for the remote 4WD user. The product is stored within the 4WD and used to generate water from the air for survival purposes.

Why does the product represent design excellence and why do you believe it deserves an Australian Design Award?

The design is responding to the problem of severe dehydration and death in remote areas due to lack of water in an innovative way. Ersa allows the user to generate water from the surrounding air in survival situations. The design utilises renewable energy (30W solar module) in creating water and allows for the use of this power to charge portable devices such as GPS units, mobile phones and also allows the user to trickle charge the vehicle’s battery. The product is entirely original and unique.

Currently, becoming stranded in remote areas involves limited access to power and a finite supply of water. Ersa is a solution to this problem. People who use 4WD vehicles in remote areas (such as recreationalists, jackeroos, tourists, aid workers, etc.) can benefit from this product.

The form of the product lends itself to simple operation in emergency situations. The large hinges and flush line between the fold out wings suggest they should fold out to operate. All buttons and indicators have been simplified and where possible include illustrative as opposed to text based interfaces so the product can just as easily be used by a remote aid worker as it could by a visiting tourist. The form of the product also suggests an aspect of operation and visual and tactile cues inform where the wings fold out from and suggest the location of the water. The visual language of the product reveals ruggedness and ‘transportability’ whilst suggesting the product is clean and involves water.

The way in which the design is carried, transported and its operation has utilised ergonomics and semantics in order to simplify its use and to provide access to product functions in emergency situations.

The design has carefully considered environmental issues including sustainability. Ersa is designed to be a standalone, sustainable product with renewable energy integrated within the product in the form of solar cells.
Product footprint, material choice, appropriate wall thickness and design for disassembly are examples of the design’s environmental considerations.

Ersa is manufactured as top and bottom ribbed shells (High Heat ABS) that house an inner assembly that is first assembled to a support piece that is layered and fixed between the two shells during assembly. The design has considered injection moulding parameters and assembly procedures in order to create a long life, durable product whilst reducing manufacturing costs.

The product is conscious of safety and standards and all bought in parts (such as the battery and charge controller) must meet their own relevant standards.

Thursday, September 4, 2008

Garden Biochar Made Easy

We are entering that time of year in which a lot of garden plants waste needs to be disposed of. It may also be a great time to produce a little biochar to fold into the soil.

We have learned that there is a strong likelihood that biochar will turn out to be superior to simply compost as a soil additive although our expectation is quite the opposite. The difference is simply that elemental carbon will hold soluble nutrients in place that are far too mobile when released from decomposing compost. Thus turning a charge of compostable plant waste into biochar should ultimately be a far superior practice.

There has been some observation that the initial soil reaction to receiving biochar is not as vigorous as initially expected but after one season this is overcome. This merely suggests that it may take a growing season to fully integrate the carbon with the soil biota.

The trick is a start a practice that can be easily repeated every year without a lot of fuss.

There we have learned that an open bottom drum will work fine. First a layer of branches less than an inch in diameter is laid down such that the edges are greater that the diameter of the drum. This permits air flow under the edge of the drum. Throw plant waste into the drum. This can be even accumulated over the summer as grass clippings and the like.
A metal lid is on top of the drum to prevent rain getting into the mass and accelerating composting. It should be possible to produce a well packed but not tightly packed charge that will still permit airflow.

When the time comes to fire the charge, I would first throw a layer of soil on top of the charge while leaving a ten inch hole in the center. The soil likely does not need to be there even but I would still put in three inches. I would then put in a charge of barbeque briquettes, preferably already burning into the center hole. This can be done with any other fuel of course but this way we are sure that the coals themselves will last for some time and avoid a premature failure. It may even be integrated with the weekly barbeque.

What is important is that the fire is strong enough and hot enough to sustain a top down burn. The beauty of this is that the volatiles coming off the burn front under the fire are forced to pass through the flame and produce even more heat as they burn. Tuned properly, and this surely will take a little practice, the process could be fairly smokeless. This could be famous last words of course, but I think that it is very possible although the initial ignition is sure to be anything but.

The lid is put back on but is left with enough of a gap to allow combustion gasses to escape. One could also mount a chimney on the top also.

This system is simple and with a bit of experimentations can surely be made very clean burning and satisfactory. Most importantly, it is easy to maintain and operate in a back yard, burn after burn.

Once the burn front has reached the branch floor of the drum, a water hose quickly douses the fire ending the burn. It is then easy to gather the charcoal and spread it onto appropriate beds and fold it into the soil.

Undoubtedly there will be superior well engineered solutions available over time as we establish a carbon making garden culture and promote the merits of the methods. In the meantime we have this as a working method and it should be possible to run it without smoking up the neighborhood and convincing the fire department that there is a disaster in play.