Friday, August 31, 2007

Methane and pottery

In the end concerns over methane production are irrelevant. We have doubled production in the last century and it is all gone. The reason is ultimately very simple. It migrates to the upper atmosphere and is consumed. This is something that is not an option for CO2.

Does the sharp increase in methane reaching the troposphere have any effect whatsoever? The quick answer is nothing that is obvious. It is a little like measuring the effect of the Mississippi on the Atlantic. The practical answer as always is to make as much as you desire and see were it takes you. My guess is nowhere.

That means that methane production concerns regarding all forms of biowaste combustion are misplaced. My real concern would be for well intentioned government regulation been actively imposed forcing a larger industrial price for the use of the method.

The second issue that has attracted comment is the association of pottery shards in the terra preta soils. I naturally postulated that this was partly to do with the disposal of kitchen waste in the corn stover stack kilns as we described in earlier postings in July. I also realized that a large bowl would have to be used to transport hot coals to the top of the stack and perhaps dumped into a prepared chimney.

These bowls are as primitive as you can get and very prone to heat breakage, so the presence of pottery is no surprise. My discomfort came from the fact that they would have normally taken broken pottery away with them for disposal elsewhere. So why not?

The answer came to me this morning. It is natural to take the bowl of coals to the top of the stack and to dump them there in the center and to let the coals slowly burn out a chimney. The problem is that you have to cover these coals with dirt to prevent flame out. The best way to do that is to upend the bowl on top of the coals and to throw dirt on top of that. Otherwise, the coals will end up been smothered by the dirt. The bowl would then migrate slowly to the bottom of the stack. In the process the high heat would cause this low quality pottery to breakup into very small pieces not worth recovering or causing any difficulty for cultivation.

Actually a pretty nifty solution to the problem of controlling the ignition coal mass. While this was progressing, the farmer would stand by to throw dirt on any emerging openings in the stack to prevent a flare up.

Thursday, August 30, 2007

Fabled Northwest Passage opens for shipping

This is an historic year. If a fleet of container ships were sitting at anchor in Lancaster Sound now, they could sail in the morning for the Bering Strait. And a fleet at anchor off Banks Island could up anchor and head into the passage going east and just in time for Christmas. 2007 will be known as the year that the Northwest Passage opened for normal navigation.

Most importantly, this has happened with a full month of sailing season ahead. A container ship leaving Europe today can easily make the transit into the Pacific. I assume at this point that we are using the narrow southerly routes to make the passage in the western part of he archipelago, but I suspect that even the western reaches of Lancaster Sound could be open in a couple of weeks.

hese conditions will prevail annually now until we get a major cold snap in the winter that at least partially restores the sea ice to previously prevailing levels. Every year, the passage will open up in late August as the annual sea ice retreats. It will also be predictable months in advance, allowing shippers to make their sailing plans.

Since the global climate appears slightly warmer, this new regime can become stable and will be reinforced by the substantial disappearance of permanent sea ice. We can expect that the annual sea ice for all the Arctic Ocean will almost disappear totally every summer once the permanent ice is gone. The sailing season will still be about thirty days and the Passage will still be the route of choice.

I do not know the precise transit times that will be achieved by shippers at this point. A lay over and a slow movement through narrow straits is likely necessary but could be built in as a matter of course. If Lancaster Sound is open, then an extra day or two could be saved.

In the meantime, the map distance is around 4000 kilometers from Shanghai to the Bering Straits. The Arctic leg is an additional 3500 kilometers to the mouth of Lancaster Sound or 500 less if it is open. The rest of the route to Rotterdam is another 4000 kilometers or so. This all suggests that the trip could be comfortably made in fifteen days. In fact with careful scheduling, it should be possible to complete a round trip for a one stop bulk transporter.

This will open the door for late season cruiseships between Vancouver- Seattle and Halifax – New York in either direction. Again it is easy to schedule around four weeks of safe passage time.

Wednesday, August 29, 2007

Methane fears

We have had a lot of enthusiasm for methane lately for its potential as a so called greenhouse gas.

Methane is a very strong greenhouse gas. Since 1750, methane concentrations in the atmosphere have increased by more than 150%. The primary sources for the additional methane added to the atmosphere (in order of importance) are: rice cultivation; domestic grazing animals; termites; landfills; coal mining; and, oil and gas extraction. Anaerobic conditions associated with rice paddy flooding results in the formation of methane gas. However, an accurate estimate of how much methane is being produced from rice paddies has been difficult to ascertain. More than 60% of all rice paddies are found in India and China where scientific data concerning emission rates are unavailable. Nevertheless, scientists believe that the contribution of rice paddies is large because this form of crop production has more than doubled since 1950. Grazing animals release methane to the environment as a result of herbaceous digestion. Some researchers believe the addition of methane from this source has more than quadrupled over the last century. Termites also release methane through similar processes. Land-use change in the tropics, due to deforestation, ranching, and farming, may be causing termite numbers to expand. If this assumption is correct, the contribution from these insects may be important. Methane is also released from landfills, coal mines, and gas and oil drilling. Landfills produce methane as organic wastes decompose over time. Coal, oil, and natural gas deposits release methane to the atmosphere when these deposits are excavated or drilled.

Table 7a-1: Average composition of the atmosphere up to an altitude of 25 km.

Gas Name

Chemical Formula

Percent Volume









0 to 4%




*Carbon Dioxide















*Nitrous Oxide






I want you to observe that everything in this list is at its lowest oxygenation level with the sole exception of methane. Also observe that CO2 is 200 times more available. This is true because methane is almost as good a rocket fuel as hydrogen. We usually call it natural gas when we use it to heat our homes. In fact, it is the one gas that has all the cards stacked against its survival.

Even with all the rice paddies, termites and cows hard at work producing methane and all the plants on earth consuming CO2 and nothing consuming methane except oxidizers, CO2 content exceeds CH4 content by a factor of 200.

This entry also makes the claim that since 1750, methane content has increased 150%. Who was measuring? Most certainly this has to be an educated guess linking human population growth and normal related agricultural usage to the current regime. In other words, rubbish has discovered a neat new way to produce methane.

The point is that CH4 is produced in normal biomass combustion and almost as swiftly consumed. This is not true at all for carbon dioxide.

Yesterday we addressed sustainable biochar production. Much concern was expressed over the production of combustibles like CH4 that will escape into the atmosphere. And a natural earthen field kiln will lose a lot of combustibles in this manner and not just methane. My description of the inexpensive modified incinerator design took advantage of that out gassing to fuel a second high temperature oven whose heat was then used to accelerate the carbonization process.

That solution is possibly available to industrialized agriculture. It is certainly not an option for everyone else, and may be suspect even were the equipment is available. Of course even more expensive systems can be deployed for a very small incremental gain.

The point that I want to make is that the primitive earthen kiln and my incinerator are separated only by efficiency. I would reasonably expect perhaps twice as much product to be produced. This cannot be accomplished with a sharp increase in haulage costs. I also point out that the jury is truly out as to the quality of the end product. The kiln promises to produce a more uniform end product but that may not be as advantageous.

In either case, gases are produced once a year for any plot of land and are then swiftly mopped up by the local environment.

The objective after all is to put carbon into the soil for a long time. Both these techniques do just that. The only other technique that convincingly does the same thing is the growth of new forests. Every other agricultural process that we have created is in a constant struggle to just maintain carbon content and related fertility. Terra preta promises to end this ten thousand year struggle forever.

Every subsistence farmer now has the option of either burning all his plant waste out in the open field as he has done forever, or building an earthen kiln and producing a few tons of Biochar as fertilizer for next year’s crop. He does not need a dime from his government to do this nor does he need any special equipment that he does not already have. He had no other choice before and has been the source of monster smoke clouds out of Asia’s rainforests. Now those smoke clouds can become a fraction of what they are today while he mops up the carbon for us and strengthens his farms fertility.

Tuesday, August 28, 2007

Machinery and Biochar

I posted a draft of this on the Terra Preta and Black Earth news groups.

I cannot help but think that the methods used to produce the black soils must be self sustaining and indigenous to the farm itself. I also fail to see the use of fairly large pieces of charcoal from wood that is difficult to pulverize properly as a very viable source. Remember that grinding has a natural sizing limit, past which a great deal of effort is needed. It would be much better if it came already sized.

Without question the use of corn stover to build natural earthen kilns is a great solution when we are relying on hand labor alone.

We also can conclude that corn stover is the best available source of large volume biochar. It needs to be central to any program simply to ensure 100% coverage of the fields with sufficient biochar. There is little enthusiasm for a system that depletes fifty acres to benefit five.

Is there a way to do this in the field with equipment?

Let us return first to best hand practice. From there we can speculate on how this can be made easier with power equipment.

We do not know how the Indians in the Amazon did this but we certainly know how they grew corn everywhere else.

In North America, they used a ternary system.

That meant that they cleared a seed hill, likely two plus feet across, perhaps slightly raised, in which they planted several corn seeds and also several beans. These hills would have been at least two feet apart. This means that twenty five percent of the land was been cropped in this way. They also planted every few hills with a few pumpkins. This provided ground cover for the seventy five percent of the land that was not been directly cropped.

An interesting modern experiment would be to now grow alfalfa in between the hills in order to fix nitrogen and provide a late fall crop. It unfortunately would likely take too much water. Recall that one of the major draw backs of corn culture is its demand for ample water, usually in the form of precipitation. This Indian methodology gave the corn a moisture bank in the adjacent soil.

This Indian corn culture system is ideal for hand work and for the production of terra preta by hand.

In September, after the corn, beans, and pumpkins are picked, it is time to remove the drying corn stover and bean waste. The pumpkin waste will be trampled into the ground fairly easily by now.

Hand pulling the stalks from one seed hill gives you a nice bundle to carry off the field to where an earthen beehive as described in my July posts is built for the production of Terra preta.

How do we accomplish the same result with the use of equipment is a more difficult question. Using a stone boat or wagon is obvious. A hydraulic grabber of some sort to pull the bunch associated with a hill would be very helpful. Tying the bundles would also be helpful.

This would allow two workers to clear a larger field quite handily.

After the earthen field stack is set up, the rest is fairly simple. A wagon full of biochar is taken to the field and each hill is replenished with biochar before planting. This is still a lot of labor but much easier than the most basic hand only system.

To do this with row agriculture will mean the creation of some fairly complex lifting and baling machinery. One method is to use a row of spikes that can get down into the soil as the machine is advancing and then rotate back up pulling the corn root from its bed. The stalks can be beat in unto a tray and perhaps automatically bundled. A lot like a swather with a root lifting modification and sizing appropriate to corn. It sounds like a bit of a challenge to this old farm boy.

If the machinery is designed to produce tightly packed one ton square bales, then we will make the rest of the handling process much easier. Remember my incinerator design? Otherwise we are still able to contemplate field processing as an option for biochar production.

At least we are on the right track.

I have seen a fair bit of comment regarding the out gassing of open air Biochar manufacture which I will be addressing again in another post.

Monday, August 27, 2007

Watching Sea Ice disappear

Last week we listened to press stories pointing out that the sea ice was already at its lowest level ever recorded. And the season had a few weeks yet to run.

I just looked at the next snapshot from august 22 and am even more startled. Appreciate that I have been looking at these maps now for several years and am fairly comfortable in interpreting them.

Over a one week period, the apparent boundaries retreated about 10 to 20 percent and we also have a large expansion of grey tones within the ice floe. This means that the amount of open water within the floe expanded hugely. This ice is still melting fast.

In fact, this is more noteworthy than the reduction of the boundaries which has been clipping along for the past several weeks. The thinning of the ice floe was masked up to now by the fact of its original thickness. For it to grey up so swiftly tells us that weeks of melting are having its effect and that much of the remaining ice is very close to disappearing. In other words, by the time you have extensive open water it is close to been over.

Go to: for the current cover and then go back one week to compare.

If we were looking at a glacier, we would be describing this as galloping. I suspect that we are looking almost at the end of the melt for this season, but the movement just in one week is huge and one can readily see that another six weeks at this rate would wipe out the polar sea ice in its entirety. These are pictures that do not lie.

It has been stated that three principle melts will clean out the ice. We had number one in 1998 with no recovery in the intervening years. This year we are having number 2. After this year, I suspect that there will be almost no multiyear ice left of consequence, setting the stage for a number 3 melt to come along in the next decade.

It is noteworthy, that however the amount of ice is reduced, that the sea ice continues to span the Arctic cutting of shipping from using the over the pole route through Russian waters. In a way it is a natural fluke that this is so. Even it the arctic warmed up enough to ensure a clearing of the ice every year, it appears that a northern route will continue to be impractical or at least a daring gamble at best.

Friday, August 24, 2007

Margin of Error anf Global Warming

Margin of Error and Global Warming

How do we obtain an accurate measure of the several forces at work affecting our climate? We have just been reminded that cloud cover is impossible to properly model at all. This means that whatever factor or function is assigned to its effect, its statistical error range will be huge.

For any given point of earth, the local temperature can already be safely written as T + or – 50 degrees F. Ocean temperature is very consistent but its volumetric flow rate is anyone’s guess. Remember that we measured the apparent volume of the Gulf Stream recently and found it had apparently declined by around forty percent since 1957. With two data points, we have no clue if it is significant. More recent work suggests that the effect is much more variable than we ever guessed.


I think that the important conclusion that we can draw is that the globe has several mechanisms whose variation within their natural range are quite capable of shifting global temperature around in the order of magnitudes that we are experiencing and have experienced historically. These same mechanisms also must move to moderate any such temperature variation. The sense is that if a trend goes too far in one direction, counter balancing triggers kick in and a lot sooner than is obvious.

We recap the mechanisms:

We have the man made direct impact of particulate production that is allowing more heat to be absorbed by the atmosphere.

We have the warming of the North Pole if sustained will eventually induce a warmer and perhaps wetter arctic.

We have agriculture, which has historically been a releaser of carbon, perhaps now about to become the major collector of all the carbon ever produced and perhaps a much larger absorber of solar energy through expansion into the deserts..

We have the speculation that cold water from the South Atlantic has periodically been injected into the Atlantic with major chilling effects on Europe and North America. The south polar sea is the primary engine of cooling on this planet because of the unusual location of Antarctica and the related circum polar current. Recall that all the cold water available for cooling in the Pacific comes from Antarctia.

Last but not least we may have the possible impact of the greenhouse mechanism.

Those are a lot of levers to juggle in any atmospheric model. And we truly need a thousand years of data to secure any knowledge that we can trust. What we have now is the knowledge that it has been hotter and it has been colder.

In the meantime, the best that we can hope to do is to regulate our global society to eliminate non carbon atmospheric pollution and to sequester carbon in all agricultural croplands as an economic bonus. And as an extra bonus, we want to grow forests on all the dry lands which will nicely double the amount of land under tree cover, absorbing solar energy and using it to sequester carbon.

Thursday, August 23, 2007

Alan Caruba writes on Clouds

I quote in its entirety this article on the state of the art regarding the impact of clouds on global climate. I am actually taken aback by the assertion of this fundamental flaw in climatic modeling. The impact of cloud cover is neither minor ignorable. Yet we learn that it has been handled with essentially a fudge factor.

Yet clouds form in response to heat buildup and particulate matter in the atmosphere, transporting this heat down wind at least. It is no accident for example that cloud formation will end often at the boundary between crop land and forest.

I was uncomfortable with the various assertions emanating from the modelers. I now think that the likelihood of the model been little more than an uncomfortable match up with the data as extremely high. You know that it is possible to map the trajectory of a cannon ball with a series of mathematically easy straight lines. It sort of works but it is still rubbish.

A Cloudy Mystery
By Alan Caruba | August 22, 2007

There's a reason why one should be extremely wary of the computer models that are cited by the endless doomsday predictions of Al Gore, the UN's International Panel on Climate Change, and all the other advocates of "global warming."

The reason is clouds. Computer models simply cannot provide for the constant variability of clouds, so they ignore them.

In a July issue of The Economist there was an article, "Grey-Sky thinking" subtitled, "Without understanding clouds, understanding the climate is hard. And clouds are the least understood part of the atmosphere." Since the increasingly rabid claims of Earth's
destruction from rising temperatures depend on computer modeling, how can they be regarded as accurate if they must largely exempt or deliberately manipulate the impact of clouds?

How can you make predictions, whether it's a week or a decade from now, if you haven't a clue why clouds do what they do?

Tim Garrett, a research meteorologist at the University of Utah, with refreshing candor has said, "We really do not know what's going on. There are so many basic unanswered questions on how they (clouds) work." And that is never mentioned in the great "global warming" debate, one we are continuously told is "decided" and upon which there is a vast scientific "consensus."

This is particularly significant because clouds act to both cool and warm the Earth. It is widely believed that high clouds can reflect solar radiation away from the planet, but they can also serve to trap heat in the atmosphere. New studies, however, have given some cause to reconsider this. Moreover, cloud droplets can last for less than a second while whole clouds can live out their lives in minutes or days. There is no way to integrate such massive, constant change into a computer model that divides the world into boxes up to sixty miles on a side, so they mostly do not.

This is why there are two new missions by the National Aeronautics and Space Administration involving highly sophisticated devices to measure and study the actions of clouds. This is also why, up to now, the computer models on which "global warming" claims have been made have actually been tweaked, adjusted, manipulated-take your choice of terms- to factor in the mystery of clouds.

How wide is the computer modeling gap when it comes to predicting the weather? The Economist reported that, "In a recent paper in Climate Dynamics, Mark Webb of Britain's Hadley Centre for Climate Change and his colleagues reported that clouds account for 66% of the differences between members of one important group of models and for 85% of them in another group." Clouds simply defy the logarithms of computer modelers.

In short, "Too much still remains unknown about the physical mechanisms that determine cloud behavior," said The Economist. Here's a useful scientific definition of the weather: "atmospheric conditions at a given time and a particular location." Drive a few miles in any direction and the weather is likely to be different. Stay put and it will change soon enough. My other favorite definition is "chaos."

In an August 2002 article, "The Trouble with the Weather", the European Space Agency noted that, "Forecasting the weather remains notoriously difficult because the atmosphere is not easy to predict, being affected by such factors as air pressure and temperature, air
movements, the distribution of water in its various states (clouds) in the atmosphere, and static electricity stored in the air."

"Clouds are that 800-pound gorilla," says research meteorologist, Gerald Mace, also of the University of Utah, referring to the critical role they play in the weather on any portion of planet Earth.

That gorilla, however, is never mentioned by the "global warming" propagandists. Neither clouds, nor volcanoes, nor the most important factor, the Sun, is credited as responsible for either the climate or the weather. Instead, we are constantly told that "human activity" is the single cause.

Unmentioned, too, is the fact that water vapor constitutes 95% of all greenhouse gases. Environmentalists insist that carbon dioxide plays a major role. It is well to keep in mind, however, that CO2 is the gas that is vital to the growth of all vegetation on Earth. Nor do global warming advocates remind people that the Earth is at the end of the interglacial period between Ice Ages which suggests another one is due any day now.

Indeed, the only global warming that is occurring has been happening since the end of the last mini-Ice Age in the 1800s. It is a natural response and is not a dramatic rise of four to ten degrees. It doesn't even represent one-half a degree increase.

Following the publication of the results of new study in the journal of the American Geophysical Union revealing that the absence of clouds actually had a cooling affect-the opposite of widely held opinion on the role of clouds-Dr. Roy Spencer of the Earth System Science Center noted that, "To give an idea of how strong this enhanced cooling
mechanism is, if it was operating on global warming, it would reduce estimates of future warming by over 75 percent. The big question that no one can answer right now is whether this enhanced cooling mechanism applies to global warming."

If leading meteorologists remain largely ignorant of why clouds do what they do, why would we pay any attention to those with a financial or ideological incentive to propagate "global warming" claims? There is, however, a difference between being ignorant and being stupid. Believing the "global warming" lies is stupid.

Alan Caruba is a veteran business and science writer,. Since founding The National Anxiety Center in 1990 as a clearinghouse for information on "scare campaigns", he has become a nationally known commentator on a wide range of topics of interest and concern to many Americans.

Wednesday, August 22, 2007

Global Dimming

I dug up some numbers for the effect that particulate pollution is having on the amount of sunlight reaching the Earth's surface. It is not trivial. I quote following piece from an article on View Post.

The effect was first spotted by Gerry Stanhill, an English scientist working in Israel. Comparing Israeli sunlight records from the 1950s with current ones, Stanhill was astonished to find a large fall in solar radiation. "There was a staggering 22% drop in the sunlight, and that really amazed me," he says.

Intrigued, he searched out records from all around the world, and found the same story almost everywhere he looked, with sunlight falling by 10% over the USA, nearly 30% in parts of the former Soviet Union, and even by 16% in parts of the British Isles. Although the effect varied greatly from place to place, overall the decline amounted to 1-2% globally per decade between the 1950s and the 1990s.

Gerry called the phenomenon global dimming, but his research, published in 2001, met with a skeptical response from other scientists. It was only recently, when his conclusions were confirmed by Australian scientists using a completely different method to estimate solar radiation, that climate scientists at last woke up to the reality of global dimming.

Dimming appears to be caused by air pollution. Burning coal, oil and wood, whether in cars, power stations or cooking fires, produces not only invisible carbon dioxide (the principal greenhouse gas responsible for global warming) but also tiny airborne particles of soot, ash, sulphur compounds and other pollutants.

The bottom line is that the radiation is absorbed into the atmosphere rather than the land and ocean. Although the comment is made that the portion reflected back into space is altered, that is pure speculation at this time. It is mostly absorbed by the particulate and associated water droplets and then reradiated as heat.

Although some have been quick to draw conclusions and wave around guesses masquerading as fact, the only thing we can be sure of is that the shift is significant and we surely are to blame for it.

And once again it is wise husbandry to end the causative practices. The atmosphere has turned out to be a far less forgiving sponge than we ever guessed for all our wastes. Just as a factory can destroy the ecology of a river, it appears that industrial man can amazingly disturb the global atmosphere in unwanted ways.

Tuesday, August 21, 2007

Mel Landers and Jackie Foo on Field testing Corn Stover Stacks

The following posts from the Black Earth Soils newsgroup (Bionet) are well worth reading since we are addressing the possibility of restarting hand production were it is the only option.

Mel Landers:

arclein wrote: Their principle option was to use corn stover and I show how.

Thank you for sharing this idea. This is very helpful. Having grown Maize for a number of decades now, I can attest to what you have stated about its stacking capacity.

I also know the difficulty in utilizing Maize stumps if you don't burn them. (not that I ever thought to do so) The Amazon Basin can grow an amazing amount of biomass in a short period of time. (I have attached a photo I took four decades ago, when I introduced slash and mulch methods to the Urarina of Peru. It shows the abundance of biomass left on the soil after cutting tropical kudzu.) But, maize is a challenge to grow in that environment. The fact that maize pollen is so common attests value of dark earth soils and their ability to retain nutrients.

It makes sense that, women would long ago have turned to firing their pots in order to increase their strength and longevity. The clays in the upper Amazon Basin are high in sand. The area is one big flood plain with continual deposition of sand. If the same is true in the lower Amazon, their pots likely needed firing. I have also attached a photo of an Urarina woman making a pot. Notice the grey color of the clay. The pot I brought back with me had a very rough texture, due to the sand.

Why not turn to maize stumps to produce a high temperature fire. Place that fire under the soil, in an impromptu dirt oven, and you have maize charcoal. It would be easily powdered and once your soils started improving, it would have been plentiful as well. It is a short step ahead to do the process specifically for soil improvement. If anyone doubts that they might do this, they need to read the information written up by Suzanna Hecht on the practices of the Kayapo.

Here is a type of biomass that is even plentiful in the temperate zone. O.K.....I can hear Bob thinking....But, how can I stack the Maize stumps from a whole section of land. That is where a large scale pyrolysis retort comes in. But, here in Nicaragua and in many other maize growing regions of the world. Stacking by hand makes good sense.

Nicaraguan producers already think I'm crazy for wanting the grass they cut off the fields in preparation for plowing. Now they will think I am totally insane for requesting their maize stumps as well. This should be interesting! Thanks again!


This is an excellent validation of the proposed mechanism

arclein wrote:

Hi I did a post describing a method of producing terra preta soils using only primative stick agriculture. Their principle option was to use corn stover and I show how.

I am hesitant about other feed stocks in general been as forgiving as corn stover, but that has to be shaken out through practice.

I also describe a modified incinerator design to utilize a full range of biomass in later posts.

The astounding revelation is that the Indians sustained continuous agriculture in the Amazon for centuries.

See my post at: carbonizing-corn-in-field.html

This has turned out to be my most popular post to date. Enjoy the site.

From Jacky Foo

Hi arclein

I checked your profile at but found no "real name" and therefore I address you as "arclein".

I have not made charcoal nor charred materials before and therefore I ask you.

Q: have you tested your idea of "carbonizing corn in the field" as described (Wednesday, July 4, 2007) in the link provided above ? or is there a drawing of what you described anywhere ?
>...the Indians in the Amazon likely created windrows that they
>then lightly buried and set afire. Your idea sounds very logical if the Amazonians were making charred materials (from corn stalks with their roots intact). But did they make charred materials to fertilise their soils or was charred materials simply a by-product of their burning away of agricultural wastes (corn stalk and tapioca stems) ? . (your message of Friday, July 6, 2007 "Those amazonian soils" in: .

Given that we now want to make charred materials and we have corn stalks with their roots intact, the idea of stacking a windrow of two rows of corn stalks with their roots to form the outside walls is a good one.

So let's say I have 5 acres of corn where I could get 50 tons of stover. I have no machinery (nor container to make a kiln), just bare hands of the workers e.g. in Kenya.

How big (length and height) would this single windrow be ? What materials can I use to make the outer wall ? ...etc

regards jacky foo

arclein wrote:

However they began doing this, the rewards were immediate inasmuch as the soil retained fertility that would have completely disappeared in perhaps three years.

The volume of corn stover made this possible over the whole growing area so that there was no lack of biochar even at the very beginning. Most other likely sources would restrict you to treating a fraction of the original cropped area and likely not be very sustainable.

Right now, we are speculating. I would actually try to build a circle with the roots on the outside and see if it is possible to build a beehive shape as an experiment. I would leave a central chimney, probably because I had to, and fill the bottom of it with a well stamped mass of biowaste.

Once the beehive had reached the point of almost been closed off, I would throw a large mass of glowing coals into the chimney and then fill the chimney with corn stover with a dirt capping. Then I would stand by and shovel dirt on any breakthrough for the next few hours.

We can try other methods of stacking once we have a little experience. And no, no one has done this yet and I am keen to see how it goes.



Monday, August 20, 2007

Solar Output

The one theory that has dogged all debates on global climate has been the idea that solar output variation is possibly greater than it is at different times in our history. The fact is that the sunspot cycle is clearly associated with a 1.5 watt per square meter variation of energy arriving at the upper atmosphere out of a total of 1366 watts per square meter. To dismiss its effect as slight is appropriate and also reassuring. There can be no surprises there. And unless some nuclear mechanism exists for which we have no evidence, the sun will continue to burn hydrogen at the same rate - i.e. - full out, for a few billion years more.

In fact, the only way to reduce solar output is to remove yourself to very high latitudes were it is possible to grow a polar ice cap for two thirds of the year.

Of this incoming energy, some is reflected away, a lot is absorbed by the atmosphere and the rest is absorbed by land and sea. And yes ladies, I know that if it were not for the biosphere it would be a big ice ball.

What I want everyone to observe, is that the only part of this that can truly vary and respond in the short term is the atmosphere. Massive reforestation of the Sahara will change dynamics but will also require a century as would similar land based modifications, however desirable. The sea is a sponge for energy that shifts at the rate of perhaps 1500 miles per year. It must take several years for heat in the equator to be exchanged for polar cold.

Now the heat content of the atmosphere in the tropics is already maxed out, or should be. That means that if the atmosphere is to absorb more heat, it must naturally shift that heat towards the poles, as it may be doing.

We now enter the realm of the ongoing debate, driven only by the fact that the our apparent climate has warmed up slightly. These current variations are still within what we know of long term historical cycles. The debate is over humanity's contribution.

And here is where I draw the line. The use of the atmosphere as a CO2 dump and a particulate dump is wrong irrespective of any linkages to climate change. We have already discovered that we can resolve both problems completely and economically. We can even provide self sustaining livelihoods for a billion families using what we have discovered.

We need a global regulatory mandate to eliminate stack gas emissions, exclusive of CO2 and to convert agriculture over to Terra Preta practices for soil enhancement and carbon sequestration. And there needs to be no break for anyone. The biggest beneficiaries will be the developing world.

Friday, August 17, 2007

Current Polar Sea Ice Maps

One of my favorite sites is:

We get a map of current sea ice coverage and a second map showing current change against the twenty year average. Don’t miss the second map.

I personally think sea ice could be mostly gone within the next ten years. When I started watching this several years ago I thought I was been brave to predict as early as fifteen years. I had to wait for confirmation of the current speed of the melt. It is coming in spades.

I reasoned that you do not lose sixty percent of ice thickness over 40 years on a linear basis. Yet we only had two points of reference consisting of the 1958 international geophysical year when an ice thickness survey was conduced by submarine. This was repeated again at the turn of the century. The difference of sixty percent was unanticipated.

The majority was likely lost in the last third of those 40 years or over a period of about fifteen years. That meant that the remaining forty percent should largely disappear in the succeeding fifteen years. And at some point toward the end it should flush out very quickly like a spring breakup since most of the long term sea ice will have been eliminated.

In the mean time it is fun to watch. Note that grey areas in the strong white areas likely reflect standing water at the least and semi open water in the main. I think that the grey areas have been growing larger and more widespread each year also. We still have a month left in this season.

Won't everyone be surprised?

I am looking forward to a cruise through the sea ice to the North Pole and Northern Greenland and Ellesmere Island. Since it was this hot in 1421 writer Gavin Menzies’ speculation on Zhu Di’s Chinese expedition through this area may even have it right. I thought that prospect a complete stretch, even with the map evidence.

I also would like to note that the polar seas will be covered every winter with seasonal sea ice that will break up and almost disappear over a fairly long summer season. It will still be as inhospitable as ever.

And we seriously need a cold snap up there.

Tom Miles comments on Biochar production costs

This is from the Terra preta site.

Agreed. Production and use of the charcoal on the farm is
not trivial. It's at a different scale than commercial
charcoal production but it is done with a purpose. That
purpose is clearly defined in your case. It is not yet
clear in many cases.
The actual cost may exceed the current returns on the
investment of labor and capital but the value
(cost/benefit) may not be calculated in strictly current
economic terms. That's not uncommon when developing new
technologies or applications, so I jokingly say that it
must be amortized on its entertainment value. The point
is that there must be a purpose, a product and a value.
Serious farm production of biochar in our area will be
regulated in a similar manner as outdoor wood boilers:
systems will have to comply with air, soil and water
quality regulations. The amount of regulation will
depend on the scale of the charcoal production. Let's
look at scale.
In your plots you have used 30 gallons (4 ft3) or 60 lbs
(4 ft3 x 15 lb/ft3) of charcoal in 85 ft2 plots
(5 x 17ft= 4). 60 lb/85 ft2 = 0.7 lbs/ft2 equal to about
14 tons of charcoal per acre. If your planted area is
50% of the total area you would use 7 tons of charcoal
per acre.
If you used a kiln the size of Robert Flanagan's (1.5
tonnes [1.65 t] biomass per charge) you would produce
about 0.66 tons per day (at two charges/day) or 20 tons
of charcoal in 30 days

So if you ran Flanagan's kiln for 30 days at two charges
per day you could treat about 3 acres per year (20
tons/7 tons per acre). In 15 years you'll cover the
whole 45 acre nursery. 1.65 tons/8 hours with wood
vinegar recovery would exhaust about 3 MMBtuh which is
large enough to be regulated in some states.
If you treat 5 acres per year that's 35 tons of
charcoal per year representing 175 tons of biomass (35
tons charcoal/20%) per year. If you make your own
charcoal at 5 tons of charcoal per day (175 tons/30
days = 5.8t/day) each kiln charge would be about 25
tons of biomass/24 hours or 1 ton per hour (2 big bales).

Your kiln will be rated at about 12 million Btuh(80%
biomass x 15 MM Btu/ton x 1 ton/hour) if no oil is
recovered, or 5 million Btuh if just the offgas is
burned to drive the process of making oil and char.

Either way you have an system is large enough that it
will be regulated for particulate, CO and NOx emissions.
A system of this size is likely to be operated as a
stationary production facility operating 250 days per
year (6250 tons biomass or 1250 tpy charcoal). Large
bale combustors of the 1980s (Agrifurnaces, IA) were
rarely moved. Most systems included debalers like the
farm scale straw burning gasifiers and boilers or today.
A farm scale charcoal system might include the same
amount of equipment as Vidir's Greenhouse Gas
Displacement system which gasifies straw to replace
natural gas for heating heat poultry houses.

Vidir's smallest system consumes 500lb/hr (3 Million
Btuh) of wheat straw. If built as a pyrolyzer it would
produce 100 lb charcoal and 1-2 million Btuh heat. The
system cost is $200,000. Annual operating cost with straw
at $10/bale is estimated at $16,000. Labor is figured at
3 hours per day $15/hr. Economics are based on 6 months
operating time (in Manitoba) or 375 tonnes (752 x 500
kg bales/year).

At 20% yield that would produce 82 tons (75 tonnes) of
charcoal which could treat about 12 acres (at 7 tons/
acre). In four years you would produce enough charcoal
for a 40 acre farm. At $200/ton the charcoal would be
worth about $16,400/year whch would just offset the
operating costs but not capital. If you had a use for
the heat (2 million btuh x 70% to hot water = 1.4
MMBtuh, 33.6 MMBtu/day) in 30 days you would recover
more than $12,800 additional revenue to help pay for
the plant. In six months you would recover $16,400 in
charcoal value and $76,800 in heat savings. So the
payback could be 4 years with heat recovery. To a see
a system like that in operation would be entertaining.

Getting the job done - Biochar on the modern farm

Getting the job done on the modern farm is a challenge that needs to be confronted on a capital sensitive basis. A good analysis of the problems facing us comes from Tom Miles over at the Terra Preta website in links. I have also posted one of his posts today and the reader can get a taste of the current debate by visiting the Terra preta link.

The rest of the world still relying on traditional agriculture can readily use the corn culture biochar stack that we believe created the Terra preta soils in the first place and have described earlier. This requires no capital investment whatsoever and likely achieves satisfactory results. It would be ironic if it turns out to be the best system which it reasonably could.

For the modern farm, I have proposed the application of a modified incinerator to produce Biochar. My first description came in my June post:

And you may wish to review that. What is becoming painfully clear, is that the secondary chamber will have to be fabricated inexpensively, eliminating its secondary usefulness as a incinerator and likely eliminating alternative recovery concepts. The machine needs to be basic and cheap because it cannot be operated year round to produce a premium byproduct.

Let us return to the concept of the modified shipping container. The original intent for this design concept was to deliver low impact incineration to a small municipality. This is achieved by the use of a two step burn. The first burn inside of the fire brick lined shipping container is held to just under 600 degrees by controlling the oxygen supply.

The flue gas, containing volatiles and other nasties (municipal waste, remember) is then vented into a separate much smaller chamber. Fresh air is injected, immediately jumping the temperature to 2000 degrees. This technique bypasses the production of intermediary combustion products that will be an emission problem. The high temperature flue gas can then be sent back into the first chamber as needed to increase the heat of its contents.

The system was extremely successful in largely eliminating emission problems surrounding the hospital waste that had driven the original development of this system.

This same system, built around a steel shipping container and perhaps a little simplification, can be used to produce a range of low temperature carbon based products ranging from biochar to possibly fully activated charcoal. The sizing is also right for agricultural use and the implied capital cost should come in at under $50,000 with any level of volume production. I anticipate that a manufacturer will simply supply the second chamber and the control system, while the buyer will acquire and line a shipping container. This will reduce costs even further and avoid shipping damage with the firebricks. A warning though, the second chamber, though comparatively small, must withstand very high temperatures and other stresses. The high performance and engineered municipal model of the secondary high temperature burner was costing out at a lot over $100,000 since it was cylindrical in shape and the bricks were over twice as thick and specially fabricated.

This system can be readily varied under operation in order to achieve the best possible yield of product including the option of not burning anything in the main chamber at all.

A typical charge of biomass will likely be less than ten tons for anything except wood for a twenty foot container. Something like straw could even be blown in.

As we have posted earlier, the one crop that can produce the most biomass per acre is corn. Corn will make ten plus tons of stover, while any grain crop will make at best one plus tons per acre. There is an order of magnitude difference. That also rather obviously implies an order of magnitude difference in haulage costs.

A farm producing enough corn stover to operate the carbonizer for say 40 days is not likely to have produced other types of waste that would need more than several days of additional operation. This means that the facility will be operated in the fall for a little over a month just after harvest. The produced carbon can be readily stored in preparation for been rebroadcast in combination with fertilizer onto the field originally cropped.

Our output is at least a ton of carbon for each acre of corn grown. We can then anticipate that the farm will be able to add a ton of carbon each year to every acre used for corn production. The increased fertility and the improved soil quality will also lead to an increase in corn production accelerating the process.

This new system now calls for a multi year field test aimed at defining costs and operating parameters and should be done soonest under an agricultural extension program. The visible payoff should come in the form of both sharply increased yields and a reduction in chemical inputs. In other words, the economic model is no different than the old traditional manure cycle of a mixed farm. It promises to just be a much better way.

It is clear that we will only achieve capital efficiency if we make the system a biochar only solution and integrate its use into farm operations in a way similar to the manure spreader. We may end up using the manure spreader to distribute the biochar unto the field. That would even clean the damn thing.

Thursday, August 16, 2007

Acid Rain in a Pipe

One of my frustrations watching the so called march of technology is that the whole problem of smoke stack pollution is readily solvable. Yet we have stayed with old systems, if any are used at all, that only partially ameliorate the problem. We have even exported our smelters offshore and wink at the horrific and noxious pollution thrown into the atmosphere.

Our coal burners currently use a fluidized bed that is charged with limestone. The limestone reacts with the sulphur to produce gypsum while absorbing some heat. This is good for about 60% of the sulphur and little else. Most such gypsum ends as waste. Not a great solution.

In the late seventies I met a technologist who had the simple insight that since natural ozone produces acid rain in the first place, it may be possible to achieve the same result in the stack using the best and fastest oxidizer possible. That is chlorine gas. He patented the idea and became its champion.

He met me and I persuaded him to run proper bench tests under the auspicious of the University of British Columbia. This ensured that the results would be credible. After that he continued to champion the protocol with little additional progress, in part due to his own business perspective.

What we developed was a very promising protocol.

The flue gas, whether from a smelter or a coal burner is well over 600 degrees when it exits the combustion chamber. It is also traveling fast. At this point water is sprayed into the gas stream along with chlorine gas. This produces hydrochloric acid in the gas stream. This acid reacts preferentially with the SOx first and secondly the NOx converting them into first sulphuric acid and then nitric acid. And the surplus hydrochloric acid is sponged up by the CO2 to produce some carbonic acid. These acids continue to additionally react with any metals in the flue stream converting them into salts that are usually soluble.

Our bench tests confirmed the implied stoichemistry of the reactions and showed a complete reduction of the SOx and NOx in the flue gas.

The spent flue gas was then sent through a water quench to sponge up any excess chlorine and to strip the heat, acids and salts out of the stream. This also would collect most of the particulate. The end result is a clean stack gas that is primarily CO2 and nitrogen.

In the heyday of the Acid Rain scare, a literature search search isolated over 150 separate strategies been explored, all stuck with slower reaction speeds than we could easily achieve with chlorine gas.

What I have just described is an aggressive reaction protocol that can be tuned nicely to be fast and efficient. The capital cost to implement this procedure is minor for a new plant and likely very doable as a retrofit for older facilities. We are only engineering a reaction chamber for the flue gas.

The waste is in the form of a hot solution of acids and salts in addition to the particulate already handled. The solution mix would be run through a small acid plant that would recover the chlorine, and produce both sulphuric acid and nitric acids as salable products. The salts would also presumably be recovered at least as a blend for later processing off site.

The total consumables for a typical power plant would be around one carload of chlorine gas per year.

Of course, even the scientifically literate shy away from the word chlorine, making this protocol a hard sell. But it is the real solution to our second major source of atmospheric pollution.

As an aside. Ozone is likely as good. The problem is producing pure ozone. The plasma arc produces mostly nitrous oxide rather than ozone which is very counter productive. Other methods of producing ozone are costly compared to chlorine.

Wednesday, August 15, 2007

Athmospheric Pollution Solution

I watched again the Nova program on solar dimming that came out last year. As usual I have a few issues with their conclusions, but what else is new.

What has been recognized is that we are putting a great deal of fine particulate in the atmosphere. In fact, it is huge and the effect is global in extent rather than a local nuisance. Although the interpretation proffered is that this actually lowers global temperatures, I beg to differ. This actually turns the atmosphere into a better heat sink. There are simply more opportunities to capture a photon.

Remember that a thunder cloud passing overhead will briefly cool of the surface. Yet that same thundercloud has absorbed vast amounts of solar energy that it is transporting back up into the upper atmosphere to release as condensation energy. Which do you think is greater?

They quote the one degree widening of the diurnal temperature range during the 9/11 grounding as evidence, yet this supports my proposition far better. The loss of water vapor in the high atmosphere reduced its moderating effect on the diurnal temperature spread.

This implies that a large portion of global warming may be entirely due to the unwelcome improved capacity of the atmosphere to retain solar energy. This is primarily caused by industrial smokestack pollution and slash and burn agriculture. Remember, we are concerned about particulates here, although we want to control the noxious chemicals as well.

We have already demonstrated that the conversion of slash and burn over to sustaining Terra Preta corn biochar culture can essentially resolve that part of the problem. This is true even if we use the corn stack field system that created the original soils. Only a small amount of particulate will actually escape into the atmosphere.

That only leaves us with the industrial smokestack which I will save for a later post. That problem has been partially solved and can be easily be well and cheaply solved.

Tuesday, August 14, 2007

THAI proclaimed a success

A couple of bits of rather interesting news on the energy front lately.

Algeria is building out a solar energy plant in the Sahara that will be able to supply power to 4,000,000 homes. The method will be parabolic trough mirrors concentrating the energy on a fluid holding tube. This is very conventional and will be combined with a natural gas plant, obviously to even out the energy flow. The field will cover 45 football fields.

The output will also be tied into the European grid. There is plenty of room for expansion and the project is big enough to induce a drive for maximum efficiency.

Much more interesting on the oil front, is that the operators of the THAI pilot test in the Oilsands of Alberta have proclaimed it a success. It has been operating for a year now and many problems have been worked out. The method consists of running a horizontal production well along the bottom of the formation for perhaps a thousand meters and then drilling an air injection well vertically to the toe area or end of pipe. Air is injected under pressure until ignition is achieved. This creates a char front that releases the remaining oil into the production well. I have been watching this with interest for two years.

Its success opens the door for the exploitation of all the deeper oil sands without the need to burn natural gas to produce hot water or steam. And a production rate of 1000 barrels per day suggests that we can go quite deep.

Up to now, published reserves have been limited to oil available to mining and shallow SAGD prospects. We can expect the SAGD prospects to be converted to THAI prospects and a major increase in suddenly economic deeper reserves to be added.

Canada may turn out to have (a fair guess only) a trillion barrels of recoverable oil because of this technique. It will still take decades to roll out. Also there are a lot of abandoned heavy oil discoveries around the globe that can now be revisited with this technique.

We know that oil supply is getting visibly very tight and that we cannot alleviate it any longer by simply pumping faster. This means however, that at least North America can engineer a soft landing. The bad news is that most of those other global resources are in decline or at least on the edge of decline. If you want to scare yourself to death, look at the decline of North American production after the peak in 1972.

Investing full out we might be able to stand still in the current regime. Yes folks, we need to lick the algae problem.

Monday, August 13, 2007

Oceanic Heat Transfer

We know that the Ice age ended over 10,000 years ago and we know that it took around three thousand years for the ice to melt to current levels. I discuss the likely reasons for the ending of the Ice Age in my chapter Pleistocene Nonconformity posted earlier.

Amazingly, we also know that the Northern Hemisphere at least experienced a climate warmer than todays for the next 5000 years up to around 3000 years ago. In an earlier post, I posited that this was a reflection of the Antarctic cold water mass reaching its maximum extent.

We now live in a semi stable regime in which temperatures in the Northern Hemisphere attempt to converge on their Bronze Age highs, yet constrained by some mechanism that likely injects surplus cold water into the south Atlantic ultimately chilling the Gulf stream by a degree or two. An engine of this nature easily accounts for the little ice age and the known variations that have been experienced.

Currents are driven in the Atlantic around two Gyres which are induced by the Coriolis force. It is very much like the foot print of an eggbeater with the center flow feeding into the Caribbean. Rather obviously, a major increase of cold Antarctic surface water which is at least 5 degrees colder at the boundary contact even in the southern latitudes, would have a chilling effect on the equatorial waters that ultimately form the Gulf Stream.

At present, each year, the Atlantic equatorial zone receives a quantity of solar energy that we can conveniently name Q. This quantity does not vary. Q is eventually delivered, almost intact into the arctic and is discharged melting winter sea ice. The point that I want to make here is that there is no credible or significant mechanism in the Northern Hemisphere capable of altering this exchange. I question whether it can even be varied significantly by any force at work in the Northern Hemisphere.

Yet a little ice age must have been induced by a reduction in Q delivered to the Arctic. Barring extraordinary variation in Solar output which I think is rubbish, we have only one remaining choice. That is a sharp jump in the amount of cold water injected into the south Atlantic. The temperature differential is so large, that only a moderate shift in flow will be sufficient to achieve our ends.

And we do know that the surface waters around Antarctica can be pulsed by shifts in winds alone. Right now, unfortunately we have little knowledge of the workings of the exchange mechanisms between the South Polar current and the various currents interacting with it. However, learning how to measure the current flow rates of ocean currents (particularly the Benguela Current) is timely, and integrating that information into our models rather wise.

A simple change in the mix of source waters for the Benguela Current could have a huge multiplier effect on Equatorial surface temperatures. Scary thought!

Friday, August 10, 2007

Algae and Farmboy Engineering

As I have commented earlier, algae production is in its infancy and needs to be mastered before it has any impact on our lives.

I recently took the trouble to read material from the naysayers, including those currently and previously involved in the 'industry'. What became clear immediately is that the attempts to date have been few and long on the big money approach to science and engineering. This approach is often doomed to failure because flawed assumptions are often impossible to fix on the fly unless a very gifted fixit type is on staff.

Fortunately, enough money is now going to be thrown at the industry, that perhaps we will win through.

In the meantime, you may recall now I engineered a farm based solution to the task of producing Biochar that was cheap and very efficient using shipping containers. An engineer would have looked at the budget of a hundred million or so and built out a haulage rich super plant dependent on the farmer paying an uneconomic price and thus proving it impossible. And the natural economic model that drives the engineering profession does not stop this type of unwitting nonsense.

Let us return to our ideal farm. How in hell are we going to successfully grow the right type(s) of algae long enough to produce the desired products? And remember, the more processing that can be achieved on site, the lower the end cost.

The first thing that I am going to admit is that I have not read anything on the current attempts to produce algae efficiently and know nothing about pumping systems. However, all that sounds like something that I would have to buy from a scientific supply shop which is a little out of my budget.

I will use one premise. My task is to emulate the behavior of an almost stagnant pond which is where I have seen plenty of algae. This also leads me to speculate that the best end environment will consist of a mix of specie rather than an attempt at a mono culture. The trick will be to find a way to manage the algae soup.

I have also learned one thing that is important. The best oil producing algae is heavy and will settle on the bottom. This was represented as a problem. to me this is a separation technique. It would be advantageous if most of the remaining lighter algae were rich in starch and could be drawn of for ethanol production. All of a sudden, I am back in the mining business running a flotation cell.

I think that these are reasonable premises and our task is to discover the art necessary to make it into a paying proposition. Hey, we are back in the wine business. Ever wonder what the first barrel of wine tasted like?

We can start out with a fabricated pond, perhaps three feet wide by say fifty long, or whatever length is convenient and a foot deep. It only has to be able to hold water. There we have many options, but today it is pretty easy to frame in the structure and then roughly lay up fiberglass. It may not be pretty, but this is a farm. Of course the base is on the ground saving us any extra expense.

The one other thing we need to do is to place a platform of tight stiff plastic screen about three inches above the bottom. Perhaps 50 to 80 percent shade. We can use bricks to support the screen. We now can also run hoses along the bottom to support both air and water injection, giving us the flexibility to explore various effects.

That leaves one other thing. We should also place transparent plastic sheets on top of the pond. These can be 4X8 sheets for ease of handling. They will allow us to contain and partially isolate the immediate atmosphere, as well as induce strong warming of the water.

This is actually a pretty good pond. Nutrient mixing can be done in a shed and pumped directly into the pond as can air or better still CO2 if available.

Since the oil rich algae will enrich on the screen (at least that is the intent) the screen needs to be occasionally lifted and run through a roller press to extract the oil. This could be set up easily at one end.

The rest of the working fluid will need to be run through a filter press and returned to the pond to continue the process. This does not have to a hundred percent of the fluid at all. In fact it is better to simply run the fluids through a recirculating cycle in a way that visibly drops the algae content by 60 to 80%. If the fluid is held briefly in small separation tank, we should be able to skim off additional oil.

With this type of setup we can now experiment to our hearts content to see if we can make it pay. All of the hardware is available in some form or the other, and once we actually know what we are doing it will be easy for manufacturers to make modifications.

That leaves us the problem of the algae soup. First, there will be a night time in which the soup will be producing CO2 and enriching the working fluid as happens in nature. That means that we want to start with as close to a natural mix as possible to get things going. Once that is accomplished, we want to introduce and encourage the success of the oil algae and perhaps several others. In other words, this is a long way from mono culture.

As a bonus, the filter press should minimize the populations of larger predators and mosquitoes. In the best of all worlds we succeed in causing the oil algae to be dominant and maximize production. Right now we do not know how, but I think that I have shown us a way produce the end product on the farm very cheaply.

Thursday, August 9, 2007

Cold Water on Global Warming

After alluding to the role of Antarctica yesterday, I think it is appropriate to add this article from 2001.

The climate of the northern hemisphere has experienced several major swings in climatic conditions over the past 10,000 years. The bronze age in particular appears to have been hotter that it is now as was the period before 1500 and the little ice age. The current hot spell seems to be doing no more than restoring those conditions. I also point out that these warm spells were very stable, while the sudden onset of a cold climate was abrupt. I posit that the only way it is possible to have such a shift is if the surface waters of the ocean itself was abruptly chilled by perhaps a degree.

And then the question is how? We have been blithely blaming the sun. I suspect that may well be rubbish. On the other hand we have a mechanism large enough in the southern hemisphere capable of doing this. And particularly doing this to the closed off Atlantic.

What would it take? There we do not know. Perhaps a build up of sea ice, or perhaps a decline in sea ice? That is the one thing capable of a long cycle of variation with periodic discharges into the Pacific and South Atlantic.

A discharge of cold water into the Atlantic would certainly impact on the whole of the Atlantic very quickly. It is also totally believable and I hope, unlikely to happen for a few centuries. At least enough time to get the permafrost out of the soil in Greenland and to reestablish the dairy industry there.

Here is the article:

September 18, 2001 - (date of web publication)

El Niño, La Niña Rearrange South Pole Sea Ice

Scientists have been mystified by observations that when sea ice on one side of the South Pole recedes, it advances farther out on the other side. New findings from NASA's Office of Polar Programs suggests for the first time that this is the result of El Niños and La Niñas driving changes in the subtropical jet stream, which then alter the path of storms that move sea ice around the South Pole.


Image 1

The results have important implications for understanding global climate change better because sea ice contributes to the Earth's energy balance. The presence of sea ice, which is generated around each pole when the water gets cold enough to freeze, reflects solar energy back out to space, cooling the planet. When there is less sea ice, the ocean absorbs the sun's heat and that amplifies climate warming.

By looking at the relationship between temperature changes in the ocean, atmospheric winds, storms, and sea ice, the new study pinpoints causes for retreating and advancing ice in the Atlantic and Pacific ocean basins on either side of the South Pole, called the "Antarctic dipole."


Image 2

"El Niños and La Niñas appear to be the originating agents for helping generate the sea ice dipole observed in the ocean basins around the Antarctic," said David Rind, lead author of the study and a senior climate researcher at the NASA Goddard Institute for Space Studies. The study appears in the September 17 issue of Journal of Geophysical Research.

During El Niño years, when the waters of the Eastern Pacific heat up, warm air rises. As the air rises it starts to move toward the South Pole, but the earth's rotation turns the winds eastward. The Earth's rotation is just strong enough to cause this rising air to strengthen the subtropical jet stream, a band of atmospheric wind near the equator that also blows eastward.

When the subtropical jet stream gets stronger over the Pacific basin, it diverts storms away from the Pacific side of the South Pole. Since there are fewer storms near the Pacific-Antarctic region during El Niño years, there are less winds to blow sea ice farther out into the ocean, and ice stays close to shore.

At the same time, the air in the tropical Atlantic basin sinks instead of rising. That sinking air weakens the subtropical jet stream over the Atlantic, guiding storms towards the South Pole. The storms, which intensify as they meet the cooler Antarctic air, then blow sea ice away from the pole farther into the Atlantic.

During La Niña years, when the Eastern and central Pacific waters cool, there is an opposite effect, where sea ice subsides on the Atlantic side, and advances on the Pacific side.

The study is important because the amount of sea ice that extends out into the ocean plays a key role in amplifying or decreasing the warming effects of the sun on our climate. Also, the study explains causes of the Antarctic sea ice dipole for the first time, and provides researchers with a greater understanding of the effects of El Niño and La Niña on sea ice.

Scientists may use these findings in global climate models to gauge past, present and future climate changes.

"Understanding how changes in the temperature in the different ocean basins will affect sea ice is an important part of the puzzle in understanding climate sensitivity," Rind said.