Friday, March 6, 2009

EEStor Posting of Interest

I grabbed this posting from the blog in the link and it is an excellent effort at describing what is going on with the EEstor protocol. It polishes my understanding of the technology and I certainly concur that the claims been made are difficult to achieve benchmarks. And yes, we all want to see this succeed.

Personally, I would like to see a working device make in some way with the specific particle sizes just to see the electro magnetic characteristics test out. How about painting some of it on a conductor? This may be simply naïve, but tricks like that go a long way to making folks comfortable.

I did exactly that once with another product surrounded by even wilder claims that landed on my desk. I designed a simple experiment that showed the claimed behavior and demonstrated it to myself. That simple do it your self experiment eliminated all further objections and the project progressed.

This sucker needs just that.


http://nextbigfuture.com/2009/02/general-motors-admits-to-working-with.html


All very interesting ... non-batteries that achieve energy densities on par with the current chemical practical record holder, Li::MnO (generally known as "Lithium") cells. The common garden variety AA LiMnO cell (rechargable) delivers 3.6 volts, holds about 2 amp-hr of juice (thus stores about 7 watt-hours of energy) ... in an AA package of about 5 milliliters capacity. So ... let's see. 7 Wh x 1000/5 = 1.4 kWh per liter. Hey! Similar to the 1,500 kWh/liter of the "full production" EEStor BariumTitanate cell. Cool.

The only difference is, the BaTiO capacitors - even with 1/100th the size of the patent claims - have yet to be produced. EEStor is to date, a proverbial "announcement engine", sending updates and developments, and endless chitter-chatter to CEO's of every conceivable target market - then disingenuously claiming to be "working closely with (a list of 100) industries". Right.
I think the idea(s) are great (after all BASF, Motorola and a number of others filed for patents for their competing ultra-permitivity dielectrics) , but practically, the issue is that it is virtually impossible to create void (arc-conduit) free ultra permittivity dielectrics ... even filled ones. Layering (say depositing dozens of spin-coated micrometer

scale films) ... could utilize the statistical effect to decouple vertical voids between electrodes (which is why mica has such a spectacular dielectric strength) ... but in the end it may end up in a cat's game.

YET - and let me just say here that I AM a devotee of this technology - yet, there are three striking advantages of the capacitive approach - even it it "only" achieves energy/weight densities similar to the best chemical methods - that are compelling. (a) two orders-of-magnitude faster charge/discharge cycles. (b) near-100% charge/discharge efficiency. (3) practically infinite lifespan. There are also a trio of secondary - but clearly useful - aspects that gather little press... (4) common-ness of materials, (5) low-toxicity materials, (6) no temperature dependencies.

The best batteries (at charging) have endothermic dissolution chemistries that 'suck up thermal energy' as they're being charged. So, while the nature of solute-conductivity (over metal) generates a lot of ohmic heat from high charge rates, the heat is reabsorbed by the endothermic reaction, yielding quite moderate cell heat-up. Of course, it can be pushed (the endothermic reaction only absorbs heat relative to the reaction rate, but the heat goes up as the SQUARE of charging overvoltage) to generate heat ... to maximize charge rate. Typically today's cells are rated between 0.2C to 1.0C charge rate ... e.g. a 2.9 Ah AA cell can be charge maximally in around 2-3 hours (actually somewhat faster, as witnessed in the remote-controlled toy vehicle market).

But lets just say that 3 hours (10,000 sec) is nominal for batteries. They can be put in series-parallel, and still retain the 10,000 sec rating (assuming not too much heat builds up). In order to get 5 minute (300 sec) charging, clearly 1.5 orders of magnitude charge-rate improvement is needed. This ... they're not going to get, under any circumstance. The internal resistance, delta-V squared waste-heat, and so on ... precludes it. further, those losses are non-recoverable in power. All megajoules lost to heat never are recovered as power.

So capacitors have a huge advantage there. This is not to say they're perfect - especially the titanate peroskvites (which are piezoelectric among other things), but the're pretty damned good.

The patent's 30 farad, 3500 volt capacitor would charge at 3.3 volts per second per 100 charging amps. To charge it in the 5 minute (300 second) window, it would need a charge rate of 12 volts/sec ... which would require about 350 amps. Hence the apologetics for "cable heating". Well, thick cables aren't all that expensive, and I could see water-cooled cables in thick black rubber (suitably armored) looking just like the thick hoses on today's filling-station gasoline dispensers. Just about as flexible, tough, and trouble free. A set of interlocks would keep these instantly-lethal voltages from escaping the housings. Yep, 5 minute charge times. Doable.

The second factor (near 100% charge-discharge efficiency) is also critical. Conventional chemical batteries - even the big mature lead-acid ones - tend to lose from 25% to 65% of the energy 'invested' in them. Put in 10 kWh, and you might only get back out 6 kWh. I don't think that is very attractive. By comparison, put 10 kWh into a super-capacitor, and 9.9 kWh comes back out. The 0.1 kWh lost is to resistive and/or dielectric hysterisis losses. That is pretty compelling.

Third (3) practically infinite lifespan. The "other" vexing thing about chemical cells is their almost immediate degradation in capacity, which continues fairly linearly over their useful lives ... then rapidly accellerates at the "end". This is nominally through permanent 'side reactions' that inactivate the electropositive anode materials; modern cell chemistry includes sacrificial compounds to compete for the parasitic oxidation pathways, thus keeping the anode materials from degrading as fast. The 'end point' (rapid decay) though is inevitably realized when the sacrificial compounds run out - as there is only so much 'space' for them in the galvanic cell ... before they start to degrade capacity itself.

Not so theoretically with solid-state dielectrics ... not run completely at their arc-over point. (i.e. when charged conservatively, dielectrics have lifetimes approaching centuries, if not milennia). This is a remarkable boon, as it puts electrical energy storage on the same footing as hydrocarbon based fuel fluids. (A can of gasoline lasts ... the life of the can ... more or less). Yes, dielectrics still leak ... hence the figure-of-merit of 1.2% per year ... but that isn't at all impractical for all vehicular and most other moderate-term energy storage use.

Fourth (4) common-ness of materials ... titanium dioxide is one of the most common minerals (rutile, etc). Barium is a common component of sea water, and is widely deposited in the arid parts of the planet as Baryte... and is produced in the 'millions of tons' per year level presently. These characteristics, along with the other common components (aluminum, alumina, plastics, eutectic tertiary glasses) spell a future that shouldn't be clouded by a 'limited resource' issues. A demand for 300,000,000 300kg batteries a year requires 100 megatons of materials ... or about 10x our present baryte production to achieve. Sounds achievable to me.

Fifth (5) low-toxicity materials. Barium titaniate has a very low dissociation constant, meaning that it is soluble in acidic aqueous fluids in vanishingly small quantities. Barium IS toxic, and the soluble barium compounds are modestly toxic (on the order of lead compounds). But, barium SULFATE is so incredibly insoluble that the admission of even a small amount of sulfate as an antidote serves to entomb in-vivo the barium, rendering it harmless. Titaniates are also non toxic. They are so non-toxic, that the oxides, carbonates and sulfates of titanium are used presently in candy and confection manufacture. Sixth (6) no temperature dependencies. This is particularly important for practical vehicular use. Cars, trucks, earthmoving equipment and industrial equipment is supposed to work realistically from the arctic's temperatures to the Sahara's (-50C to +50C), reasonably reliably. The great majority of electrochemical cell chemistries and configurations suffer from electrolytes that "freeze" (or otherwise deactivate) below -10C. The problem is so severe that modern photographers cannot use their cameras below -10C for extended periods of time without protecting and heating the battery-packs. There is enough civilization living in lower-than-20-below temperatures all winter long that a very low operation point is a requirement. Likewise, many electrochemical batteries suffer from significant recharge-rate limitations at elevated temperatures ... since many of the chemistries utilize gaseous hydrogen in nanocrystalline carbon matrices for the cathodic material. At temperatures above +35C, the gaseous hydrogen rate-of-adherance to the carbon-matrix rapidly diminishes. Again, there are enough people who live in arid, high-heat areas, that this too needs addressing.

The BaTiO (and virtually all titanates) have operating ranges from -75C to +75C. This "solves" the problem of temperature dependency.

In any case, i didn't initially set out to write so much up. I am very much interested in seeing this technology actually produce some REAL WORLD, REAL-SCALE DEMO DEVICES. Not just endless talk about the dielectric strength permittance measurements, and 6 months between trials. This is getting to be TOO OLD of a story. let's just see impractical-but-notable megacapacitors being built. Show the world plenty of these, and in particular (since it is patent protected anyway), making nominal quantities of these available to hands-on-testing ... would be a great way for the researchers to "prove it to the world" and get excitement flowing.

For this is the problem with the technology that causes it to fail the GoatTech tests. No product, lots of talk - especially how it will save the world, solve the problems with every other new-tech/green-tech, endless microscopically 'incremental' developments (which sound like bullshit taken globally), and promises that tomorrow will deliver, but today way more funds are needed.

I just hate to give such a promising technology the title of "SNAKE OIL", but insofar as this old Missourian is concerned ... until they can show the world (and me) ... the oleagenous distillate of common asps ... it is.

GoatGuy

Super China

Without question, China has been handed a heaven sent opportunity, not to take advantage of the discomfiture of the developed world’s financial system which is simply counter productive, but to use the time out as wisely as possible to strengthen the internal Chinese economy itself.

This is when you establish the proper social support systems so clearly missing and establish a national Medicare system and provide more internal capital through the banks to encourage local village development. All those folks who just came back to the village prematurely are a fabulous resource if they are modestly bankrolled. Microcap is the flavor of the times.

A pause led by the expansion of the internal economy will hugely lower China’s exposure to similar financial shocks in the future. Steady internal growth will sponge up the surplus labor over about three years. After that the economy will be driven by the expansion of the internal consumer economy, already well entrenched.

In fact, since last year, China reached the point in which there were no more surplus workers to leave the villages, it has actually reached the point in which all able bodied people are gainfully employable in the modern economy and this will mean rising pay packets over the next decade and Chinese consumer demand driving Indian economic maturation.

In practice it has taken China thirty years to trace the same path traced by Japan in forty years. It is now entering an era of solid internal growth in which the benefits of the modern economy penetrates to every Chinese household. Economically China has reached the top of the development S curve and must now figure out how to maintain growth by reinvention just like the rest of the developed world.

Super China
Robert Peston
5 Mar 09, 06:25 AM
Much of what the Chinese premier, Wen Jiabao, described this morning to the 11th National People's Congress as his country's programme to combat the evils of global recession would have sounded very familiar to a European or American audience.

What have become the new orthodox policy prescriptions for this time of crisis were all there: tax cuts; big increases in public spending; massive jumps in public-sector borrowing; more lending to business; anti-protectionist rhetoric; calls for improved regulation of banking and financial services.

It could almost have been Gordon Brown addressing the 3000 members of the National People's Congress in the Great Hall of the People under the giant red star.

Except for one glaring and important difference.

The Chinese economy remains - by the standards of the US or the UK - exceptionally strong.

It's true, as I've been pointing out over the past few days, that growth in China has been slowing down - and regions particularly dependent on exports, especially the south, have suffered mass closures of factories and painful rises in unemployment.

But many economists believe that the Chinese economy is still growing, even if they also say that the official statistics overstate that growth.

Thus Stephen Green at Standard Chartered reckons there was 1% growth between the third and fourth quarters of last year, and that there'll be a similar expansion in the first three months of this year.

For 2009 as a whole, he's forecasting GDP growth of between 6 and 7% - which is only a little less than China's official forecast of 8% (which Wen Jiabao repeated today).

That may be a long way from the low teens growth of last year. But it looks pretty amazing compared with the very painful recessions in Japan, the UK, Germany and the US.

And here's another frightening comparison between China on the one hand and the UK and US on the other.

Wen Jiabao announced that China's budget deficit this year will be 950bn yuan. That sounds like a big number - and it is an all-time record for China.

But, in relative terms, it's a flea bite compared with public-sector borrowing in the UK.

Converted to sterling, that 950bn yuan is equivalent to roughly £100bn.
Which is almost 20% less than what the UK government expects to borrow in 2009/10.

When those numbers are expressed as a percentage of GDP, there's an even starker picture of Chinese prudence versus what many would describe as British profligacy.

China's deficit is less than 3% of GDP, compared with 8% in the UK.

And, of course, the US public sector is arguably mortgaged up to an even higher hilt than Britain's.

When you add in the near-crippling indebtedness of businesses, banks and consumers in the UK and the US, well at that point China's financial strength looks almost awesome.

Also, as I've been emphasising, China's giant state-controlled banks have been much more cautiously managed than our commercial banks - and have neither the capital or funding constraints of ours.

None of which is to retreat from what I've been highlighting, which is that China faces formidable problems - in particular the challenge of maintaining social stability at a time when wages are being squeezed and millions are losing their jobs.

It's just that - in a way - we'd be fortunate to have their economic problems (if not their social ones).

So what are the big messages I took away from Wen Jiabao's two-hour address (perhaps we should, at the least, be grateful that Gordon Brown shows no sign of adopting Chinese speechmaking habits)?

Well he said some very striking things about allowing inefficient businesses to fail, about reducing the country's reliance on low-cost manufacturing of the basics, and about wanting to stimulate consumer spending.

All of that is both a threat and an opportunity for developed economies like ours.
There should be scope to increase our exports to China. But the competitive threat to the companies of developed economies will - if anything - intensify.

And over time (but it will take years) China's massive financial surplus - which was in part responsible for the glut of cheap money in the US and UK that fed our dangerous addiction to debt - should diminish.

For what it's worth, however, every Chinese person I've met over the past few days - from the lowliest factory work up to the Chinese Commerce Minister, Chen Deming - lays the blame for the global economic crisis on crazy risk-taking by American banks (Britain's aren't famous enough to register with them) and excessive borrowing in the US.

In that context, here's my favourite quote from my interview with Chen Deming, which pokes gentle fun at those who say China was at fault for saving too much and then lending that surplus to spend-spend-spend consumers in the west:

"Personally I can't agree with some people on their point that they [US households and businesses] borrow money from others, they overly spend this money and they make trouble for the rest of the world, but finally they blame those who lend them money for making these troubles. According to Chinese philosophy this kind of accusation is totally ridiculous and unreasonable."

I suspect that many of you would agree with China's equivalent of Peter Mandelson.

That said, China's leaders recognise that the country's prosperity is wholly dependent on ours.

So even if they believe that our mess is our own fault, they see that they have a powerful interest in helping us to clear it up.

In that context, it was striking that Chen Deming strongly disagreed with me when I described China as an economic superpower, perhaps because of a fear that as such China would have to take on the heavy burden of new responsibilities to the global community.

By contrast, today's rhetoric from Wen Jiabao's was all about a more open, outward looking China.

Wen Jiabao's China seems to want to play an important role in making the global economy safe for all of us - and is not revelling in our economic humiliation.

Thursday, March 5, 2009

Corn Gene Doubling Produces Giant Biomass

This breakthrough is good news as it allows a biomass focused growing regime in the Corn Belt. They do not share any expected yield as this is simply too new. Combined with the report today on the advance on using xylose, we see the ingredient for an emergent corn cellulose to biofuel industry that will have a low producer cost base and ample developed land to grow on.

We have never focused on maximizing cellulose production ever and this is a good stert. Other crops like hemp and tobacco should be just as amenable.

There is obviously still plenty of work to do in this field but I am now encouraged that the cost in land and space will be completely bearable. Worst case scenarios were making the ethanol fuel replacement model look impossible. We now are seeing that full biomass conversion to ethanol or perhaps biobutanol will be technically feasible. That means that cost structures and operation scales will be stable everywhere.

Once that is properly settled, then it should be one additional step to develop wood waste as a feedstock also. This would permit the proper use of wood management methods to be possible economically employed in a way that begins to maximize the tree size and quality over the decades. All our hardwood forests can become fully managed and a large section of the boreal forest also as the cattail trade expands into the forest.

Doubling A Gene In Corn Results In Giant Biomass

by Staff Writers
Champaign IL (SPX) Mar 04, 2009

http://www.biofueldaily.com/reports/Doubling_A_Gene_In_Corn_Results_In_Giant_Biomass_999.html

University of Illinois plant geneticist Stephen Moose has developed a corn plant with enormous potential for biomass, literally. It yields corn that would make good silage, Moose said, due to a greater number of leaves and larger stalk, which could also make it a good energy crop.

The gene known as Glossy 15 was originally described for its role in giving corn seedlings a waxy coating that acts like a sun screen for the young plant.

Without Glossy 15, seedling leaves instead appear shiny and glossy in sunlight. Further studies have shown that the main function of Glossy15 is to slow down shoot maturation.

Moose wondered what would happen if they turned up the action of this
gene. "What happens is that you get bigger plants, possibly because they're more sensitive to the longer days of summer. We put a corn gene back in the corn and increased its activity. So, it makes the plant slow down and gets much bigger at the end of the season."

The ears of corn have fewer seeds compared to the normal corn plant and could be a good feed for
livestock. "Although there is less grain there is more sugar in the stalks, so we know the animal can eat it and they'll probably like it." This type of corn plant may fit the grass-fed beef standard, Moose said.

"The first time I did this, I thought, well, maybe the seeds just didn't get pollinated very well, so I hand pollinated these ears to make sure. I found that just like the shoot, seed development is also slower and they just don't make it all the way to the end with a plump kernel," Moose said.

He explained that the energy to make the seed goes instead into the stalk and leaves. "We had been working with this gene for awhile. We thought there would be more wax on the leaves and there was. But we also got this other benefit, that it's a lot bigger."

Moose tested his hypothesis with other corn lines and the effect was the same. "We essentially can make any corn variety bigger with this gene. And it can be done in one cross and we know exactly which gene does it."

He noted that if you put too much of the Glossy 15 gene in, it slows down the growth too much and the frost kills the plant before it can grow.

One advantage to growing sugar corn for biomass rather than switchgrass or miscanthus is that sugar corn is an annual. Moose said that if it would attract a pest or develop a disease, farmers could rotate a different crop the next year.

Moose said that sugar corn might make a good transition crop.

"We think it might take off as a livestock feed, because it's immediate," Moose said. "This would be most useful for on-farm feeding. So a farmer who has 50 steers, could grow this and use the corn as feed and sell the stalks and sugar. It could be an alternative silage, because it has a longer
harvest window than regular silage."

For this sugar corn plant to become commercialized, it would have to get government approval, but Moose said that this is about as safe a gene as you can get. "It's a gene that's already in the corn - all we did was to put an extra copy in that amps it up."

Xylose Enzyme Discovered

Step by step we are finding ways to convert cellulose into a usable biofuel like ethanol. This company is also attempting to produce a better biofuel that ethanol itself. Their present focus is on biobutanol.

To date most effort, for good reason has gone into simply unraveling cellulose into constituent sugars and lignin. As these are freeing up, it then becomes necessary to process the derivative products. We have already reported on work on lignin and this is one of the resulting sugars from the processing of cellulose..

It does look as if we will establish protocols and pathways to convert cellulose into a desirable biofuel. The magic question then becomes whether we can do it profitably.

The multiple processes and separation steps are somewhat discouraging, particularly when you will also have to fine tune the feedstock. However, uniform feedstocks such as corn stover and cattail waste and bagasse are all excellent sources of renewable feedstocks.


Researcher discovers enzyme to ferment xylose

By Anna Austin

Web exclusive posted Feb. 17, 2009, at 3:45 p.m. CST

http://www.ethanolproducer.com/article.jsp?article_id=5390

Eckhard Boels, cofounder of Swiss biofuel company Butalco gmbH and a professor at Goethe-University in Frankfurt, Germany, has discovered a new enzyme which teaches yeast cells to ferment xylose into ethanol. Xylose is an unused waste sugar in the cellulosic ethanol production process.

According to Boles, one of the major problems with cellulosic ethanol is that when utilizing other parts of plants, which today are considered waste, yeasts are unable to ferment some of the sugars in a majority of the plant material.

Saccharomyces cerevisiae (SC), a yeast commonly used for ethanol production, lacks the ability to ferment some sugars. “Heterologous expression of a xylose isomerase would enable yeast cells to metabolize xylose,” Boles said. “However, many attempts to express a prokaryotic xylose isomerase with high activity in SC have failed so far. We have screened nucleic acid databases for sequences encoding putative xylose isomerases, and finally could clone and successfully express a highly active new kind of xylose isomerase from an anaerobic bacterium in SC.”

The new enzyme was taken from the bacterial organism and inserted into yeast cells that were retrieved from a commercial ethanol plant. “With just a minor effort, we were able to teach the yeast cells how to ferment the xylose into ethanol,” Boles said.

Boles believe the findings may provide an excellent starting point for further improvement of xylose fermentation in industrial yeast strains, and greatly enhance the development of an efficient biomass-to-ethanol fermentation process. His company, Butalco gmbH, is now working to construct yeast strains to convert plant waste materials into biobutanol.

The research was published in the
Applied and Environmental Microbiology journal in February.

Obama Muddles While Rome Burns

The broad markets have now lost a full sixty percent of their peak values. This is as bad as it should get. The vulnerable will be swept away. The housing market has put all mortgages issued since 2003 underwater to some degree or the other. It faces an overhang of inventory held by folks suddenly unqualified to borrow the sums they have contracted for.

This wall of inventory is strangling the majority of American consumers and if allowed to continue, it will succeed in destroying all credit and all financial institutions. The reason for this is that the ultimate strike price for buildings is the depreciated cost of the build out and the ultimate strike price for all land is the cost of providing services. All else is an intangible value created by property attractiveness.

With an overhang as we now face, everyone with free capital now owns all the properties they want while their capacity to borrow has disappeared. In short actual demand has collapsed far more than anyone understands and the banks now have to attempt to match the two together to get out from the position they are in and from there to rebuild their balance sheets.

Most consumers also have to rebuild their personal balance sheets. Without been reckless at all, a person with a good job had a mortgage on a property with a hundred thousand in free equity available and a credit card line that totaled $40,000. Now he has a negative equity of $50,000 and his credit card company is no longer revolving his card and is now charging usurious rates of interest for the slightest provocation.

He is highly motivated to force his credit card company into a settlement arrangement and to look carefully at the cost benefit arrangement regarding the house.

The problem is the mortgage portfolio and it must be fixed properly now. This is an emergency. Doing nothing is toying with the economic engine that has driven the world.

This is not a problem that the banks can solve by themselves. All they can do is ask for more money to shore up their balance sheets until the price of all underlying land approaches zero, so that trade becomes impossible. All the banks can do by law and the rules that guide them is to chase the price of property downward.

I have already explained what should be done and what will be done will approach that, but likely in a much more ugly fashion. After all it will be managed by folks driven by greed and stupidity.

President Obama is approaching his Herbert Hoover moment and the signs are not positive. Every investor understands that if his friends are losing their homes that his investments will start losing money.

The other shoe that has not dropped fully yet and it will be crushing, is the pending collapse in all government revenues. We are not talking about a few percentage points. We are talking of a massive revenue shortfall that no tinkering will escape. It also cannot be solved by raising taxes. Hoover tried and it suppressed the economy further.

Quite simply, each day that we fail to act will add several days of pain on the road back. Otherwise, our hope is, and that is a lousy way to run an economy, that there is enough liquidity in hard cash to suck up enough of the inventory to begin the recovery cycle. Once that happens the overhang can disappear quickly.

The last article is a timely reminder of how attempted intervention in the thirties was less than helpful. It also provides background to help understand current government thinking.

Poll: Voters Skeptical About Obama's Recovery Plan

WASHINGTON - Americans are skeptical that President Barack Obama will solve the economic crisis within two years but still overwhelmingly approve his job performance, a new poll found Wednesday.

Most voters also support Obama's mortgage rescue plan unveiled last month in a bid to quell a rising tide of home foreclosures, but they think it is unfair to people who played by the rules and met all their payments.

The large snapshot of more than 2,500 voters reveals deep pessimism among U.S. voters about the state of the economy and prospects for a recovery, according to the Quinnipiac University Polling Institute.

Half of the survey sample was asked whether they believed that the federal government could fix the economic crisis within two years and answered no by a margin of 68 to 26 percent.

The other slice of the survey group was asked whether Obama alone would be able to lead the country out of the economic mire within the same time period, and answered no by a 64 to 28 percent margin.

Yet Obama's approval rating, so far at least, seems immune to the impact of the worst economic crisis in decades: 59 percent of those polled said they approved of the job their new president is doing, compared with 25 percent who did not.

Overall, voters approve of Obama's handling of the economy 57 to 33 percent, and significantly give him much higher marks on the issue - 56 to 26 percent - than Republicans.

"President Barack Obama's approval rating is solid, compared to the historical record of new presidents," said Peter Brown, assistant director of the institute.

"But the lofty numbers he enjoyed after his election are leveling off, largely because of declining support among Republicans," Brown said.

There is also more good news than bad for Obama on three of his key domestic policy priorities, which he has been highlighting in the last two weeks.

By a 55 to 39 percent margin voters believe that he will get healthcare reform, an issue that has bedeviled past Democratic presidents, through the Congress this year.

By a 61 to 35 percent breakdown they also say they believe Obama when he promises not to raise taxes on anyone with a family income under 250,000 dollars a year.

But the president's vow, made last week to cut the ballooning budget deficit in half by the end of his term in 2013, draws more cynicism.

Fifty-five percent of Americans do not believe he can do it, compared to 38 percent who do.

The survey was conducted between February 25 and March 2 among 2,573 voters with a margin of error of plus or minus 1.9 percentage points.

The Government and the Great Depression

by Chris Edwards, Director of Tax Policy, Cato Institute


The economic policies of the 1930s are a continuing source of myth and confusion. Many people believe that capitalism caused the Great Depression and that President Franklin Roosevelt helped to end it. A recent History Channel special on Roosevelt said that his New Deal resulted in “recovery and reform” while creating “millions of jobs.”


1 Such often-stated claims are incorrect. Misguided federal policies caused the downturn that began in 1929, and they prevented the economy from fully recovering for a decade. Policy blunders by the Federal Reserve, Congress, and Presidents Herbert Hoover and Roosevelt battered the economy on many fronts. The events of the 1930s influence economic policymaking today. Many people think that we need a big government to prevent, or to reverse, recessions. But the 1930s illustrate that activist policies increase, not decrease, economic instability. Government interventions reduce the flexibility that markets need to adjust to shocks and return to growth. This bulletin looks at the 1930s economy and highlights the worst policy failures.


2 Policy Failures Lead to a Long and Deep Downturn


The Depression was a uniquely severe contraction. Real gross domestic product fell for four years before finally beginning to recover.


3 Real output only regained its 1929 level in 1936, but then output plunged again in 1938. The unemployment rate stayed persistently high at more than 14 percent for 10 years (1931 to 1940).


4 By contrast, the economy recovered rapidly after a sharp contraction in 1921. Real output fell 9 percent in 1921 and unemployment rose to 11.7 percent.


5 But the economy bounced back with output recovering all its lost ground in 1922. Unemployment fell to 6.7 percent in 1922 and 2.4 percent in 1923. The secret to the quick recovery was that the government generally stood aside and let the market recover by itself—wages and prices adjusted, resources shifted to new areas of growth, profits recovered, business optimism returned, and investment rose. By contrast, government policies in the 1930s prevented the U.S. economy from recovering. The following are some of the key policy mistakes:


6 Monetary Contraction. The Depression was precipitated by a one-third drop in the money supply from 1929 to 1933, which was mainly the fault of the Federal Reserve. The Fed made further errors that helped put the economy back into recession in 1938. Meanwhile, a flood of bank failures in the early 1930s compounded the money supply shrinkage and heightened economic fears. A key problem was that most states restricted bank branching, which prevented banks from diversifying their portfolios across jurisdictions. By contrast, Canada allowed nationwide branching and did not suffer a single bank failure during the Depression.


Tax Hikes. In the early 1920s, Treasury Secretary Andrew Mellon ushered in an economic boom by championing income tax cuts that reduced the top individual rate from 73 to 25 percent. But the lessons of these successful tax cuts were forgotten as the economy headed downwards after 1929. President Hoover signed into law the Revenue Act of 1932, which was the largest peacetime tax increase in U.S. history. The act increased the top individual tax rate from 25 to 63 percent. After his election in 1932, Roosevelt imposed further individual and corporate tax increases. The highest individual rate was increased to 79 percent. State and local governments also increased taxes during the 1930s, with many imposing individual income taxes for the first time. All these tax increases killed incentives for work, investment, and entrepreneurship at a time when they were sorely needed.


International Trade Restrictions. In 1930, President Hoover signed into law the infamous Smoot-Hawley trade act, which raised import tariffs to an average of 59 percent on more than 25,000 products. More than 60 countries retaliated by slapping new restrictions on imports of U.S. products. As new trade restrictions were imposed around the world, trade plummeted. By 1933, world trade was down to just one-third of the 1929 level.


Keeping Prices High. The centerpiece of the New Deal was the National Industrial Recovery Act of 1933. It created “codes” or cartels in more than 500 industries in order to limit competition. Businesses were told to cut output and maintain high prices and wages. Businessmen who cut prices were cajoled, fined, and sometimes arrested. Fortunately, NIRA was struck down by the Supreme Court in 1935. The Agricultural Adjustment Act of 1933 similarly restricted production to keep prices high. “Excess” output was destroyed or dumped abroad. While millions of Americans were going hungry, the government plowed under 10 million acres of crops, slaughtered 6 million pigs, and left fruit to rot. Production of milk, fruits, and other products was cartelized to boost prices under “marketing orders” begun in 1937. These policies reduced employment and burdened families with higher prices. At a May 1935 press conference, Roosevelt read letters from businessmen thanking him for keeping prices high.


7 With millions out of work and short of money, Roosevelt thought that his job was to shield high-cost producers from entrepreneurs wanting to offer lower prices to hard-pressed families.


Keeping Employment Costs High. Many New Deal policies raised employer costs, contributing to the extraordinarily high unemployment of the 1930s. NIRA industry codes required high wages. The new Social Security tax increased compensation costs. New minimum wage rules reduced demand for low-skilled workers. The Davis-Bacon Act required the payment of excessively high wages on federal contracts. Compulsory unionism and militant union tactics were encouraged under a series of laws. One result was that U.S. work stoppages soared from an average 980 annually between 1922 and 1932 to a peak of 4,740 in 1937.


8 While “millions of jobs” were created in the government during the 1930s, private-sector jobs were destroyed. Total U.S. private employment was lower in 1940 than it had been in 1929.


9 Harassment of Businesses. Investment stagnated in the 1930s as a result of uncertainties in the economy and the new risks of adverse federal actions.


10 Roosevelt and members of his administration demonized business leaders and investors in their speeches. FDR called them “economic royalists” and “privileged princes” seeking a “new despotism” and “industrial dictatorship.” Laws and regulations poured forth from Washington like never before. Roosevelt issued more executive orders than all presidents from Harry Truman through Bill Clinton combined. Presidents typically issue just a few hundred executive orders, but Roosevelt issued 3,723.


11 Roosevelt’s antitrust crusade was typical of his antimarket approach. The Justice Department hired hundreds of new attorneys and began a lawsuit blitzkrieg in 1938 against dozens of industries for conspiring to keep prices high. The irony was that Roosevelt had spent his first term encouraging cartels, monopoly unionism, and other policies designed to boost prices and production costs.


Conclusion


New Deal interventions were not only bad for the economy, but favored fat cats over average families. Most farm subsidies went to major land owners, not small-time farmers. Required reductions in farm acreage devastated poor sharecroppers. Efforts to keep farm prices high led to the destruction of food while millions of families went hungry. Compulsory unionism led to discrimination against blacks because it gave monopoly power to union bosses who often didn’t want them hired. NIRA cartels prevented entrepreneurs from cutting prices for consumers. Roosevelt’s strategies of handouts, federal jobs, subsidized loans, demonizing businesses, and public works projects in swing states worked well politically. But economically, Roosevelt and his “brains trust” had no idea what they were doing. They attempted one failed intervention after another. The Great Depression was a disaster, and sadly an avoidable one.

1 The History Channel, “America’s Man of Steel,” advertisement, Washington Post, April 17, 2005, p. R2.

2 See Chris Edwards, Downsizing the Federal Government (Washington: Cato Institute, November 2005), Appendix 1.

3 U.S. Bureau of Economic Analysis, Survey of Current Business, April 2000, p. 15.

4 U.S. Bureau of the Census, Historical Statistics of the United States, 1975, Part 1, p. 135.

5 For the change in output during the 1920s, see U.S. Bureau of the Census, Part 1, p. 224. For unemployment, see p. 135.

6 Most facts are from Jim Powell, FDR’s Folly: How Roosevelt and His New Deal Prolonged the Great Depression (New York: Crown Forum, 2003). See also Alan Reynolds, “What Do We Know About the Great Crash,” National Review, November 9, 1979.

7
http://newdeal.feri.org/court/fdr5_31_35.htm.

8 Powell, p. 204. Powell cites data from Morgan O. Reynolds.

9 www.bea.doc.gov/bea/dn/nipaweb. See Table 6.4A.

10 Real gross private domestic investment did not recover to its 1929 level until 1936. Investment fell again in 1938. U.S. Bureau of Economic Analysis, p. 15.

11 William J. Olson and Alan Woll, “Executive Orders and National Emergencies: How Presidents Have Come to ‘Run the Country’ by Usurping Legislative Power,” Cato Policy Analysis no. 358, October 28, 1999, p. 13

Wednesday, March 4, 2009

Thirty Year Timeout for Global Warming

Let me get this right. We have a cooling event without cause and we must be prepared to discount it as significant for thirty years. I am trying to imagine any other field of human endeavor that would accept such a specious argument. It in fact boggles the mind. The heart of good science is to acknowledge the data and ask where it takes you.

In this case, the data is flowing strongly against the proposed theory and its expected data structure. The data is not waffling anymore as it did the previous several years. It has completely undone all the previous warming and is still trending downward.

Attempting to preserve a pet theory by giving it a thirty year time out is unbelievable. It might have been better to do what most other scientists have done by either becoming silent or beating a hasty retreat.

In 2007, the sea ice melt had entered terminal breakup if the seven year temperature regime was maintained. The temperature reversal was almost immediate and is continuing. We have made some progress to understanding what drives this reversal and I can assure you it has zip to do with human meddling.

My response was to recognize the new data and explain its meaning. It really is that easy. My continuing concern is that the community has a long lead time in getting data interpretations out into the hands of the public. It took a minimal understanding of theory to recognize that present trends meant an ice free 2012. NASA and others were already into temperature reversal data before their related stories came out. This means that I am generally reading out of date news stories on climate subjects.

Today we are continuing to suffer through a cold miserable winter comparable to the low end on the averages for the past fifty years. I also have no reason to anticipate anything else for the next few years.

The only variable able to show significance is solar sunspot activity and it continues to be very quiet. If it got active tomorrow, the lag time will still be a couple of years.

http://dsc.discovery.com/news/2009/03/02/global-warming-pause-print.html

Global Warming: On Hold?

Michael Reilly, Discovery News

March 2, 2009 -- For those who have endured this winter's frigid temperatures and today's heavy snowstorm in the Northeast, the concept of
global warming may seem, well, almost wishful.

But climate is known to be variable -- a cold winter, or a few strung together doesn't mean the planet is cooling. Still, according to a new study, global warming may have hit a speed bump and could go into hiding for decades.

Earth's climate continues to confound scientists. Following a 30-year trend of
warming, global temperatures have flatlined since 2001 despite rising greenhouse gas concentrations, and a heat surplus that should have cranked up the planetary thermostat.

"This is nothing like anything we've seen since 1950," Kyle Swanson of the University of Wisconsin-Milwaukee said. "Cooling events since then had firm causes, like eruptions or large-magnitude
La Ninas. This current cooling doesn't have one."

Instead, Swanson and colleague Anastasios Tsonis think a series of climate processes have aligned, conspiring to chill the climate. In 1997 and 1998, the tropical Pacific Ocean warmed rapidly in what Swanson called a "super El Nino event." It sent a shock wave through the oceans and atmosphere, jarring their circulation patterns into unison.

How does this square with temperature records from 2005-2007, by some measurements among the warmest years on record? When added up with the other four years since 2001, Swanson said the overall trend is flat, even though temperatures should have gone up by 0.2 degrees Centigrade (0.36 degrees Fahrenheit) during that time.

The discrepancy gets to the heart of one of the toughest problems in climate science -- identifying the difference between natural variability (like the occasional March snowstorm) from
human-induced change.

But just what's causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be sucking heat down into the depths. Or an overabundance of tropical clouds may be reflecting more of the sun's energy than usual back out into space.

"It is possible that a fraction of the most recent rapid warming since the 1970s was due to a free variation in climate," Isaac Held of the
National Oceanic and Atmospheric Administration in Princeton, New Jersey wrote in an email to Discovery News. "Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again."

Swanson thinks the trend could continue for up to 30 years. But he warned that it's just a hiccup, and that humans' penchant for spewing
greenhouse gases will certainly come back to haunt us.

"When the climate kicks back out of this state, we'll have explosive warming," Swanson said. "Thirty years of greenhouse gas radiative forcing will still be there and then bang, the warming will return and be very aggressive."

Graphene Surprise

We continue to get unexpected results with grapheme. Here the surplus heat is bled off directly into the silica substrate by electromagnetic transference. And while we are at it, the grapheme fails to permit thermalization.

This is almost too good to be true and the ramifications are only starting to be understood and even imagined. Is a layer of grapheme a heat barrier? Or can heat be directly converted to electron flow?

I have once conjectured that it might be possible to produce electron flow from metglass at low temperatures from the absorption of heat energy. This is just a superficial speculation that asks more questions than it answers but I thought it might lead somewhere. Perhaps once I get my Eden Machine skunk works up and running we can investigate the limits of that conjecture without spending much coin.

What brought that on at the time was the surprising behavior of electron flow on metglass. It allowed for the manufacture of much smaller starter motors for GM. It was my first eye opener to how little we understood the behavior of thin films, amorphous anything and electron flow and all that implied for technological advance. We are now starting to see the advances.

It almost makes you want to disappear into a laboratory to see what weird and wonderful things we could mock up. The whole area is still early stages and wide open.

Nanotubes wreak havoc with heat

Physicists in the US have discovered that electrons flowing in carbon nanotube-based circuits dissipate energy in much different ways than electrons flowing through devices made from conventional semiconductors such as silicon. The findings reveal processes of heat conduction that were never previously thought important, and could influence the types of materials chosen for the next generation of electronic devices in order to prevent them from overheating.

In conventional semiconductor devices, different layers of material are always joined by chemical bonds. This provides continuity for heat flowing through such devices, making them relatively easy to cool. Many researchers believe that future generations of electronic devices could be made from carbon nanotubes — tubes with walls just one atom thick — which could enable much smaller feature sizes and hence much better computing performance. However, nanotubes do not bond chemically to adjoining structures, which suggested that it should be very difficult to remove heat from such devices.

Bonding not needed

But now
Phaedon Avouris and colleagues at the IBM Thomas J. Watson Research Center in New York and researchers at Duke University in North Carolina have found that electrons in nanotubes can dissipate energy straight to an adjacent substrate even though it is not chemically bonded.
The team has also found that current-carrying electrons in nanotube devices do not undergo the normal process of “thermalization”, in which a material’s thermal vibrations reach statistical equilibrium (
Nature Nanotechnology doi:10.1038/nnano.2009.22).

Avouris and team studied a carbon nanotube on a silicon-dioxide substrate, an arrangement that acts like the active channel of a field–effect transistor. They have used a variety of techniques, including Raman scattering, in which the energy of scattered light reveals the different temperatures or “modes” of vibration of the nanotube lattice.

Normally when a current passes through a semiconductor the electrons bump into nearby atoms, which begin to vibrate in a certain mode. This mode then gradually transfers its energy to atoms at lower temperature modes until, at thermalization, all atoms are vibrating in statistical equilibrium.

The researchers have shown that, in nanotubes, thermalization does not take place; the atoms continue to vibrate in the same mode and statistical equilibrium is never reached.

Just as surprising, however, is that the lack of chemical bonding to the substrate does not inhibit heat conduction. The team has shown that when the electrons collide with atoms in the silicon dioxide, which is a polar material, the subsequent shift in position of the atoms generates an electric field that extends beyond the substrate and into the nanotube.
When the nanotube’s electrons interact with this field they are able to dissipate energy straight to the substrate.

Overlooked effect

Scientists were aware of this process of remote heat conduction before, but had never before considered it important because they had focused on 2D and 3D materials in which the effect is much weaker. But Avouris told physicsworld.com that the other unusual mechanism — the absence of thermalization — could exist in other materials, and that it may have been overlooked because researchers have not had the right observational tools.

Effect of Low Temperature Pyrolysis

The whole article is a bit of heavy reading but it is packed with data. This paper is a welcome addition to the literature that should now turn into a flood. We are beginning to replace educated guesses with hard facts.

The first hard fact that we can accept is that the best process temperature for agricultural biochar is unsurprisingly low as was certainly the case with classic terra preta. Most literature suggested that 350C degrees were about right. In fact the nature of an earthen kiln as certainly was used would suggest that a combustion front would pass through the bulk of the material and that the material would be hit with a peak temperature generated by the flame.

Remaining moisture would dampen the final temperature somewhat and perhaps provide a little measure of product control.

We also discover that the agricultural characteristics are markedly superior at the lower temperature. This is important because the tendency with metal kilns is to run to higher temperatures in order to speed the process. That earthen kiln methodology looks more foolproof by the day.

Certainly my intuition told me that an earthen kiln approach was likely the best method and that it was likely to remain a very good option. Once you are into metal, you are no longer losing surplus heat into the atmosphere and you really then need to draw off the volatiles some other way as the pyrolysis boys are trying to do.

The earthen kiln allows all the volatiles to be burned while using enough heat to char out the feedstock only. This is eminently practical for the agricultural industry from the subsistence farmer up. Corn culture makes it practical for the subsistence farmer as does elephant grass in Africa. The simple creation of shallow trench by removing top soil with a blade should allow any other form of farm waste to be packed and enclosed in dirt to form a similar kiln. And bales can be set on end, wrapped in a metal sheet and covered with a layer of dirt before set afire.

The importance of the dirt is that it will smother the red hot char as it loses structural integrity. You want it burning to that point at which you need to stop the process as fast as possible.

EFFECT OF LOW TEMPERATURE PYROLYSIS CONDITIONS
ON BIOCHAR FOR AGRICULTURAL USE

http://westinstenv.org/wp-content/Gaskin%20et%20al%202008.pdf

J. W. Gaskin, C. Steiner, K. Harris, K. C. Das, B. Bibens

ABSTRACT. The removal of crop residues for bio‐energy production reduces the formation of soil organic carbon (SOC) and therefore can have negative impacts on soil fertility. Pyrolysis (thermoconversion of biomass under anaerobic conditions) generates liquid or gaseous fuels and a char (biochar) recalcitrant against decomposition. Biochar can be used to increase SOC and cycle nutrients back into agricultural fields. In this case, crop residues can be used as a potential energy source as well as to sequester carbon (C) and improve soil quality. To evaluate the agronomic potential of biochar, we analyzed biochar produced from poultry litter, peanut hulls, and pine chips produced at 400°C and 500°C with or without steam activation. The C content of the biochar ranged from 40% in the poultry litter (PL) biochar to 78% in the pine chip (PC) biochar. The total and Mehlich I extractable nutrient concentrations in the biochar were strongly influenced by feedstock. Feedstock nutrients (P, K, Ca, Mg) were concentrated in the biochar and were significantly higher in the biochars produced at 500°C. A large proportion of N was conserved in the biochar, ranging from 27.4% in the PL biochar to 89.6% in the PC biochar. The amount of N conserved was inversely proportional to the feedstock N concentration. The cation exchange capacity was significantly higher in biochar produced at lower temperature. The results indicate that, depending on feedstock, some biochars have potential to serve as nutrient sources as well as sequester C. Keywords. Agricultural residues, Biochar, Bioenergy, Black carbon, Carbon sequestration, Charcoal, Plant nutrition, Pyrolysis, Soil fertility, Soil organic carbon.

Tuesday, March 3, 2009

China Acts

If there is ever a time to buy assets, it is now. China sat on over two trillion dollars of surplus cash and was niggardly in taking on new projects. Now they can load up without using extensive banking lines to complete acquisitions. That means that no deal is conditional on banking support, a difficult to find choice today.

The global banking system is going through a major deleveraging of its balance sheets and what that means is that the good assets will be let go only because they are the only ones that can be sold.

This opens the doors for Chinese sovereign wealth to be plowed into assets that will support the continued expansion of the Chinese economy. This story is only beginning, and the tragedy is this was made possible by the disastrous promotion of the mortgage credit bubble.

From Calgary to Caracas, China snapping up resources

http://www.terradaily.com/reports/From_Calgary_to_Caracas_China_snapping_up_resources_999.html

by Staff Writers
Shanghai (AFP) March 1, 2009

Resource-hungry China has seized upon the financial crisis to sign billions of dollars in deals in a buying spree that is set to pick up pace and reshape the global economic landscape, analysts say.

From Calgary to Caracas, China has hammered out an unprecedented series of agreements over the past month as plummeting energy and commodity prices have left once mighty producers over-extended and short on funds.

"Obviously there are heaps of opportunities out there, given the low asset values," said Sherman Chan, a Sydney-based economist for Moody's economy.com. "They're assessing all these
investment opportunities."

In February alone, a six-billion-dollar cash injection secured up to 200,000 barrels of oil per day from Venezuela.

A 25-billion-dollar loan also locked in 15 million tonnes of petrol a year for 20 years from Russia's Rosneft and Transneft. And 400 million dollars bought a Canadian firm's promising Libyan oil field.

In Australia, state-owned aluminium firm Chinalco inked the largest foreign deal ever by a Chinese company putting 19.5 billion dollars into troubled Australian mining giant, Rio Tinto, to increase its stake to 19 percent.

State-owned Hunan Valin Iron and Steel Group offered 650 million for a 16.5 percent stake in Australian iron ore miner Fortescue Metals, while China's Minmetals offered 1.7 billion dollars to take over debt-laden OZ Minerals.

"We will see more and more Chinese companies making overseas deals and they will probably also be on a very big scale," said Shi Jianxun, a professor at Shanghai's Tongji University.

Shi is part of a growing chorus within China arguing the government should now use a big portion of its 1.95 trillion-dollar
foreign exchange reserves, the world's largest, to buy assets that can give the nation greater security.

Oil, mining and high-tech companies should be priorities, he said.

"Crude and mineral resources are quite scarce and precious to China. It's worth it to get them because these are important strategic resources that will not depreciate easily," Shi said.

The buy-up began after Chinese state banks extended a record 1.2 trillion yuan (175 billion dollars) in
loans in January, as part of the government's economic stimulus plan launched late last year, Moody's Chan said.

The resource acquisitions will complement other aspects of China's infrastructure heavy stimulus plan, which is expected to lead to significant increases in resource exports in the second quarter, Chan said.

Chen Bin, head of the industry department in the National Development and Reform Commission, or NDRC -- the super-ministry that plans China's economy -- told a briefing on Friday the acquisitions were companies acting independently and not government-driven.

But the Russian, Rio Tinto and Oz Mineral deals were financed by the China Development Bank, a policy bank with no deposit base that only deals with projects assigned by the NDRC.

Meanwhile, the head of the sovereign
wealth fund, China Investment Corporation, met with Fortescue executives Wednesday to discuss financing.

China's acquisitions are also attracting close scrutiny from politicians abroad. Australian lawmakers, for instance, have raised concerns about China's state-owned entities buying into Australia's resource sector.

Zhang Ming, an economist with the Chinese Academy of Social Sciences, pointed to a range of upsides for China as it goes on its spending spree.

China's reliance on resource imports had left China's low-cost manufacturers vulnerable when prices skyrocketed, but gaining stakes in resource suppliers will help China hedge against future price increases, Zhang said.

"And the relatively low costs in the gloomy market could bring substantial profits once the market heats up," Zhang said.

#######

China's reliance on resource imports had left China's low-cost manufacturers vulnerable when prices skyrocketed, but gaining stakes in resource suppliers will help China hedge against future price increases.
China's economy 'shows signs of recovery': Wen

A four trillion yuan (585 billion dollar) stimulus package is starting to boost China's economy, but the government is prepared to take stronger measures if needed, Prime Minister Wen Jiabao said. "The stimulus measures have shown initial effects and produced good results in certain areas," Wen said in an online chat with web users over the weekend, which was widely circulated in the state media Sunday. "We must fully realize we are facing a long-term and arduous task... we are ready to take firmer and stronger actions whenever necessary." Wen cited rising loans, retail sales in January and increasing power output and consumption since the middle of February as signs of relief. The export-dependent Chinese economy expanded by nine percent in 2008, down steeply from 13 percent growth the year before. In the fourth quarter of last year, it grew by just 6.8 percent. China has released only limited data about how its economy has performed since the beginning of 2009, but some of the figures have been slightly more positive than expected. During his webchat, Wen pledged to support small- and medium-sized companies and expressed concern for the hardships suffered by up to 20 million migrant workers who have already been laid off by factories hit by the global crisis. He said there would be broad financial support for China's vast rural areas, details of which are expected to be given in Wen's annual work report to parliament on Thursday. China has set an official economic growth target of eight percent in 2009, considered by the government to be the minimum needed to prevent unemployment reaching a level where social unrest breaks out. In an effort to limit the domestic impact of the global financial crisis, Wen in November announced a four trillion yuan stimulus package, largely aimed at pump priming consumption and stimulating growth in rural areas.

Sea Ice Underestimated

I grabbed this out of Newsmax, and it is more a report on bad luck than anything else. I do not bother to track it that closely I had noticed that the reports made little sense in view of apparent conditions and just ignored them anyway. That just shows that my own confidence in the data is low and that I know it is all subject to correction. This is fine, because the important numbers are final seasonal calculations. This year we should have a substantial sea ice recovery over the past two years.

Our winter was clearly cold and long lasting. The new sea ice is maxed out and will take longer to destroy. I also suspect that spring is coming to the Arctic in its traditional time slot.

As I have been posting, this emphatically ends the 1990 – 2007 Northern Hemispheric warming cycle and returns us to weather comparable to the seventies and the eighties. The remaining open question is whether we are facing further cooling. History says we are, and the ongoing lack of sunspots is not a comfort, because that suggests that we may lose a little bit each year until it finally kicks back in.

In the meantime the global warming fanatics will have their work cut out for themselves as this so far modest cooling cycle asserts itself.

All we need now is a volcano to do its thing and give us a wrecked growing season. It has happened and it will happen again.

If we have learned anything though, it is that a number of factors are really impacting the final climatic output. They include sunspots (reflecting solar output), macroscopic decadal climate shifts, and small doses of little else that folks get excited about.

Those Macroscopic Decadal Shifts are very important because they are the mechanism by which surplus heat is shifted from the equator to the poles for final disposition. The size and duration of these events are such as to make efforts to fine tune the effect of CO2 if any as utterly meaningless.
The effect is reduced to the impact of a wind driven cross current on the tide.

The shifts that are apparent include the Pacific decadal Shift and the forty year hurricane cycle.

Arctic Sea Ice Underestimated Due to Sensor Glitch

Climate change alarmists are quick to point to diminishing Arctic sea ice as an indicator of global warming. But a faulty sensor led scientists to underestimate the extent of the ice — by an area larger than California.

The error began in early January and persisted until mid-February, according to the National Snow and Ice Data Center (NSIDC) at the University of Colorado, which releases estimates of Arctic sea ice.

The problem was caused by the malfunction of a satellite sensor used for daily updates on the extent of Arctic sea ice.

The NSIDC explained on its Web site: “On February 16, 2009, as e-mails came in from puzzled readers, it became clear that there was a significant problem — sea-ice-covered regions were showing up as open ocean . . .

“Upon further investigation, we found that data quality had begun to degrade over the month preceding the catastrophic failure.

“As a result, our processes underestimated total sea ice extent for the affected period. Based on comparisons with sea ice extent derived from the NASA Earth Observing System Advanced Microwave Scanning Radiometer sensor, this underestimation grew from a negligible amount in early January to about 500,000 square kilometers (193,000 square miles) by mid-February.”

The area of California is about 163,700 square miles.

The NSIDC uses Department of Defense satellites to obtain its Arctic sea ice data, rather than more accurate National Aeronautics and Space Administration equipment, Bloomberg.com reported.

The Arctic ice cap retreated to its smallest extent on record in 2007, then posted its second-lowest annual minimum at the end of last year’s melt season, and the NSIDC said the recent error does not change its view that the ice is retreating.

Plastic Solar Advance

This is another approach on the solar energy problem. By now everyone has figured out that it is desirable to produce a solar cell by printing technology. It is of course possible to do everything crudely in a proof of concept mock up. The hard part is to produce something that solves the manufacturing challenge.

Here they are trying to lay down layers that are bound and communicate properly.

We are overdue for a survey article that explains each and every protocol that is been worked on. It has long been obvious what the limitations were with silica and why it is necessary to work hard on these alternatives. It would be nice to have as clear a picture on the alternatives. It is not really good enough to mention mysterious layers without saying enough to give us a sense of comfort or to address the options available to be studied.

I am sure that the interest is there.

Googling the topic throws up the datum that the protocol relies on zinc oxide in nano sizes. I recall that prior work was thwarted by the difficulty in getting particle sizes small enough to maximize efficiency. Of course, this work is not addressing that issue directly.


University Of Alberta And NINT Researchers Make Solar Energy Breakthrough

by Staff Writers
The University of Alberta and the National Research Council's National Institute (NINT) for Nanotechnology have engineered an approach that is leading to improved performance of plastic solar cells (hybrid organic solar cells).

The
development of inexpensive, mass-produced plastic solar panels is a goal of intense interest for many of the world's scientists and engineers because of the high cost and shortage of the ultra-high purity silicon and other materials normally required.
Plastic solar cells are made up of layers of different materials, each with a specific function, called a sandwich structure. Jillian Buriak, a professor of chemistry at the U of A, NINT principal investigator and member of the research team, uses a simple analogy to describe the approach:

"Consider a clubhouse sandwich, with many different layers. One layer absorbs the light, another helps to generate the electricity, and others help to draw the electricity out of the device.

Normally, the layers don't stick well, and so the electricity ends up stuck and never gets out, leading to inefficient devices. We are working on the mayonnaise, the mustard, the butter and other 'special sauces' that bring the sandwich together, and make each of the layers work together. That makes a better sandwich, and makes a better solar cell, in our case".

After two years of research, these U of A and NINT scientists have, by only working on one part of the sandwich, seen improvements of about 30 per cent in the efficiency of the working model.

Michael Brett, professor of electrical and computer engineering, NINT principal investigator and member of the research team is optimistic: "our team is so incredibly cross-disciplinary, with people from engineering, physics and chemistry backgrounds all working towards this common goal of cheap manufacturable solar cells. This collaboration is extremely productive because of the great team with such diverse backgrounds, [although] there is still so much more for us to do, which is exciting." This multidisciplinary approach, common at the National Institute for Nanotechnology, brings together the best of the NRC and the University of Alberta.

The team estimates it will be five to seven years before plastic solar panels will be mass produced but Buriak adds that when it happens solar energy will be available to everyone. She says the next generation of solar
technology belongs to plastic.

"Plastic solar cell material will be made cheaply and quickly and in massive quantities by ink jet-like
printers

Monday, March 2, 2009

Dr. Hansen Over the Top

Dr Hansen’s utterances have recently gone way over the edge into the land of fanaticism and faith based positions and certainly is not the voice of science. His colleagues are quite right to call him out on this and he would be quite right to back of and apologize.

Of course Mother Nature has called him on the weakness of his science and his explanations have collapsed with each passing month of really lousy weather as he finds himself supporting the unsupportable.

The global temperature needs to rebound and we have seen no evidence of that this winter. It was worse than last winter, and I think that we are going to have a good growing season this year across North America.

Oh well, he probably thinks he is important enough to challenge someone who will bite his head off.

NASA's Chief Climate Scientist Stirs Controversy with Call for Civil Disobedience

Thursday, February 26, 2009
By Joshua Rhett Miller

Dr. James Hansen at a Capitol Hill press conference in 2008.

NASA's chief climate scientist is in hot water with colleagues and at least one lawmaker after calling on citizens to engage in civil disobedience at what is being billed as the largest public protest of global warming ever in the United States.

In a video on
capitolclimateaction.org, Dr. James Hansen is seen urging Americans to "take a stand on global warming" during the March 2 protest at the Capitol Power Plant in Southeast Washington, D.C.

"We need to send a message to Congress and the president that we want them to take the actions that are needed to preserve climate for young people and future generations and all life on the planet," says Hansen, who has likened coal-fired power plants to "factories of death" and claims he was muzzled by the Bush administration when he warned of drastic climate changes.

"What has become clear from the science is that we cannot burn all of the fossil fuels without creating a very different planet. The only practical way to solve the problem is to phase out the biggest source of carbon — and that's coal."

var adsonar_placementId="1426008",adsonar_pid="256757",adsonar_ps="-1",adsonar_zw=224;adsonar_zh=93,adsonar_jv="ads.adsonar.com";
qas_writeAd()
;

But critics say Hansen's
latest call to action blurs the line between astronomer and activist and may violate the Hatch Act, which prohibits federal employees from participating in partisan political activity.

"Oh my goodness," one of Hansen's former supervisors, Dr. John Theon, told FOXNews.com when informed of the video. "I'm not surprised ... The fact that Jim Hansen has gone off the deep end here is sad because he's a good fellow."

Theon, a former senior NASA atmospheric scientist, rebuked Hansen last month in a letter to the Senate's Environment and Public Works Committee, saying Hansen had violated NASA's official position on climate forecasting without sufficient evidence and embarrassed the agency by airing his claims before Congress in 1988.

"Why he has not been fired I do not understand," Theon said. "As a civil servant, you can't participate in calling for a public demonstration. You may be able to participate as a private citizen, but when you go on the Internet and call for people to break the law, that's a problem."

Officials at the U.S. Office of Special Counsel, which investigates possible Hatch Act violations, disagreed, saying Hansen is in the clear since it's an "issue-oriented activity," according to Hatch Unit attorney Erica Stern Hamrick.

The majority of federal government employees are allowed to take an active part in political activities, while workers at other departments like the FBI, Secret Service and National Security Council are subject to more restrictions on their political activities.

NASA spokesman Mark Hess also defended Hansen.

"He's doing this as a private citizen on his own time and there's nothing wrong with that," Hess told FOXNews.com. "There's nothing partisan here. You don't give up your rights to free speech by becoming a government employee."

Matt Leonard, a project coordinator for Greenpeace, one of more than 90 organizations endorsing the protest, said several thousand people are expected to participate and "peacefully disrupt operations" at the plant just blocks from Capitol Hill.

Participants are willing to "put their bodies on the line to stop climate change," including risking arrest, Leonard said.

"Our intention is to completely surround the facility, basically sending a message that these types of power plants can't be a part of our future," Leonard said. "They're destroying our environment."

Hansen will be in attendance and is expected to speak at the "completely nonviolent, peaceful" protest, Leonard said.

Meanwhile, Rep. Dana Rohrabacher, R-Calif., urged Hansen to rethink his plans.

"If he wants to have a demonstration concerning global warming, coming to the Capitol is not a right choice," Rohrabacher told FOXNews.com. "The bottom line is if Hansen wants to protest global warming, he should go to the National Cathedral and take it up with God rather than going to Capitol Hill."

Rohrabacher, a member of the House's Committee on Science and Technology, called on Hansen to "step out" of his role.

"He obviously doesn't feel comfortable with the restraints that come with being a scientist rather than a political activist," Rohrabacher said. "Most of us have always thought he has been hiding behind a scientific facade, and really, he was a political activist all along."

Chris Horner, author of "Red Hot Lies: How Global Warming Alarmists Use Threats, Fraud, and Deception to Keep You Misinformed," also denounced Hansen's latest call to arms against climate change.

"He's providing ample cause to question his employment on the taxpayer dime," Horner told FOXNews.com. "He's clearly abused his platform provided to him by the taxpayer, principally by the way he's been exposed of manipulating and revising
data with the strange coincidence of him always found on the side of exaggerating the warming."

Horner claimed that Hansen doctored temperature data on two occasions in 2001 and once in 2007 in attempts to show an impending climate catastrophe.

"He's creating an upward slope that really wasn't there," Horner said. "At some point you have to say these aren't mistakes."

Hansen, who did not respond to repeated requests for comment on this story, was most recently honored for his work last month with the 2009 Carl-Gustaf Rossby Research Medal, the highest honor bestowed by the American Meteorological Society.

"Jim Hansen is performing a tremendous job at communicating our science to the public and, more importantly, to policymakers and decision-makers," Franco Einaudi, director of the Earth Sciences Division at NASA's Goddard Space Flight Center, said in a press release.

"The debate about global change is often emotional and controversial, and Jim has had the courage to stand up and say what others did not want to hear. He has acquired a credibility that very few scientists have. His success is due in part to his personality, in part to his scientific achievements, and in part to his refusing to sit on the sidelines of the debate."
Former Vice President Al Gore, who toured with Hansen while promoting "An Inconvenient Truth," did not return repeated requests for comment for this article.

Batibe in Cameroon

I have here a report from West Africa in which indigenous peoples produce biochar from elephant grass which is an ample source of biomass. This is another example of indigenous ingenuity that has produced productive soils comparable to the terra preta soils of the Amazon. It is easy to see corn stover been fitted into this method for the same reason.

I have recently been able to discount the use of pottery as an important active factor in the Amazon. It simply does not show up in terra mulato. That made the simple earthen kiln as the best possible explanation. Here in the Cameroon we have a field length earthen kiln produced and then lit. It is a good bet that the earth collapses behind the burn front helping smother the produced char.

Corn is obviously much more bulky but the same method could well apply. I still think that building a vertical stack with the root balls forming the outer shell is likely to be much more effective for corn.

The important point though is that biochar is a living indigenous practice in this part of West Africa.

http://e-terrapreta.blogspot.com/2009/02/soils-near-batibo-cameroon.html

The Batibe technique was described to us as to work as follows: before the planting season, farmers collect big piles of elephant grass or any other type of savannah grass, which they spread out over their fields to dry it. After the grass has dried, they pile it so as to make long strips, on which they will grow their crops. Then they cover the big rows of grass with a layer of mud, which they leave to dry again. After the mud has dried and hardened, they open one part of the strip and set fire to the grass contained in this "container". The fire travels slowly through this "kiln", providing a low oxygen environment, and chars all the biomass. After this operation, they crush the mud layer, and the char beneath it. They repeat the effort several times to create layers of char and crushed mud. This then becomes their soil bed, on which they start planting crops when they rains arrive. The rains turn this soil layer into an apparently fertile soil. To our own amazement, the farmers of our workshop in Kendem immediately understood the biochar concept, because of their knowledge of this Batibe technique.

Laurens Rademakers

Winter 1709

Around the end of the Little Ice Age, Europe experienced this awful winter that was recognized as the worst experienced and can be accepted as that because of the living knowledge of the time and place. This article gives us a good description and plenty of data and some suggestive ideas.

The default explanation for the Little Ice Age has been a drop in solar output. The only thing that makes me uncomfortable with that explanation is that we lack good global confirmation. In fact one gets the sense that the rest of the globe was simply unaffected and that the European chill resulted from a confluence of bad luck in the year’s weather.

We had both Vesuvius and Fuji banging away, but they are really insufficient. The unusual impact of the southern winds is notable only inasmuch as their expected effect failed to materialize.

Once again, we have no knowledge of the activity of the Alaskan volcanoes that could well have the capacity to impact northern climate. And that really says it all. We have blank spots in the global record book that needs to be extracted somehow. We need to properly determine the ejection history of all active volcanoes around the world, even if it is to eliminate their influence.

A more important observation is that the reported temperatures are not particularly dangerous. Cold yes, but certainly well within our working comfort level for all of Europe. It was mostly an inconvenience to most that lived through it. The temperatures quoted are comparable to the continental weather of the Midwest every year. They are not comparable to those of Northern Alberta.

Another observation is that the rebound was rapid over the next two decades. This is the mirror of the present day reversal been experienced. Whatever confluence of atmospheric dynamics takes place, it seems recoil immediately toward the equilibrium position.

1709: The year that Europe froze

07 February 2009 by
Stephanie Pain

People across Europe awoke on 6 January 1709 to find the temperature had plummeted. A three-week freeze was followed by a brief thaw - and then the mercury plunged again and stayed there. From Scandinavia in the north to Italy in the south, and from Russia in the east to the west coast of France, everything turned to ice. The sea froze. Lakes and rivers froze, and the soil froze to a depth of a metre or more. Livestock died from cold in their barns, chicken's combs froze and fell off, trees exploded and travellers froze to death on the roads. It was the coldest winter in 500 years.

IN ENGLAND they called the winter of 1709 the Great Frost. In France it entered legend as Le Grand Hiver, three months of deadly cold that ushered in a year of famine and food riots. In Scandinavia the Baltic froze so thoroughly that people could walk across the ice as late as April. In Switzerland hungry wolves crept into villages. Venetians skidded across their frozen lagoon, while off Italy's west coast, sailors aboard English men-of-war died from the cold. "I believe the Frost was greater (if not more universal also) than any other within the Memory of Man," wrote William Derham, one of England's most meticulous meteorological observers. He was right. Three hundred years on, it holds the record as the coldest European winter of the past half-millennium.

Derham was the Rector of Upminster, a short ride north-east of London. He had been checking his thermometer and barometer three times a day since 1697. Similarly dedicated observers scattered across Europe did much the same and their records tally remarkably closely. On the night of 5 January, the temperature fell dramatically and kept on falling. On 10 January, Derham logged -12 °C, the lowest temperature he had ever measured. In France, the temperature dipped lower still. In Paris, it sank to -15 °C on 14 January and stayed there for 11 days. After a brief thaw at the end of that month the cold returned with a vengeance and stayed until mid-March.

Later that year, Derham wrote a detailed account of the freeze and the destruction it caused for the Royal Society's Transactions. Fish froze in the rivers, game lay down in the fields and died, and small birds perished by the million. The loss of tender herbs and exotic fruit trees was no surprise, but even hardy native oaks and ash trees succumbed. The loss of the wheat crop was "a general calamity". England's troubles were trifling, however, compared to the suffering across the English Channel.
In France, the freeze gripped the whole country as far as the Mediterranean. Even the king and his courtiers at the sumptuous Palace of Versailles struggled to keep warm. The Duchess of Orleans wrote to her aunt in Germany: "I am sitting by a roaring fire, have a screen before the door, which is closed, so that I can sit here with a sable fur piece around my neck and my feet in a bearskin sack and I am still shivering with cold and can barely hold the pen. Never in my life have I seen a winter such as this one."

In more humble homes, people went to bed and woke to find their nightcaps frozen to the bed-head. Bread froze so hard it took an axe to cut it. According to a canon from Beaune in Burgundy, "travellers died in the countryside, livestock in the stables, wild animals in the woods; nearly all the birds died, wine froze in barrels and public fires were lit to warm the poor". From all over the country came reports of people found frozen to death. And with roads and rivers blocked by snow and ice, it was impossible to transport food to the cities. Paris waited three months for fresh supplies.

People went to bed and woke to find their nightcaps frozen to the bed-head
There was worse to come. Everywhere, fruit, nut and olive trees died. The winter wheat crop was destroyed. When spring finally arrived, the cold was replaced by worsening food shortages. In Paris, many survived only because the authorities, fearing an uprising, forced the rich to provide soup kitchens. With no grain to make bread, some country people made "flour" by grinding ferns, bulking out their loaves with nettles and thistles. By the summer, there were reports of starving people in the fields "eating grass like sheep". Before the year was out more than a million had died from cold or starvation.

The fact that so many people left accounts of the freeze suggests the winter of 1708/1709 was unusually bad, but just how extraordinary was it?
In 2004, Jürg Luterbacher, a climatologist at the University of Bern in Switzerland, produced a month-by-month reconstruction of Europe's climate since 1500, using a combination of direct measurements, proxy indicators of temperature such as tree rings and ice cores, and data gleaned from historical documents (Science, vol 303, p 1499). The winter of 1708-1709 was the coldest. Across large parts of Europe the temperature was as much as 7 °C below the average for 20th-century Europe.
Why it was quite so cold is harder to explain. The Little Ice Age was at its climax and Europe was experiencing climatically turbulent times: the 1690s saw a string of cold summers and failed harvests, while the summer of 1707 was so hot people died from heat exhaustion. Overall, the climate was colder, with the sun's output at its lowest for millennia. There were some spectacular volcanic eruptions in 1707 and 1708, including Mount Fuji in Japan and Santorini and Vesuvius in Europe. These would have sent dust high into the atmosphere, forming a veil over Europe. Such dust veils normally lead to cooler summers and sometimes warmer winters, but climatologists think that during this persistent cold phase, dust may have depressed both summer and winter temperatures.

None of these things accounts for the extremity of that particular winter, however. "Something unusual seems to have been happening," says Dennis Wheeler, a climatologist at the University of Sunderland, UK. As part of the
European Union's Millennium Project, which aims to reconstruct the past 1000 years of Europe's climate, Wheeler is extracting data from Royal Navy logbooks, which provide daily observations of wind and weather. "With daily data you can produce very reliable monthly averages but you can also see what happened from one day to the next," says Wheeler. He and his colleagues have now compiled a database of daily observations stretching back to 1685 from the English Channel area. "This is a key climatic zone. The weather there reflects wider conditions across the Atlantic, which is where in normal circumstances much European weather originates."

The most immediate cause of cold winters in Europe is usually an icy wind from Siberia. "What you would expect would be long runs of easterly winds with a well-developed anticyclone over Scandinavia sucking in cold air from Siberia," says Wheeler. Instead, his data show a predominance of southerly and westerly winds - which would normally bring warm air to Europe. "There were only occasional northerlies and easterlies and those were never for more than a few days," says Wheeler. Another odd finding was that January was unusually stormy. Winter storms tend to bring milder, if wilder, weather to Europe. "This combination of cold, storms and westerlies suggests some other mechanism was responsible for that winter."

There may be no easy explanation for the Great Frost of 1709, but unexpected weather patterns revealed by Wheeler's data underline
why climate reconstructions are so important. "We need to explain the natural variation in climate over past centuries so that we can tease apart all those factors that contribute to climate change. But before we can do that we need to nail down those changes in detail," says Wheeler. "Climate doesn't behave consistently and warmer and colder, drier and wetter periods can't always be explained by the same mechanisms." In the two decades after that terrible winter, the climate warmed very rapidly. "Some people point to that and say today's warming is nothing new. But they are not comparable. The factors causing warming then were quite different from those operating now."