Tuesday, February 3, 2009

Whirlpool Power Device Solves Power Dam Salmon Transit Problem

Sometimes we come across a bit of new technology that is way more important than imagined by the developer. This is one of those. Before we get to that though, this is an important new innovation because it permits easy exploitation of run of the river power opportunities were reservoirs are inconvenient. Also every river can locate a one meter drop just about anywhere, and you do not have to put the whole river through the mill.

The picture shows a millrace feeding a prop which spins slowly with a high torque, allowing the passage of fish down through the prop. I do not expect fish to swim up through the prop, but it is only one meter and less likely things sometimes work. More likely, and what is shown in the drawings on the home site is a bypass channel that can obviously be replaced by a fish ladder to reduce wastage.

Most important is that a fish can swim past this system up stream. On top of that, the system as illustrated can obviously be stacked as a double silo to any necessary height. This means we have a feasible way to allow migrating fish to bypass a tall power dam safely.

We currently are trucking fish past dams and it unsatisfactory. Here we have a solution that also produces power to support its upkeep.

This directly solves the problem of sharing a salmon fishery with dam. We still have to work to maintain appropriate streams and gravel washes but that is the easy part that needs regulation and a little care. This promises to even be beneficial to the salmon since a dammed river system has eliminated most of the natural hazards and has opened up ample potential spawning beds and rearing areas



Capturing the Power of Whirlpools

http://www.ecogeek.org/content/view/2488/85/

Written by Philip Proefrock

Monday, 19 January 2009

An Austrian engineer, Franz Zotlöterer, has developed a new method for small scale hydropower by creating a whirlpool that avoids many of the problems typically associated with hydroelectric generation.

In terms of green power generation, solar and wind get much of the attention. Hydropower is as green as wind and solar in terms of limiting emissions, but some of the ecosystem disruption associated with large-scale hydro have taken it off the table as a choice for good green power. However, smaller scale hydropower options can provide electrical power and provide additional benefits to the waterway.

The original idea behind the vortex was, in fact, not power generation, but water purification. A vortex in the water serves to efficiently aerate the water and to aid in more rapid breakdown of contaminants.

The idea to use a continuous, strong vortex flow actually occurred to Franz Zotlöterer while trying to solve the inherent problems with water quality of the natural swimming pond he had set up in his own garden. He finally decided to build a small rotation basin to aerate the water – and it worked. He then began to think about other potential fields of application for his aeration concept: drinking water supply, wastewater treatment, electricity generation.
- Aquamedia

Instead of channeling the water directly through a turbine, vortex hydropower creates a spinning vortex, and draws energy from the swirling water. This approach makes it possible to generate energy without completely blocking the waterway and eliminates the need for much screening or filtration. Small debris is not a problem for vortex generation as it would be for a conventional turbine. Furthermore, fish are able to pass by the vortex chamber without harm. The vortex operates at slower speeds, and the large open chamber makes it possible for fish to pass even going upstream. The vortex also reduces the temperature change of the water, and can more readily be integrated into the natural river environment.

The pilot plant only needs a fall of 1.3 meters (4.25 feet) and, with a flow rate of 1 cubic meter per second (about 265 gallons per second), produces 8 kW of electricity, enough for about 14 average European homes. A head of as little as 0.7 meters (2.25 feet) is possible for a vortex generator. The much lower rise makes it easier to locate a vortex generator on a smaller waterway, without the need for high dams and other interventions typically associated with hydropower.

The vortex system is about 80% efficient, comparable to a standard turbine. However, the vortex cannot scale as large as a turbine power plant. A vortex has a range of performance up to about 150 kW, while a traditional turbine can reach up to 100 MW.

A system that both generates electricity and helps to clean and purify the water is a great technology of the kind we like to see.

Thanks for the tip, VikingHouse

For much more technical detail see;

Monday, February 2, 2009

Catalyst Cracks Ethanol for Fuel Cells

This obscure item is actually rather good news. Available methods, as mentioned were chemically slow and also likely very inefficient. My own exposure to the problem was to discover that the best chemical method filled the membrane up with chalk. Oh well.

We have already understood that the best available liquid fuel for long term transportation will be ethanol. We can produce it easily today using cattail starch and possibly will also be able to convert the cellulose. This can be accomplished in the volumes necessary without disturbing food production.

The idea of using an ethanol fuel converter to produce hydrogen and to use that hydrogen to produce current through a fuel cell is attractive and likely very efficient.

Most important, it can step into situations where an EEStor style battery based system will likely never be acceptable such as long haul trucking in general.

I particularly draw attention to the following quote:

"The ability to split the carbon-carbon bond and generate CO2 at room temperature is a completely new feature of catalysis," Adzic said. "There are no other catalysts that can achieve this at practical potentials."

If we can extend a similar capability to the conversion of methane and other hydrocarbons and even other organic molecules, then we have an actual shortcut in the production of hydrogen for powering fuel cells. This certainly gives it to us for ethanol which is certainly a giant first step.

Up to this point one was forced to imagine needle forges, pyrolyzing methane perhaps to break out the hydrogen. It would plausibly work for hydrazine but nothing else. Now we have a room temperature process that produces CO2 and hydrogen. Does it get any better?


New Catalyst Paves The Path For Ethanol - Powered Fuel Cells

http://www.biofueldaily.com/reports/New_Catalyst_Paves_The_Path_For_Ethanol_Powered_Fuel_Cells_999.html

http://www.energy-daily.com/images/ternary-electrocatalyst-ethanol-oxidation-sm.jpg

Model of a ternary electrocatalyst for ethanol oxidation consisting of platinum-rhodium clusters on a surface of tin dioxide. This catalyst can split the carbon-carbon bond and oxidize ethanol to carbon dioxide within fuel cells.

by Staff Writers
Upton NY (SPX) Jan 29, 2009

A team of scientists at the U.S. Department of
Energy's (DOE) Brookhaven National Laboratory, in collaboration with researchers from the University of Delaware and Yeshiva University, has developed a new catalyst that could make ethanol-powered fuel cells feasible.
The highly efficient catalyst performs two crucial, and previously unreachable steps needed to oxidize ethanol and produce clean energy in fuel cell reactions. Their results are published online in the January 25, 2009 edition of Nature Materials.

Like batteries that never die, hydrogen fuel cells convert hydrogen and oxygen into water and, as part of the process, produce electricity.

However, efficient production, storage, and transport of hydrogen for fuel cell use is not easily achieved. As an alternative, researchers are studying the incorporation of hydrogen-rich compounds, for example, the use of liquid ethanol in a system called a direct ethanol fuel cell.

"Ethanol is one of the most ideal reactants for fuel cells," said Brookhaven chemist Radoslav Adzic. "It's easy to produce, renewable, nontoxic, relatively easy to transport, and it has a high energy density. In addition, with some alterations, we could reuse the infrastructure that's currently in place to store and distribute gasoline."

A major hurdle to the commercial use of direct ethanol fuel cells is the molecule's slow, inefficient oxidation, which breaks the compound into hydrogen
ions and electrons that are needed to generate electricity. Specifically, scientists have been unable to find a catalyst capable of breaking the bonds between ethanol's carbon atoms.

But at Brookhaven, scientists have found a winner. Made of platinum and rhodium atoms on carbon-supported tin dioxide nanoparticles, the research team's electrocatalyst is capable of breaking carbon bonds at room temperature and efficiently oxidizing ethanol into
carbon dioxide as the main reaction product. Other catalysts, by comparison, produce acetalhyde and acetic acid as the main products, which make them unsuitable for power generation.
"The ability to split the carbon-carbon bond and generate CO2 at room temperature is a completely new feature of catalysis," Adzic said. "There are no other catalysts that can achieve this at practical potentials."

Structural and electronic properties of the electrocatalyst were determined using powerful x-ray absorption techniques at Brookhaven's National Synchrotron Light Source, combined with data from transmission electron microscopy analyses at Brookhaven's Center for Functional Nanomaterials.

Based on these studies and calculations, the researchers predict that the high activity of their ternary catalyst results from the synergy between all three constituents - platinum, rhodium, and tin dioxide - knowledge that could be applied to other
alternative energy applications.

"These findings can open new possibilities of research not only for electrocatlysts and fuel cells but also for many other catalytic processes," Adzic said.

Next, the researchers will test the new catalyst in a real fuel cell in order to observe its unique characteristics first hand.

This work is supported by the Office of Basic Energy Sciences within DOE's Office of Science.

Comet Impact Theory Challenged

This one of those items where the headline promises more than it actually delivers. Once again lack of evidence is hardly evidence of lack. This is about evidence supporting an increase in fires as a result of an improving climate. That is all well and good. It does not disprove the existence of a comet explosion, or an asteroid impact leading to the initiation of the climatic change.

That headline writers were extrapolating fires from the shock wave was optimistic. In fact, what evidence that existed was in the form of charcoal, and that is not evidence of burning so much as a blast of heat that toasted large areas.

If significant charcoal survived wildfires then all our soils would be terra preta. The reality is that the charcoal succumbs to fire sooner or later.

The evidence of a blast is in the rock scatter throughout the Ohio Valley. That others have suggested that a Tunguska like event would have ignited a continental wildfire is purely a speculation and as suggested, not supported. There are related charcoal deposits that are not lake sediment related and likely need to be understood.

A charring event following a shock wave passing over grassland is likely to produce a carbon layer if it did not ignite, except it would be consumed during the next season’s grassfire. The same event passing over a forest might just leave enough carbon buried in fallen trees to protect the fresh carbon layer. This tells us that a lot more data is needed, in order to put the early claims in proper perspective.

That a cosmological event took place is not in dispute although this headline claims that. The event put a lot of dust, rock and ice on a trajectory toward the Carolinas from just west of Hudson Bay. It created a shock front that flattened exposed ground but likely left huge tracts untouched.

It was a large object whose energy was largely absorbed by the crust. Much of the shock was absorbed by the ice. And it may have broken up just before impact.

It would be nice if the continent was probably burned over also, but really begs the question of why? It is amazing how a good tale runs ahead of the facts that are dramatic enough.

They did prove that a warmer climate produces more biomass and that this produces more fuel which produces more fire.


Comet Impact Theory Disproved

http://www.spacedaily.com/reports/Comet_Impact_Theory_Disproved_999.html

by Staff Writers
Bristol, UK (SPX) Jan 28, 2009

New data disproves the recent theory that a large comet exploded over North America 12,900 years ago, causing a
shock wave that travelled across North America at hundreds of kilometres per hour and triggering continent-wide wildfires.

Dr Sandy Harrison from the University of Bristol and colleagues tested the theory by examining charcoal and pollen records to assess how fire regimes in North America changed between 15 and 10,000 years ago, a time of large and rapid
climate changes.

Their results provide no evidence for continental-scale fires, but support the fact that the increase in large-scale wildfires in all regions of the world during the past decade is related to an increase in global warming.

Fire is the most ubiquitous form of landscape disturbance and has important effects on
climate through the global carbon cycle and changing atmospheric chemistry. This has triggered an interest in knowing how fire has changed in the past, and particularly how fire regimes respond to periods of major warming.

The end of the Younger Dryas, about 11,700 years ago, was an interval when the temperature of Greenland warmed by over 5 degrees C in less than a few decades. The team used 35 records of charcoal accumulation in lake sediments from sites across North America to see whether fire regimes across the continent showed any response to such rapid warming.

They found clear changes in biomass burning and fire frequency whenever climate changed abruptly, and most particularly when temperatures increased at the end of the Younger Dryas cold phase. The results published, January 26, in the Proceedings of the National Academy of Science.

Understanding whether rapid changes in climate have caused wild fires in the past will help understand whether current changes in global temperatures will cause more frequent fires at the present time. Such fires have a major impact on the economy and health of the population, as well as feeding into the increase in global warming.

Polls turn against Gobal Warming

I am personally disturbed that the Global Warming debate has degenerated into blind propaganda and deliberate intimidation when challenged. Everyone involved is acting more and more desperate and strident, particularly the media crowd who are finding themselves riding a dying horse.

Now we have polls showing that the great unwashed is no longer buying the party line and is getting downright surly.

The Media needs a new story to sell. Global Warming has lost its legs and will not now be back for decades.
Perhaps we are hearing the howls of chagrin as every reporter loses his best excuse for a story spin on any mundane subject. Maybe we can get them back to chasing volcanoes and hurricanes.

It is still remarkable that the unrelenting drum beat of pro global warming material is actually taking support in reverse. A very real lesson for those who wish to shape public opinion is that at some point you begin to work against yourself. This is not obvious, but the harder you work to catch everybody, the more you force your soft support to review their position. And minds thus engaged do change.

This is an interesting curve. As funds expended climb, the support percentage also climbs quite briskly at first, then levels off, and then begins to decline slightly. Obviously, once you hit a peak, it is best to smartly drop expenditures to maintenance levels.

This should apply nicely to advertising campaigns.

By Bob Ellis on January 19th, 2009

Al Gore and his disciples are losing the war to convince the American people that human beings are responsible for any warming of planet earth.

Forty-four percent (44%) of U.S. voters now say long-term planetary trends are the cause of global warming, compared to 41% who blame it on human activity.

This is down from 47% last year, so the Gore-ites are progressively losing their grip on the minds of the people.
This is especially remarkable given that the “mainstream” media has been 110% in the tank for Gore and his outrageous claims for years. They have been totally complicit in pushing this notion despite the fact that there is little science to back it up, and a mountain of
conflicting evidence.

NASA temperature records have been found to be erroneous and have been revised…with little or no fanfare in the “mainstream” media.

Many of the weather stations gathering evidence has also been compromised and do not provide reliable data.

Believers in this theory also ignore evidence going back thousands of years (some say even farther) showing natural climate cycles.

They ignore the most obvious possibility that the huge star in the middle of our solar system (known commonly as “the sun”) may be changing the temperature change on earth. Warming has been detected on
Mars and Jupiter where there are no SUVs or power plants, yet we are expected to believe humans and their capitalistic progress are the villains on this planet.

The “mainstream” media has also attempted to create a perception that there is no dissent in the scientific community to this wacky idea, despite the fact that
thousands of scientists have officially registered their disagreement, and many continually go on record (in the off-the-beaten-path venues where they are allowed to go on record) with their skepticism.

It’s heartening to see the ability of the American people to see through the environmentalist garbage they’re fed from the “mainstream” media and others on the Left.

Too bad that wasn’t the case in the recent election.

Friday, January 30, 2009

Death Throes of Global Warming Hypothesis

A lot of the members of the scientific community are finding they are sorry that they ever signed on to the global warming enthusiasm.

When I began this blog, the fact that the climate had warmed up a degree or so and had then held on to that increase for a decade was obviously true. The idea that this was an event outside the normal range of climatic variation was not obviously true. In fact the apparent cyclic time spans involved supported the idea that this was a natural reoccurrence.

For that reason, from the beginning, I explicitly separated my concerns over the accumulation of CO2 from my ongoing interest in the climate and my principal theme of terraforming the Earth and yes, actually warming up the northern Hemisphere in the process.

History will show that the atmosphere discharged surplus heat into the Arctic in 2007 and thereafter global temperatures fell back abruptly by a degree or so by now. In short, the warmth that took a decade or more to accumulate and sustained for an additional decade, was lost almost overnight. We also have a much better understanding of the mechanism.

In the summer of 2007, we were on the road to an ice free summer Arctic by 2012. Three months later, the switch had been visibly been pulled and we had started on the down slope. That has continued through this winter. We are literally back to the worst of it and are hoping that some nasty volcano does not pick this time to blow its top.

Right now, the folks who should have known better, or were simply too intimidated to speak their minds are now standing up and kicking this dead horse to death.

It will be fun to promote my program for employing two billion people and warming the Earth while the media gets back to promoting the next ice age. Maybe someone will listen.

The turning point—it’s becoming chic to be a skeptic

This must be it, surely, the point where being a skeptic has more scientific cachet than being a believer. The trickle is becoming a flood. We are reaching the stage where independent scientists will want to make sure they are known to be on the skeptical side of the fence.

None other than James Hansen’s former supervisor at NASA has just
announced that not only is he a skeptic, but that Hansen is an embarrassment to NASA and was never muzzled. In a message to the Minority Office at the Environment and Public Works Committee, Theon wrote:

“I appreciate the opportunity to add my name to those who disagree that global warming is man made, …I was, in effect, Hansen’s supervisor because I had to justify his funding, allocate his resources, and evaluate his results”

“Hansen was never muzzled even though he violated NASA’s official agency position on climate forecasting (i.e., we did not know enough to forecast climate change or mankind’s effect on it). Hansen thus embarrassed NASA by coming out with his claims of global warming in 1988 in his testimony before Congress,”

Retired senior NASA atmospheric scientist, Dr. John S. Theon

Theon joins a growing list of over 650 prominent skeptics. Here’s how the list is becoming a story all of it’s own, and the drive to publicly announce skepticism is picking up pace.

Dec 11, 2008: Marc Morano released an updated list of
650 skeptics, it’s a 230 page pdf file with quotes and qualifications listed from skeptical prominent scientists that even includes past and present IPCC authors. As people became aware of the list, the clamour began from those who want to join in. 11 scientists joined the list in the next two weeks including Dr Schaffer, and Dr Happer (below).

Dec 19, 2008: Dr Schaffer, Professor of Ecology & Evolutionary Biology Uni of Arizona, has authored more than 80 scientific publications and authored the paper “Human Population and Carbon Dioxide.”

“The recent lack of warming in the face of continued increases in CO2 suggests (a) that the effects of greenhouse gas forcing have been over-stated; (b) that the import of natural variability has been underestimated and (c) that concomitant rises of atmospheric CO2 and temperature in previous decades may be coincidental rather than causal,” he added. “I fear that things could easily go the other way: that the climate could cool, perhaps significantly; that the consequences of a new Little Ice Age or worse would be catastrophic and that said consequences will be exacerbated if we meanwhile adopt warmist prescriptions. This possibility, plus the law of unintended consequences, leads me to view proposed global engineering ‘solutions’ as madness.”

Dr. W. M. Schaffer, Professor, Uni of Arizona

Dec 22, 2007: Dr Will Happer, Professor at Princeton University and former Director of Energy Research at the Department of Energy from 1990 to 1993, and has published over 200 scientific papers.

“I had the privilege of being
fired by Al Gore, since I refused to go along with his alarmism. I did not need the job that badly… I have spent a long research career studying physics that is closely related to the greenhouse effect, for example, absorption and emission of visible and infrared radiation, and fluid flow… Fears about man-made global warming are unwarranted and are not based on good science.”

Dr Will Happer, Professor, Princeton

January 7th, 2009: Jack Schmitt—the last man to walk on the moon, announced he was a skeptic.

“As a geologist, I love Earth observations,” Schmitt wrote, “But, it is ridiculous to tie this objective to a ‘consensus’ that humans are causing global warming when human experience, geologic data and history, and current cooling can argue otherwise. ‘Consensus,’ as many have said, merely represents the absence of definitive science. You know as well as I, the ‘global warming scare’ is being used as a political tool to increase government control over American lives, incomes and decision making…”

Jack Schmitt, Geology PhD, Harvard, NASA Astronaut

This is probably the sweetest of the lot in a way. As Marc Morano points out,
back in 2006 Al Gore said “The debate’s over. The people who dispute the international consensus on global warming are in the same category now with the people who think the moon landing was staged on a movie lot in Arizona.”

Back then, being a skeptic was supposedly equivalent to being a nut-case. Now, even those who’ve landed on the moon dispute the concensus.

The momentum is growing. History will record this cold northern winter as the season when being known as a skeptic became scientifically hip, and being labelled a ‘believer’—scientifically uncool (as it should be) .

I say: scientists everywhere, be proud of our standards, stand up and be counted. Rise against Dark-Age-reasoning, political pressure and the call of government grants.

Iranian Rocketry

Iran’s enthusiasm for nuclear weapons and rockets is not the type of good news any of us want to read about. We used to dream that such weapons could be kept out of the hands of apparent sociopaths and that it would be possible to keep us all safe.

What we learn instead is that if the will and the money exist, then these systems can be built and put in place by any country. And prime minister Whack Job is doing a fairly good imitation of a sociopath.

The only good news is that their capability is not as eminent as some would have you believe. We actually have time to allow natural regime change to work its way through the Iranian government.

We still have the present capability of the Pakistanis and the pretense of the North Koreans.

The real problem is that Islam permits individuals to preach a doctrine that would have shamed Hitler. There will always be someone who will strive to act on these teachings to satisfy his blood cravings and under cover of this groupthink. You only have to look at the ease in which suicide bombers are recruited.

That means we must address the problem of effective counter measures.

We have provided the possessors of nuclear weapons respect that has reinforced their owner’s behavior. In fact it has legitimized their acquisition.

The only effective and lasting solution is and will be the outright deNazification of Islam. It took five to ten years in Germany to drain the poison. It will take just as long to do the same for any Islamic country.

The non Islamic world needs to unite in this program and challenge Islam everywhere, at home and abroad. Most of Islam will cooperate and support this because Islam will prosper once Islamic barbarism itself is extinct.

The program entails the arrest and imprisonment of all extreme interpreters of the Koran for a minimum of five years and permanent probation blocking them from any position of power. Liberal clerics are organized to supply command and control to replace any open positions in society.

Down this road all of Islam can prosper and survive. The other road is a perpetuation of Gaza unto the seventh generation and the murder of millions.

Regimes that oppose this process can be step by step embargoed. There is plenty to do and we have all the time in the world to see it through. Fortunately, most of Islam can make this happen very easily and much more easily if it is part of a global cleansing supported by an united globe.

Iran Set to Launch First Domestic Satellite by March 20

by Staff Writers
Tehran (RIA Novosti) Jan 29, 2009

http://www.spacedaily.com/reports/Iran_Set_To_Launch_First_Domestic_Satellite_By_March_20_999.html

Iran plans to put its first domestically made communication satellite into orbit by March 19-20, the head of the Iranian space agency has said.

"If we do not run into problems, the first domestic satellite will be put in orbit by the end of this [Iranian solar calendar] year," Reza Taqipour said.

He said that technical experts were working to complete the preparations, adding that the precise launch date for the Omid (Hope) satellite would be announced as it drew nearer.

In November, Iran launched a carrier space rocket, Kavoshgar 2 (Explorer 2), which returned to earth after completing its mission.

The project was part of the country's "strategic
space program" and "preparation for scientific and technological developments in space," according to Iran's state TV IRIB.

Last August, Iran successfully launched a carrier rocket Safir (Messenger), capable of putting lightweight satellites into low-earth orbit.

Iran has said it plans to put a "series of satellites" into space by 2010 to aid natural disaster management programs and improve telecommunications.

The launches have aroused concerns throughout the world that Tehran is developing long-range ballistic missile technology that could be used to launch
nuclear weapons.

Shadi Bushra on Death of Monroe Doctrine

This article is a wake up call for all of us in North America. We have been utterly complacent in our private and public dealings between ourselves and the southern continent.

Curiously, I have just recently read the biography of Lula and if you do not have a clue, please google Lula. I took strong encouragement from his emergence. Real democracy is working in Brazil, and real progressive steps are been taken. It is never fast enough, but the potential is palpable.

Brazil is rising as another super economy, not unlike India and China with appropriate political reliability to go with it. Brazil’s rising middle class will soon dominate the remainder of South America and establish a pan S. American consensus in dealing with the North and elsewhere.

The day of apparent heavy handed USA interventionism is fast disappearing and this is for the better. The US never wanted it, but confronted by yet another version of broken governance it was always unavoidable.

We should do all that is possible to welcome a Brazil led South America onto the world stage and greater international responsibility. The sooner Obama embraces Lula the better for all.

The Death of the Monroe Doctrine

January, 2009
by Shadi Bushra

Some hundred and fifty years ago, President James Monroe declared the entire Western Hemisphere the dominion of the United States of America. His none too subtle warnings to the European powers to keep out of Latin American affairs led to the budding of policies that would use enormous military, economic, and political pressure from Washington to dictate to Latin America the terms of their relationship.
Today, however, we are seeing a Latin America whose weariness and frustration with the United States is amounting to a sharp divergence of policies as our neighbors to the south pursue their national and regional interests with little regard for the United States’ opinion.
From Venezuela’s confrontational Chavez to Brazil’s more moderate Lula, leaders throughout Latin America are finding new partners on issues from development to defense. This November, Russian warships engaged in naval exercises with Venezuela followed by a visit to Cuba, the first such excursion to the Caribbean since the Cold War. In the same period of time, Chinese President Hu Jintao signed a free-trade agreement with Peru, and Brazil invited Iranian President Mahmoud Ahmadinejad for a state visit.
In response to these and other diplomatic overtures from rivals, Thomas Shannon, Assistant Secretary of State for Western Hemisphere Affairs said, “We don’t subscribe to the hydraulic theory of diplomacy that when one country is up, the other is down - that if China and Russia are in the area our influence has somehow waned.” In any case, it is no longer possible to say that the United States is the only power which Latin America can, or particularly wants to do business with.
As our international image and with it our global influence has decreased, many countries are realizing that overly friendly or close relations with the United States are no longer necessary to their survival.Beyond such overt encroachments by competitors, much of America’s own policy towards Latin America has led to a cooling of relations between Washington and much of the region.
We have continued to insist that NAFTA and CAFTA have promoted growth despite evidence to the contrary. Meanwhile, bilateral agreements with Colombia, Panama, and Peru are pending specific conditions or Congressional approval.
Despite the free trade agreements in place and under negotiation, the de facto death of the US-backed Free Trade Area of the Americas (FTAA) seems to be indicative of a broader trend in Latin America which recognizes that we live in an incredibly globalized world where capital and markets can be found just as easily in China or Russia as they can in the United States.
While Russia has been focused largely on military sales, China has kept itself open to any and all opportunities. Two-way trade with the region shot up twelvefold since 1995 to $110 billion last year, according to the Inter-American Development Bank. China’s share of the region’s imports also jumped, to 24 percent from 9.8 percent in 1990, while the U.S. share shrunk to 34 percent from 43 percent.
Additionally, China has offered loans with far less conditions than international or American institutions.However Latin America is not only looking for assistance from America’s global competitors; the region is increasingly turning inward to solve its problems. The economic crisis and the global powers’ uncoordinated reactions reaffirmed that developing countries would have to unite if they were to weather this economic storm, and the political and social challenges that came with it.
A recent regional summit put a much finer point on it by excluding the United States and inviting Cuba, which Washington has attempted to make a pariah in the hemisphere. Raul Castro’s presence and the warm welcome extended him by the summit’s host, Brazil’s President Lula, and their counterparts from the 31 other countries represented marked a sharp deviation from Cuba’s 1961 expulsion from the Organization of American States (OAS) at the United State’s behest.
The summit, which called for greater economic integration (including very vocal and at times confrontational calls to lift the embargo on Cuba), a regional bank for countries to turn to before the despised IMF and World Bank, and reducing dependence on US arms by improving South American defense industries. To some, this summit gives the image of a region that is turning increasingly hostile towards that US.
But in reality, no leader in the area is demanding anything from the United States beyond respect of their sovereignty and their right to make decisions in the best interest of their people. Where those interests are in agreement with ours, as they often are, we can work in concert to achieve common goals. Where they conflict, we must realize that we can no longer impose our will upon the Latin American people. That, at the very least, has been made thoroughly clear.

Thursday, January 29, 2009

Laffer Curve Applied and Oil Rationing

The original article by Laffer in 2004 is much longer and investigates a great deal of the history of tax cuts in general and what is now known as the Laffer Curve. Its prehistory may be mature, but it was Arthur Laffer and importantly, Ronald Reagan who finally rewired finance ministers worldwide to understand the meaning of this curve as applied to public policy.

Historic resistance to this paradigm came from the simple inability of most of the population to intuit second order mathematics. A linear argument is always more beguiling even if it is the road to wrack and ruin, Its successful application has delivered twenty five years of sustained economic growth here and abroad, as well as government budget surpluses. It has only been halted because the bankers got drunk one night and listened to their greed and threw the regulatory governors away on the financial system, which was poorly structured to start with.

However, dear reader, understanding this curve and matching it to the current state of taxation policy will inform you as well as any finance minister of the future condition of the economy.

We have a progressive income tax system, and we have a series of turn over taxes. The latter responds directly to the ebb and flow of the economy, and recently in Canada’s case led directly to the ballooning of governmental revenues and the paying down of deficits for the first time almost in living memory.

However, the progressive nature of the income tax has led to tax creep as incomes have risen. This has also happened elsewhere, including the USA. That is why it is possible for politicians to offer tax cuts every couple of years or so.

To bail us out of the current disaster, the policy calls for strong deficit funding of present capital programs including bailouts and financial infrastructure recapitalization which is clearly falling into place just as fast as possible. It also calls a cut in taxation ahead of the expected economic rebound to give the economy the maximum running room. We can expect a rapid recovery of government revenue surpluses as the economy swings back into motion.

The Oil Problem

The problem we do not control is the price of oil which is draining liquidity out of the North American economy. Much of this is a massive tax on USA business that is not been repatriated in country and is instead sloshing around looking for a home and not all been directly reinvested. We have lived with a lot of this for a long time. What we cannot live with is volatility such as drained the blood out of the country last summer and accelerated the developing crash.

All solutions are ugly. The best solution is to put a ceiling on the amount of money that will be spent on foreign oil. That guarantees shortages and this must be coordinated with a simple rationing system that obviously gives supply priority to industry and transportation. Last on the list will be the private automobile for luxury driving.

A couple of important things happen though. There is a turn over price, becomes a defacto ceiling that everyone recognizes as the point of long term diminishing returns. It should make a run up such as was delivered last summer nearly impossible and certainly far more costly to sustain.

Much more importantly, American Industry will have no choice but to move heaven and earth to replace oil as a secure energy source. The gasoline car will make its exit from the mass market in a couple of years instead of slowly with lots of foot dragging.

And because the USA had so far to go, it becomes the global standard for solutions to the energy problem.




June 1, 2004

The Laffer Curve: Past, Present, and Future

by Arthur B. Laffer
Backgrounder #1765

http://www.heritage.org/Research/Taxes/bg1765.cfm

The story of how the Laffer Curve got its name begins with a 1978 article by Jude Wanniski in The Public Interest entitled, "Taxes, Revenues, and the `Laffer Curve.'"
1 As recounted by Wanniski (associate editor of The Wall Street Journal at the time), in December 1974, he had dinner with me (then professor at the University of Chicago), Donald Rumsfeld (Chief of Staff to President Gerald Ford), and Dick Cheney (Rumsfeld's deputy and my former classmate at Yale) at the Two Continents Restaurant at the Washington Hotel in Washington, D.C. While discussing President Ford's "WIN" (Whip Inflation Now) proposal for tax increases, I supposedly grabbed my napkin and a pen and sketched a curve on the napkin illustrating the trade-off between tax rates and tax revenues. Wanniski named the trade-off "The Laffer Curve."

I personally do not remember the details of that evening, but Wanniski's version could well be true. I used the so-called Laffer Curve all the time in my classes and with anyone else who would listen to me to illustrate the trade-off between tax rates and tax revenues. My only question about Wanniski's version of the story is that the restaurant used cloth napkins and my mother had raised me not to desecrate nice things.

The Historical Origins of the Laffer Curve

The Laffer Curve, by the way, was not invented by me. For example, Ibn Khaldun, a 14th century Muslim philosopher, wrote in his work The Muqaddimah: "It should be known that at the beginning of the dynasty, taxation yields a large revenue from small assessments. At the end of the dynasty, taxation yields a small revenue from large assessments."

A more recent version (of incredible clarity) was written by John Maynard Keynes:

When, on the contrary, I show, a little elaborately, as in the ensuing chapter, that to create wealth will increase the national income and that a large proportion of any increase in the national income will accrue to an Exchequer, amongst whose largest outgoings is the payment of incomes to those who are unemployed and whose receipts are a proportion of the incomes of those who are occupied...

Nor should the argument seem strange that taxation may be so high as to defeat its object, and that, given sufficient time to gather the fruits, a reduction of taxation will run a better chance than an increase of balancing the budget. For to take the opposite view today is to resemble a manufacturer who, running at a loss, decides to raise his price, and when his declining sales increase the loss, wrapping himself in the rectitude of plain arithmetic, decides that prudence requires him to raise the price still more--and who, when at last his account is balanced with nought on both sides, is still found righteously declaring that it would have been the act of a gambler to reduce the price when you were already making a loss.

Theory Basics

The basic idea behind the relationship between tax rates and tax revenues is that changes in tax rates have two effects on revenues: the arithmetic effect and the economic effect. The arithmetic effect is simply that if tax rates are lowered, tax revenues (per dollar of tax base) will be lowered by the amount of the decrease in the rate. The reverse is true for an increase in tax rates. The economic effect, however, recognizes the positive impact that lower tax rates have on work, output, and employment--and thereby the tax base--by providing incentives to increase these activities. Raising tax rates has the opposite economic effect by penalizing participation in the taxed activities. The arithmetic effect always works in the opposite direction from the economic effect. Therefore, when the economic and the arithmetic effects of tax-rate changes are combined, the consequences of the change in tax rates on total tax revenues are no longer quite so obvious.

Figure 1 is a graphic illustration of the concept of the Laffer Curve--not the exact levels of taxation corresponding to specific levels of revenues. At a tax rate of 0 percent, the government would collect no tax revenues, no matter how large the tax base. Likewise, at a tax rate of 100 percent, the government would also collect no tax revenues because no one would willingly work for an after-tax wage of zero (i.e., there would be no tax base). Between these two extremes there are two tax rates that will collect the same amount of revenue: a high tax rate on a small tax base and a low tax rate on a large tax base.

http://www.heritage.org/Research/Taxes/images/35038257.gif



The Laffer Curve itself does not say whether a tax cut will raise or lower revenues. Revenue responses to a tax rate change will depend upon the tax system in place, the time period being considered, the ease of movement into underground activities, the level of tax rates already in place, the prevalence of legal and accounting-driven tax loopholes, and the proclivities of the productive factors. If the existing tax rate is too high--in the "prohibitive range" shown above--then a tax-rate cut would result in increased tax revenues. The economic effect of the tax cut would outweigh the arithmetic effect of the tax cut.

Moving from total tax revenues to budgets, there is one expenditure effect in addition to the two effects that tax-rate changes have on revenues. Because tax cuts create an incentive to increase output, employment, and production, they also help balance the budget by reducing means-tested government expenditures. A faster-growing economy means lower unemployment and higher incomes, resulting in reduced unemployment benefits and other social welfare programs.

Over the past 100 years, there have been three major periods of tax-rate cuts in the U.S.: the Harding-Coolidge cuts of the mid-1920s; the Kennedy cuts of the mid-1960s; and the Reagan cuts of the early 1980s. Each of these periods of tax cuts was remarkably successful as measured by virtually any public policy metric.

Prior to discussing and measuring these three major periods of U.S. tax cuts, three critical points should be made regarding the size, timing, and location of tax cuts.

Size of Tax Cuts

People do not work, consume, or invest to pay taxes. They work and invest to earn after-tax income, and they consume to get the best buys after tax. Therefore, people are not concerned per se with taxes, but with after-tax results. Taxes and after-tax results are very similar, but have crucial differences.

Using the Kennedy tax cuts of the mid-1960s as our example, it is easy to show that identical percentage tax cuts, when and where tax rates are high, are far larger than when and where tax rates are low. When President John F. Kennedy took office in 1961, the highest federal marginal tax rate was 91 percent and the lowest was 20 percent. By earning $1.00 pretax, the highest-bracket income earner would receive $0.09 after tax (the incentive), while the lowest-bracket income earner would receive $0.80 after tax. These after-tax earnings were the relative after-tax incentives to earn the same amount ($1.00) pretax.

By 1965, after the Kennedy tax cuts were fully effective, the highest federal marginal tax rate had been lowered to 70 percent (a drop of 23 percent--or 21 percentage points on a base of 91 percent) and the lowest tax rate was dropped to 14 percent (30 percent lower). Thus, by earning $1.00 pretax, a person in the highest tax bracket would receive $0.30 after tax, or a 233 percent increase from the $0.09 after-tax earned when the tax rate was 91 percent. A person in the lowest tax bracket would receive $0.86 after tax or a 7.5 percent increase from the $0.80 earned when the tax rate was 20 percent.

Putting this all together, the increase in incentives in the highest tax bracket was a whopping 233 percent for a 23 percent cut in tax rates (a ten-to-one benefit/cost ratio) while the increase in incentives in the lowest tax bracket was a mere 7.5 percent for a 30 percent cut in rates--a one-to-four benefit/cost ratio. The lessons here are simple: The higher tax rates are, the greater will be the economic (supply-side) impact of a given percentage reduction in tax rates. Likewise, under a progressive tax structure, an equal across-the-board percentage reduction in tax rates should have its greatest impact in the highest tax bracket and its least impact in the lowest tax bracket.

Timing of Tax Cuts
The second, and equally important, concept of tax cuts concerns the timing of those cuts. In their quest to earn after-tax income, people can change not only how much they work, but when they work, when they invest, and when they spend. Lower expected tax rates in the future will reduce taxable economic activity in the present as people try to shift activity out of the relatively higher-taxed present into the relatively lower-taxed future. People tend not to shop at a store a week before that store has its well-advertised discount sale. Likewise, in the periods before legislated tax cuts take effect, people will defer income and then realize that income when tax rates have fallen to their fullest extent. It has always amazed me how tax cuts do not work until they actually take effect.

When assessing the impact of tax legislation, it is imperative to start the measurement of the tax-cut period after all the tax cuts have been put into effect. As will be obvious when we look at the three major tax-cut periods--and even more so when we look at capital gains tax cuts--timing is essential.

Location of Tax Cuts

As a final point, people can also choose where they earn their after-tax income, where they invest their money, and where they spend their money. Regional and country differences in various tax rates matter.

Thorium Energy Paradigm

I have posted last year on thorium and we have here a much better bit of intelligence on the prospects of a thorium industry arising. As I pointed out earlier, uranium has dominated because it occasionally produces high grade deposits and can be used to make nuclear bombs. That last consideration is slowly unwinding and that resource is flowing back to the market. The high grade deposits will dominate for a couple more generations as far as we can tell from here.

This article shows us where the thorium reserves are and just how huge they really are. It also explains India’s long development of thorium reactors.

This article describes a fifty percent thermal efficiency which is excellent. Again no mention is made of the reverse Rankin cycle engine as a cooling system. That method can deliver an additional 37 ½ percent brake horsepower to the already produced fifty percent. In short, it is plausible that a LFTR can achieve 87 ½ percent brake horsepower which is surely optimistic. It does make the protocol very attractive.

You cannot come away from this article but to be sure that the systems described will nicely consume all our uranium waste problems while supplying massive grid power anywhere needed.

The Liquid Fluoride Thorium Paradigm

Posted by
Gail the Actuary on January 20, 2009 - 9:05am

This is a guest post by Charles Barton. Charles is a retired counselor who writes the
Energy from Thorium blog. His father Dr. Charles Barton, Senior, worked at Oak Ridge National Laboratory for 28 years. He was a reactor chemist, who worked on the Liquid-Fluoride Thorium Reactor (LFTR) concept for about 2/3 of his ORNL career. Charles Barton, Junior gained his knowledge of the LFTR concept from his familiarity with his father's work. Neither his father nor Mr. Barton will gain financially from the advancement of this idea.

The Liquid Fluoride Thorium Paradigm

Excitement has recently been rising about the possibility of using thorium as a low-carbon way of generating vast amounts of electricity. The use of thorium as a nuclear fuel was extensively studied by Oak Ridge National Laboratory between 1950 and 1976, but was dropped, because unlike uranium-fueled
Light Water Reactors (LWRs), it could not generate weapons' grade plutonium. Research on the possible use of thorium as a nuclear fuel has continued around the world since then. Famed Climate Scientist James Hanson, recently spoke of thorium's great promise in material that he submitted to President Elect Obama:

The Liquid-Fluoride Thorium Reactor (LFTR) is a thorium reactor concept that uses a chemically-stable fluoride salt for the medium in which nuclear reactions take place. This fuel form yields flexibility of operation and eliminates the need to fabricate fuel elements. This feature solves most concerns that have prevented thorium from being used in solid-fueled reactors. The fluid fuel in LFTR is also easy to process and to separate useful fission products, both stable and radioactive. LFTR also has the potential to destroy existing nuclear waste.

(The) LFTR(s) operate at low pressure and high temperatures, unlike today’s LWRs. Operation at low pressures alleviates much of the accident risk with LWR. Higher temperatures enable more of the reactor heat to be converted to electricity (50% in LFTR vs 35% in LWR). (The) LFTR (has) the potential to be air-cooled and to use waste heat for desalinating water.

LFTR(s) are 100-300 times more fuel efficient than LWRs. In addition to solving the nuclear waste problem, they can operate for several centuries using only uranium and thorium that has already been mined. Thus they eliminate the criticism that mining for nuclear fuel will use fossil fuels and add to the greenhouse effect.

The Obama campaign, properly in my opinion, opposed the Yucca Mountain nuclear repository. Indeed, there is a far more effective way to use the $25 billion collected from utilities over the past 40 years to deal with waste disposal. This fund should be used to develop fast reactors that consume nuclear waste, and thorium reactors to prevent the creation of new long-lived nuclear waste. By law the federal government must take responsibility for existing spent nuclear fuel, so inaction is not an option. Accelerated development of fast and thorium reactors will allow the US to fulfill its obligations to dispose of the nuclear waste, and open up a source of carbon-free energy that can last centuries, even millennia.
It is commonly assumed that 4th generation nuclear power will not be ready before 2030. That is a safe assumption under "business-as-usual”. However, given high priority it is likely that it could be available sooner. It is specious to argue that R&D on 4th generation nuclear power does not deserve support because energy efficiency and renewable energies may be able to satisfy all United States electrical energy needs. Who stands ready to ensure that energy needs of China and India will be entirely met by efficiency and renewables?

Development of the first large 4 generation nuclear plants may proceed most rapidly if carried out in China or India (or South Korea, which has a significant R&D program), with the full technical cooperation of the United States and/or Europe. Such cooperation would make it much easier to achieve agreements for reducing greenhouse gases.

Uranium-235 is the only fissionable material that is observed in usable amounts in nature. Thus pioneering nuclear physicist like Enrico Fermi and Eugene Wigner had no other choice of but to use U-235 to
create their first chain reaction under the bleachers of the University of Chicago’s unused football field.

But Fermi and Wigner knew early on that once a reactor was built, it was possible to create other fissionable substances with the excess neutrons produced by a U-235 chain reaction. Thus if U-238 absorbed a neutron, it became the unstable U-239, which through a two stage nuclear process was transformed into plutonium-239. Plutonium-239 is very fissionable. The physicists also calculated that if thorium-232 was placed inside a reactor and bombarded with neutrons, it would be transformed into U-233. Their calculations also revealed that U-233 was not only fissionable, but had properties that made it in some respects a superior reactor fuel to U-235 and Pu-239.

During World War II, Fermi and Wigner, who were geniuses with active and far ranging minds, collected around themselves a group of brilliant scientists. Fermi, Wigner and their associates began to think about the potential uses of the new energy they were discovering--uses that would improve society rather than destroy it.

The capture of nuclear energy and its transformation into electrical energy became a central focus of discussions among early atomic scientists. They were not sure how long the uranium supply would last, so Fermi proposed that reactors be built that would breed plutonium from U-238. Wigner counted that thorium was several times as plentiful as uranium, and that it could produce an even better nuclear fuel than Pu-239.

The first nuclear era was dominated by uranium technology, a technology that was derived from military applications, and carried with it, rightly or wrongly, the taint of association with nuclear weapons. As it turned out, there was far more uranium available than Fermi or Wigner had originally feared, but other rationales propelled scientific interest in developing thorium fuel cycle reactors. First, Pu-239 was not a good fuel for most reactors. It failed to fission 1/3 of the time when it absorbed a neutron in a conventional Light Water Reactor (LWR). This led to the most difficult part of the problem of nuclear waste. Plutonium made excellent fuel for fast neutron reactors, but the fast neutron reactor that Fermi liked used dangerous liquid sodium as its coolant, and would pose a developmental challenge of enormous proportions.
Advocates of the thorium fuel cycle point to its numerous advantages over the uranium-plutonium fuel cycle. B.D. Kuz’minov, and V.N. Manokhin, of the Russian Federation State Science Centre, Institute of Physics and Power Engineering at Obninsk, write:

Adoption of the thorium fuel cycle would offer the following advantages:

- Increased nuclear fuel resources thanks to the production of 233U from 232Th;
- Significant reduction in demand for the enriched isotope 235U;
- Very low (compared with the uranium-plutonium fuel cycle) production of long-lived radiotoxic wastes, including transuraniums, plutonium and transplutoniums;
- Possibility of accelerating the burnup of plutonium without the need for recycling, i.e. rapid reduction of existing plutonium stocks;
- Higher fuel burnup than in the uranium-plutonium cycle;
- Low excess reactivity of the core with thorium-based fuel, and more favourable temperature and void reactivity coefficients; . . .

Thorium could replace U-238 in conventional LWRs, and could be used to breed new nuclear fuel in specially modified LWRs. This technology was successfully
tested in the Shippingport reactor during the late 1970’s and early 1980’s.

WASH-1097 remains a good source of information on the thorium fuel cycle. In fact, some major recent studies of the thorium fuel cycle rely heavily on WASH-1097. A recent IAEA report on Thorium appears to have been prepared without overt reliance on WASH-1097.

One of the first things physicists discovered about chain reactions was that slowing the neutrons involved in the process down, promoted the chain reaction. Kirk Sorensen discusses slow or thermal neutrons in
one of his early posts.

Under low energy neutron conditions, Th232 can be efficiently converted to U233. The conversion process works like this. Th232 absorbs a neutron and emits a beta ray. A neutron switches to being a proton and the atom is transformed into Protactinium 233. After a period averaging a little less than a month, Pa 233 emits a second beta ray and is transformed into U233. U233 is fissionable, and is a very good reactor fuel. When a U233 atom encounters a low energy neutron, chances are 9 out of 10 that it will fission.

Since U233 produces an average of 2.4 neutrons every time it fissions, this means that each neutron that strikes U233 produces an average of 2.16 new neutrons. If you carefully control those neutrons, one neutron will continue the chain reaction. That leaves an average of 1.16 neutrons to generate new fuel.

Unfortunately the fuel generation process cannot work with 100% efficiency. The leftover U-234 that was produced when U-233 absorbed a neutron and did not fission will sometimes absorb another neutron and become U-235. Xenon-135, an isotope that that is often produced after U-233 splits, is far more likely to capture neutrons than U233 or Th232. This makes Xenon-135 a fission poison. Because Xenon in a reactor builds up during a chain reaction, it tends to slow the nuclear process as the chain reaction continues. The presence of Xenon creates a control problem inside a reactor. Xenon also steals neutrons needed for the generation of new fuel.

In conventional reactors that use solid fuel, Xenon is trapped inside the fuel, but in a fluid fuel Xenon is easy to remove because it is what is called a noble gas. A noble gas does not bond chemically with other substances, and can be bubbled out of fluids where it has been trapped. Getting Xenon 135 out of a reactor core makes generating new U233 from Th232 a whole lot easier.

It is possible to bring about 1.08 neutrons into the thorium change process for every U-233 atom that splits. This means that reactors that use a thorium fuel cycle are not going to produce an excess of U-233, but if carefully designed, they can produce enough U233 that burnt U233 can be easily replaced. Thus a well designed thorium cycle reactor will generate its own fuel indefinitely.

Research continues on a thorium cycle LWR fuel that would allow for the breeding of thorium in LWRs. There is however a problem which makes the LWR a less than ideal breeding environment for thorium. Elisabeth Huffer, Hervé Nifenecker, and Sylvain David note:

Fission products are much more efficient in poisoning slow neutron reactors than fast neutron reactors. Thus, to maintain a low doubling time, neutron capture in the fission products and other elements of the structure and coolant have to be minimized.

India has only a small uranium supply, but an enormous thorium reserve. Millions of tons of thorium ore lie on the surface of Indian beaches, waiting to be scooped up by front loaders and hauled away to potential thorium reactors for a song. (For those of you who are interested in the EROEI concept, the EROEI for the recovery of thorium from Indian beaches would be almost unbelievably high, and the energy extracted could power the Indian economy for thousands of years, potentially making India the richest nation in the world.)

India has for 50 years been following a plan to
gradually switch from uranium to thorium cycle reactors. That plan is expected to finally come to fruition by the end of the next decade. At that point India will begin the rapid construction of a fleet of thorium fuel cycle reactors.

A commercial business,
Thorium Power, Limited, continues research based on the Shippingport Reactor experiment. Thorium Power plans to offer a thorium cycle based nuclear fuel with a starting charge of enriched U-235 for modified LWRs. Thorium Power has sponsored Throium fuel research at the Kurchatov Institute in Moscow, and a Russian VVER has been used to conduct thorium cycle fuel experiments.

Research on thorium cycle liquid fuel reactors is ongoing world-wide. The best-known effort is being performed in Grenoble, France at the
Laboratoire de Physique Subatomique et de Cosmologie. The Reactor Physics Group there is the only one in the world that has the resources and backing needed to actually develop a fluid core thorium cycle reactor that can be commercialized. In terms of organization size, the Thorium Molten Salt Reactor research group is much smaller than would be required to sustain a full-scale rapid development of thorium cycle reactor technology. The LPSC group thus is working in a business as usual time frame, and has no urgent motivation to do otherwise. After all, 80% of French electricity already comes from nuclear power plants.

Thorium fuel cycle research is also being carried on in the Netherlands, Japan, the Czech Republic. There is also presently a small-scale effort in the United States.

Thorium is extremely abundant in the earth's crust, which appears to contain somewhere around 120 trillion tons of it. In addition to 12% thorium monazite sands, found on Indian beaches and in other places, economically recoverable thorium is found virtually everywhere. For example, large-scale recovery of thorium from granite rocks is economically feasible with a very favorable EROEI. Significant recoverable amounts of thorium are present in mine tailings. These include the tailings of ancient tin mines, rare earth mine tailings, phosphate mine tailings and uranium mine tailings. In addition to the thorium present in mine tailings and in surface monazite sands, burning coal at the average 1000 MWe power plant
produces about 13 tons of thorium per year. That thorium is recoverable from the power plant’s waste ash pile.

One ton of thorium will produce nearly 1 GW of electricity for a year in an efficient thorium cycle reactor. Thus current coal energy technology throws away over 10 times the energy it produces as electricity. This is not the result of poor thermodynamic efficiency; it is the result of a failure to recognize and use the energy value of thorium. The amount of thorium present in surface mining coal waste is enormous and would provide all the power human society needs for thousands of years, without resorting to any special mining for thorium, or the use of any other form or energy recovery.

Little attention is paid to the presence of thorium in mine tailings. In fact it would largely be passed over in silence except that radioactive gases from thorium are a health hazard for miners and ore processing workers.

Thorium is present in phosphate fertilizers because fertilizer manufactures do not wish to pay the recovery price prior to distribution. Gypsum present in phosphate tailings is unusable in construction because of the presence of radioactive gasses associated with the thorium that is also present in the gypsum. Finally organic farmers use phosphate tailings to enrich their soil. This has the unfortunate side effect of releasing thorium into surface and subsurface waters, as well as leading to the potential contamination of organic crops with thorium and its various radioactive daughter products. Thus the waste of thorium present in phosphate tailings has environmental consequences.

The world’s real thorium reserve is enormous, but also hugely underestimated. For example the USGS reports that the United States has a thorium reserve of 160,000 tons, with another 300,000 tons of possible thorium reserve. But Alex Gabbard
estimates a reserve of over 300,000 tons of recoverable thorium in coal ash associated with power production in the United States alone.

In 1969, WASH-1097 noted a report that had presented to President Johnson that estimated the United States thorium reserve at 3 billion tons that could be recovered for the price of $500 a pound – perhaps $3000 today. Lest this sound like an enormous amount of money to pay for thorium, consider that one pound of thorium contains the energy equivalent of 20 tons of coal, which would sell on the spot market for in mid-January for around $1500. The price of coal has been somewhat depressed by the economic down turn. Last year coal sold on the spot market for as much as $300 a ton, yielding a price for 20 tons of coal of $6000. How long would 3 billion tons last the United States? If all of the energy used in the United States were derived from thorium for the next two million years, there would be still several hundred thousand years of thorium left that could be recovered for the equivalent of $3000 a pound in January 2009 dollars.

Nor would exhausting the USAEC’s 1969 estimated thorium reserve exhaust the American thorium supply. Even at average concentrations in the earth’s rocks, thorium can be recovered with a good EROEI, without making the cost of electricity impossibly expensive.

Louis Sheehan on Early Terra Preta

A quick review here on the subject of terra preta and we are sort of due. New information is this tale about confederate soldiers who took up farming on such soils and discovered their value and obviously told the story to a research group.

They pass over the making of the biochar as if it were a simple matter of smoldering wood and brush. If only that were true, everyone would be doing this for thousands of years worldwide.

And no, the microorganisms do not turn organic matter into dark earth. It that were true we all would be living on miles of dark earth. They turn it into food which they consume. In fact, the problem with tropical soils is that it is rapidly degraded by the biology leaving depleted nutrient poor soils. Terra preta intercepts that process and holds the nutrients.

We associate slash and burn with primitive agriculture. That is quit true as far as it goes. What is not understood is that a primitive lifestyle is the result of slash and burn. Slash and burn was not very easy until it was possible to buy a steel axe and a machete.

Thus earlier cultures were static and mastered their soils while building up huge communities.

By Louis Sheehan esquire

Shortly after the U.S. Civil War, a research expedition encountered a group of Confederate expatriates living in Brazil. The refugees had quickly taken to growing sugarcane on plots of earth that were darker and more fertile than the surrounding soil, Cornell University’s Charles Hartt noted in the 1870s.

The same dark earth, terra preta in Portuguese, is now attracting renewed scientific attention for its high productivity, mysterious past, and capacity to store carbon. Researchers on Feb. 18 at the annual meeting of the American Association for the Advancement of Science in St. Louis presented evidence that new production of the fertile soil could aid agriculture and limit global greenhouse-gas emissions.

Prehistoric farmers created dark earth, perhaps intentionally, when they worked charcoal and nutrient-rich debris into Amazonian soils, which are naturally poor at holding nutrients. The amendments produced “better nutrient retention and soil fertility,” says soil scientist Johannes Lehmann of Cornell.

Charcoal forms when organic matter smolders, or burns at low temperatures and with limited oxygen. Nutrients such as phosphorus and potassium readily adhere to charcoal, and the combination creates a good habitat for microorganisms. The soil microbes transform the materials into dark earth, says geographer William I. Woods of the University of Kansas in Lawrence.

If some of today’s Amazonian farmers were to use smoldering fires to produce dark earth rather than clear fields with common slash-and-burn methods, they “would not only dramatically improve soil and increase crop production but also could provide a long-term sink for atmospheric carbon dioxide,” says Lehmann.

Slash-and-burn land clearing releases about 97 percent of the carbon that’s in vegetation. Smoldering the same fuel to form charcoal releases only about 50 percent of the original carbon, Lehmann previously reported. The rest of that carbon remains in dark earth for centuries.
http://Louis1J1Sheehan.us

However, dark earth requires extra nutrients, such as those in compost. International agreements on greenhouse gases don’t provide financial incentives for farmers to make the effort to create dark earth, Woods says.

Nevertheless, ongoing field experiments in Brazil suggest that the fertility associated with terra preta could provide its own incentive, reports Beáta Madari of the Brazilian Agricultural Research Corporation in Rio de Janeiro.

Brazil contains a wide range of dark earths with varying compositions. The scientists found differences between the soils used for ancient backyard gardens, which received more nutrients, and soils from distant fields.

Farmers of the time “certainly would have immediately learned about the properties of that soil, however [it] formed,” says anthropologist Michael J. Heckenberger of the University of Florida in Gainesville. But the knowledge about how to make dark earth disappeared after contact with Europeans decimated the indigenous population.

Wednesday, January 28, 2009

Roy Spenser on Al Gore's Propaganda

Roy Spencer wades into the fray again, this time pointing out the reality of the global warming debate. As he points out, this no longer scholarly debate so much as a PR campaign. These methods are often used piece meal and quite innocently. Here that is simply not the case.

Gore and Hansen have grabbed the ring and are not backing down even as it becomes apparent that their arguments are fading in a morass of negative evidence.

This is becoming a clinical study on the application of propaganda and likely nothing else as Mother Nature tramples the position.

This summer we will be awash with news about the extent of fresh sea ice and the general coolness in the Northern Hemisphere.
It has become unfortunately clear that the organized party of the scientifically willing are standing by their position for other than the reasons of science. Most likely it is good old lucre. They understand that they will get research grants forever if they stand by their hero. The only thing unusual in this revealing demonstration of feet of clay, is that they were ever so dumb as to tie the process to the weather. That may still derail the rush to underwrite large research budgets.
We are having a winter that now compares to the worst. At least so far. I have every reason to expect it to pile up the snow until March 31. In short, we are having complete comfirmation that the global temperature has dropped back to the natural low. A few more storms are pending.

Al Gore’s Propaganda

January 27th, 2009 by Roy W. Spencer, Ph. D.

The methods used by global warming alarmists to convince you that more carbon dioxide is going to ruin the Earth are increasingly laced with insults and attacks directed toward anyone who might disagree with them. For instance, one of the many intellectually lazy (& false) claims is that I am paid by Big Oil.

Mr. Gore’s tactics have been a little more subtle, and reminiscent of propaganda methods which have proved to be effective throughout history at influencing public opinion. One should keep in mind that his main scientific adviser, NASA’s James Hansen, has the most extreme views of any climate researcher when it comes to predicting a global warming induced Armageddon.

Listed below are ten propaganda techniques I have excerpted from Wikipedia. Beneath each are one or more examples of Mr. Gore’s rhetoric as he has attempted to goad the rest of us into reducing our CO2 emissions. Except where indicated, most quotes are from his testimony before the U.S. Senate Environment and Public Works Committee, March 21, 2007. (Mr. Gore is scheduled to testify again tomorrow, January 28, 2009, before the Senate’s Foreign Relations Committee…if the cold and snowy weather doesn’t cause them to reschedule.)

Appeal to fear: Appeals to fear seek to build support by instilling anxieties and panic in the general population.

“I want to testify today about what I believe is a planetary emergency—a crisis that threatens the survival of our civilization and the habitability of the Earth.”

Appeal to authority: Appeals to authority cite prominent figures to support a position, idea, argument, or course of action. Also, Testimonial: Testimonials are quotations, in or out of context, especially cited to support or reject a given policy, action, program, or personality. The reputation or the role (expert, respected public figure, etc.) of the individual giving the statement is exploited.

“Just six weeks ago, the scientific community, in its strongest statement to date, confirmed that the evidence of warming is unequivocal. Global warming is real and human activity is the main cause.”

“The scientists are virtually screaming from the rooftops now. The debate is over! There’s no longer any debate in the scientific community about this.” (from An Inconvenient Truth)

Bandwagon: Bandwagon and “inevitable-victory” appeals attempt to persuade the target audience to join in and take the course of action that “everyone else is taking”. Also, Join the crowd: This technique reinforces people’s natural desire to be on the winning side. This technique is used to convince the audience that a program is an expression of an irresistible mass movement and that it is in their best interest to join.

“Today, I am here to deliver more than a half million messages to Congressasking for real action on global warming. More than 420 Mayors have nowadopted Kyoto-style commitments in their cities and have urged strong federal action. The evangelical and faith communities have begun to take the lead, calling for measures to protect God’s creation. The State of California, under a Republican Governor and a Democratic legislature, passed strong, economy wide legislation mandating cuts in carbon dioxide.
Twenty-two states and the District of Columbia have passed renewable energy standards for the electricity sector.”

Flag-waving: An attempt to justify an action on the grounds that doing so will make one more patriotic, or in some way benefit a group, country, or idea. Also, Inevitable victory: invites those not already on the bandwagon to join those already on the road to certain victory. Those already or at least partially on the bandwagon are reassured that staying aboard is their best course of action.

“After all, we have taken on problems of this scope before. When England and then America and our allies rose to meet the threat of global Fascism, together we won two wars simultaneously in Europe and the Pacific.”

Ad Hominem attacks: A Latin phrase which has come to mean attacking your opponent, as opposed to attacking their arguments. Also Demonizing the “enemy”: Making individuals from the opposing nation, from a different ethnic group, or those who support the opposing viewpoint appear to be subhuman.

“You know, 15 percent of people believe the moon landing was staged on some movie lot and a somewhat smaller number still believe the Earth is flat. They get together on Saturday night and party with the global-warming deniers.” (October 24, 2006, Seattle University)

Appeal to Prejudice: Using loaded or emotive terms to attach value or moral goodness to believing the proposition.

“And to solve this crisis we can develop a shared sense of moral purpose.” (June 21, 2006, London, England)

Black-and-White fallacy: Presenting only two choices, with the product or idea being propagated as the better choice.

“It is not a question of left vs. right; it is a question of right vs. wrong.” (July 1, 2007, New York Times op-ed)

Euphoria: The use of an event that generates euphoria or happiness, or using an appealing event to boost morale:

Live Earth concerts organized worldwide in 2007 by Al Gore.

Falsifying information: The creation or deletion of information from public records, in the purpose of making a false record of an event or the actions of a person or organization. Pseudo-sciences are often used to falsify information.

“Nobody is interested in solutions if they don’t think there’s a problem. Given that starting point, I believe it is appropriate to have an over-representation of factual presentations on how dangerous (global warming) is, as a predicate for opening up the audience to listen to what the solutions are, and how hopeful it is that we are going to solve this crisis.” (May 9, 2006 Grist interview)

Stereotyping or Name Calling or Labeling: This technique attempts to arouse prejudices in an audience by labeling the object of the propaganda campaign as something the target audience fears, hates, loathes, or finds undesirable. Also, Obtain disapproval: This technique is used to persuade a target audience to disapprove of an action or idea by suggesting that the idea is popular with groups hated, feared, or held in contempt by the target audience

“There are many who still do not believe that global warming is a problem at all. And it’s no wonder: because they are the targets of a massive and well-organized campaign of disinformation lavishly funded by polluters who are determined to prevent any action to reduce the greenhouse gas emissions that cause global warming out of a fear that their profits might be affected if they had to stop dumping so much pollution into the atmosphere.” (January 15, 2004, New York City)

Global Magnetic Field

I am hesitant to say much on this until I am in possession of a lot more information. It is easy to imagine a cause and effect relationship between the waxing and waning of the magnetic field and a cosmic ray – precipitation connection. Except our real data is a mere thirty years old. During this time there was a real uptrend in apparent global heat, now recently reversed.

Magnetic fields will change at the same pace. Thus a correlation is unavoidable. And it is a compelling hypothesis.

I just think that the time spans are far too short or subject to eyeball selection that makes results suspect. We will be seeing more of this.

http://www.spacedaily.com/2006/090112183735.ojdq7esu.html

The earth's magnetic field impacts climate: Danish study COPENHAGEN, Jan 12 (AFP) Jan 12, 2009

The earth's climate has been significantly affected by the planet's
magnetic field , according to a Danish study published Monday that could challenge the notion that human emissions are responsible for global warming.

"Our results show a strong correlation between the strength of the earth's magnetic field and the amount of precipitation in the tropics," one of the two Danish geophysicists behind the study, Mads Faurschou Knudsen of the geology department at Aarhus University in western Denmark, told the Videnskab journal.

He and his colleague Peter Riisager, of the Geological Survey of Denmark and Greenland (GEUS), compared a reconstruction of the prehistoric magnetic field 5,000 years ago based on data drawn from stalagmites and stalactites found in China and Oman.

The results of the study, which has also been published in US scientific journal Geology, lend support to a controversial theory published a decade ago by Danish astrophysicist Henrik Svensmark, who claimed the climate was highly influenced by galactic cosmic ray (GCR) particles penetrating the earth's atmosphere.

Svensmark's theory, which pitted him against today's mainstream theorists who claim
carbon dioxide (CO2) is responsible for global warming, involved a link between the earth's magnetic field and climate, since that field helps regulate the number of GCR particles that reach the earth's atmosphere.

"The only way we can explain the (geomagnetic-climate) connection is through the exact same physical mechanisms that were present in Henrik Svensmark's theory," Knudsen said.

"If changes in the magnetic field, which occur independently of the earth's climate, can be linked to changes in precipitation, then it can only be explained through the magnetic field's blocking of the cosmetic rays," he said.

The two scientists acknowledged that CO2 plays an important role in the changing climate, "but the climate is an incredibly complex system, and it is unlikely we have a full overview over which factors play a part and how important each is in a given circumstance," Riisager told Videnskab.