Thursday, April 9, 2009

Improving Wind Generation Efficiency

That this has not been implemented long ago says more about the generator manufacturing industry than anything else. The method described is the brute force approach to the optimization problem. Simply make it possible to access parts of the coil as needed and rely of a nifty controller program.

It is not a direct response system that feels the changes and alters its actions accordingly. Fortunately, wind speed variation is fairly smooth so it should be possible to do almost as well with a predictive software package and blade momentum will also work with you.

No one discusses the likely magnitude of the efficiency increase. I suspect it is a lot less than one would think and that the better initial gains will come from design savings as hardware is minimized.

Others are working on more sophisticated approaches to the same problem and we certainly will see power production optimized now that the blade technology is pretty mature.

Efficient Power at Any Wind Speed

New engine technology makes wind power more efficient in any weather
By
Steven Ashley
http://www.sciam.com/article.cfm?id=efficient-power-at-any-wind-speed&sc=CAT_ENV_20090407

One of wind power’s drawbacks is its variability: sometimes the breeze is weak; other times it is strong. To convert the rotation of wind turbines into electricity efficiently, however, generators require a single turning speed. Faster or slower than this “sweet spot” and efficiency falls off fast. To compensate, engineers design turbine hardware to have adjustable blade angles to shed surplus wind energy or to capture more. Wind turbines often also employ a transmission to gear the shaft speed up or down to the sweet spot. But both mechanisms add weight, complexity and cost.

ExRo Technologies in Vancouver is commercializing what should be a better idea: a generator that operates efficiently over a wide speed range. Retrofitted wind turbines could produce as much as 50 percent more power over time, CEO John McDonald states.

The device works much the same as a traditional generator, except that fast-acting electronic switches can engage individual generator coils as needed to harvest energy effectively at different wind speeds. An intelligent controller turns on only a few coils at low speed and connects more at higher velocities. “This means that the generator has many sweet spots,” says McDonald, who likens the concept to a car engine that saves fuel by shutting down cylinders when the driver demands less power.

ExRo has successfully tested a prototype generator. The company and an industrial partner expect to start side-by-side trials of turbines with and without the new generators soon and plan to commercialize their product by the end of 2009.

Geological Climate View

This is a geological viewpoint of man’s impact on the climate and it is naturally very conservative.

The one event not recognized is the unusual temperature collapse coinciding with the end of the Bronze Age that has never been reversed. My conjecture is that this was caused directly by the deforesting of the whole Sahara resulting in the reflectance of incoming solar energy back out into space. This represents a huge chunk out of the global energy budget.

That is a man made climatic change event that puts current activities in the shade. We can and will reverse that event and in the process provide home and agriculture for billions of people.

Otherwise, the claims of significant human impact in the modern era are hard to substantiate or quantify and appear to be a distraction from real efforts aimed at the terraforming the earth.

As I have stated, it is becoming possible to convert all the deserts into productive farmland and woodland. It is also massively beneficial to do so.

Man's contribution to climate change is negligible in geologic time

http://www.examiner.com/x-2950-Denver-Energy-Industry-Examiner~y2009m3d21-Mans-contribution-to-climate-change-is-negligible-in-geologic-time

March 21, 11:49 AM

Most geologists, including those in the energy business, take a REALLY long view of the earth's history including global warming and cooling cycles. Within the framework of geologic time, i.e. the earth's history, man is a very late entry and relatively small contributor to climate changes.

The current debate concerning global warming is well publicized. It features histrionic presentations of data on both sides of the issue usually by writers or politicians, with no scientific background, "interpreting" volumes of data gathered by true scientists. The arguments, for and against, have been going on for about 40 years. The earth is about 4.6 billion (4,600,000,000) years old so the debate has been going on for about 0.000001% of geologic time. Man, or at least our earliest demonstrable "human" ancestors, arrived about 2.3 million (2,300,000) years ago so "man" has been an observer of climate change for about 0.05% of geologic time.

Climate change, as measured and recorded in the fossil and rock record, as "ice ages" (global cooling) and ocean expansion (global warming) have been occurring periodically but erratically throughout geologic time from about 3.3 billion (3,300,000000) years ago or approximately 1 billion years after the earth formed. The earth basically "cooled" from its nuclear, "Big Bang", inception for over 1 billion years. At least two, multi-million year length "ice ages" occurred before the first signs of organic, carbon based life in the form of algae or pond scum. At least four more ice ages occurred from the age of pond scum, through the age of creepy crawlers, fishes, amphibians, reptiles (dinosaurs) and early mammals. In the last 1 million (1,000,000) years, during the age of man, at least 10 well documented periods of cooling have occurred. The last "ice age", lasted about 60,000 years from approximately 70,000 years ago until about 10,000 years ago. In North America, the timing and duration are determined by measuring the advance and retreat of glaciers in the fossil plant and rock records. Within these overall "ice ages", there are also shorter cycles of warming and cooling. The warmer periods, in today's vernacular, would be called "global warming."

Without question, man's use of fire (wood), dating from 1.5 million years ago; coal, from about 3000 years ago; and petroleum for the last 150 years have contributed to the most recent cycle of warming. The significance of man's activity is a part of the ongoing debate. The CO2 emissions and ozone layer changes are measurable phenomena. The so called "greenhouse effect" is an unproven theory. At worst, however, man's contribution looks to have only "sped up" the earth's natural cycles by a few decades. Obviously, a "few decades" are significant to the earth's current human population but not in terms of impacting the earth's climate history. If this speeding up process began with the first burning of petroleum 150 years ago, man's activities have affected 0.000003% of the earth's history; 0.0065% of man's history; and 1.5% of the time since the end of the last ice age.

Some evidence exists suggesting that the current phase of warming MAY have peaked in the 1970s and the earth MAY be returning to a cooling phase. Regardless of the rhetoric on either side of the arguments, man's total contribution to global climate change is negligible and probably not measurable within the context of geologic time. Instantaneous events like the asteroid or meteor strike that ended the age of dinosaurs by creating a global wide "dust cloud" or continuous volcanic eruptions that have also shrouded the earth with ash and smoke clouds have had a far greater and long lasting effect on climates. If all of man's "contribution" were to cease immediately, the net effect, measured in geologic time, on the earth's natural warming and cooling cycles would not be measurable.

Wednesday, April 8, 2009

Astors


Hydrogen Breakthrough

This is a major development in the hydrogen story. Suddenly we can use solar energy to directly produce both hydrogen and oxygen from water while keeping them separate. Plants are unable to do this because they produce the oxygen by consuming the hydrogen. Up to now, when the question of producing hydrogen came up at all we were treated to a hand wave because we had no satisfactory solution protocols that did not entail huge energy expenditures.

This breakthrough needs a quick work up on a practical production tool, just to get a handle on the practical issues involved. It sounds extremely promising at this point.

The light step is using very little energy to succeed in producing free oxygen that can then be removed. The concentrated hydrogen is liberated by the solution been heated to 100C. so the energetics of the system calls for a solution circulating between room temperature and 100C in concert with a heat exchanger while absorbing sunlight to release the oxygen. It is efficient on paper but sounds a lot like refrigeration.

However, it surely will be massively efficient compared to the cost of electrolysis that has crippled the use of hydrogen from the beginning.

Having a simple and cheap mechanism for the production of both hydrogen and oxygen from water is a huge boon to industrial process. Metallurgy particularly will be revolutionized as pure oxygen substitutes the present regimes. We are going to see factories using this technology to produce the two gases on a massive scale.

Even if we never use hydrogen as a transportation fuel and there is good reason not to, we can use massive amounts in industry itself were transporting the cgas is not an issue.

Unique Approach For Splitting Water Into Hydrogen And Oxygen

http://www.energy-daily.com/reports/Unique_Approach_For_Splitting_Water_Into_Hydrogen_And_Oxygen_999.html

http://www.energy-daily.com/images/hydrogen-bonding-water-h2o-molecule-bg.jpg



The new approach that the Weizmann team has recently devised is divided into a sequence of reactions, which leads to the liberation of hydrogen and oxygen in consecutive thermal- and light-driven steps, mediated by a unique ingredient - a special metal complex that Milstein's team designed in previous studies.


by Staff Writers
Rehovot, Israel (SPX) Apr 07, 2009

The design of efficient systems for splitting water into hydrogen and
oxygen, driven by sunlight is among the most important challenges facing science today, underpinning the long term potential of hydrogen as a clean, sustainable fuel.

But man-made systems that exist today are very inefficient and often require additional use of sacrificial chemical agents. In this context, it is important to establish new mechanisms by which water splitting can take place.

Now, a unique approach developed by Prof. David Milstein and colleagues of the Weizmann Institute's Organic Chemistry Department, provides important steps in overcoming this challenge. During this work, the team demonstrated a new mode of bond generation between oxygen atoms and even defined the mechanism by which it takes place.

In fact, it is the generation of oxygen gas by the formation of a bond between two oxygen atoms originating from water molecules that proves to be the bottleneck in the water splitting process. Their results have recently been published in Science.

Nature, by taking a different path, has evolved a very efficient process:
photosynthesis - carried out by plants - the source of all oxygen on Earth.

Although there has been significant progress towards the understanding of photosynthesis, just how this system functions remains unclear; vast worldwide efforts have been devoted to the development of artificial photosynthetic systems based on metal complexes that serve as
catalysts, with little success. (A catalyst is a substance that is able to increase the rate of a chemical reaction without getting used up.)

The new approach that the Weizmann team has recently devised is divided into a sequence of reactions, which leads to the liberation of hydrogen and oxygen in consecutive thermal- and light-driven steps, mediated by a unique ingredient - a special metal complex that Milstein's team designed in previous studies.

Moreover, the one that they designed - a metal complex of the element ruthenium - is a 'smart' complex in which the metal center and the organic part attached to it cooperate in the cleavage of the water molecule.

The team found that upon mixing this complex with water the bonds between the hydrogen and oxygen atoms break, with one hydrogen atom ending up binding to its organic part, while the remaining hydrogen and oxygen atoms (OH group) bind to its metal center.

This modified version of the complex provides the basis for the next stage of the process: the 'heat stage.' When the water solution is heated to 100 degrees C, hydrogen gas is released from the complex - a potential source of clean fuel - and another OH group is added to the metal center.

'But the most interesting part is the third 'light stage,'' says Milstein. 'When we exposed this third complex to light at room temperature, not only was oxygen gas produced, but the metal complex also reverted back to its original state, which could be recycled for use in further reactions.'

These results are even more remarkable considering that the generation of a bond between two oxygen atoms promoted by a man-made metal complex is a very rare event, and it has been unclear how it can take place. Yet Milstein and his team have also succeeded in identifying an unprecedented mechanism for such a process.

Additional experiments have indicated that during the third stage, light provides the energy required to cause the two OH groups to get together to form hydrogen peroxide (H2O2), which quickly breaks up into oxygen and water. 'Because hydrogen peroxide is considered a relatively unstable molecule, scientists have always disregarded this step, deeming it implausible; but we have shown otherwise,' says Milstein.

Moreover, the team has provided evidence showing that the bond between the two oxygen atoms is generated within a single molecule - not between oxygen atoms residing on separate molecules, as commonly believed - and it comes from a single metal center.

Discovery of an efficient artificial catalyst for the sunlight-driven splitting of water into oxygen and hydrogen is a major goal of renewable clean energy research.

So far, Milstein's team has demonstrated a mechanism for the formation of hydrogen and oxygen from water, without the need for sacrificial chemical agents, through individual steps, using light. For their next study, they plan to combine these stages to create an efficient catalytic system, bringing those in the field of alternative energy an important step closer to realizing this goal.

Participating in the research were former postdoctoral student Stephan Kohl, Ph.D. student Leonid Schwartsburd and technician Yehoshoa Ben-David all of the Organic Chemistry Department, together with staff scientists Lev Weiner, Leonid Konstantinovski, Linda Shimon and Mark Iron of the Chemical Research Support Department.

Prof. David Milstein's research is supported by the Mary and Tom Beck-Canadian Center for Alternative Energy Research; and the Helen and Martin Kimmel Center for Molecular Design. Prof. Milstein is the incumbent of the Israel Matz Professorial Chair of Organic Chemistry.

Stone Age Forestry

I am reading a book published in 2003 by Nigel Randell titled ‘The White Headhunter” about a British sailor marooned on the island of Malaita in the Solomons for eight years ending in the early 1870’s. The society he entered and impacted was Stone Age, well organized and remarkably similar to the society of pre Columbian Brazil including the ritual sacrifice of enslaved captives. This book informs us of the lifeways of these societies and their stresses rather well. It is not the eyes of a trained anthropologist but of a captive who needed to make himself valuable to the tribe and having little hope of escape.

Most valuable, we get a report on the felling of a tree, using Stone Age technique. This had been discussed by those debating the origins of terra preta soils in the Amazon. Manu had expected that the carbon had come from the reduction of the forest itself. I had argued that Stone Age technology lacked the necessary productivity, but had no referents to support that position, unless common sense can be scientifically quoted. It is noteworthy, however, that once the steel ax became available that bush natives switched immediately to slash and burn agriculture.

I quote as follows;

“Bush life was shaped around the wrestling of subsistence from the land. Families depended upon a continuing harvest of starchy food with the staple diet consisting of taro, with yams providing a seasonal change. Gardening was exhausting labour as the land had to be cleared. The bases of the huge hardwood trees were burned and stone adzes used to chip away at the charred wood. This process of burning and chipping was slow work; each tree would take four days to fell. The cleared land could only be used for one planting, then left to lie fallow to renew the feeble fertility of the soil. There was little in the forest to supplement their diet except tree grubs, frogs, and lizards. What bush people always craved was fish.”

The advent of terra preta soil culture in exactly this type of tropical subsistence environment was a productivity revolution. The use of biomass, and more specifically the use of maize stover as a biochar feedstock, provide sufficient product to immediately plant a successor crop using the three sisters protocol or even just more corn. Two days work would surely gather the stover and generate the necessary earthen kiln in the garden patch. Obviously such a bush tribe would base themselves close to good fishing and a lot of fish waste would also accumulate.

This quote clearly establishes the forest management limits of a Stone Age society and shows us that the use of wood for charcoaling was massively labour intensive and not a practical option.

It is also noteworthy that the best Stone Age adze technology of the North West Indians allowed fairly modest totem pole work. It was only the advent of the steel ax that allowed the art to blossom into today’s forms and size.

Viral Fabrication at MIT

It is the measure of just how fast things are moving at the lab that this piece of science fiction is been demonstrated in principle. It certainly will not be quick to make its way into every day technology, but they can actually do it. One more tool has been added to the kit of homo scientificus.

A virus, or in this case a part of a virus, is small enough to work with on forming metallic structures. Cells and the like are simply way too big for any of this. This certainly is a proof of principle that should encourage plenty more effort.

The point is that the proteins are producing nano wires out of the metals. That it should come this quickly in the development of nano materials is surprising and encouraging. One can dare to dream impossible structures and then set about the task of fabrication using proteins as the manufacturing machines.

MIT Made a Virus Make a Battery

Written by Hank Green
Thursday, 02 April 2009

Those crazy lab rats at MIT are attempting to radically diminish the cost of producing sophisticated nanotech cathodes and anodes by enlisting viruses to do the hard work for them.

New lithium ion batteries are being designed with increasingly sophisticated cathodes and anodes that allow fast charge, controlled discharge, longer lifetimes and higher power densities. The problem is, as these components become more advanced, so too do the batteries. Which is why practical electric vehicles (now that they're feasible at all) seem to be generally out of my price range.

The team at MIT genetically engineered viruses to excrete certain proteins. Those proteins then react with chemicals introduced to the environment to create complicated structures. Proteins are very good at directing compounds to create complicated structures...like life forms.

The viruses, in effect, pull the needed compounds (gold and cobalt for the anode and iron phosphate and carbon for the cathode) into nanowires. Both the cathode and the anode for the battery were constructed by viruses, though the battery created is only big enough to power a calculator, the same technique could be used to make batteries for cars.

The best thing is, all you need is the viruses (which are easy enough to multiply exponentially in a lab) and the raw materials to create these sophisticated components. So the cost of advanced battery production could drop like a rock.

Unfortunately, the batteries being produced are not up to the standards of traditionally designed nanotech batteries. They can only go through about 100 cycles (vs. more than 1000 for today's batteries) before starting to lose their charge. Of course, the team is confident that they can direct the viruses more effectively and increase that number significantly.

This technique could also mean a more cost-effective way to build and test new battery chemistries. The team is already experimenting with slightly different cathodes and anodes to attempt to increase power density.

And so maybe soon viruses will be doing all our work for us.
Via
GreenTechMedia

Tuesday, April 7, 2009

Algae Biodiesel Status

These two articles give us another snapshot of the algae biodiesel industry which has been built out mostly to absorb a lot of cheap capital ahead of any convincing technical success. That really is the problem.

That we have a breakthrough in harvesting and drying is good news and an issue I addressed months back because no one had much to say about this most fundamental issue. That this is recent news is bad news. It takes years to polish technology like this.

I think we will master this technology and that it will become our best source of biodiesel, ethanol and other organic feed stocks. The problem is that we have begun a serious twenty year development cycle that is way more difficult than wind or geothermal which are both thirty years down the same road.

I would like to see work done on integrating algae production with some form of shallow sea system. We certainly get the requisite algae blooms with a minimum of encouragement. Converting that into a form of husbandry should be possible.

I believe algae to be suitable operated in the sea and on non arable lands with excessive sunlight. That prevents competition for valuable agricultural lands.

I would also like to see a productivity comparison between optimal algae production and optimal cattail production. Right now, I think cattails win.

When Will Algal Fuels Be Plentiful and Cheap?

The algae industry is getting there – growing, harvesting, separating and converting to useful oils is nearing completion and the ideas are proving up nicely, which should trigger competition in ideas for the process steps in controlling production costs.


Lots of claims have been made over the years for algae energy efficiency: Some experts say each acre given over to algae cultivation could theoretically produce the equivalent of thousands of gallons of oil per year, compared with an estimated yield of 18 to 335 gallons of ethanol per acre for traditional biofuel crops. Others claim that algae-growing systems could be tweaked to yield as much as 100,000 gallons per acre annually.


There are four important steps in the production of algal biofuels: growing the algae, harvesting the crop, separating the oil, and refining the oil to useful fuels. Each step in the process is the focus of intense study by scientists, engineers, and technologists across the developed world. We have already seen
a very significant breakthrough in harvesting and drying of algae.


Technologists tend to overestimate what can be accomplished in two years and underestimate what can be accomplished in ten to twenty years. Algae as biofuel looks more like a ten to twenty year project. DARPA is betting on three to five years, VCs are betting on three to five years, the algae roadmap from DOE takes a decade. _
Greentech

A better method of making fuel from algal oil has got a lot of biofuel analysts excited:

"This is the first economical way to produce biodiesel from algae oil," according to lead researcher Ben Wen, Ph.D., vice president of United Environment and Energy LLC, Horseheads, N.Y. "It costs much less than conventional processes because you would need a much smaller factory, there are no water disposal costs, and the process is considerably faster."

A key advantage of this new process, he says, is that it uses a proprietary solid catalyst developed at his company instead of liquid catalysts used by other scientists today. First, the solid catalyst can be used over and over. Second, it allows the continuously flowing production of biodiesel, compared to the method using a liquid catalyst.

_
WaterandWastewater

A continuous process using solid catalyst is potentially more efficient and productive, compared to batch processing. Also more scalable.

Currently, producing biodiesel from algal oil costs about $20 a gallon. But with all the attention being given each of the multiple steps in the fuel production process,
some producers are projecting production costs as low as $1.50 a gallon. If costs drop that low within the next 10 years, algal biodiesel will begin to place an effective ceiling on the costs of petrol diesel. It will take time to scale up production, of course.

Eric Wesoff

Slimed, Pt. 1: Biofuels and the Aquatic Species Program April 3, 2009 at 1:02 AM

http://greenlight.greentechmedia.com/2009/04/03/slimed-part-1-biofuels-and-the-aquatic-species-program-1313/

Scores of firms, startups and Fortune 500 companies alike, are working on algae-based biofuels. Hundreds of millions of dollars have been invested. And so far, maybe a few thousand gallons of algae oil have been produced.

The question is: Can algae be economically cultivated and commercially scaled to make a material contribution to mankind’s liquid fuel needs? The jury is still out.

Ghosts of NREL Algae Programs Past

The basement of the marine biology department at the University of Hawaii has a hallway lit by a dim incandescent bulb. At the end of the hallway is a cardboard sign with the faded letters “ASP” written on it. A creaky door leads to a dank-smelling room crowded with beakers and algae scientists, milling aimlessly. They share the same slightly green tinge and defeated look.

This is the last remains of the Aquatic Species Program or ASP. These letters are spoken in hushed reverence by today’s crop of phycologists, NRELians and algae-fuel entrepreneurs.

The Program identified hundreds of algae species that could potentially be farmed and cultivated for their lipids — lipids that could be converted to biodiesel and used to wean the U.S. from its dependence on foreign oil.

The Aquatic Species Program was launched in 1978 by president Jimmy Carter to explore the potential of algae as an energy source. About $25 million was put into the program until it was shelved by the Clinton administration in 1996. They never found the “lipid trigger.”

The echoes of that program reverberate in today’s algae fuel renaissance.

Why Algae?

On paper, algae is perhaps the perfect feedstock for biofuels. It grows in a wide variety of climates. It can be used to mitigate carbon dioxide. The liquid fuels produced by these single-celled creatures are only one of their byproducts, and potentially not even the most valuable. Cosmetic supplements, nutraceuticals, pet food additives, animal feed, and specialty oils for human consumption may well fetch higher per-gallon prices.

The tantalizing quality of algae is that some algal species contain up to 40 percent lipids by weight. And therefore, according to some sources, an acre of algae could yield 5,000 to 10,000 gallons of oil a year, making algae far more productive than soy (50 gallons per acre), rapeseed (110 to 145 gallons), mustard (140 gallons) jatropha (175 gallons) palm (650 gallons) or cellulosic ethanol from poplars (2,700 gallons).

More optimistic data from less informed people indicate the theoretical biodiesel yield from microalgae is in the range of 11,000 to 20,000 gallons per acre per year.

But according to Dr. John Benemann, a cantankerous algae consultant whose research is widely cited in the field, the realistic potential production level (despite claims to the contrary) is about 2,000 gallons of algal oil per acre per year.

VCs and Algae Farmers

“VCs cannot come in here and just harvest ripened fruit, this is not shovel ready technology,” said Dr. John Benneman.

Considering the immense technical risks and daunting capital costs of building an algae company, it doesn’t seem like a reasonable venture capital play. And most if not all of the VCs I’ve spoken with categorize these investments as the longer-term, long-shot bets in their portfolio. But given the size of the liquid fuels market, measured in trillions of dollars, not the customary billions of dollars, it makes some sense to take the low-percentage shot.

These firms are going to continue to need capital. According to Jennifer Fonstad of VC investor, Draper Fisher Jurvetson: “The current strategy of many of these companies has been to turn to the government stimulus plan – this is the risk capital we can rely on today.”

A Few Conclusions

We need lots more time and more money

Technologists tend to overestimate what can be accomplished in two years and underestimate what can be accomplished in ten to twenty years. Algae as biofuel looks more like a ten to twenty year project. DARPA is betting on three to five years, VCs are betting on three to five years, the algae roadmap from DOE takes a decade.

The scope of the algae to large-scale biodiesel effort is more along the lines of the Manhattan Project or the Apollo moon shot, which cost $24 billion and $360 billion respectively. A $25 million Aquatic Species Program or $300 million in venture capital is not going to get it done. It will take tens of billions of dollars and decades.

All of the process steps need to be addressed

In the words of Courtney McColgan of DFJ, “There are many pieces to the algae puzzle that seem like afterthoughts, but are actually crucial to the economics — co-products, nutrients, harvesting, drying, and conversion technology. System design and algae strain (which seem to be the focus of most discussions) are important, but not the only components.”

Algae producers admit that there’s a massive difference between growing large, consistent quantities of algae versus growing it on a fish tank wall. Standards for growth, strain selection, breeding, genetic modification, water extraction, oil extraction, and oil refining have yet to be established.

Set realistic expectations for the technology

Exploit near term, intermediate technology deployment opportunities such as wastewater treatment. Cost constraints restrict consideration to the simplest possible devices, which are large unlined, open, mixed raceway ponds.

And finally a word from our favorite curmudgeon…

“Engineering studies do not conclude that we can or will actually be able to produce algal oil/biodiesel. They conclude that the R&D to develop such processes can be justified, at least until it can be demonstrated to be impossible,” said Dr. John Benemann.

This is a small excerpt from the April issue of the Greentech Innovations Report which dives deep into the algae pond. You can subscribe to it here.

Navy Carrier Role

This article on the utility of the Carrier fleet challenges navel doctrine by playing up the apparent vulnerability of carriers. To the degree that the concerns have any merit whatsoever, it is totally true that any carrier is easily destroyed if it encounters simple ballistic firepower. Even a battery of world war one howitzers will get the job done. It is just that is not the role of an aircraft carrier.

Its role is to project force throughout the world while standing off from any such threat. It is not a bombardment platform; it is a mobile air power launch platform. As a force in place, it has been able to suppress military adventurism for over fifty years and occasionally support US military action in support of occasional geopolitical goals.

It is not a platform that can be applied easily against major states with deep hinterlands such as Russia, China, India and Brazil. And in the end, it is deployed in support of landing operations.
\
Regardless, today the USA is the great power whose effective borders are up to the two hundred mile limit of every country on Earth. The high seas are dominated by this large carrier fleet. The fleet is also still been added to and will apparently climb into the twenties. This is likely way too large. A shooting war seems to demand having five on station with all the aircraft that entails.

Then why is the Chinese building anti carrier missiles? Why not? They want to enforce territorial control over their share of the waters around China and blocking any form of USA adventurism is surely part of that agenda.

I will go a little further. A close reading of Chinese history since 1800 would lead you to the same decisions.
Every nation prepared to build a modern navy, showed up and forced concessions from the Chinese government and that included the very dangerous Japan who was local and had real plans of hanging around for a while. If that was your history, you too would want a robust coastal defense.

As an aside, a close reading of Chinese history reveals that in the early nineteenth century, a failed scholar rewrote missionary tracts about Christianity and inspired a mass movement that succeeded in grabbing the southern half of China away from the central government and precipitating a twenty year civil war. Maoist communism was another western inspired but Chinese ideology that also turned the country upside down. Is it any wonder that the leadership is less than sympathetic to the emergence of the Falun Gong? How many lessons do they need?

Navy's Big Weakness: Our Aircraft Carriers Are (Expensive) Defenseless Sitting Ducks

Every single change in technology in the past 50 years has had "Stop building carriers!" written all over it. But the Navy paid no attention.

I've been saying for a long time that aircraft carriers are just history's most expensive floating targets and that they were doomed.

But now I can tell you exactly how they're going to die. I've just read one of the most shocking stories in years. It comes from the U.S. Naval Institute, not exactly an alarmist or anti-Navy source. And what it says is that the U.S. carrier group is scrap metal.

The Chinese military has developed a ballistic missile, Dong Feng 21, specifically designed to kill U.S. aircraft carriers:

"Because the missile employs a complex guidance system, low radar signature and a maneuverability that makes its flight path unpredictable, the odds that it can evade tracking systems to reach its target are increased. It is estimated that the missile can travel at Mach 10 and reach its maximum range of 2,000 kilometers in less than 12 minutes."

That's the U.S. Naval Institute talking, remember. They're understating the case when they say that, with speed, satellite guidance and maneuverability like that, "the odds that it can evade tracking systems to reach its target are increased."

You know why that's an understatement? Because of a short little sentence I found further on in the article -- and before you read that sentence, I want all you trusting Pentagon groupies to promise me that you'll think hard about what it implies. Here's the sentence: "Ships currently have no defense against a ballistic missile attack."

That's right: no defense at all. The truth is that they have very feeble defenses against any attack with anything more modern than cannon. I've argued before no carrier group would survive a saturation attack by huge numbers of low-value attackers, whether they're Persians in Cessnas and cigar boats or mass-produced Chinese cruise missiles.

But at least you could look at the missile tubes and Phalanx gatlings and pretend that you were safe. But there is no defense, none at all, against something as obvious as a ballistic missile.

So it doesn't matter one goddamn whether the people in the operations room of a targeted carrier could track the Dong Feng 21 as it lobbed itself at them. They might do a real hall-of-fame job of tracking it as it goes up and comes down. But so what? Let me repeat the key sentence here: "Ships currently have no defense against a ballistic missile attack."

Think back a ways. How old is the ballistic missile? Kind of a trick question; a siege mortar is a ballistic missile, just unguided. A trebuchet on an upslope outside a castle is a ballistic weapon.

But serious long-range, rocket-powered ballistic weapons go back at least to the V-2. A nuclear-armed V-2 would have been a pretty solid way of wiping out a carrier group, and both components, the nuke and the ballistic missile, were available as long ago as 1945.

A lot has happened since then, like MIRVs, mobile launchers, massively redundant satellite guidance -- but the thing to remember is that every single change has favored the attacker. Every single goddamn change.

You know that Garmin satellite navigation you use to find the nearest Thai place when the in-laws are visiting? If you were the Navy brass, that should have scared you to death. The Mac on your kid's bedroom desk should have scared you.

Every time electronics got smaller, cheaper and more efficient, the carrier became more of a death trap. Every time stealth tech jumped another step, the carrier was more obviously a bad idea. Smaller, cooler-running engines: another bad sign for the carrier.

Every single change in technology in the past half-century has had "Stop building carriers!" written all over it. And nobody in the Navy brass paid any attention.

The lesson here is the same one all of you suckers should have learned from watching the financial news this year: the people at the top are just as dumb as you are, just meaner and greedier. And that goes for the ones running the U.S. surface fleet as much as it does for the GM or Chrysler honchos. Hell, they even look the same.

Take that Wagoner ass who just got the boot from GM and put him in a tailored uniform, and he could walk on as an admiral in any officer's club from Guam to Diego Garcia. You have to stop thinking somebody up there is looking out for you.

Remember that one sentence, get it branded onto your arm: "Ships currently have no defense against a ballistic missile attack."

What does that tell you about the distinguished gentlemen with all the ribbons on their chests who've been standing up on carrier bridges looking like they know what they're doing for the past 50 years?
They're either stupid or so sleazy they're willing to make a career commanding ships they goddamn well know are floating coffins for thousands of ranks and dozens of the most expensive gold-plated airplanes in the history of the world.

That's why it's so sickening to read shit like the following:

"The purpose of the Navy," Vice Admiral John Bird, commander of the 7th Fleet, tells me, "is not to fight." The mere presence of the Navy should suffice, he argues, to dissuade any attack or attempt to destabilize the region.

From Yokosuka, Guam, and Honolulu, the Navy is sending its ships on missions to locales as far away as Madagascar. On board the Blue Ridge, the vice admiral's command ship anchored at Yokosuka, huge display screens allow officers to track the movements of any country's military vessels cruising from the international date line in the east to the African coast in the west -- the range of the 7th Fleet's zone of influence.

That's the kind of story people are still writing. It's so stupid, that first line, I won't even bother with it: "The purpose of the Navy is not to fight." No kidding. The 7th Fleet covers the area included in that 2,000 kilometer range for the new Chinese anti-ship weapons, so I guess it's a good thing they're not there to fight.

Stories like this were all over the place in the last days of the British empire. For some dumb-ass reason, these reporters love the Navy. They were waving flags and feeling good about things when the Repulse and the Prince of Wales steamed out with no air cover to oppose Japanese landings. Afterward, when both ships were lying on the sea floor, nobody wanted to talk about it much.

What I mean to say here is, don't be fooled by the happy talk. That's the lesson from GM, Chrysler and the Navy: These people don't know shit.

And they don't fucking care either. They're going to ride the system and hope it lasts long enough to see them retire to a house by a golf course, get their daughters married and buy a nice plot in an upscale cemetery. They could give a damn what happens to the rest of us.

OED Hyphens

If there was ever any doubt that the OED is a living document, this should dispel it. I have noticed this particular phenomenon in my own writing where I am making decisions on hyphenation for stylistic reasons rather than whether spell check accepts it or not. The writer makes a judgment rather than blindly following some rule and I think the reader benefits from this in improved clarity.

What this really says is that the use of hyphens is very much at the writer’s discretion as a convenience to establish necessary clarity.

Thousands of hyphens perish as English marches on

By Simon Rabinovitch

LONDON (Tstrich.com) - About 16,000 words have succumbed to pressures of the Internet age and lost their hyphens in a new edition of the Shorter Oxford English Dictionary.

Bumble-bee is now bumblebee, ice-cream is ice cream and pot-belly is pot belly.

And if you've got a problem, don't be such a crybaby (formerly cry-baby).

The hyphen has been squeezed as informal ways of communicating, honed in text messages and emails, spread on Web sites and seep into newspapers and books.

"People are not confident about using hyphens anymore, they're not really sure what they are for," said Angus Stevenson, editor of the Shorter OED, the sixth edition of which was published this week.

Another factor in the hyphen's demise is designers' distaste for its ungainly horizontal bulk between words.

"Printed writing is very much design-led these days in adverts and Web sites, and people feel that hyphens mess up the look of a nice bit of typography," he said. "The hyphen is seen as messy looking and old-fashioned."

The team that compiled the Shorter OED, a two-volume tome despite its name, only committed the grammatical amputations after exhaustive research.

"The whole process of changing the spelling of words in the dictionary is all based on our analysis of evidence of language, it's not just what we think looks better," Stevenson said.

Researchers examined a corpus of more than 2 billion words, consisting of full sentences that appeared in newspapers, books, Web sites and blogs from 2000 onwards.

For the most part, the dictionary dropped hyphens from compound nouns, which were unified in a single word (e.g. pigeonhole) or split into two (e.g. test tube).

But hyphens have not lost their place altogether. The Shorter OED editor commended their first-rate service rendered to English in the form of compound adjectives, much like the one in the middle of this sentence.

"There are places where a hyphen is necessary," Stevenson said. "Because you can certainly start to get real ambiguity."

Twenty-odd people came to the party, he said. Or was it twenty odd people?

Some of the 16,000 hyphenation changes in the Shorter Oxford English Dictionary, sixth edition:

Formerly hyphenated words split in two:

fig leaf, hobby horse, ice cream, pin money, pot belly, test tube, water bed,

Formerly hyphenated words unified in one:

Bumblebee, chickpea, Crybaby, leapfrog, logjam, lowlife, pigeonhole, touchline, waterborne

Monday, April 6, 2009

Let Us Call it Fraud

That anyone thought lack of regulation was possibly a good idea was never true. Many thought that certain changes were acceptable and the change makers were simply not smart enough to actually do it. We could call it a dumbing down driven by the honest competition for better returns on the capital deployed.

Recall that the profitability of S&L’s was low and made the industry a slow growth sector that found it difficult to attract new capital and management. The Canadian banking industry had the same rap.

The only way to juice earnings in the financial industry or in any industry with a mature business model is by fraud. That means using confidence to support the finance of lousy investments until the confidence evaporates. If you are the first to play and deftly pass the train wreck of to the most greedy of you successors, you can walk away very rich.

This economist is right. It was fraud and what is more, they all knew it was a pending disaster. The only ones who could be excused would be the salesmen who trusted their bosses and partners. You do not manufacture a package of crap mortgages and then get it ranked AAA and not know that you are constructing a fraud and that everyone is going wink - wink as this is peddled of to the hedge funds desperate for returns.

It is fitting that these fools were forced to pack so much in inventory, but that is likely a result of the last desperate moves to retain confidence. The players are all dead and all their competitors are very much in business since the clique could hardly let them play since they would have asked all the right questions.

As I have previously posted, all participants should be put on notice of a full investigation and asked to turn in their passports. And the crime is also treason because these greedy men put their interests ahead of their institutions and thir country's interest.

Economist: US collapse driven by 'fraud,' Geithner covering up bank insolvency

In an explosive interview on PBS' Bill Moyers Journal, William K. Black, a professor of economics and law with the University of Missouri, alleged that American banks and credit agencies conspired to create a system in which so-called "liars loans" could receive AAA ratings and zero oversight, amounting to a massive "fraud" at the epicenter of US finance.
But worse still, said Black, Timothy Geithner, President Barack Obama's Secretary of the Treasury, is currently engaged in a cover-up to keep the truth of America's financial insolvency from its citizens.
The interview, which aired Friday night, is carried on the Bill Moyers Journal Web site.Black's most recent published work, "The Best Way to Rob a Bank is to Own One," released in 2005, was hailed by Nobel-winning economist George A. Akerlof as "extraordinary.""There is no one else in the whole world who understands so well exactly how these lootings occurred in all their details and how the changes in government regulations and in statutes in the early 1980s caused this spate of looting," he wrote. "This book will be a classic."But that book only covers the fallout from the 1980s Savings & Loan crisis; Black's later first-hand involvement in that scandal being the ensuing liquidation of bad banks."
A single bank, IndyMac, lost more money than the entire Savings and Loan Crisis," reported PBS. "The difference between now and then, explains Black, is a drastic reduction in regulation and oversight, 'We now know what happens when you destroy regulation. You get the biggest financial calamity of anybody under the age of 80.'"That financial calamity, he explained, was brought about not by mishap or accident, but only after a concerted effort to undermine and remove all regulations, allowing a creditor free-for-all that hinged on fraudulent risk ratings for bad loans.
"[T]he way that you do it is to make really bad loans, because they pay better," he told Moyers. "Then you grow extremely rapidly, in other words, you're a Ponzi-like scheme. And the third thing you do is we call it leverage. That just means borrowing a lot of money, and the combination creates a situation where you have guaranteed record profits in the early years. That makes you rich, through the bonuses that modern executive compensation has produced.
It also makes it inevitable that there's going to be a disaster down the road."...This stuff, the exotic stuff that you're talking about was created out of things like liars' loans, that were known to be extraordinarily bad," he continued. "And now it was getting triple-A ratings.
Now a triple-A rating is supposed to mean there is zero credit risk. So you take something that not only has significant, it has crushing risk. That's why it's toxic. And you create this fiction that it has zero risk. That itself, of course, is a fraudulent exercise. And again, there was nobody looking, during the Bush years.
So finally, only a year ago, we started to have a Congressional investigation of some of these rating agencies, and it's scandalous what came out. What we know now is that the rating agencies never looked at a single loan file. When they finally did look, after the markets had completely collapsed, they found, and I'm quoting Fitch, the smallest of the rating agencies, "the results were disconcerting, in that there was the appearance of fraud in nearly every file we examined."He equated the entire US financial system to a giant "ponzi scheme" and charged Treasury Secretary Timothy Geithner, like Secretary Henry Paulson before him, of "covering up" the truth.
"Are you saying that Timothy Geithner, the Secretary of the Treasury, and others in the administration, with the banks, are engaged in a cover up to keep us from knowing what went wrong?" asked Moyers.
"Absolutely, because they are scared to death," he said. "All right? They're scared to death of a collapse.
They're afraid that if they admit the truth, that many of the large banks are insolvent. They think Americans are a bunch of cowards, and that we'll run screaming to the exits.
And we won't rely on deposit insurance. And, by the way, you can rely on deposit insurance. And it's foolishness. All right? Now, it may be worse than that. You can impute more cynical motives.
But I think they are sincerely just panicked about, 'We just can't let the big banks fail.' That's wrong."Ultimately, said Black, the financial downfall of the United States in the wake of the Bush years is due to "the most elite institutions in America engaging in or facilitating fraud."
"When will Americans wake up and hold the real criminals - Banksters - accountable for their actions, and pressure the government to enact systemic changes to prevent future abuses?" asked Huffington Post blogger Mike Garibaldi-Frick.

Atlantic Ocean Warming Dust Driven

This is a little unexpected. It is not unexpected that dust is a contributor but the magnitude certainly is. It appears that two thirds of the regional temperature gain can be attributed to dust and volcanic ash.

This also clarifies the feedback mechanism. We forget the heat gathering capacity of ocean water as compared to land because is attenuated through a thick upper layer. Land either absorbs and uses the energy chemically or reflects it back into the atmosphere. Thus the ocean has a stable temperature regime whose variation is minor as remarked on here. However a one degree rise represents a major jump in heat content in the water that will discharge into the atmosphere producing storms.

A modest amount of dust in the atmosphere generated major reduction in the driving heat engine.

The clear lesson is that a major volcanic interlude will be felt strongly in terms of climate change.

This returns me to contemplation on the possible causes of the worst climate experiences of the little ice age.
Assuming that we were on the low end of the normal Holocene climate variation, the injection of volcanic dust would have been rather damaging. And it need not be a overly big event. I recall that Fuji erupted in a timely manner and its position is such as to possibly affect European climate through high level aerosols and dust.

Thus provided that we already had a modest reduction is temperature due to a very slightly cooler sun, an inconvenient volcano could easily wreck a years climate while not even been noticed.

Dust Responsible for Most of Atlantic Warming

posted: 26 March 2009 02:15 pm ET

The warming of Atlantic Ocean waters in recent decades is largely due to declines in airborne dust from African deserts and lower volcanic emissions, a new study suggests.

Since 1980, the tropical North Atlantic has been warming by an average of a half-degree Fahrenheit (a quarter-degree Celsius) per decade.

While that number may sound small, it can translate to big impacts on hurricanes, which are
fueled by warm surface waters, said study team member Amato Evan of the University of Wisconsin-Madison. For example, the ocean temperature difference between 1994, a quiet hurricane year, and 2005's record-breaking year of storms (including Hurricane Katrina), was just 1 degree Fahrenheit.

Evan and his colleagues had previously shown that African dust and other airborne particles can suppress hurricane activity by reducing how much sunlight reaches the ocean and keeping the sea surface cool. Dusty years predict mild hurricane seasons, while years with low dust activity — including 2004 and 2005 — have been linked to stronger and more frequent storms.

In the new study, the researchers investigated the exact effect of dust and volcanic emissions on ocean temperatures. They combined satellite data of dust and other particles with existing climate models and calculated how much of the Atlantic warming observed during the last 26 years could be accounted for by simultaneous changes in African dust storms and tropical volcanic activity, primarily the eruptions of El Chichón in Mexico in 1982 and
Mount Pinatubo in the Philippines in 1991.

The results: More than two-thirds of this upward trend in recent decades can be attributed to changes in African dust storm and tropical volcano activity during that time.

This was a surprisingly large amount, Evan said.

The results, detailed in the March 27 issue of the journal Science, suggest that only about 30 percent of the observed Atlantic temperature increases are due to other factors, such as
a warming climate.

"This makes sense, because we don't really expect global warming to make the ocean [temperature] increase that fast," Evan said.

This adjustment brings the estimate of global warming's impact on the Atlantic more in line with the smaller degree of ocean warming seen elsewhere, such as the Pacific.

Of course, this doesn't discount the importance of global warming, Evan said, but indicates that newer climate models will need to include dust storms as a factor to accurately predict how ocean temperatures will change.

Satellite research of dust-storm activity is relatively young, and no one yet understands what drives dust variability from year to year. And volcanic eruptions are still relatively unpredictable.

"We don't really understand how dust is going to change in these climate projections, and changes in dust could have a really good effect or a really bad effect," Evan said.

More research and observations of the impact of dust will help answer that question.

EMP Munitions

Following up on the subject of EMP technology, this informs us that we need not rely on a nuclear blast out in space to be able to effectively use it. Instead we will now have a battle field device able to deny the enemy the use of modern electronics.

It does not stop the AK 47 and other conventional hardware, but certainly degrades effectiveness back to world war two levels making it a sitting duck for the force able to retain modern capability. The good news is that we have lost the appetite to fight ourselves, so maybe we will never see this tested in battle against an equally equipped foe.

This also may be the beginnings of an effective anti ballistic missile strategy. Interception with a hot warhead hitting the kill zone with EMP and shrapnel should succeed. It certainly would be difficult to shield against properly.

We have entered a world in which threat levels as well as hot wars have been in steady decline, just better reported. No small part of this has been the advent of overwhelming military power. No possible opponent doubts that reality and the only remaining strategy against any organized national power today is a hot insurgency against the opponent’s will.

All major powers have long since abandoned aggressive militarism, and in most cases internal militarism. This removes much of the threat from political adventurism.

New E-Bomb and EMP Device Details

1.
There are shockwave ferromagnetic generators for e-bomb and EMP devices. This is a magnet that blows up and spontaneously demagnetizes, releasing energy as a pulse of power. The effect is known as pressure-induced magnetic phase transition, and only occurs with some types of magnets in certain situations.

The researchers moved on to more exotic lead zirconate titanate magnets. This enabled them to reduce the volume of the power generator from 50 cu. cm. (3 cu. in.) to 3 cu. cm., excluding explosives. Army requirements call for assembly of the power generator, power conditioning and aerial in a 1-in. space. Power output will be measured in hundreds of megawatts for microseconds.

3. Allen Stults of Amrdec is using the jet of ionized plasma produced by the explosion as an antenna.

The new munitions will have two crucial advantages over previous e-bombs: they are small, and should not cause electronic "friendly fire" casualties hundreds of meters away. And because they still have the same blast, fragmentation and armor-piercing properties as they did, commanders can be confident that they're not wasting space carrying rounds that might have no effect.

An enhanced warhead could knock out a tank even if it did not penetrate. The vehicle would be left without ignition, communications or other electronics. A warhead would also knock out other electronic systems, including mobile phones used by insurgents to detonate bombs and circuitry in rocket-propelled grenades.

Two candidate munitions for upgrade are the Tow missile and 2.75-in. rockets fired by helicopter. This is unlike previous e-bomb efforts, which have focused on large air-delivered bombs or unitary artillery munitions that cover a large area, what Kopp terms “weapons of electrical mass destruction.”

A small e-bomb will be qualitatively different than larger versions. Radiated power falls off with the square of distance, so a target 3 meters (10 ft.) away receives 100 times the effect of one 30 meters away. An EMP-enhanced Tow missile would produce a pulse strong enough to destroy what it hits, but should not disrupt electronics over a wide area.

The smallest weapon that the Army is looking to upgrade is the M77 bomblet fired by the Multiple Launch Rocket System (MLRS). A bomblet has a shaped-charge warhead and throws out antipersonnel fragments. Bomblets cover a wide area—one launcher can fire a 12-rocket salvo blanketing an area the size of six football fields—and are used against soft targets. An EMP-enhanced version would cover the same area, providing even destruction over the target zone.

If the M77 can be upgraded, shoulder-launched rockets and similar weapons could be modified to produce an EMP. Small infantry rockets have limited effectiveness against modern armor. An EMP-enhanced round might not penetrate but could provide a “soft kill” capability that immobilizes a vehicle. This damage is hard to repair and would probably require the replacement of electronic systems.

Friday, April 3, 2009

Current Sea Ice Report 2009

I really hate it when report writers pick and chose data points and ignore the forest. Of course, we are all guilty of that one way or the other. What this item is showing us is that the sea ice collapse that began well in summer of 2007 was halted that winter and the process of claw back has begun. The open waters are allowing higher heat absorption so the claw back is necessarily slow. Right now, we can expect another incremental increase over last year because we have had a cold winter and the new sea ice will be much thicker than the past two years.

We lost a lot of ice this time around, so there is a good chance that when the next wave of solar inspired warming hits in two years, it will quickly reassert the collapse of the sea ice. I expect another winter as cold as the one we just had for 2010, but after that we should catch a warming trend again with significant and accelerating declines every year.

This report certainly suggests that a lot of surplus heat has remained in the arctic undischarged. How true that is should become clear this year. I always get nervous when reports state 5 degrees over five years. That often hides a recent precipitous decline.

I am including the link to my most useful current sea ice map here for future reference. It usually updates every week or so and the lower map is a comparison map. Make for a great reality check.

http://www.socc.ca/seaice/seaice_current_e.cfm
Or alternately their new web page for this. Updating is also lagging as of June.

"Study: Arctic sea ice melting faster than expected"
(Source: AP, 4/2/09)


http://tech.groups.yahoo.com/group/MAWS_General_List/message/3241
WASHINGTON Arctic sea ice is melting so fast most of it could be gone in 30years. A new analysis of changing conditions in the region, using complex computer models of weather and climate, says conditions that had been forecastby the end of the century could occur much sooner.
A change in the amount of ice is important because the white surface reflectssunlight back into space. When ice is replaced by dark ocean water that sunlightcan be absorbed, warming the water and increasing the warming of the planet.


The finding adds to concern about climate change caused by human activities suchas burning fossil fuels, a problem that has begun receiving more attention inthe Obama administration and is part of the G20 discussions under way in London.

"Due to the recent loss of sea ice, the 2005-2008 autumn central Arctic surfaceair temperatures were greater than 5 degrees Celsius (9 degrees Fahrenheit)above" what would be expected, the new study reports.

That amount of temperature increase had been expected by the year 2070.
The new report by Muyin Wang of the Joint Institute for the Study of Atmosphereand Ocean and James E. Overland of the National Oceanic and AtmosphericAdministration's Pacific Marine Environmental Laboratory, appears in Friday'sedition of the journal Geophysical Research Letters.

They expect the area covered by summer sea ice to decline from about 2.8 millionsquare miles normally to 620,000 square miles within 30 years.

Last year's summer minimum was 1.8 million square miles in September, secondlowest only to 2007 which had a minimum of 1.65 million square miles, accordingto the National Snow and Ice Data Center.

The Center said Arctic sea ice reached its winter maximum for this year at 5.8million square miles on Feb. 28. That was 278,000 square miles below the1979-2000 average making it the fifth lowest on record. The six lowest maximumssince 1979 have all occurred in the last six years.

Overland and Wang combined sea-ice observations with six complex computer models used by the Intergovernmental Panel on Climate Change to reach their conclusions. Combining several computer models helps avoid uncertainties caused by natural variability.
If you believe that is anything other than a direct admission that our models are unreliable, then I have this wonderful stock picking program for sale. Oh well.

Much of the remaining ice would be north of Canada and Greenland, with much lessbetween Alaska and Russia in the Pacific Arctic.

"The Arctic is often called the Earth's refrigerator because the sea ice helps cool the planet by reflecting the sun's radiation back into space," Wang said ina statement. "With less ice, the sun's warmth is instead absorbed by the openwater, contributing to warmer temperatures in the water and the air."

The study was supported by the NOAA Climate Change Program Office, the Institutefor the Study of the Ocean and Atmosphere and the U.S. Department of Energy.

Cold Fusion Vindication Heralded

This article is beating the cold fusion drum a little more loudly in light of the recent news that we posted a few days ago.

Fleishman and Pons were debunked unmercifully in North America and it was wrong. A scientist must be allowed to become enthusiastic when a new line of research proves enticing. Their curious results, however interpreted, have since triggered thousands of hours of good scientific lab work that needed to be done.

We are obviously gaining on the problem and we are possibly inching toward a working device that can be used.

Rather more importantly, we are slowly succeeding in the task of seeing the related particles and this leads to opportunities to do clever things.

Cold fusion when announced revealed our profound ignorance regarding atomic structure and more pertinently the nature of atomic curvature in and about the atom. All this plays a part in any prospective reaction. Cold fusion was the apple falling from the tree that signaled the need to take a long hard look.

Research Vindicates Cold Fusion
Cold Fusion Proven True by U.S. Navy Researchers - by Mike Adams, NaturalNews
Editor

http://www.westender.com.au/news/466

(NaturalNews) The world owes Fleischmann and Pons a huge apology: The cold fusion technology they announced in 1989 -- which was blasted by arrogant hot fusion scientists as a fraud -- has been proven true once again by U.S. Navy Researchers. In papers presented at this year's American Chemical Society meeting, scientist Pamela Mosier-Boss presented data supporting the reality of cold fusion, declaring the report, "the first scientific report of highly energetic neutrons from low-energy nuclear reactions."

Technically, it's not the first report at all, however. It might be the five-hundredth report, given how many people have been working on cold fusion since 1989 in laboratories across the world. Following the politically-motivated assassination of cold fusion credibility in 1989, the cold fusion movement went underground, renaming itself to LENR (Low Energy Nuclear Reactions). As LENR, cold fusion has been proven true in literally thousands of experiments conducted over the past two decades.

I first went public with the true story about the conspiracy against cold fusion in 1998. It described this classic conspiracy against a new technology, schemed up by desperate defenders of old technology -- hot fusion researchers who, after hundreds of billions of dollars in research money, have yet to produce a single sustainable hot fusion reaction that produces more energy than it consumes. The arrogant hot fusion researchers have the same snooty attitude as cancer researchers: "Just give us another billion dollars," they say, "and we'll find a cure!"

It's been the same story for nearly three decades now, and hot fusion still doesn't work. A working cold fusion unit, however, can be built on a kitchen countertop for less than $2,000, and it doesn't require a doctorate in physics to pull it off, either. It is precisely this simplicity that offends the arrogant hot fusion pushers who act much like medical doctors in the vicious defense of their territory.

Cold fusion applications

Cold fusion isn't some magical free energy machine. It produces excess heat, but slowly. So don't go thinking this is some kind of Mr. Fusion device that you can feed some banana peels and expect to get clean electricity out the other end.

Rather, cold fusion converts mass to heat energy, slowly losing a bit of mass through very low-energy nuclear reactions (hence the LENR name) that generate excess heat. In practical terms, cold fusion produces hot water.

And why is hot water useful? Because with hot water, you can produce steam. Steam turns turbines that generate electricity. This is how coal power plants work, too, except they're burning coal to heat water instead of using cold fusion. Conventional nuke plants work the same way, too, using much higher-energy nuclear reactions to heat vast amounts of water that drive electricity-generating turbines.

So heating water with cold fusion is a big deal. If the technology can be scaled up and applied properly, it could spell an end to the era of dirty coal power plants.

And that, friends, could mean a very big deal for reducing CO2 emissions and avoiding a worsening of global warming. It will even help global warming skeptics, too, because even if you don't believe global warming is real, the climate still changes on you. Mother Nature can't be debated. It just reacts.

Whether you recognize the reality of global warming or not, cold fusion technology could reduce air pollution due to coal power plant emissions. Coal power plants are the No. 1 source of mercury pollution on our planet, in case you didn't know. That's because burning coal spews mercury into the air, which then contaminates oceans and land masses, contaminating the world with mercury.

(Perhaps there are mercury skeptics who do not believe coal power plants spew mercury at all, or that mercury is safe for human consumption. The mercury skeptics are probably dentists, come to think of it...)

No radioactive waste

Cold fusion, by the way, does not produce radioactive waste. So it's not like a world full of cold fusion power plants would create yet another radioactive waste problem. It might cause a shortage of palladium, though, which is one of the metals typically used in cold fusion devices.

Some of the more astute readers of this website will probably figure out that investing in palladium futures ahead of any widespread production of cold fusion devices would no doubt be extremely profitable. But that kind of product rollout is likely years away, at best.

And that's assuming that this latest round of cold fusion announcements won't get clobbered yet again by the hot fusion conspirators. I'm half expecting an updated news announcement in a day or two, with a headline like, "U.S. Navy Retracts Cold Fusion Announcement, Scientists Accused of Fraud" or some such nonsense. If you see such a headline, remember what you're reading here, and you'll know it's all been manipulated to erase the reality of cold fusion from the sphere of public knowledge.

Cold fusion, after all, could revolutionize the energy industry and spell doom for coal and natural gas. I know a bunch of executives in Wyoming who are shaking in their (insulated) boots right now at the thought of cold fusion sidelining natural gas.

Authors' Quotes on Cold Fusion

Below, you'll find selected quotes from noted authors on the subject of Cold Fusion. Feel free to quote these in your own work provided you give proper credit to both the original author quoted here and this NaturalNews page.

Nowhere are the resistance to and promise of a new energy technology more dramatically revealed than those of the case of cold fusion. This well-researched approach has the potential of reversing much of the pollution while turning the interests of the energy monopolies upside down. Unfortunately, even the environmentalists haven't yet given new energy alternatives a fair look. The cold fusion Revolution: The unfolding cold fusion saga has provided us with an illustrious thirteen year history that would make the suppression of Tesla seem like a school exercise.

- Reinheriting the Earth: Awakening to Sustainable Solutions and Greater Truths by Brian O'Leary - Available on Amazon.com

The coup de grace was delivered to cold fusion when the US House committee formed to examine the claims for cold fusion came down on the side of the skeptics. 'Evidence for the discovery of a new nuclear process termed cold fusion is not persuasive,' said its report. 'No special programmes to establish cold fusion research centers or to support new efforts to find cold fusion are justified.'

- Alternative Science: Challenging the Myths of the Scientific Establishment by Richard Milton - Available on Amazon.com

Cold fusion The fusion of hydrogen atoms into helium at room temperature. In 1989 two scientists announced that they had produced cold fusion in their laboratory, an achievement that if true would have meant a virtually unlimited cheap energy supply for humanity. When other scientists were unable to reproduce their results, the scientific community concluded that the original experiment had been flawed.

- The New Dictionary of Cultural Literacy: What Every American Needs to Know by E. D. Hirsch, Joseph F. Kett, James Trefil - Available on Amazon.com

Thus within two months of its original announcement, cold fusion had been dealt a fatal blow by two of the world's most prestigious nuclear research centres, each receiving millions of pounds a year to fund atomic research. The measure of MIT's success in killing off cold fusion is that still today, the US Department of Energy refuses to fund any research into it while the US Patent Office relies on the MIT report to refuse any patents based on or relating to cold fusion processes even though hundreds have been submitted.

- Alternative Science: Challenging the Myths of the Scientific Establishment by Richard Milton - Available on Amazon.com

Patent Office of any application mentioning cold fusion; 3) Suppression of research on the phenomenon in government laboratories; 4) Citation of cold fusion as "pathological science" or "fraud" in numerous books and articles critical of cold fusion in general, and of Fleischmann and Pons in particular." One of the DOE panel members, Prof. Steven Koonin of Caltech (and now Provost there), said, "My conclusion is that the experiments are just wrong and that we are suffering from the incompetence and delusion of Doctors Pons and Fleischmann...

- Reinheriting the Earth: Awakening to Sustainable Solutions and Greater Truths by Brian O'Leary - Available on Amazon.com

Six months after cold fusion was announced, the American Department of Energy denounced it. In Japan, the people who are considered authorities blindly emulated the attitude of the Americans, as they invariably do, and they too pontificated against cold fusion. Perhaps it was inevitable that most people would assume the claims are cock and bull nonsense. In keeping with the tide of the times, countless books and articles have been published attacking cold fusion. The very act of researching cold fusion has become scandalous.

- Alternative Science: Challenging the Myths of the Scientific Establishment by Richard Milton - Available on Amazon.com

Equally illuminating were the remarks of Professor John Huizenga, who was co-chairman of the US Department of Energy's panel on cold fusion and who came down against the reality of the process. In a recent book on the subject, Professor Huizenga observed that 'The world's scientific institutions have probably now squandered between $50 and $100 million on an idea that was absurd to begin with.' The question is, what were his principal reasons for rejecting cold fusion.

- Alternative Science: Challenging the Myths of the Scientific Establishment by Richard Milton - Available on Amazon.com

This was perhaps the high-water mark of cold fusion. Scores of organisations over the world were actively working to replicate cold fusion in their laboratories, and although many reported difficulties a decent number reported success. And by the end of April, Fleischmann and Pons were standing before the US House Science, Space and Technology committee asking for a cool $25 million to fund a centre for cold fusion research at Utah University. Then things began to go wrong.

Pleistocene Mega Lions

I will make this simple. It was theorized that Clovis man was able to wipe out Pleistocene mega fauna. This discovery makes that rubbish. Man cannot even properly challenge the much smaller African lion in Africa where we have just as good a reason to see them off into extinction. So long as we knew little about saber toothed tigers the theory was safe. Now we have something we know plenty about.

I have posted extensively on the Pleistocene Nonconformity and the targeted polar blast that appears to have triggered it. I will also soon be posting a short update on the conjecture itself.

One other important result is also obvious. African lions are hunters of men and the Africans have barely kept the modern lions at bay. These suckers were impossible to keep at bay. That explains the limited human involvement with Pleistocene fauna. Strong defensive strategies were mandatory and this surely meant maximum use of thorn thickets and travelling in well armed groups. The Clovis point becomes very necessary, not for bringing down game but to confront these carnivores.

The existence of these lions is in fact additional indirect confirmation of unusual nature of the Pleistocene Nonconformity itself. These predators were adapted for the climatic conditions and stayed within their northern niche. Had they had time to readapt to more southerly climes, they would certainly have shifted south and hunted elephants. Instead they hunted musk oxen, black bears, grizzlies and smaller mammoths at least.

For the record, we have determined the nature of the Pleistocene Nonconformity in a number of earlier posts as a crustal shift triggered by a targeted blast - impact close by the pole whose blast - shock wave shattered the Pleistocene world. Ample evidence of time, direction and location has emerged and been posted on. And yes, we deal with the obvious objections.

ABSTRACT

Lions were the most widespread carnivores in the late Pleistocene, ranging from southern Africa to the southern USA, but little is known about the evolutionary relationships among these Pleistocene populations or the dynamics that led to their extinction. Using ancient DNA techniques, we obtained mitochondrial sequences from 52 individuals sampled across the present and former range of lions. Phylogenetic analysis revealed three distinct clusters: (i) modern lions, Panthera leo; (ii) extinct Pleistocene cave lions, which formed a homogeneous population extending from Europe across Beringia (Siberia, Alaska and western Canada); and (iii) extinct American lions, which formed a separate population south of the Pleistocene ice sheets. The American lion appears to have become genetically isolated around 340 000 years ago, despite the apparent lack of significant barriers to gene flow with Beringian populations through much of the late Pleistocene. We found potential evidence of a severe population bottleneck in the cave lion during the previous interstadial, sometime after 48 000 years, adding to evidence from bison, mammoths, horses and brown bears that megafaunal populations underwent major genetic alterations throughout the last interstadial, potentially presaging the processes involved in the subsequent end-Pleistocene mass extinctions.

From Scientific American:

Mar 30, 2009 05:25 PM in
Archaeology & Paleontology

Massive lions prowled North America not so long ago

By
Katherine Harmon

Large lions roamed North America and Europe as recently as 13,000 years ago, according to
a new study published in Molecular Ecology. "These ancient lions were like a super-sized version of today's lions, up to 25 percent bigger," study co-author Ross Barnett, a researcher at the Ancient Biomolecules Centre at the University of Oxford's department of zoology, said in a statement. The extinct big, big cats turned out to be much more closely related to today's lions than jaguars or other living contemporaries in North America, according to the study.

To trace the genetic tree of these fearsome felines, researchers analyzed DNA from fossils from across the Northern Hemisphere – from Germany to Wyoming. They found that the Pleistocene-period (1.8 million - 10,000 years ago)
lions living in Europe and Alaska were closer cousins than those living farther south in North America. The Oxford research team explains this by pointing to the Bearing land bridge, which connected Siberia and Alaska during the last ice age, allowing the cats to travel to North America from Eurasia. Ice sheets later cut off a path southward from Canada's Yukon, isolating the population to the south and eventually rendering it genetically distinct.

But after tens of thousands of years of hunting prey on the tundra, the lions, along with the mammoth and other massive mammals, died out about 13,000 years ago. "We still don't know what caused this
mass extinction," study co-author Nobby Yamaguchi, a researcher at Oxford's Wildlife Research Conservation Unit, said in a statement, "although it is likely that early humans were involved one way or another."

They are surely joking. Humans are going to hunt out a one ton lion that routinely takes down Mammoths? Before the advent of the gun, the Grizzly hunted past the Mississippi and routinely hunted men. The mega lion is bigger and faster and strong. I do not figure how they got the 25% larger calculation. The skull size difference suggests something far larger and certainly able to possibly crush the vertebrate of a large animal.