Thursday, November 20, 2008

Diane Francis and Economic Reform

This article by Diane Francis rather nicely outlines what went wrong and how financial leadership within the USA bought into and over extended their confidence game. That they are all dead now and governments are stuck with picking up the pieces is hardly a satisfactory outcome.

It really has yet to hit home. We are faced with a huge credit contraction that is going to continue to take ongoing intervention by unhappy governments to keep from forcing a financial contraction of the real economy,

China is doing its bit by funding a portfolio of long awaited infrastructure improvements that were long overdue, and we can be sure that this will stimulate a rise in China’s housing market. A decade of this and the country will be fully modernized.

It is a good bet that India will follow suit.

This financial shakeout is allowing China and India to build an internal consumer society. This surely is a good thing for the globe as it will reorder those societies and largely dispel chronic inequities.

The economies of the developed economies lack this obvious escape door. We have to do a lot more and we must use imagination. The obvious fix is to strongly stimulate the windmill industry now and the whole North American power grid in preparation for supporting electric auto carts.

This program is large enough and fast enough to make a real difference and we will then be positioned to use any energy system that enters the market.

And obviously submitting your financial system to a global system of standards is a must, or quite simply, no foreign lender will ever be prepared to provide the same capital to the US banking system. It has been shown that politicians and greedy mangers cannot keep their hands out of the cookie jar and that gaming the system is acceptable in the US. The US has damaged its credit standing in the eyes of the Global economy.

And the Chinese in particular but almost every country out there must also stop gaming their currencies for short term comparative advantage. Confronting this problem is a sticky problem because developing a proper solution without damaging trade is difficult. In fact, that is what always suggests itself.

Right now for the next several years, the global reorganization will likely make these concerns moot. Chinese surpluses will evaporate now.

America: Smarten up

By Diane Francis

America must smarten up. The global economy has crashed and the cause must be determined to prevent another. My concern is that the big problem facing the world’s leaders, who are trying to fix the global economy, will be America’s denial of its responsibility for this mess and its resistance to submitting to supranational institutions, solutions or oversight.

Such American policy isolationism could sink the planetary ship. It's obvious that the biggest misdeeds were perpetrated by American financial institutions worldwide. This makes arguments by Americans, that they don't want international regulatory oversight, is the financial equivalent of Iran’s isolationism.

Put another way, it would be as irrational to allow the Ayatollahs to build bombs as it would be to allow Wall Streeters, hedge funds and rogue insurance underwriters to run amok again building their weapons of financial mass destruction.

The fact is the catastrophe is mostly America’s fault and here's why:

Successive American regimes allowed the Federal Reserve Bank to print money without responsible controls.

This currency debasement was camouflaged because China, Saudi Arabia and Asian export nations bought massive amounts of T-bills. They did so to artificially reduce the value of their currencies in order to cheat trade-wise and sell more
oil, cars and manufactured goods.

Washington’s Treasury Department turned a blind eye to this currency manipulation cheating even though it has the power to impose tariffs against countries who do this. Washington was co-opted and allowed its dollar to be higher than it deserved to be and others to be too low.

The U.S.-controlled International Monetary Fund also reneged on its responsibility to watch currency cheating and, instead, morphed into a slush fund for infrastructure projects in poor countries.

Washington made matters worse by encouraging the over-leveraging, or weakening, of the American economy. Policies promoted borrowing and credit, in large measure through gigantic, regressive entitlements for rich people or homeowners such as unlimited mortgage interest deductions against taxes.
These incentives, easy credit and soaring money supply policies fostered the tech and real estate bubbles.
America also failed to be prudent and ran huge fiscal and trade deficits financed by foreigners. It did not encourage savings rates, in order to meet its looming demographic time-bomb of pension and senior healthcare liabilities.

American mortgage and bankingn regulation was an oxymoron. Fraudsters, mafias, terrorists were easily able to defraud borrowers and lenders. These bad debts were packaged, fraudulently re-rated and exported around the world by Wall Street, bundled along with good debts.

American insurance andn securities regulation was also an oxymoron. There wasn’t enough regulation, nor was there sufficient money to support sophisticated or streetsmart regulatory bodies. The result was the credit default swap tsunami which capsized Wall Street, AIG and the globe. There is no excuse for this, given that these swaps were merely gigantic insurance policies issued by public companies which knew that insurance and financial products must have sufficient capital and assets to cover claims. Their managers and boards turned a blind eye.

America’s financial institution boards of directors, shareholder activists and other watchdogs did not do their jobs.

Of course, the U.S. isn’t the only culprit, but it's by far the biggest.

That’s why the world must push back vigorously against any American resistance to supranational regulations and reforms. Every economy in the world must sign a non-proliferation treaty and submit to policing to avoid the financial skullduggery that has brought the world to its knees

Wednesday, November 19, 2008

Lost Tribes of Israel

The writers of the King James Bible were contemporaneous with Shakespeare. Today you can read that same bible with no difficulty whatsoever. Yet Shakespeare is a challenge needing a lot of help. Thanks to the Scots and Calvinism, our children have grown up for most of the past centuries been taught several years of reading, writing and arithmetic so that every plowboy could read for himself the words of god. That Bible set the standard and crystallized our language that even today is still progressively unifying us.

For most of the nineteenth century and the early twentieth century, every child got at least six years of the three R’s. In many cases, the only book was the Bible and little other entertainment existed. Therefore, it comes as no surprise that events and signs of the past were interpreted in terms of an apparent Biblical history. That was the only real framework shared and understood before the rise of the modern era including the ready availability of even newspapers.

I heard the tales and interest in the lost tribes of Israel that flowed through the discourse of the nineteenth century and simply dismissed them as wishful thinking and expecting no material cause. Again I reacted to the metaphor and assumed the intangibility of the metaphor reflected the reality of the causation.

After all, lost tribes are never lost, as much as absorbed into the next tribe often brutally through the separation and destruction of the men and remarriage of the women. The polite ways of civilization is the exception in human history.

Yet in North America there was an undercurrent of discoveries of ancient sites with inscriptions identified with early Phoenician or Canaanite script. The small cadre of scholars were at best dismissive and at worst destroyers.

Yet all the difficulties disappear if we accept one operating conjecture. The conjecture is that those European Bronze Age merchants were able to travel just once to the Americas and return to tell about it.

We have already described the European side of the equation in my postings in fair detail.

What is not obvious is that once the trade prospects were opened up, just as with the Spanish, it would have needed the European adventurers a mere generation to explore the Americas and its possibilities.

Their currency was copper. That was the royal road to wealth and state like enterprise. Other colonies may have been created, but they would always be vulnerable to been submerged in the sea of intertribal warfare of the Indians. Thus we need to go to the known copper fields to find their centers of influence.

It is there where the most evidence has emerged for a European presence in the Americas lasting over the millennia of the Bronze Age itself and finally ending perhaps with the rise of Rome in Western Europe. The major collapse took place in 1159 BCE with the Hekla blast but there is reason to believe that some form of commerce continued for another thousand years but severely reduced. Certainly the capability existed.

We have a thousand year mining history associated with Lake Superior with millions of pounds of copper clearly removed. We have the extensive stonework associated with the Ananazi in New Mexico which is another well established copper province. And we also have the rich copper and tin mines of the Alto Plano in Bolivia. Regardless we have the rise of organized antique civilizations throughout the Americas whose origins are been driven back to the dawn of the European Bronze age with continued archeological discoveries.

These civilizations always were suggestive of significant contact. We just never had a reason and proper time frames. We are now seeing both converging to our simple conjecture.

Their existence in time and place coincident with the end of the Bronze Age and the simultaneous rise of the Canaanite kingdom of Israel explains nicely the conforming evidence of similar scripts.

This also suggests that the Atlanteans were the inventors of phonetic scripts whose successors were Greek in particular. I think if is fair to describe this civilization as Atlantean even though the civilization itself was radically different than the expectations we impose on it. It was a palace centered mercantilism tied to the sea trade. It moved and marketed bronze. These merchant princes inspired societies in the Americas and throughout the Mediterranean. This meant that the actual footprint was small as compared to more direct colonizing events. A thousand years of success ended with the tsunami of Hekla and the resulting collapse of supply from the Americas controlled for centuries by Atlantis.

The Bronze Age ended, not because iron was better – it wasn’t for another thousand years, but because the supply was cut of. Very possibly, their American intermediaries lost control after the disaster that ended Atlantis.

Early report on ruins of New Mexico

The evidences of an ancient civilization which are met with in the ruins scattered throughout the Territory, have long been a subject of earnest inquiry among savants and explorers. The character and extent of these ruins prove conclusively that the region now known as the Territory of Arizona was, at some period in the past, the seat of a civilization much further advanced than that which occupied the land when first discovered by Europeans.

First among these prehistoric relics, both in its extent and state of preservation, is the Casa Grande, about six miles below Florence, in the valley of the Gila, and about five miles south of that stream. This ruin was discovered by Coronado's expedition in 1540. It was then four stories high, with walls six feet in thickness. Around it were several other ruins, some with the walls yet standing, which have since succumbed to time and the elements. The Pima Indians, who, then as now, were living in the immediate vicinity, had no knowledge of the origin or history of the structure. It had been a ruin as far back as tradition extended in their tribe, and when or by whom erected was as much a mystery to the dusky natives as to their European visitors. Father Pedro Font examined the Casa Grande in 1775, and describes the main building as ‘‘an oblong square, facing to the cardinal points of the compass. The exterior wall extends from north to south four hundred and twenty feet, and from east to west two hundred and sixty feet The interior of the house consists of five halls, the three middle ones being of one size, and the extreme ones longer. The three middle ones are twenty-six feet in length from north to south, and ten feet in breadth from east to west, with walls six feet thick. The two extreme ones measure twelve feet from north to south, and thirty-eight feet from east to west.’’ At present the ruins are about two stories high, and are rapidly crumbling away. The walls are composed of a material looking like concrete or grout. The dimensions of the ruin still standing are about 50 by 30 feet. It is divided into many small rooms, and plastered with a reddish cement. The walls still show small round holes where the rafters had entered, charred pieces of which are yet found imbedded in the adobe. The interior room is the largest, and is still in a fair state of preservation. All around the main building are mounds and traces of ruins, which go to show that a large city existed here at one time. The course of an immense irrigating canal, which watered the plain where the ruins now stand, has been followed to the Gila above Florence, forty miles distant.

Near Tempe, in the Salt-river valley, are found the ruins of extensive buildings, which are supposed to have been even larger than the Casa Grande. The foundations of one have been traced, which measures 275 feet in length and 130 feet in width. Excavations made in these mounds have brought to light several ollas which were filled with charred bones. The remains of a large irrigating canal are traced near the ruins. The road from Phœnix to Tempe follows the bed of the ancient water-course for a considerable distance; it is much larger than any in use by the modern occupants of the valley. The ruins of canals and buildings which are yet found in the plain between the Gila and the Salt rivers go to show that this region, now so desolate, was at one time thickly inhabited. At many other points in the Salt-river valley the marks of a civilization which once flourished here and made the desert to smile with industry, are yet plainly traced. All about the ruins are found fragments of pottery, painted in various colors and highly glazed.

In the valley of the Upper Gila, known as Pueblo Viejo, are found extensive mounds similar to those of the Salt river. Traces of buildings, irrigating canals, broken pottery, etc., are met with in every direction. Ruins of a like character are encountered at different points all along the Gila river. On the San Pedro, near its junction with the Gila, are remains of what must have been a large city. The foundations were of stone, laid in a coarse cement. Numerous ruins are found along the Verde and its tributaries, in the Agua Fria valley, and in the mountains and valleys extending for fifty miles in every direction from Prescott. Some of the structures on the Verde and Beaver creek, are among the most interesting in the Territory. On a hill overlooking the river, below Chino valley, is a series of ruins of stone houses; on another hill, about three miles east, are found the remains of many other stone buildings. In the valley of the Verde, traces of its early inhabitants are found in every direction. Opposite Camp Verde are a number of stone ruins, overlooking the river. Two miles down the stream, on an elevated mesa, an ancient burial ground has been discovered. On Beaver creek, a tributary of the Verde, are found many interesting cave dwellings. They are walled up in front, and look like the rocky bluffs out of which they have been excavated. Cisterns made of cement, and in a remarkable state of preservation, are found near many of these dwellings. One of these caves is eighty feet across its front, and nearly one hundred feet above the base of the cliff. The interior is divided into many rooms, the height of the roof being about fifty feet. The wall in front is pierced by two loopholes, through which a view of the country for some distance around, can be had.

In Chino valley, twenty miles north of Prescott, are found many interesting stone ruins. Large ollas, filled with charred corn and beans, have been unearthed from these mounds. Several skeletons have been discovered, and also a number of stone hammers and axes. There is every reason to believe that the inmates died by violence, the doors and windows being walled up, evidently as a protection against a hostile foe.

In the vicinity of Walnut Grove, twenty-five miles south of Prescott, are found the ruins of large stone structures crowning elevated mountain-tops, some of them from twenty to thirty feet square. On the Hassayampa, and the mountainous country south from Prescott, these ruins are numerous, and were evidently built on their commanding positions by people who were constantly harassed by savage foes. That the bed of the Hassayampa has been washed for gold in ages past, is proven by the large pines, whose age is numbered by hundreds of years, found growing where the ancient miner once searched for the precious metal. Prescott, the modern capital of Arizona, occupies, it is believed, the site of an ancient city, and many relics of its former inhabitants which have been brought to light, go to strengthen this theory.

Near Fort McDowell are found the remains of a large fortification, and of an immense irrigating canal. The bones of a man, supposed to be seven feet high, were unearthed near this point. On the Rio Bonito and other branches of the Salt river, numerous cave dwellings are found. The Colorado Chiquito valley exhibits traces of mounds and irrigating ditches, showing that this region was at one time densely populated. All over the Territory, north from the Casa Grande on the Gila, and extending into New Mexico and Southern Colorado, the ruins of stone buildings, large towns, cave dwellings, and immense canals are met with in the valleys and on the mountain-tops and hillsides, near the principal water-courses.

Nothing is left to tell the story of the people who constructed them, save the few earthen vessels which have been found in the ruins, the stone hammers and axes occasionally met with, and the fragments of broken pottery which lie scattered about their former abode. From the charred remains of human bones taken from the ruins, it has been supposed that the ancient people cremated their dead; and, from the few hieroglyphics which they have left behind, it has been thought they were sunworshipers. As to their pursuits and mode of life, it is generally believed they followed the business of mining, as well as agriculture. As has been before alluded to, the evidence is conclusive that many of the gulches in the Sierra Prieta range were worked for the golden treasures hundreds of years ago. That this ancient race, who have left such massive monuments of their skill and industry behind them, had made rapid progress in the arts of an advanced civilization, there can be no doubt. Who were those people who erected imposing structures, opened canals, and brought immense stretches of land under cultivation? From whence did they come, and what has been the cause of their extinction, so complete that nothing is left to tell the story? Many theories have been advanced as to their origin and history, but nothing definite is yet known of one of the most remarkable of prehistoric races of the American continent.

Here is a wide field for the savant who desires to trace the evidences of a civilization whose origin is lost in the mists of antiquity, and whose crumbling monuments yet proclaim its ancient vigor and wide extent. Perhaps the key to unlock the barred and bolted chambers of prehistoric American history may yet be found in the ruins of Arizona.

Make Way for the Buffalo

It is good to see this topic a little boost. There has been scant coverage this past year, but the industry itself is steadily growing. In fact, it would take a modest effort to expand the current North American herd ten fold over the next several years. It might be as simple as funding a marketing board or a dedicated meat production facility in the right places.

All the right things are happening. Now we just need to get the herds big enough to put buffalo meat on everyone’s table. We may even see pemmican restored as a staple.

The latest burst is promoted by an enthusiast who sees merit in selling far western land to provide capital to buy down the failing lands of the buffalo commons. I guess the idea is for a giant unfenced open plain covered with huge herds. I will pass on that thank you.

We need those fences to manage herd size and provide grazing control. We may never fully restore the prairie back to the original buffalo grass of the past, but we can certainly try. In the meantime, a monoculture of buffalo is not smart either. We have already shown success in mixing cattle and buffalo. And once the deep rooted grasses are reestablished we may even have luck handling a few goats and sheep on those grasslands.
Recognizing that these lands are superb fodder lands for well managed animal husbandry that survives principally on live fodder, even sometimes in the winter with appropriate augmentation from hays is the only viable protocol for these poorly watered lands.

Make Way for Buffalo

By NICHOLAS D. KRISTOF

Published: October 29, 2003

This forlorn farm town Rawson, population 6 -- is a fine place to contemplate the boldest idea in America today: rescuing the rural Great Plains by returning much of it to a vast ''Buffalo Commons.''

The result would be the world's largest nature park, drawing tourists from all over the world to see parts of 10 states alive again with buffalo, elk, grizzlies and wolves. Restoring a large chunk of the plains -- which cover nearly one-fifth of the lower 48 states -- to their original state may also be the best way to revive local economies and keep hamlets like Rawson from becoming ghost towns.

Rawson used to be a bustling town with a railroad depot, two stores, a hotel, a bank, a post office, a gas station, a Lutheran church, a lumber yard, a grain elevator and a school. It had its own newspaper, The Rawson Tribune, and its slogan was ''Rawson, where opportunity awaits you.''

It has been downhill ever since. Two years ago, after the election for mayor ended in an exact tie (one vote for Nels Heggen and one vote for Garvin Gullickson), after the four adult residents tired of taxing themselves to pay for seven streetlights, they dissolved the city and turned it into an unincorporated village.

''My children won't come back here to live,'' admitted Mr. Heggen, whose grandfather ran the hotel in town.
''There isn't much to do here. Right around here, it's kind of desolate.'' (Some journalists reach judgments about a place after interviewing just a few inhabitants; I boast that I talked to half the town.)

It sounds cruel to say so, but towns like Rawson are a reminder that the oversettlement of the Great Plains has turned out to be a 150-year-long mistake, one of the longest-running and most costly errors in American history. Families struggled for generations to survive droughts and blizzards, then finally gave up and moved on. You can buy a home out here for $3,000, and you can sometimes rent one for nothing at all if you promise to mow the lawn and keep up the house.

The rural parts of the Great Plains are emptying, and in some cases reverting to wilderness.

It's immensely sad to travel through the Dakotas' ghost towns or Nebraska's cattle country -- where Loup is the poorest county in America -- because they are full of warm, hard-working, honest farmers and ranchers who are having their hearts broken. How can one not admire the people of Sentinel Butte, N.D., where there is no attendant at the gasoline station but the townspeople all have keys and pay on the honor system?

Yet honesty and sweat aren't enough to make farming and ranching successful in marginal lands. The farms produce plenty of grain and beef, but they will never make much money, even with billions of dollars in agricultural subsidies. The economic model will be even less viable as underground aquifers run out in the next two or three decades. Much plains farming relies on the vast Ogallala aquifer, which is dropping at a rate of four feet per year.

So it's time to reach for something bold, like the Buffalo Commons idea, proposed in 1987 by Frank and Deborah Popper, two New Jersey social scientists. This would be the biggest step to redefine America since the Alaska purchase. Pushing it would give the environmental movement a chance to be known mainly by what it's for instead of for what it's against. But it would take close cooperation with the people with the most at stake: struggling farmers and ranchers, who for now are irritated by East Coast city slickers trying to turn their land into a buffalo playground.

''Why not let us manage our own affairs, just as people in New York would want to manage their own affairs?'' asked Keith Winter, a veteran rancher, during a break from working with his calves.

It's a fair question, and a Buffalo Commons can be achieved only if it benefits North Dakotans more than New Yorkers. That should be possible, for states like Colorado, Utah and Idaho have boomed by branching out from their traditional economic base to embrace tourism and recreation, and Buffalo Commons would become one of the world's wonders.

If Buffalo Commons comes about, perhaps a hotel can reopen in Rawlins, and Mr. Winter's ranch could draw German tourists who would pay to herd cattle. If the thunder of buffalo hooves is again heard on the open plains, that will not be the death knell for towns like Rawlins -- it will be their last, best hope.

E-mail: nicholas@nytimes.com

Obama adresses Global Warming

It is fascinating that the first non partisan address by president elect Barack Obama should be about Global warming and be coached in the language of a true believer.

At least we will be on a global drive to eliminate CO2 dumping into the atmosphere. That is good and necessary policy and I think its implementation to be good business however it is justified. I only hope the next few years of lousy weather do not cool everyone’s ardor.

The science is certainly beyond dispute, as the temperature has dropped most of a degree for the first time in a couple of decades this past year. His speech writers are clearly as out of touch with science as he is. In the meantime we are setting up for a bitter cold winter that is sure to mock all his efforts on the subject and certain to trigger a blow up in the grand tradition of scientific dispute.

Of well, let him have his warm-ups for the inauguration. His speech writers may even learn to get someone scientifically literate on board who knows the appropriate weasel words.

Report on Video Address

The science is beyond dispute and the facts are clear," Mr. Obama says. "Sea levels are rising. Coastlines are shrinking. We've seen record drought, spreading famine, and storms that are growing stronger with each passing hurricane season."

Obama says the White House has often failed to show leadership on the issue. "That will change when I take office," he says. "My presidency will mark a new chapter in America’s leadership on climate change that will strengthen our security and create millions of new jobs in the process."

He proposes a federal cap and trade system to reduce emissions to their 1990 levels by 2020 and an additional 80 percent by 2050; an annual $15 billion investment in private sector efforts to build a clean energy future; solar power, wind power, next generation biofuels, safe nuclear power, and clean coal technologies.

"Delay is no longer an option," he says. "Denial is no longer an acceptable response."

Good Comment Here

President-elect Obama was very explicit in his intention to implement a carbon cap-and-trade system to reduce CO2 emissions, with an 80% reduction goal by 2050, and it should be mentioned that John McCain also deserves credit for supporting cap-and-trade. To me, these developments are a clear sign of how far the world, following the universal lead of science, has moved past arguing whether CO2 reduction is necessary, and is discussing how to accomplish it. It's probably unrealistic to expect that all voices resisting a transition from fossil fuels to an economy reliant on renewable energy will immediately fall silent, but those of us who want our voices heard rather than ignored would be wise to engage in the discussion of carbon remediation options. The upcoming meeting in Poland will be another important step, although it's regrettable that China will not be participating from what I understand. Even so, China has already begun to match rhetoric with some constructive actions to reduce the magnitude of its CO2 emissions while it continues to promote its industrialization efforts, and its desire to receive help from the West in implementing carbon control technologies deserves a favorable response from the industrialized nations.

Fred Moolten

Tuesday, November 18, 2008

Big Battery Boom

This short article surveys the expansion of effort on the creation of industrial sized batteries. It is not a business for the small light start up company, but is certainly within the purview of the larger concerns. What this is telling us is that while the demand is there, no clear winner stands out.

The advantages of Vanadium can be offset by the superior energy density of the alternatives. That opens the door for plenty of competition. This means that solar and wind energy can be combined with onsite storage for ultimate release into the grid itself.

Conventional energy producers can also benefit from energy storage established throughout the grid, provided the turn around cost is low enough. Major industrial users can tap the grid during off peak times to reload their battery banks in preparation for the days work.

These batteries are a necessary major step in smoothing out the energy delivery system and many will survive well past their original need.

What this article reveals is that the development of this infrastructure is now underway and will shortly be at a factory near you. Obviously the installed industrial battery bank will support the build out of adjacent windmills and solar plants.

NOVEMBER 16, 2008, 9:19 P.M. ET

Building a Better Battery

Finding alternative sources of energy is only part of the battle. You also need to store it.

industry thinks it can make wind and solar power a lot more useful -- by building a better battery.
One of the big problems with wind and solar is that they're often not generated when they're needed.
Winds are usually strongest at night, for instance, when demand for power is at its lowest. That makes it tough for utilities to effectively integrate alternative power sources into their energy mix.

The Journal Report

Now companies across the globe are working on a potential solution: batteries that can store wind and solar power and release it onto the grid at times of heavy demand. Developers are investing millions -- and in some cases billions -- of dollars into a slew of promising technologies.

The battery industry "is going through a major growth phase," says Craig Irwin, vice president of equity research at New York-based financial-services firm Merriman Curhan Ford Group Inc.

Powering Up

With the growing push toward alternative energy and away from fossil fuels, the market for batteries is potentially huge. According to a report from Lux Research Inc., which tracks emerging technologies, the batteries could represent a $50 billion market if only 10% of wind-power plants installed them. However, because of the long planning cycles and risk-aversion of utilities, Lux predicts the market will reach only about $600 million by the end of 2012.

Some in the industry -- unsurprisingly -- are anticipating a much stronger market. Premium Power Corp., a North Reading, Mass., battery maker, envisions a day in the not-too-distant future when large-scale batteries and other forms of energy storage are ubiquitous.

"In 10 years, you'll see every renewable-energy source be tightly integrated with an energy-storage system and be controlled by the grid," says Bic Stevens, senior vice president of business development at Premium Power.

Utilities and project developers are already beginning to deploy battery technologies. One device, which companies are hooking up to wind farms, is called a sodium sulfur battery.

In this technology, the electrochemicals that create the reactions that store the energy are housed inside the battery. That's roughly how traditional batteries work, but these are much bigger and have more reactive chemicals. For instance, one utility is employing a sodium sulfur battery system that's 30 feet wide and 15 feet high.

Right now, these batteries are getting a lot of attention from companies like Japanese wind-project developer Japan Wind Development Co. In May, the company started a 51-megawatt wind farm and linked it to a 34-megawatt battery system developed by
NGK Insulators Ltd. of Nagoya, Japan. The energy-storage system will have enough capacity to power approximately 26,000 homes, by storing the energy generated by the wind farm and then redistributing that power during the day.

Utilities like
American Electric Power Co. of Columbus, Ohio, are also working with NGK, although on a much smaller scale. According to Ali Nourai, AEP's executive in charge of distributed power generation and energy storage, the company has installed five NGK batteries with 7 megawatts of capacity in total, enough to power approximately 5,400 homes. AEP's batteries are already up and running in Ohio, and others in Indiana and West Virginia will be operational by the end of the year. The utility also has a 4-megawatt battery set to be installed in Texas.

Going With the Flow

Sodium sulfur batteries can store a lot of energy, which is why utilities like them. But because of the reactivity of the chemicals involved and the ceramic separator required to keep the chemicals apart, the batteries are expensive to manufacture. One estimate pegs the cost at $2,500 per kilowatt -- which means even a small-scale battery could run more than $10 million.

And that gives some utilities pause. "The technology is beautiful, but it is not inexpensive," Mr. Nourai says.

So, he's hunting for alternatives. He believes that one of the most promising is a technology called flow batteries.

Unlike sodium sulfur batteries or other traditional batteries, flow batteries have their chemical reactants stored in external containers. That means that the batteries can be easily tailored to any size to store more energy. What's more, Mr. Nourai says that the batteries may be able to deliver power more inexpensively than sodium sulfur models; flow technology uses less-reactive chemicals and thus should be easier to manufacture.

Mr. Irwin, at Merriman Curhan Ford, pegs the cost of flow batteries at roughly $1,000 per kilowatt -- less than half the cost of sodium sulfur models. But he points out that cost of materials for a flow battery is only about $200 per kilowatt. So the overall price tag for the technology could drop dramatically if companies find a way to bring down the cost of manufacturing.

Pilot Projects

To get a sense of how the batteries could work, utilities are rolling out pilot projects. AEP is working with Premium Power, which is backed by $21 million from investors including VantagePoint Venture Partners, according to data from Dow Jones VentureSource. Publicly held companies like VRB Power Systems Inc. and
ZBB Energy Corp. are developing similar technologies.

The companies' offerings have some important differences. Both Premium Power and ZBB Energy use zinc bromide electrolytes in their flow batteries, while VRB Power Systems uses vanadium.

The benefits of the different chemistries are still being proved, but some analysts and industry observers say that the zinc bromide batteries have cost advantages over the vanadium batteries. Advocates for vanadium batteries argue that they can be recycled more easily and have faster response times, which can be helpful for better regulating the flow of energy onto the grid.

It's far too early to tell which battery manufacturer will win out at the large scale -- and another big entrant is about to complicate the picture even further. In late October, Intel Capital, the venture arm of the chip-manufacturing giant, put its money behind yet another player in the market. Intel backed Beijing-based Net Power Holdings Ltd., which is developing its own version of the flow battery, potentially with a greater cost advantage, given the ability to capitalize on more inexpensive Chinese manufacturing capacity.

—Mr. Shieber is a staff reporter for Dow Jones Clean Technology Insight in Jersey City, N.J.

Graphene Production

This is another bit of good news coming out of the labs. I am extremely optimistic regarding the future importance of graphene in technology development. What is there not to like about a substance that is hyper strong and able to handle the temperatures of all molten metals.

All other materials pale in comparison and I am sure that we will have grapheme as a working substrate for single layered metallic composites including superconductors and magnetic coolants.

2008 will surely be known as the year in which nanotechnology research reached full stride. Several labs have rushed into production their version of low cost solar cells. We are also due to see reports on some clever work involving surface structure management.

What most astonishes me is how the press has utterly missed the nanosolar story. The sheer size and the quality of corporate sponsorship is unprecedented. The fact that Nanosolar can announce a two million dollar tool able to produce the power capacity of a nuclear power plant in one year and they are shipping now has barely created a ripple.

We are entering one of the greatest five year technological transitions in human history. The starter gun was fired with $140 oil. The Global financial system was shaken out and soundly disciplined shortly thereafter.

We are now going to rebuild the global energy economy at breakneck speed. In five years we will all be driving electric autocarts because we have no choice. And North America will well on the way to energy self sufficiency.

This lab work is showing us the way to make large bits of graphene. This will surely be worked on intensely over the next two years in order to commercialize it. The second article gives us an excellent snap shot of the actual processes.

http://www.eurekalert.org/pub_releases/2008-11/uoc--rdm111008.php

Researchers discover method for mass production of nanomaterial graphene process has already produced the largest graphene sample reported

Graphene is a perfect example of the wonders of nanotechnology, in which common substances are scaled down to an atomic level to uncover new and exciting possibilities.

Graphene is created when graphite — the mother form of all graphitic carbon, which is used to make the pigment that allows pencils to write on paper — is reduced down to a one-atom-thick sheet. Graphene is among the strongest materials known and has an attractive array of benefits. These sheets — single-layer graphene — have potential as electrodes for solar cells, for use in sensors, as the anode electrode material in lithium batteries and as efficient zero-band-gap semiconductors.

Research on graphene sheets has been restricted, though, due to the difficulty of creating single-layer samples for use in experiments. But in a study published online Nov. 9 in the journal Nature Nanotechnology, researchers from UCLA's California NanoSystems Institute (CNSI) propose a method which can produce graphene sheets in large quantities.

Led by Yang Yang, a professor of materials science and engineering at the UCLA Henry Samueli School of Engineering, and Richard Kaner, a UCLA professor of chemistry and biochemistry, the researchers developed a method of placing graphite oxide paper in a solution of pure hydrazine (a chemical compound of nitrogen and hydrogen), which reduces the graphite oxide paper into single-layer graphene.
Such methods have been studied by others, but this is the first reported instance of using hydrazine as the solvent. The graphene produced from the hydrazine solution is also a more efficient electrical conductor. Field-effect devices display output currents three orders of magnitude higher than previously reported using chemically produced graphene. Kaner and Kang's co-authors on the research were doctoral students Vincent Tung, from Yang's lab, and Matthew Allen, from Kaner's lab.

"We have discovered a route toward solution processing of large-scale graphene sheets," Tung said. "These breakthroughs represent the future of graphene nanoelectronic research."

The coverage of the graphene sheets can be controlled by altering the concentration and composition of the hydrazine solution. This hydrazine method also preserves the integrity of the sheets, producing the largest-area graphene sheet yet reported, 20 micrometers by 40 micrometers. A micrometer is one-millionth of a meter, while a nanometer is one billionth of a meter.

"These graphene sheets are by far the largest produced, and the method allows great control over deposition," Allen said. "Chemically converted graphene can now be studied in depth through a variety of electronic tests and microscopic techniques not previously possible."

"Interdisciplinary research of this sort is a benefit of collaborative institutes like the CNSI," said Kaner, who is also an associate director of the CNSI. "Graphene is a cutting-edge nanomaterial and one which has great potential to revolutionize electronics and many other fields."

There are two methods currently used for graphene production — the drawing method and the reduction method, each with its own drawbacks. In the drawing method, layers are peeled off of graphite crystals until one is produced that is only one-atom thick. When likely graphene suspects are identified from the peeled layers, they must be extensively studied to conclusively prove their identity. In the reduction method, silicon carbide is heated to high temperatures (1100° C) to reduce it to graphene. This process produces a small sample size and is unlikely to be compatible with fabrication techniques for most electronic applications.

"This technology (hydrazine reduction) utilizes a true solution process for graphene, which can dramatically simplify preparing electronic devices," said Yang, who is also faculty director of the Nano Renewable Energy Center at the CNSI. "It thus holds great promise for future large-area, flexible electronics."

###

The California NanoSystems Institute at UCLA is an integrated research center operating jointly at UCLA and the University of California, Santa Barbara, whose mission is to foster interdisciplinary collaborations for discoveries in nanosystems and nanotechnology; train the next generation of scientists, educators and technology leaders; and facilitate partnerships with industry, fueling economic development and promoting the social well-being of California, the United States and the world. The CNSI was established in 2000 with $100 million from the state of California and an additional $250 million in federal research grants and industry funding. At the institute, scientists in the areas of biology, chemistry, biochemistry, physics, mathematics, computational science and engineering are measuring, modifying and manipulating the building blocks of our world - atoms and molecules. These scientists benefit from an integrated laboratory culture enabling them to conduct dynamic research at the nanoscale, leading to significant breakthroughs in the areas of health, energy, the environment and information technology. For more information, visit
www.cnsi.ucla.edu.

http://arstechnica.com/journals/science.ars/2008/11/12/rocket-fueled-graphene-production-promises-higher-volume-ready-for-edits

Rocket-fueled graphene production promises higher volume

By Todd Morton Published: November 12, 2008 - 08:22AM CT

Graphene has shown the great potential in the short time that it has been at the forefront of materials research. It's essentially an unrolled carbon nanotube, so graphene shares many of the unique electrical and physical properties that have made carbon nanotubes the poster child of materials research in the last decade. Production of graphene is still decidedly archaic, though.

Nobel Intent covered a
decomposition technique that allowed for more accurate deposition. At the time, this was a "high volume" technique, but it provided nothing close to the volume needed for any industrial application or larger scale research effort. Recent research, published in Nature Nanomaterials, demonstrated a solution-based technique that has the promise of both large-scale production and bigger samples, which should open the door for more extensive characterization efforts.

When graphene was first discovered, the best method available for making it was simply taking graphite and peeling it apart with cellophane tape until you had a monolayer of graphene that you could transfer to a substrate (science at its finest, my friends). The arduous part came in determining just what exactly you had produced—scanning electron microscopy, which uses electrons instead of photons to resolve an image, would reveal candidate graphene sites, while atomic force microscopy (think of it as a record player that reads individual atoms instead of your worn out copy of Led Zeppelin III) would confirm a that it was, in fact, a perfectly flat single layer of carbon. To say that this method doesn't lend itself to large-scale production, or even large-scale laboratory work, would be an understatement.

Decomposition methods involving baking silicon carbide, which are also used to produce carbon nanotubes, often yield misshapen, mutant sheets of graphene, and demands high temperatures that rule out any sort of in-line processing with traditional electronics manufacturing equipment. It often yields materials that are less than a square micron, which rules out several characterization techniques that require a larger mass of material.

Researchers have continued working with hydrazine (a well-known rocket propellant), using it as a solvent for graphite oxide that can also strip off the oxygen, preparing the graphene for deposition. The resulting process could be controlled to make samples of graphene as large as 40 microns square, as well as smaller samples if required. The graphene was reasonably high quality, although testing revealed a lack of n-type semiconductor behavior might have resulted from residual hydroxyl groups left by the hydrazine.

Hydrazine processing eliminates several problems associated with using water as a solvent, such as agglomerations of graphene during drying. The dissolved graphene oxide could also be transferred to a less toxic solvent for deposition.

A truly bulk process, such as the solution process demonstrated in this research, is a big step towards making graphene devices a reality. A chemical process like this, although incredibly toxic, is much cheaper for a research institution to deal with than tying up expensive and specialized lithography equipment, which represented the previous state of the art for graphene production. Keep checking Nobel Intent, and we will continue to bring you news on studies of graphene's unique properties, and the efforts to produce it en masse.
Nature Nanomaterials, 2008. DOI: 10.1038/nnano.2008.329

Early Onset of Winter Sea Ice

I am sure that few have noticed this but the Arctic sea ice has appeared a bit ahead of schedule. This conforms to the information that the globe itself dropped most of a degree this past year. A much thicker layer of first year ice fits and it may approach what is considered normal from historical records.

Right now the folks supporting the pacific decadal oscillation (PDO) model are making the global warming enthusiasts into purveyors of nonsense. Sorry al Gore.

Another tidbit that I picked up on recently was that the era of the Little Ice age was unprecedented during the Holocene. I presume this is meant in terms of its duration, since volcanic popped up and provided cool offs and equally swift recoveries. The little ice age was sustained over decades. I am not sure to believe that as a valid observation and will now try to get a handle on its’ validity. Help is welcome since an unprecedented event in the Holocene begs explanation.

We are now having our second cold year and this one is looking like it is heading toward the lower third of observed temperatures. IPCC and Jim Hansen is actually looking a little rattled over this one.

As I stated when I began this blog, I believed that monitoring changing climatic conditions is a worthy endeavor, and that involving ourselves in the effort to sequester CO2 was simply good husbandry. I also stated that the linkage between these two phenomena was suspect and quite likely within the band of data noise and thus indeterminate. I thought that attempting to support linkage as a truth was simply going to become untenable as soon as the weather decided to change. It appears that is what has now happened.

Unless something surprising comes along, the only interpretation that can now be supported is a recovery of perennial sea ice in the Arctic. We came very close to a warmer Arctic, but not close enough.

An ice free summer Arctic is still a worthy objective but must surely wait for a human led restoration of the Bronze Age Sahara. That would collect and retain the extra heat that is necessary.

Monday, November 17, 2008

Australian Low CO2 Coal Burner

We have here an interesting project from Australia. We start with compressing air in order to separate out the nitrogen. This does not have to be a perfect separation and yields of 98% are likely and much less would do.

The oxygen is then blended with exhaust gases to give the same ratios associated air. This will ensure that a more complete burn is achieved as this gas blend is fed into the powdered coal burner. The carbon monoxide could be sharply reduced.

A portion of the exhaust gas is taken off and also compressed in order to liquefy the carbon dioxide.

Of course, this is all doable. The main trick is to get the nitrogen out of system before it gets to the burner itself to hugely reduce the amount of exhaust gases needing compression. Something like this is able to remove most produced carbon dioxide in a transportable form for disposal.

Much is made of geological disposal and my question is it possible to get deep enough to use liquid pressures to keep the co2 liquid in the ground facilitating easy disposal.

This is a coal thermal plant so the heat of compression should be put back into the various heating processes employed by the plant.

I would like to see the reverse Rankin cycle engine integrated with this.

Then we get to the real problem that has made ideas like this fail for decades. It is the problem of compressor efficiency. It has never been good enough to avoid large systemic losses and we are asking here for high volume compressors. It sounds like an engineering nightmare from this distant remove.

I got a solid introduction to the technology thirty years ago and not to be too picky, I see minimal evidence of real improvement. I think that the industry has just never been sexy enough and besides, folks are prepared to make do with designs lifted straight of engine design in particular.

I really would like to find evidence of major improvement, but the only evidence that I note is emphasizes by the organizations on their own reliability. Somehow I suspect that the equipment itself continues to be maintenance demanding.

A lot of energy is currently lost by compressors and we are sticking them on both front and back. Marrying this up with the reverse Rankin cycle could help a bit, but developing a new super efficient compressor for industrial applications was grossly overdue decades ago and is even more compelling today.

Maybe someone has done it?

One other reason I would like something like this to actually work is that grabbing the flue gas gives you an opportunity to collect the heavy metals. That would eliminate the primary source of mercury entering the environment. This is not a minor problem. It is impacting us through our consumption of fish and is endemic in high fish diets although that knowledge is at best suppressed.

I posted last year on the utility of stripping the flue gas stream of SOx and NOx and heavy metal using a chlorine cycle.

The resulting gas was CO2 stripped of its heat and compressor ready.

In the event, these folks are taking a run at using a lot of compression and it will be worth following to see if it has a happy outcome.

Xstrata, J-Power in low-emission coal tech first

SYDNEY (Bloomberg) J-Power and Xstrata PLC have started a 206 million Australian dollar, or $137 million, project in Australia that will be the first in the world to use a low-emission coal-fired generating technology.

The technology may cut typical
carbon dioxide emissions from coal-fired generation by about 90 percent.

Schlumberger Ltd., Mitsui & Co. and IHI Corp. are also in the group funding the venture in Queensland state, the Callide Oxyfuel Project said Friday in a statement. The 30-megawatt plant is due to start operating in 2011.

The federal and state governments are contributing A$85 million to the Callide project. The Japanese government and the Australian Coal Association are also providing money for the plant.

"This project will lay the foundation for the widespread deployment of low-emission coal technology so essential for Australian power generation and for the millions of people across the world relying on Australian coal," Australian Resources and Energy Minister Martin Ferguson said Friday in a separate statement.

The venture's oxyfuel technology involves burning coal in a mix of oxygen and recirculated waste gases, instead of air, resulting in higher concentrations of
carbon dioxide that can be more easily captured from the exhaust gases.

The carbon waste is then liquefied and buried underground. The technology may cut carbon dioxide emissions from coal-fired generators by about 90 percent, the venture said.

The technology can be fitted at existing coal-fired generators instead of building a new low-emissions plant.

Australia, which plans to introduce
emissions trading in 2010 to fight global warming, depends on coal for more than 80 percent of its power supplies

Barry Fell and Atlantis

It has been thirty years since Barry Fell stirred up a hornet’s nest regarding his thesis of a long lasting European interaction with the Americas that began perhaps around the beginning of the Bronze Age and had died out before the current era, probably as a result of Roman suppression of the Celts and their deep sea fleet by Julius Caesar.

His strength was the interpretation of inscriptions in particular. To put it bluntly, he was the first to take them seriously and attempt their interpretation. He was also a linguist and expert in the art of epigraphy as well as a trained scientist in marine biology.

His death caused his initative to be shelved in the late eighties. He made extraordinary claims, showed extraordinary proof and was denounced in an extraordinary manner. They are still bad mouthing him to this day. He never deserved that, and for that matter no scholar deserves that.

I have observed that most scholars are successful because of their trained memories or unusual memory talent, usually in the form of an actual eidetic memory. I have also observed that when an idea is entered into that world that conflicts with their memory patterns, the natural instinct is to dismiss the idea. This makes it very difficult to pursue new ideas with this style of scholarship.

Not surprisingly, these folks usually avoid those areas of scholarship demanding sophisticated mathematical thinking and the like, although a friend of mine did dive into that world with an obviously eidetic memory and aced everything he touched. His measured IQ was at 185. I once introduced an idea to him that had been incredibly fruitful, yet he dismissed it immediately with a label and went elsewhere in his thinking. I was startled and I never disabused him.

My conjecture is that the more powerful the memory function the more difficult it is to shift mental gears and follow curiosity outside ones defined expertise which also excludes new material not already part of your world.

In any event there is currently limited material support for work in this field. The recent recognition of ancient city remnants outside the Straits of Gibraltar that conform in space and detail and best locale with the legend of Atlantis is another example. It took a non specialist (a mathematician) to tell the story. I also have the advantage of not having to protect an academic reputation so I can shout as loud as I like.

This piece here is the Los Lunas inscription of the ten commandments located in New Mexico using a script that conforms to script used around 1000 BCE.

http://www.asis.com/users/stag/Las%20Lunas%20Ten%20commandments

Of course the naysayers have shouted hoax. And perhaps a scholar steeped in Canaanite scripts took a trip into the desert sometime in the nineteenth century and set up this hoax. The hoax claim is thrown automatically against every inscription found everywhere.

The problem of course is that the hoaxer would need incredible knowledge of the script and sentence structure able to preclude error confirmed by forthcoming discoveries. The fact is that an inscription hoax fails to hold up in the face of new inscriptions. It starts to be the odd man out. I am very uncomfortable with the hoax argument. It still needs a crooked scholar who failed to profit from his discovery.

More compelling, the script is reminiscent of Celtic Ogam to the extent that it can be easily chiseled into stone. Both scripts are phonetic based and recently it has been established that Mayan script was also phonetic based.

As an aside, I never understood what the Mayans were doing until I realized their script was painted with a rather fat brush. The tools and media determine the nature of your script. Ogam was designed to be cut into a branch.

This is an example of Phoenician script from Tyre surely and perhaps the principle script of the city of Atlantis.

Recall also, like Lake Superior, that New Mexico is copper country and is accessible via the Rio Grande.

Plainly, the sooner Atlantis and its Atlantic mercantile empire is dug up and acknowledged the better for scholarship everywhere. What impresses me most is how accurate Plato’s report turned out to be. I am developing a healthy respect for ancient sources even if they lacked our vocabulary. By the time something was consigned to paper and many successive rewritings, it was thoroughly attested to unlike today low cost makes this aspect far less important.

Friday, November 14, 2008

Chapter 11 Solution for USA Auto Industry

A question now needs to be asked about the current state of the global economy. The banking system has been stabilized because it had to be. The magic of reserve banking is that if your reserves expand you are allowed to expand your loan portfolio by the appropriate multiple. This all works because the banks have a monopoly on deposit management and it is all tightly regulated or at least used to be in order to prevent the type of crisis we just got hit with. It is the natural nature of competition that forces the imposition of regulation on this particular business. After all if you lend recklessly for a short while, your earning performance will give you bragging rights among your peers.

Unfortunately the reverse magic happens when confidence fails. The limited cash evaporates and the loan contracts can never be sold for immediate liquidity. In these circumstances, governments must step in and provide liquidity. Once confidence returns, the multiples will reexpand and the banks will repay the bail out loans. And everyone will wonder what it was all about.

The best assurance that an investor can have is to hear bankers complaining about their rules.

That begs the next question. What about the auto industry? Their lease portfolio is good and likely needs a mere assist there to get access to cheap money. After all they are not deposit taking institutions.

The manufacturing situation is a vastly different story. This industry has moved heaven and earth to accommodate the cost structures imposed by their employee unions and the grossly distorted medical insurance system. They have shifted as much manufacturing off site as possible and they demanned as much as possible. The bottom line is that they are premium prices for labor in a market were their competitors are not and that includes their newly built North American competitors. It must be fixed and fixed now.

The simplest solution is not to write large governments checks unless it is to provide bridge financing while the industry passes through chapter 11. That puts all union contracts and debt obligations and other such contracts into the proverbial cocked hat for a complete restructuring and puts all stakeholders on the same side working together to stave of sudden death.
And done properly, there is no reason to destroy the shareholders or the debt holders since the industry will be immediately profitable, though needing time to rebuild reserves and reduce debt.

The point that I am making is that the woes of the auto industry stem from its structural problems, not visibly shared by their competitors. Re structuring now will end the charade that they can compete against competitors on the world stage where their cost structures do not apply.

Starting immediately, the US auto industry needs to manufacture small electric auto carts as fast as possible. Their range may be only forty miles, but everywhere except the USA and Europe, this will not matter much, and we will get over it.

The real emergency is that the USA must begin reducing oil consumption very aggressively and our personal transportation is essentially obsolete. Very likely you will receive a fuel ration that will let you drive the old SUV 5,000 miles per year. And second cars will become hanger queens. An electric auto cart solves the problem by providing urban transportation. And surprise, surprise, you will discover that occasional long haul travel will generally not exceed that 5000 miles for the majority.

Bioplastics

This short article on the current state of the emergent bioplastics industry is useful. The amount involved may not seem large but it certainly has the largest value added on resale and is able to absorb the cost of a new feedstock. Knocking plastics out of the fossil fuel business will help both industries in the face of the oil rationing that is now inescapable.

The links include a good article on the subject and covers the industry well.

I do not know now degradable this form of plastic actually is. You can be sure that the promoters will suggest far more than can be delivered. I suspect that it is not unless it is designed in.

This is a cause that all the globe can get behind and implement by the simple process of phasing out the old over ten years. The ocean problem informs us that it must be done.

One way we can conserve oil and solve a big pollution problem is by switching to bioplastics. We now use from 5% to 10% of our oil to make plastics. Plastics are a huge environmental problem. Many of the products made from plastics are disposables. Much of it is packaging. It ends up in our landfills and polluting the land and sea. Plastics are generally not biodegradable.

The amount of plastic pollution in the oceans is astounding. There is a sea of plastics as large as the United States in the Pacific Ocean. The winds and currents in the north pacific circulate clockwise, gathering bits of plastic into the center. The plastic breaks down into tiny bits small enough to be inadvertently ingested by creatures at the bottom of the food chain.

Here's what DR. Marcus Ericson of the Algalita Marine Research Foundation has to say about this phenomena.

"Most plastic floats near the sea surface where some is mistaken for food by birds and fishes. Plastics are carried by currents and can circulate continually in the open sea. Broken, degraded plastic pieces outweigh surface zooplankton in the central North Pacific by a factor of 6-1. That means six pounds of plastic for every single pound of zooplankton." "Storms flush plastics down stream and ultimately into the ocean. Plastic debris looks bad, but it behaves worse. Far worse! Plastic pollution negatively effects trillions upon trillions of ocean inhabitants and ultimately humans."

"Plastic pieces can attract and hold hydrophobic elements like PCB and DDT up to one million times background levels. As a result, floating plastic is like a poison pill. As a result, new research regarding endocrine disrupters in floating plastic debris is being planned by the Algalita Marine Research Foundation.. "Synthetic Sea" is a documentary based on scientific findings backed by published scientific papers."

http://www.algalita.org/research.html#plastichttp://www.algalita.org/pelagic_plastic.html
There are bioplastics already on the market on a small scale. Some of the big chemical companies are working on bioplastics, as well as some smaller companies. Some of these products are blends of petroleum based and bio based plastics.

By far the best and most promising technology is from a company called Metabolix. They are building their first factory in Iowa in a joint venture with Archer Daniels Midland, one of the large purveyors of corn products. The joint venture is called Mirel. The plastic they make is called PHA. This plastic has better biodegradability than any other, and their manufacturing process is more efficient and has less steps than that used by other companies.
From: The Independent

"But what's really clever about Mirel is the way it is "grown". Most modern bioplastics are manufactured by extracting starch from maize or other crops and fermenting it to produce an acid, which then undergoes a series of chemical treatments to create a plastic polymer. (lactic acid -hence polylactic acetate or PLA)The scientists at Metabolix have engineered microscopic bacteria to do all that work for them. They add sugar from the maize, as well as oxygen, and watch the microbes swell as tiny plastic particles form inside them. Using a secret process, the particles are then harvested to create the pellets that can be moulded into a range of products."

Metabolix is currently using corn as the feedstock, but is looking to non food crops for the future. What is really impressive about Mirel plastics is that they are not just biodegradable, they are actually compostable. Other bioplastics need to be heated to 150% or so before they are compostable. This is not the case with Mirel bioplastics.

"Mirel has the physical properties to be a useful alternative to most traditional plastics," says Barber.
"But initially we're focusing on disposable items, such as razors, plastic bags and packaging, which use so much plastic and just get thrown away."Mirel is different. "It will break down in almost any environment, including soil, in industrial or domestic compost, or even in rivers and seas," says Barber"

From the Metabolix website:"Metabolix today produces a broad family of these natural plastics through the fermentation of plant sugars and oils using microbial biofactories. These materials range in properties from stiff thermoplastics suitable for molded goods, to highly elastic grades, to grades suitable for adhesives and coatings. In some cases, Metabolix natural plastics offer combinations of properties not available in synthetic materials. For example, the combination of excellent water resistance with biodegradability allows flushable personal hygiene products and wet wipes."

That's not all. What is truly amazing is that Metabolix has developed a process for growing plants, with the plastic already in the stems and leaves of the plant! They have done this using switchgrass on a small scale and are further developing this process with a grant from the federal government. My first reaction, when hearing about this, was that they must be genetically modifying the plants. We all know how controversial GM crops are. That is not what they are doing. They are genetically modifying the bacteria that breaks down the sugars and starch, producing the plastic.

"In the future, Metabolix natural plastics will be produced directly in plants, making them cost-competitive with even general purpose resins such as polyethylene, and environmentally friendly alternatives to over half of the plastics used today."

"Metabolix will produce PHA Natural Plastics directly in non-food crop plants to provide cost-competitive alternatives to such widely used plastics as polyethylene, polystyrene, polypropylene, and PET, and useful raw materials for a variety of currently important chemicals. These new plastics, however, will be agriculturally produced from annually renewable resources and can be incinerated or composted with no net increase in atmospheric carbon dioxide over their lifecycle, including harvesting, isolation, and incineration or composting."

Metabolix is also developing technology to produce feedstock for biofuels as a byproduct of their bioplastics processing.

http://www.metabolix.com/Plastics Engineering's June 2007 issue had a cover story about the bioplastics industry.

Here's a link.
http://www.cereplast.com/pdfs/1185862661.pdf
Cereplast is another bioplastics company.
http://www.cereplast.com/homepage.php
The article gives overviews of many more companies involved with bioplastics.

Chris Nelder on IEA report

This is a report on the IEA report by Chris Nelder.

http://www.energyandcapital.com/articles/iea-oil-report/782

Read and weep. These folks over at IEA backpedalled this pending disaster for the past several years. Now they are backpedalling their way out of the corner.

George Bush’s greatest historical failure will be his inertness in face of this. It was no secret. I personally figured that markets and oil were heading for a crisis toward the middle to late 2008 as much as four years ago. The only question was whether he could wiggle out of office soon enough.

Had he followed Arnie’s lead and pushed hard to commence the national shift over to LNG, we would be in position to handle the pending oil supply crisis. I suspect Arnie tried to get his attention.

We are now going to have a crash program not unlike the transition to a wartime economy. It feels like it already.

A global oil rationing authority is coming with a price lock around $70 to $100 dollars per barrel. That is high enough to maximize investment effort and encourage replacement with alternatives.

It will also bring THAI/CAPRI oil production expansion as fast as possible since their cost structure will drop toward thirty dollars per barrel.

We cannot transfer energy at a higher level without destroying the customers as the Saudi’s learned in the seventies.

No one has calculated in the impact of THAI/CAPRI as yet. A production well pair can be put on line and producing a thousand BPD in perhaps twelve months. It will at least be building up to that level by then. That means a thousand drill rigs can likely build out thousands of such pairs inside of a year. It is within our capacity to expand heavy oil production by several millions of barrels over the next several years just in Alberta.

In crisis, the same technology will add millions more around the globe.

The fixes are there and are known. We just need recognition and support to see it through as quickly as can be.

Thursday, November 13, 2008

Pleistocene Evidence

My articles on the Pleistocene Nonconformity and the likely prior establishment of a developed human society for the previous several thousands of years fills a large gap in the emergence of mankind. It also recognizes that human occupation of the temperate zone was not a viable option, as any survey of ice age conditions confirm.

The population had the great coastal plains, now submerged and a couple of good zones in the Tropics.

For these conjectures to stand up and bark, there still has to be viable evidence. And in fact the evidence exists and is out there to be recognized. But without these conjectures in place and part of the mental tool box, all eyes are blind.

With these tools, it is now possible to look at old evidence and provide a superior interpretation. This will not always work, and the number of possible artifacts to be recovered decline rapidly as we go back in time. I also suggest that we should recall our own artifacts are nor surviving for very long at all and are now been vigorously recycled and will soon be all recycled unless they make it to an antique store.

As an example, my working conjecture on the presence of Bronze Age traders in the Mississippi valley has allowed a rereading of old reports whose evidentiary content could not easily have been fabricated at the time they were published and conform to established Bronze Age communities overlain on the indigenous societies. Now we need more informed eyeballs even if they are trying to prove that conjecture wrong.

There is evidence, controversial of course, from the time frames that matter and in the one place that they could be expected to exist. Scattered occurrences of unexpected artifacts have been found in mining locales. Most have a recent genesis as expected but a few simply do not.

The problem is sufficiently troublesome as to bring a whole range of aging methods into question. We are not just talking of the substantial readjustment brought to the science of carbon dating. Radioactive aging has always relied on the assumption that the process is independent of external effects.

If anything, the carbon fiasco should have cured us, but instead we actually have a situation in which the data is often fudged and ignored if it goes against prejudice. This means that although most dating is valuable, it needs to be confirmed by some form of physical method such as checking strata.

Aging is still a young science and we do not know what can alter radiometric readings although we certainly have evidence that it is possible. It is prudent to be overly cautious.

From my articles we have established that the crust shifted with the original pole migrating thirty degrees south along the longitude running through Hudson Bay. This shifted materials of the equator and led to compressive forces that lifted both the Andes and the Himalayas. It also led to a lot of additional alteration at the same time that we do not easily recognize.

One of those events may be the Columbia River basalt flows. It is actually the sort of thing that could have happened then. The studies on aging were saying that they were far older than that. And I would have been happy to simply leave it at there. Except human artifacts were then located below the basalt itself. When that happens, something has to give and ignoring the evidence is utterly unacceptable.

At which point everyone remembers that the basalts look fresh. That is also my beef about a lot of the mountains in the Andes and Himalayas and along the ring of fire. There are simply way too many surfaces defying gravity for a few million years to be very believable. If anything, the record shows an eruption of activity perhaps fifteen thousands of years ago followed be a steady settling down of such activity.

This also suggests that we should look to the Northern points of weakness and this quickly gives us the hyperactive Alaska volcanoes and Iceland by itself astride the crustal divergence rifts. Iceland likely was build during this era. The oldest rocks in Iceland are a meager 23,000 years, at least the last time I checked the literature.

It always amazes me that when you have a successful conjecture, how easily evidence falls into place. Right now we know specifically where to look.

I will make one additional comment. The early proponents of a crustal shift including Hapworth and Einstein opened the door to a couple of additional shifts which I dismissed as unlikely. With the advent of direct human causation, additional shifts become feasible and simply may have been necessary to achieve the final configuration that has given us the Holocene.

This is just a beginning. There are many reports out there that have been shelved that suddenly make a lot of sense if we are using my conjectures. If you see something in an odd location that I should see, let me know.

Don Easterbrook on Global Cooling

This is an excellent article presenting the evidence for the onset of global cooling. That it is an extrapolation of past cycles over a claimed period of several hundred years is unconvincing since our data is very spotty after we go a hundred and fifty years back. I would even assert it is spotty up to the beginning of the satellite age.

We comfortably know a few things as we go back in time and very little outside of major centers. I am very conscious of how local weather is in the face of minimal communication between observers.

Therefore I get nervous as we attempt to link various proxies together.

In the meantime the arguments for global cooling are quite solid compared to the IPCC projections which have been just plain wrong. They did not even get lucky. I swear that I could give them three to one odds for flipping coins and still win. Ten years of flat temperature change is a pretty good endorsement for the failure of a model claiming an uptrend.

We now have had a major down draft in the apparent temperature regime that must mean something if it is not a herald of many cold winters. Also the Alaska glaciers were net gainers of snow this past season. Thus we can expect the glaciers to advance shortly.

I am going to miss seeing an ice free Arctic it seems.

I must remark that the evidence is certainly mounting that the deep freeze is in the process of returning to the Arctic. It really has not actually hit the sea ice yet but it should deliver a sharp increase in thickness this winter if it has actually gotten colder. Again all we can do is wait and see.



Global Cooling is Here
Evidence for Predicting Global Cooling for the Next Three Decades

by Prof. Don J. Easterbrook

Global Research, November 2, 2008

Department of Geology, Western Washington University

Global Research Editor's note

The following article represents an alternative view and analysis of global climate change, which challenges the dominant Global Warming Consensus.

Global Research does not necessarily endorse the proposition of "Global Cooling", nor does it accept at face value the Consensus on Global Warming. Our purpose is to encourage a more balanced debate on the topic of global climate change.

INTRODUCTION

Despite no global warming in 10 years and recording setting cold in 2007-2008, the Intergovernmental Panel on Climatic Change (IPCC) and computer modelers who believe that CO2 is the cause of global warming still predict the Earth is in store for catastrophic warming in this century. IPCC computer models have predicted global warming of 1° F per decade and 5-6° C (10-11° F) by 2100 (Fig. 1), which would cause global catastrophe with ramifications for human life, natural habitat, energy and water resources, and food production. All of this is predicated on the assumption that global warming is caused by increasing atmospheric CO2 and that CO2 will continue to rise rapidly.

http://www.globalresearch.ca/articlePictures/glcool1.jpg
http://www.globalresearch.ca/articlePictures/globalcool2.jpg


Figure 1. A. IPCC prediction of global warming early in the 21st century. B. IPCC prediction of global warming to 2100. (Sources: IPCC website)

However, records of past climate changes suggest an altogether different scenario for the 21st century. Rather than drastic global warming at a rate of 0.5 ° C (1° F) per decade, historic records of past natural cycles suggest global cooling for the first several decades of the 21st century to about 2030, followed by global warming from about 2030 to about 2060, and renewed global cooling from 2060 to 2090 (Easterbrook, D.J., 2005, 2006a, b, 2007, 2008a, b); Easterbrook and Kovanen, 2000, 2001). Climatic fluctuations over the past several hundred years suggest ~30 year climatic cycles of global warming and cooling, on a general rising trend from the Little Ice Age.

PREDICTIONS BASED ON PAST CLIMATE PATTERNS

Global climate changes have been far more intense (12 to 20 times as intense in some cases) than the global warming of the past century, and they took place in as little as 20–100 years. Global warming of the past century (0.8° C) is virtually insignificant when compared to the magnitude of at least 10 global climate changes in the past 15,000 years. None of these sudden global climate changes could possibly have been caused by human CO2 input to the atmosphere because they all took place long before anthropogenic CO2 emissions began. The cause of the ten earlier ‘natural’ climate changes was most likely the same as the cause of global warming from 1977 to 1998.

Figure 2. Climate changes in the past 17,000 years from the GISP2 Greenland ice core. Red = warming, blue = cooling. (Modified from Cuffy and Clow, 1997)

Climatic fluctuations over the past several hundred years suggest ~30 year climatic cycles of global warming and cooling (Figure 3) on a generally rising trend from the Little Ice Age about 500 years ago.

Figure 3. Alternating warm and cool cycles since 1470 AD. Blue = cool, red = warm. Based on oxygen isotope ratios from the GISP2 Greenland ice core.

Relationships between glacial fluctuations, the Pacific Decadal Oscillation, and global climate change.
After several decades of studying alpine glacier fluctuations in the North Cascade Range, my research showed a distinct pattern of glacial advances and retreats (the Glacial Decadal Oscillation, GDO) that correlated well with climate records. In 1992, Mantua published the Pacific Decadal Oscillation curve showing warming and cooling of the Pacific Ocean that correlated remarkably well with glacial fluctuations. Both the GDA and the PDO matched global temperature records and were obviously related (Fig. 4). All but the latest 30 years of changes occurred prior to significant CO2 emissions so they were clearly unrelated to atmospheric CO2.

http://www.globalresearch.ca/articlePictures/globalcool5.jpg

Figure 4. Correspondence of the GDO, PDO, and global temperature variations.

The significance of the correlation between the GDO, PDO, and global temperature is that once this connection has been made, climatic changes during the past century can be understood, and the pattern of glacial and climatic fluctuations over the past millennia can be reconstructed. These patterns can then be used to project climatic changes in the future. Using the pattern established for the past several hundred years, in 1998 I projected the temperature curve for the past century into the next century and came up with curve ‘A’ in Figure 5 as an approximation of what might be in store for the world if the pattern of past climate changes continued. Ironically, that prediction was made in the warmest year of the past three decades and at the acme of the 1977-1998 warm period. At that time, the projected curved indicated global cooling beginning about 2005 ± 3-5 years until about 2030, then renewed warming from about 2030 to about 2060 (unrelated to CO2—just continuation of the natural cycle), then another cool period from about 2060 to about 2090. This was admittedly an approximation, but it was radically different from the 1° F per decade warming called for by the IPCC. Because the prediction was so different from the IPCC prediction, time would obviously show which projection was ultimately correct.

Now a decade later, the global climate has not warmed 1° F as forecast by the IPCC but has cooled slightly until 2007-08 when global temperatures turned sharply downward. In 2008, NASA satellite imagery (Figure 6) confirmed that the Pacific Ocean had switched from the warm mode it had been in since 1977 to its cool mode, similar to that of the 1945-1977 global cooling period. The shift strongly suggests that the next several decades will be cooler, not warmer as predicted by the IPCC.

http://www.globalresearch.ca/articlePictures/globalcool61.jp

Figure 5. Global temperature projection for the coming century, based on warming/cooling cycles of the past several centuries. ‘A’ projection based on assuming next cool phase will be similar to the 1945-1977 cool phase. ‘B’ projection based on assuming next cool phase will be similar to the 1880-1915 cool phase. The predicted warm cycle from 2030 to 2060 is based on projection of the 1977 to 1998 warm phase and the cooling phase from 2060 to 2090 is based on projection of the 1945 to 1977 cool cycle.

Implications of PDO, NAO, GDO, and sun spot cycles for global climate in coming decades

The IPCC prediction of global temperatures, 1° F warmer by 2011 and 2° F by 2038 (Fig. 1), stand little chance of being correct. NASA’s imagery showing that the Pacific Decadal Oscillation (PDO) has shifted to its cool phase is right on schedule as predicted by past climate and PDO changes (Easterbrook, 2001, 2006, 2007). The PDO typically lasts 25-30 years and assures North America of cool, wetter climates during its cool phases and warmer, drier climates during its warm phases. The establishment of the cool PDO, together with similar cooling of the North Atlantic Oscillation (NAO), virtually assures several decades of global cooling and the end of the past 30-year warm phase. It also means that the IPCC predictions of catastrophic global warming this century were highly inaccurate.

The switch of PDO cool mode to warm mode in 1977 initiated several decades of global warming. The PDO has now switched from its warm mode (where it had been since 1977) into its cool mode. As shown on the graph above, each time this had happened in the past century, global temperature has followed. The upper map shows cool ocean temperatures in blue (note the North American west coast). The lower diagram shows how the PDO has switched back and forth from warm to cool modes in the past century, each time causing global temperature to follow. Comparisons of historic global climate warming and cooling over the past century with PDO and NAO oscillations, glacial fluctuations, and sun spot activity show strong correlations and provide a solid data base for future climate change projections.

The Pacific Ocean has a warm temperature mode and a cool temperature mode, and in the past century, has switched back forth between these two modes every 25-30 years (known as the Pacific Decadal Oscillation or PDO). In 1977 the Pacific abruptly shifted from its cool mode (where it had been since about 1945) into its warm mode, and this initiated global warming from 1977 to 1998. The correlation between the PDO and global climate is well established. The announcement by NASA’s Jet Propulsion Laboratory that the Pacific Decadal Oscillation (PDO) had shifted to its cool phase is right on schedule as predicted by past climate and PDO changes (Easterbrook, 2001, 2006, 2007). The PDO typically lasts 25-30 years and assures North America of cool, wetter climates during its cool phases and warmer, drier climates during its warm phases. The establishment of the cool PDO, together with similar cooling of the North Atlantic Oscillation (NAO), virtually assures several decades of global cooling and the end of the past 30-year warm phase.

Figure 6. Switch of PDO cool mode to warm mode in 1977 initiated several decades of global warming. The PDO has now switched from its warm mode (where it had been since 1977) into its cool mode. As shown on the graph above, each time this has happened in the past century, global temperature has followed. The upper map shows cool ocean temperatures in blue (note the North American west coast). The lower diagram shows how the PDO has switched back and forth from warm to cool modes in the past century, each time causing global temperature to follow. Projection of the past pattern (right end of graph) assures 30 yrs of global cooling

Comparisons of historic global climate warming and cooling over the past century with PDO and NAO oscillations, glacial fluctuations, and sun spot activity show strong correlations and provide a solid data base for future climate change projections. As shown by the historic pattern of GDOs and PDOs over the past century and by corresponding global warming and cooling, the pattern is part of ongoing warm/cool cycles that last 25-30 years. The global cooling phase from 1880 to 1910, characterized by advance of glaciers worldwide, was followed by a shift to the warm-phase PDO for 30 years, global warming and rapid glacier recession. The cool-phase PDO returned in ~1945 accompanied by global cooling and glacial advance for 30 years. Shift to the warm-phase PDO in 1977 initiated global warming and recession of glaciers that persisted until 1998. Recent establishment of the PDO cool phase appeared right on target and assuming that its effect will be similar to past history, global climates can be expected to cool over the next 25-30 years. The global warming of this century is exactly in phase with the normal climatic pattern of cyclic warming and cooling and we have now switched from a warm phase to a cool phase right at the predicted time (Fig. 5)
The ramifications of the global cooling cycle for the next 30 years are far reaching―e.g., failure of crops in critical agricultural areas (it’s already happening this year), increasing energy demands, transportation difficulties, and habitat change. All this during which global population will increase from six billion to about nine billion. The real danger in spending trillions of dollars trying to reduce atmospheric CO2 is that little will be left to deal with the very real problems engendered by global cooling.

CONCLUSIONS

Global warming (i.e, the warming since 1977) is over. The minute increase of anthropogenic CO2 in the atmosphere (0.008%) was not the cause of the warming—it was a continuation of natural cycles that occurred over the past 500 years.

The PDO cool mode has replaced the warm mode in the Pacific Ocean, virtually assuring us of about 30 years of global cooling, perhaps much deeper than the global cooling from about 1945 to 1977. Just how much cooler the global climate will be during this cool cycle is uncertain. Recent solar changes suggest that it could be fairly severe, perhaps more like the 1880 to 1915 cool cycle than the more moderate 1945-1977 cool cycle. A more drastic cooling, similar to that during the Dalton and Maunder minimums, could plunge the Earth into another Little Ice Age, but only time will tell if that is likely.

Don J. Easterbrook is Professor of Geology at Western Washington University. Bellingham, WA.