Thursday, April 23, 2009

Cyclones Water Jet Stratosphere

This is a neat bit of science. The link to global warming is a lot more tenuous. Any water in the stratosphere is absorbing heat and you are certainly well above the working atmosphere. So is the troposphere for that matter and it is where all the excess methane goes to die.

We have a mechanism for putting water vapor into the stratosphere. So far - so good. It increased over fifty years by fifty percent. I suspect that we have one measurement from 1958 and a few recent results. It has been warmer so that we should have higher moisture.

However, just how much are we talking about? The mechanism alone suggests that the amount is negligible. At least we know why it is there. Otherwise there should be none.

Absolutely none of this has anything to do with the real atmosphere since they are by mass way to small and do not mix at all in a significant manner.

There is no plausible link to global warming in what we have just described.

Cyclones Spurt Water Into The Stratosphere And Feed Global Warming

by Staff WritersCambridge MA (SPX) Apr 22, 2009

http://www.terradaily.com/reports/Cyclones_Spurt_Water_Into_The_Stratosphere_And_Feed_Global_Warming_999.html
Scientists at Harvard University have found that tropical cyclones readily inject ice far into the stratosphere, possibly feeding global warming.

The finding, published in Geophysical Research Letters, provides more evidence of the intertwining of severe weather and global warming by demonstrating a mechanism by which storms could drive climate change. Many scientists now believe that global warming, in turn, is likely to increase the severity of tropical cyclones.

"Since water vapor is an important greenhouse gas, an increase of water vapor in the stratosphere would warm the Earth's surface," says David M. Romps, a research associate in Harvard's Department of Earth and Planetary science.

"Our finding that tropical cyclones are responsible for many of the clouds in the stratosphere opens up the possibility that these storms could affect global climate, in addition to the oft-mentioned possibility of climate change affecting the frequency and intensity of tropical cyclones."

Romps and co-author Zhiming Kuang, assistant professor of climate science in Harvard's Faculty of Arts and Sciences, were intrigued by earlier data suggesting that the amount of water vapor in the stratosphere has grown by roughly 50 percent over the past 50 years.

Scientists are currently unsure why this increase has occurred; the Harvard researchers sought to examine the possibility that tropical cyclones might have contributed by sending a large fraction of their clouds into the stratosphere.

Using infrared satellite data gathered from 1983 to 2006, Romps and Kuang analyzed towering cloud tops associated with thousands of tropical cyclones, many of them near the Philippines, Mexico, and Central America.

Their analysis demonstrated that in a cyclone, narrow plumes of miles-tall storm clouds can rise so explosively through the atmosphere that they often push into the stratosphere.

Romps and Kuang found that tropical cyclones are twice as likely as other storms to punch into the normally cloud-free stratosphere, and four times as likely to inject ice deep into the stratosphere.

"It is ... widely believed that global warming will lead to changes in the frequency and intensity of tropical cyclones," Romps and Kuang write in Geophysical Research Letters.

"Therefore, the results presented here establish the possibility for a feedback between tropical cyclones and global climate."

Typically, very little water is allowed passage through the stratosphere's lower boundary, known as the tropopause. Located some 6 to 11 miles above the Earth's surface, the tropopause is the coldest part of the Earth's atmosphere, making it a barrier to the lifting of water vapor into the stratosphere: As air passes slowly through the tropopause, it gets so cold that most of its water vapor freezes out and falls away.

But if very deep clouds, such as those in a tropical cyclone that can rise through the atmosphere at speeds of up to 40 miles per hour, can punch through the tropopause too quickly for this to happen, they can deposit their ice in the warmer overlying stratosphere, where it then evaporates.

"This suggests that tropical cyclones could play an important role in setting the humidity of the stratosphere," Romps and Kuang write.

Wednesday, April 22, 2009

Cold Fusion or LENR

Went to the website of Energetics Technologies. I am posting some of their material, but do look at the second link and run the show. The approach is a definite improvement over early experiments that indicated a reaction but was too low to be definitive. This clever bit of design has the merit that it emulates the original design but causes a major expansion of the effect. That makes it as clear as a bell. And their cheering is even clearer as it should.

They are talking already of turning this into a heat production machine for selling to the householder even. Nice story, but a little short. I suspect we are a long way from a good steady heat flow that is reliable and I suspect it does not like been turned on and off. Other than that, how about replacing the anodes at least with something a lot cheaper and now that we have a protocol, let us try it on a few other materials. Palladium is wonderful but do we have a good reason that makes it unique? How about gold and graphite and every other rare earth?

The bottom line is that this is major step forward and the jump in output has silenced the crowd yelling bunkum.

Whatever the heat flux and whatever the capital cost we are now in possession of a device that produces without a significant input except the original capital. It remains to be seen if it is good enough after a lot more tweaking to be cost effective. It will now need to break the twenty year payback threshold to avoid been a curiosity.


http://www.energeticstechnologies.com/

http://superwavefusion.com/popups/process.html

Cold Fusion is a Low-Energy Nuclear Reaction (LENR) occurring at near room temperature and pressure using relatively simple and low-energy-input devices to produce excess energy. When Albert Einstein devised his now famous formula E=mc² he realized that a tiny amount of mass could produce extraordinary amounts of energy, thereby mimicking the sun’s energy production mechanism of fusion. Some 40 years later, scientists began attempting “hot” fusion, which requires extreme temperatures and pressures found inside stars. This research is still underway. In the 1980’s, scientists also began investigating “cold” fusion, which unlike the Sun’s “hot” fusion, does not require extreme temperature and pressure to achieve.

Cold fusion appears to be the fusion, or combining of nuclei of a naturally found hydrogen isotope called Deuterium. This fusing of Deuterium results in the release of excess heat. The major product of this reaction is an isotope of helium called helium-4 which is harmless and found throughout nature. One helium-4 nucleus is produced from the fusion of two Deuterium nuclei, although this appears to occur as a “many-body process” instead of “hot” fusion, where two isolated Deuterium nuclei fuse. Helium-4 nucleus is slightly less massive than the two Deuterium nuclei that combined to form it, so the difference in mass is converted to energy in the form of heat.

While cold fusion creates no dangerous by-products, it is, nevertheless often confused with both the “hot” fusion process found inside stars, and nuclear fission, the splitting of heavy nuclei used in atomic weapons and mainstream nuclear power. A nuclear fission reaction leaves behind harmful radioactive waste products. These misconceptions are just a few of the uphill battles that cold fusion researchers have had to face in their pursuit for a safe and plentiful source of new energy.

The Process

The SuperWave™ Fusion Process

SuperWave™ Fusion is an excess-heat producing reaction created by a SuperWave™-induced interaction of palladium and deuterium. Click on the video to view an animation depicting how this process is believed to work.

This energy producing interaction is driven by a complex, nested, “waves-waving-within-waves” signal discovered by the company’s Chief Visionary Officer, Dr. Irving Dardik. In the current apparatus, this proprietary SuperWave™ signal is delivered via an electric current to a custom module containing a palladium cathode and D2O (deuterium instead of hydrogen in the water molecule). The end result is the release of energy as the deuterium atoms disassociate from the heavy water and load into the palladium lattice, allowing their wave-based energy structures to interact. The principal outputs from this interaction are heat and apparently small quantities of 4He, a non-radioactive isotope of Helium. Research to verify the 4He is currently underway.

Energetics Technologies’ SuperWave™ Fusion has the potential to:

Provide an inexpensive, inexhaustible fuel source

Produce no significantly measurable hazardous by-products

Revolutionize the concept of energy production

Be a groundbreaking Green Energy source

Senegal Green Charcoal

Green charcoal is been made from cattails, also known as typha. This is great news and it appears that a whole lot of production issues have been solved to make it available as a cooking fuel.

I hope that they are also grabbing the starch for food from the roots. However, converting the available stalks into biochar that is packed as fuel cubes is a good way to make it all economic. If some of the produced biochar also finds itself onto the land, so much the better.

The compression tool likely squeezes out most of the moisture and if in cubes, a very efficient carbonization cycle can be set up. This obviously lends itself to low cost mass production with a secure feedstock in the typha reeds. In fact it sounds superior to wood charcoal manufacture because the step of crushing and sorting is omitted,

Until electrical heating is available, charcoal will be the available fuel. Production from plant waste other than woods is preferred because of the jump in efficiency. It is certainly possible to expand this sector hugely once farmers see a profit.

SENEGAL: Can "green charcoal" help save the trees?

http://www.irinnews.org/Report.aspx?ReportId=84015

ROSS-BETHIO, 20 April 2009 (IRIN) - An environmental NGO in northern Senegal is about to go to market with “green charcoal” – a household fuel produced from agricultural waste materials to replace wood and charcoal in cooking stoves.

Given that Senegal’s trees are disappearing, finding viable alternatives is a must, a Ministry of Energy official says. At least half of Senegal’s 13 million people rely on wood and charcoal for household fuel, while 40 percent relying on petrol products like butane gas, the ministry says.

“You need to cut down 5kg of wood to produce only 1kg of [conventional] charcoal,” said Ibrahima Niang, alternative household energies specialist at the Energy Ministry.

“Less than 30 years ago, charcoal consumed in [the capital] Dakar came from 70km away, from the Thiès region. Now you have to go 400km from Dakar to find forests.”

According to Senegal’s Department of Water and Forestry, 40,000 hectares of forest are cut every year for fuel and other commercial uses.

Deforestation is said to exacerbate
soil erosion – already a considerable problem in parts of Senegal. The country is part of the Sahel, a region where erratic rainfall, land degradation and desertification are constant challenges for a population largely dependent on agriculture and livestock.

The “green charcoal” is produced by compressing agricultural waste, like the invasive typha weed, into briquets and then carbonising them using a machine. The product has the look and feel of traditional charcoal and burns similarly.

“The technology is efficient, effective and economical because we can produce a substitute for charcoal at half the price,” Guy Reinaud, director of Pro Natura International, the French NGO that has partnered with the Senegalese government on the green charcoal project. The project is based in Ross-Bethio, a town 300km north of Dakar in the Saint-Louis region.

Environmental firms and governments have long been working to transform plants and natural waste materials into energy, such as
water lilies in the Philippines.

Tough sell?

Despite the apparent advantages marketing the green charcoal in Senegal is a challenge, according to Mireille Ehemba, specialist in alternative household fuels at
PERACOD, a Senegalese-German renewable energy initiative that is also a partner in the green charcoal project.

“We have not been able to penetrate the charcoal market in urban areas. People are very attached to charcoal,” Ehemba told IRIN. “Much more [education] is needed, including cooking demonstrations that explain how this new fuel works, if we want people to make the switch.”

Not only buyers need to be convinced. Identifying distribution networks and responding to the needs of charcoal vendors are also major challenges, Ehemba said. For 1kg of green charcoal, a vendor receives 5 US cents, whereas conventional charcoal brings in almost 20 cents per kilogram.

“We must talk to producers to get them to increase the scale of their operation in order to increase the profit for vendors if this is to work.”

Affordable

Senegalese consumers may be tempted to switch to the new product because it is cheaper than charcoal and butane gas. One kilogram of green charcoal sells for just 20 cents, whereas traditional charcoal currently costs three times that. A 6-kg bottle of butane gas costs about $5.

Fatou Camara, 40, from Ross-Bethio, has tested the new fuel when cooking for her family of 10. “I can use 1kg of green charcoal and that will cook the dinner. It is cheaper than normal charcoal.”

Camara told IRIN she used to use butane gas for cooking, but recurrent gas shortages pushed her to switch to green charcoal.

In the past, butane gas was heavily subsidised and promoted by the government as an alternative to charcoal. But such measures are no longer sustainable, according to the Energy Ministry’s Niang. The government plans to phase out butane subsidies in July.

PERACOD’s Ehemba is concerned the move will put more pressure on Senegal's forests as poorer households return to traditional fuels like charcoal. “It is now very important that we propose alternatives like improved stoves and bio-charcoal so that people have affordable ways to cook cleanly,” she said.

ProNatura and the Senegalese government plan to turn the project into a profit-making venture called “Green Charcoal Senegal” that will produce up to 800 tons of the green fuel a year for sale in the Saint-Louis region.

ProNatura will soon start a project in Mali, transforming cotton stems into green charcoal, and plans similar projects in Niger, Madagascar, China, India and Brazil.

“It has global potential in terms of its adaptability to different local environments, and it uses local waste materials,” said Reinaud.

The Energy Ministry’s Niang said: “It is not possible to completely replace charcoal [in Senegal]. But even if we can replace 10 or 15 percent [of it] that is good. It will preserve the forests.”

Holy See Plans Largest European Solar Plant

This piece shows that this institution has decided it makes good sense to lead in terms of energy conservation. It makes good economic sense and makes a point to those that listen. It reinforces other similar messages and thus must be welcomed.

Those that follow this blog know that the whole question of how to supply energy is in flux like never before and everyone is affected. Locking in renewables gives the user an effective safety net against an energy shock that has become all too visible.

Oil production is geared to decline to under 50 million barrels from the present 85 million barrels. The decline looks to be as bumpy and rough as we could imagine. It is thus prudent and eminently sensible to come off grid and be an energy supplier.

The church is moving ahead of the curve as strongly as they can in light of current knowledge. Breakthroughs are popping up everywhere today except they are way behind the curve and cannot possibly help obviate pending supply crisis.

Pope to Pursue Heavenly Power in Europe’s Biggest Solar Plant

By Flavia Krause-Jackson and Flavia Rotondi

April 17 (Bloomberg) -- On pasture land a day’s walk north of Rome, the inventor of radio Guglielmo Marconi set up a broadcasting service in 1931 for the Vatican.

The world’s smallest state now intends to build the biggest solar plant in Europe for 500 million euros ($660 million) on those same 740 acres near the medieval village of Santa Maria di Galeria, project engineer Mauro Villarini said in an interview.

Advised by German solar-panel maker Solarworld AG, the Holy See is running counter to many governments that say harnessing sunlight on a grand scale is too costly to help curb global warming, especially in the deepest recession since World War II.

“Now is the time to strike,” Cardinal Giovanni Lajolo, the Vatican City’s governor, said in an interview from his study overlooking the Michelangelo-designed Basilica of St. Peter’s. “One should take advantage of the crisis to try and develop these renewable-energy sources to the maximum, which in the long run will reap incomparable rewards.”

European nations, daunted by spending needed to stimulate their economies, find it harder to invest in clean power generation required to meet the European Union’s target to cut greenhouse-gas emissions 20 percent by 2020. Italy is balking.

Italian Prime Minister Silvio Berlusconi in December threatened to play “bad guy” by resisting demands to trim emissions. The nation already is set to miss its Kyoto treaty target, which calls for releasing 6.5 percent less heat-trapping gases by 2012 from 1990 levels, European Union filings show.

Subsidized by Italy

By contrast, the Roman Catholic city-state is going greener. Advantaged by its small size, the pope will also count on revenue and solar aid from Italy after 2014. That’s when the new plant is scheduled to turn the Vatican into an electricity exporter to the nation that surrounds it.

“Certainly we will try to get what we can in a fair, friendly way with Italy, considering that the State and the Holy See have expenses” to cover by generating revenue, Lajolo said.

The 100 megawatts unleashed by the station will supply about 40,000 households. That will far outstrip demand by Pope Benedict XVI and the 900 inhabitants of the 0.2 square-mile country nestled across Rome’s Tiber River. The plant will cover nine times the needs of Vatican Radio, whose transmission tower is strong enough to reach 35 countries including Asia.

“It’s a wise investment in every way we look at it,” said Umberto Bertelé, chairman of the management school at Politecnico di Milano, in an interview. He said the Vatican will benefit from Italy’s solar incentives that include requiring local utilities to buy sun power at above-market prices.

Inside Track?

The Vatican hasn’t decided how much to rely on photovoltaic panels, which turn sunlight directly into electricity, and on thermal devices that heat water for generators, Solarworld Chief Executive Officer Frank Asbeck said in a telephone interview from a holiday on the Mediterranean island of Ibiza.

The Bonn-based company in 2008 may have gained an inside track for a future contract by donating $1.5 million of panels for a 6,300-seat dome used for the pope’s weekly audiences to the world’s Roman Catholics.

A public tender for suppliers and builders is likely, Asbeck said. If the project goes ahead, “We’re quite confident we’ll get the job,” he said. German peer SMA Solar Technology AG, based in Niestetal, also donated equipment last year -- rooftop collectors for the audience hall. That benefited SMA Solar’s brand recognition in Italy, CEO Guenther Cramer said.

Including installation costs and products from other suppliers as a “turn-key project,” a 100-megawatt plant would cost 350 million euros to 400 million euros to build if it only used Solarworld modules, Asbeck said.

Polluting, Sinning

The Germany-born Benedict has been outspoken on environmental issues since becoming pope in 2005. During an address for World Peace Day in 2006, he said: “The destruction of the environment, its improper or selfish use, and the violent hoarding of the Earth’s resources cause grievances, conflicts and wars, precisely because they are the consequences of an inhumane concept of development.”

The Vatican listed pollution as one of seven “social” sins in an effort last year to update the cardinal vices that date to the 6th century.

“You offend God not only by stealing, taking the Lord’s name in vain or coveting your neighbor’s wife but also by wrecking the environment,” Bishop Gianfranco Girotti, head of the Apostolic Penitentiary, said then.

More recently the Vatican has put words into actions.

The 5,000-square-meter roof of the Paul VI auditorium -- built in 1971 by Pier Luigi Nervi, the architect who designed Milan’s Pirelli Tower -- was covered with 2,400 solar panels to produce 300 kilowatt hours of energy a year, enough for 100 households, cutting carbon-dioxide emissions by about 225 tons.

Sun-Powered Cafeteria

A large electronic board hangs by the entrance of the Nervi hall that counts the kilowatt hours generated and amount of carbon dioxide saved.

The Vatican’s 300-seat cafeteria for staff will be decked out this summer with a solar-heating system to provide air conditioning and heating for the whole building. Kloben Solar Evolution, based in Verona, won the 300,000-euro contract to install thermal collectors, Villarini said.

The pope’s own Castel Gandolfo summer residence, a 17th- century palace south of Rome in the Alban hills, may be the site of a renewable-energy project to break down biodegradable waste material to produce methane and gas. The Vatican’s engineers are conducting a feasibility study on this.

Horse-Stable Power

“We are not thinking only in terms of solar energy but also energy that can be produced from the gasification of natural products,” Lajolo said in a March 20 interview. “So everything that comes from the stable or from hay.”

The most ambitious plan is for the countryside around Santa Maria di Galeria, land that Italy donated to the church and is about 10 times the size of the Holy See. The solar station planned there should reduce about 91,000 tons of carbon-dioxide emissions a year that otherwise would have been produced by fossil-fuel generators, Villarini told Bloomberg.

“If we solve the environmental problem, the benefits are immeasurable,” Lajolo said. “They can cost a bit to be implemented but when they are, they generate incomparable savings if you consider the expense needed to produce oil.”

To contact the reporters on this story: Flavia Krause-Jackson in Rome at
fjackson@bloomberg.net Flavia Rotondi in Rome at frotondi@bloomberg.net

Tuesday, April 21, 2009

Graphene Ribbons Mastered

More very good news here regarding the ongoing development of graphene technology. This suggests that it becomes plausible to mass produce ribbons of graphene in large amounts.

Although not yet in sheets, this may be just as good for most applications.

Now if we can figure out how to directionalize them and link edges, we might yet be able to produce fibers that could form an incredibly strong cable.

This is still a good start and it certainly opens the door for computer chip manufacture.

It is remarkable to believe that we have come so far from the first recognition of Bucky balls in candle soot.



Making Nanoribbons From Sliced Open Nanotubes


http://www.spacemart.com/reports/Making_Nanoribbons_From_Sliced_Open_Nanotubes_999.html

by Staff Writers
Stanford CA (SPX) Apr 20, 2009

A world of potential may lie tied up in graphene nanoribbons, particularly for electronics applications. But researchers have been hampered in their efforts to fully explore that potential because they had no reliable way of creating the large quantities of uniform nanoribbons needed to conduct extensive studies.

Now a team at Stanford University under Hongjie Dai has developed a new method that will allow relatively precise production of mass quantities of the tiny ribbons by slicing open carbon nanotubes.

It is relatively easy to produce fairly uniform carbon nanotubes in large numbers. But being the tiny, delicate structures that they are, slicing open nanotubes requires a tender touch. "The key is to be able to open up the tubes without destroying the whole structure," Dai said. "I mean, it doesn't have any zipper on it, right?"

Dai's method effectively creates the needed zipper. Carbon nanotubes are placed on a substrate, then coated with a polymer film. The film covers the entire surface of each nanotube, save for a thin strip where the nanotube is in contact with the substrate.

The film is easily peeled off from the substrate, taking along all the nanotubes and exposing the thin strip of polymer-free surface on each of them. A chemical etching process using plasma can then slice open each nanotube along that narrow strip. It's not unlike generating flat linguini noodles by slicing open bucatini, a long tubular pasta.

The process works not only on single-layer carbon nanotubes, but also on nanotubes with concentric layers of nanotubes, allowing each layer to be sliced open along the same "dotted line." The work is detailed in a paper published in the April 16, 2009 issue of Nature. Dai, the J.G. Jackson and C.J. Wood Professor of Chemistry, is the senior author of the paper.

Given all the other methods of nanoribbon production that have been tried - lithography, chemical reactions and ultrasound-influenced chemistry - all of which failed to produce the needed quantity or quality of graphene nanoribbons, Dai's method is surprisingly simple. "Once we overcame the hurdle of how to unzip the nanotubes, everything seemed so obvious," he said. "It is one of those things where you go, 'why didn't I think of that earlier?'"

In addition to being fairly straightforward and easy to do, the process can be extremely efficient. "We can open up every carbon nanotube at the same time and convert many nanotubes into ribbons at the same time," Dai said.

Depending on how large a surface they cover with nanotubes - anything from a chip to a wafer - Dai said his team can create anywhere from one to tens of thousands of graphene nanoribbons at a time. The ribbons can easily be removed from the polymer film and transferred onto any other substrate, making it easy to create items such as graphene transistors, which may hold promise as a way to possibly make high performance electronic devices.

"How much better
computer chips using graphene nanoribbons would be than silicon chips is an open question," Dai said. "But there is definite potential for them to give a very good performance."

Another advantage of Dai's method is that the edges of the nanoribbons produced are fairly smooth, which is critical to having them perform well in electronics applications.

The next step in the team's research is to better characterize the ribbons and try to refine their control of the production process. Dai said it is important to control the width of the ribbon and the edges of the structure of the ribbon, as those things could potentially affect the electrical properties of the ribbons and any device in which they are used.

Dai said that graphene nanoribbons have other uses in addition to potential electronics applications.

"It is a very nice
system to study nanoscale phenomena, in general," he said. "This method now opens up all these things that we can explore."

Liying Jiao, a postdoctoral researcher in the chemistry department, and Li Zhang, a graduate student in chemistry, are co-first authors of the Nature paper and contributed equally to this work. Xinran Wang, a graduate student in physics, and Georgi Diankov, a graduate student in chemistry, also are authors of the paper.

Early Oxygen Reappraisal

This is a surprise.

This is telling us that the earliest life forms on Earth were not barred from producing oxygen.

Perhaps this is as well. I am more and more inclined to suspect that seed lifeforms presently fill the universe and get free passage in the interstellar dust. When first suggested, we knew nothing about the real limits of such lifeforms. After all if you assume that everything evolved on earth, there is no need to push the limits. Yet we keep finding life at impossible places, even on Earth.

It makes things a lot easier if upon a new planet settling down, it becomes infected by incoming dust that quickly delivers a palette of useful terraforming cells. It did not quite prevent this from happening in the low oxygen conjecture, but it surely made it a lot harder and distorted to ward an alternative chemistry.

So far we have had sniffs from space that that might be the way of things. It really makes too much sense and even galaxies share dust. This way, active life forms would need to be evolved once in the beginnings of the universe in order to dominate throughout a large fraction thereafter.

Although this is proof of nothing, it argues that a seed kit is sufficient on every planet and no special evolution is needed after the seeding step.

When mankind terra forms Venus we will certainly speed the process up in order to see the benefits sooner than later. However it seems likely that the machinery is already in place awaiting some water and methane.



Origins Of Sulfur In Rocks Tells Early Oxygen Story

by Staff Writers
Baltimore MD (SPX) Apr 20, 2009

http://www.terradaily.com/reports/Origins_Of_Sulfur_In_Rocks_Tells_Early_Oxygen_Story_999.html

Sedimentary Rocks created more than 2.4 billion years ago sometimes have an unusual sulfur isotope composition thought to be caused by the action of ultra violet light on volcanically produced sulfur dioxide in an oxygen poor atmosphere.

Now a team of geochemists can show an alternative origin for this isotopic composition that may point to an early, oxygen-rich atmosphere.

"The significance of this finding is that an abnormal isotope fractionation (of sulfur) may not be linked to the atmosphere at all," says Yumiko Watanabe, research associate, Penn State. "The strongest evidence for an oxygen poor atmosphere 2.4 billion years ago is now brought into question."

The researchers, who also include James Farquhar, associate professor of geology, University of Maryland and Hiroshi Ohmoto, professor of geoscience, Penn State, present the possibility that the rocks with an anomalous sulfur isotope fractionation came from locations on the ocean floor where hydrothermal fluids seeped up from submarine vents through organic carbon rich sediments and mixed with the ocean water.

Watanabe used laboratory experimentation to test their theory and report on the results in Science.

Chemical elements often have more than one form. While the number of protons and electrons are all the same, the element may have forms with a greater or lesser number of neutrons and consequently a different atomic weight. Sulfur has four naturally occurring isotopes none of which are radioactive.

Although 95 percent of sulfur has an atomic weight of 32, the other 5 percent is composed of sulfur with atomic weights of 33, 34 or 36.

The relationship between the amounts of 33, 34 and 36 are predictable based on the differences in their weights, but in the early rocks examined, the relationship was often anomalous. Other scientists have previously determined that the sulfur dioxide, ultraviolet light reaction in the absence of oxygen can produce the anomalous isotope fractionation.

Watanabe looked at samples of amino acids and sodium sulfur compounds to try to recreate the anomalous sulfur isotope composition in another way. She chose amino acids as a proxy for organic material because the anomalous sulfur isotopes often come from sedimentary rock, black shale, that also contains abundant mature kerogen - a mixture of organic compounds. She chose sodium compounds because of the large amounts of sodium and sulfate in the ocean.

Initial experiments used two amino acids - alanine and glycine - and sodium sulfite, which is less oxidized compared to sulfate. When heated, these did not produce abnormal fractionation.

Watanabe then tested five amino acids, adding histidine,
arginine and tryptophan, and mixed them with sodium sulfate. In this case, alanine and glycine produced the anomalous isotope composition found in the rocks. In all, she ran 32 series of experiments with more than 100 individual samples.

"At high temperatures it sometimes took 24 hours for the sulfate to reduce to sulfide," said Watanabe. "At lower temperatures it took about two months, 1,000 hours. I ran the experiments until I had enough product to test the isotopic distribution."

Although Watanabe captured the sulfur from the experiments as hydrogen sulfide gas, she converted it to silver sulfide for analysis because it is easier to work with a solid than a gas.

"People never thought that anomalous sulfur isotope fractionation could be caused by a process other than atmospheric reactions," said Ohmoto. "Our study significantly shifts possibilities to something different, to a biological and thermal regime. There are now at least two ways that the anomalous sulfur isotope fractionation seen in some rocks could be achieved."

While sulfate-reducing bacteria do not produce anomalous isotope relationships, the remains of simple organisms coupled with thermal sulfate reduction does produce the anomalous isotope signature.

The researchers plan to look at dead cyanobacteria - blue
green algae - next to see if their organic material will fuel the thermal reaction to produce anomalous sulfur isotope relationships.

Antarctic is Growing

Both stories are short so I am quoting both to get the most information. It really boils down to the fact that the bulk of our land ice is stable and even getting colder. It will take a lot more in terms of real global warming to melt enough ice to ever impact on the sea levels, when this block looks very much as the very last to ever be affected.

Anyway, this is a welcome update in the data that in view of the commotion over the past years is welcome. I had little doubt that the outcome would be exactly as advertised, but it is always nice to see it confirmed.

Now we will wait to see if the Arctic sea ice makes a significant recovery this year. The press is still reacting of the retreat brought on by 2007. This winter was cold and we should get significant regrowth unless open water warming is much more effective than expected.

Western Antarctica and Greenland represent only a small fraction of the available land ice and obviously the only blocks realistically exposed to any melting whatsoever. Thus, whatever the pundits have to say, the chances of a significant rise in seal level is presently remote.

Antarctic ice growing, not shrinking

18 Apr 2009

http://economictimes.indiatimes.com/Earth/Antarctic-ice-growing/articleshow/4418558.cms


SYDNEY: New analysis has indicated that contrary to the belief that there is large-scale melting of ice over most of Antarctica, ice is actually expanding in a large portion of the continent.
Antarctica has 90 per cent of the Earth's ice and 80 per cent of its fresh water. Extensive melting of Antarctic ice sheets would be required to raise sea levels substantially, and ice is melting in parts of west Antarctica.
The destabilization of the Wilkins ice shelf generated international headlines this month.
However, according to a report in the Australian, the picture is very different in east Antarctica, which includes the territory claimed by Australia. East Antarctica is four times the size of west Antarctica and parts of it are cooling.
The Scientific Committee on Antarctic Research report noted that the South Pole had shown "significant cooling in recent decades". According to Australian Antarctic Division glaciology program head Ian Allison, sea ice losses in west Antarctica over the past 30 years had been more than offset by increases in the Ross Sea region, just one sector of east Antarctica.
"Sea ice conditions have remained stable in Antarctica generally," Dr Allison said. The melting of sea ice - fast ice and pack ice - does not cause sea levels to rise because the ice is in the water.
Sea levels may rise with losses from freshwater ice sheets on the polar caps. In Antarctica, these losses are in the form of icebergs calved from ice shelves formed by glacial movements on the mainland.
Dr Allison said there was not any evidence of significant change in the mass of ice shelves in east Antarctica nor any indication that its ice cap was melting. "The only significant calvings in Antarctica have been in the west," he said.
Ice core drilling in the fast ice off Australia's Davis Station in East Antarctica by the Antarctic Climate and Ecosystems Co-Operative Research Centre shows that last year, the ice had a maximum thickness of 1.89m, its densest in 10 years. The average thickness of the ice at Davis since the 1950s is 1.67m.
A paper to be published soon by the British Antarctic Survey in the journal Geophysical Research Letters is expected to confirm that over the past 30 years, the area of sea ice around the continent has expanded.

Revealed: Antarctic ice growing, not shrinking

Greg Roberts April 18, 2009

Article from:
The Australian

ICE is expanding in much of Antarctica, contrary to the widespread public belief that global warming is melting the continental ice cap.

The results of ice-core drilling and sea ice monitoring indicate there is no large-scale melting of ice over most of Antarctica, although experts are concerned at ice losses on the continent's western coast.

Antarctica has 90 per cent of the Earth's ice and 80 per cent of its fresh water. Extensive melting of Antarctic ice sheets would be required to raise sea levels substantially, and ice is melting in parts of west Antarctica. The destabilisation of the Wilkins ice shelf generated international headlines this month.

However, the picture is very different in east Antarctica, which includes the territory claimed by Australia.

East Antarctica is four times the size of west Antarctica and parts of it are cooling. The Scientific Committee on Antarctic Research report prepared for last week's meeting of Antarctic Treaty nations in Washington noted the South Pole had shown "significant cooling in recent decades".

Australian Antarctic Division glaciology program head Ian Allison said sea ice losses in west Antarctica over the past 30 years had been more than offset by increases in the Ross Sea region, just one sector of east Antarctica.

"Sea ice conditions have remained stable in Antarctica generally," Dr Allison said.

The melting of sea ice -- fast ice and pack ice -- does not cause sea levels to rise because the ice is in the water. Sea levels may rise with losses from freshwater ice sheets on the polar caps. In Antarctica, these losses are in the form of icebergs calved from ice shelves formed by glacial movements on the mainland.

Last week, federal Environment Minister Peter Garrett said experts predicted sea level rises of up to 6m from Antarctic melting by 2100, but the worst case scenario foreshadowed by the SCAR report was a 1.25m rise.

Mr Garrett insisted global warming was causing ice losses throughout Antarctica. "I don't think there's any doubt it is contributing to what we've seen both on the Wilkins shelf and more generally in Antarctica," he said.

Dr Allison said there was not any evidence of significant change in the mass of ice shelves in east Antarctica nor any indication that its ice cap was melting. "The only significant calvings in Antarctica have been in the west," he said. And he cautioned that calvings of the magnitude seen recently in west Antarctica might not be unusual.

"Ice shelves in general have episodic carvings and there can be large icebergs breaking off -- I'm talking 100km or 200km long -- every 10 or 20 or 50 years."

Ice core drilling in the fast ice off Australia's Davis Station in East Antarctica by the Antarctic Climate and Ecosystems Co-Operative Research Centre shows that last year, the ice had a maximum thickness of 1.89m, its densest in 10 years. The average thickness of the ice at Davis since the 1950s is 1.67m.

A paper to be published soon by the British Antarctic Survey in the journal Geophysical Research Letters is expected to confirm that over the past 30 years, the area of sea ice

Monday, April 20, 2009

Bussard Polywell Fusion Funded

The news is that the Bussard polywell fusion device is now attracting a major ( for it ) tranche of funding. This must also be an indicator that every possible fusion energy strategy is now getting a fair hearing and real support. The polywell pioneered by the late Dr Bussard had over a twenty year time span never received more than a pittance and that perhaps two or three times. The sheer weight of time ate up any available capital.

This report tells us that Bussard is sorely missed. It also tells us that scaling is difficult, although I am not sure that having a larger device makes things easier or harder or just with more variation emerging.

We now have several fusion programs modestly funded, including work on cold fusion. We are going to be having news on fusion work streaming out over the next twelve months. This is a radical departure from past practice for the physics that has obviously denigrated small budget attempts.

This is welcome news, as is the sudden burst of interest emerging around cold fusion. The lack of neutrons had killed that approach more surely than any other issue. The polywell is a simple device that needs to be tested over several size configurations in order to perfect the theory itself. That then could lead to an efficiency breakthrough.

In other words it is the type of program that you feed two million plus and fresh talents every year to progressively advance the knowledge. Past work has consisted of perhaps two such rounds stretched out over twenty years. I hate when that happens. I have a drawer full of such projects mostly mundane but needing just that.

I can see the navy been very keen on this technology working. In retrospect they are even the best partner. After all they have a natural heat sink available to dispose of surplus energy.

April 16, 2009

Plasma Fusion (Polywell) Demonstrate fusion plasma confinement system for shore and shipboard applications; Joint OSD/USN project. 2.0 [million]

Introduction to Bussard Fusion

This site has covered IEC (Bussard) Fusion many times. Bottom line is that it is one of the most promising technologies for achieving cheap, clean and non-controversial energy within ten years. Success would alter energy production, the world economy, propulsion of ships and other vehicles and enable inexpensive access to space.

IEC fusion uses magnets to contain an electron cloud in the center. It is a variation on the electron gun and vacuum tube in television technology. Then they inject the fuel (deuterium or lithium, boron) as positive ions. The positive ions get attracted to the high negative charge at a speed sufficient for fusion. Speed and electron volt charge can be converted over to temperature. The electrons hitting the TV screen can be converted from electron volts to 200 million degrees.

The old problem was that if you had a physical grid in the center then you could not get higher than 98% efficiency because ions would collide with the grid. The problem with grids is that the very best you can do is 2% electron losses (the 98% limit). With those kinds of losses net power is impossible. Losses have to get below 1 part in 100,000 or less to get net power. (99.999% efficiency)

Bussard system uses magnets on the outside to contain the electrons and have the electrons go around and around 100,000 times before being lost outside the
magnetic field.

The fuel either comes in as ions from an ion gun or it comes in without a charge and some of it is ionized by collisions with the madly spinning electrons. The fuel is affected by the same forces as the electrons but a little differently because it is going much slower. About 64 times slower in the case of Deuterium fuel (a hydrogen with one neutron). Now these positively charged Deuterium ions are attracted to the virtual electrode (the electron cloud) in the center of the machine. So they come rushing in. If they come rushing in fast enough and hit each other just about dead on they join together and make a He3 nucleus (two protons and a neutron) and give off a high energy neutron.

Ions that miss will go rushing through the center and then head for one of the grids. When the voltage field they traveled through equals the energy they had at the center of the machine the ions have given up their energy to the grids (which repel the ions), they then go heading back to the center of the machine where they have another chance at hitting another ion at high enough speed and close enough to cause a fusion.

Discussion Board Technical Details From IEC Fusion Research Lead Dr Nebel

Some technical comments from Dr Nebel

A few comments on scaling laws….


To a certain extent we are in the same boat as everyone else as far as the previous experiments go since Dr. Bussard’s health was not good when we started this program and he died before we had a chance to discuss the previous work in any detail. Consequently, we have had to use our own judgement as to what we believe from the earlier experiments and what we think may be questionable. Here’s how we look at it: 1. We don’t rely on any scaling results from small devices. The reason for this is that these devices tend to be dominated by surface effects (such as outgassing) and it’s difficult to control the densities in the machines. This is generally true for most plasma devices, not just Polywells.


2. Densities for devices prior to the WB-7 were surmised by measuring the total light output with a PMT and assuming that the maximum occurred when beta= 1. We’re not convinced that this is reliable. Consequently, we have done density interferometry on the WB-7. We chose this diagnostic for the WB-7 because we knew through previous experience that we could get it operational in a few months (unlike Thomson scattering which by our experience takes more than a man-year of effort and requires a
laser which was outside of our budget) and density is always the major issue with electrostatic confinement. This is particularly true for Polywells which should operate in the quasi-neutral limit where Debye lengths are smaller than the device size.

3. As discussed by several people earlier, power output for a constant beta device should scale like B**4*R**3. All fusion machines scale this way at constant beta. Input power scales like the losses. This is easy to derive for the wiffleball, and I’ll leave that as an “exercise to the reader”. This is the benchmark that we compare the data to.


4. As for Mr. Tibbet’s questions relating to alpha ash, these devices are non-ignited (i.e. very little alpha heating) since the alpha particles leave very quickly through the cusps. If you want to determine if the alphas hit the coils, the relevant parameter is roughly the comparison of the alpha Larmor radius to the width of the confining magnetic field layer. I’ll leave that as an “exercise to the reader” as well.


Loss fraction = (summation (pi*rl**2))/(4*pi*R**2) where rl is the electron gyroradius and R is the coil radius. The summation is a summation over each of the point cusps. If you calculate rl from one of the coil faces, then there are "effectively" ~ 10 point cusps (fields are larger in the corners than the faces). The factor that your observed confinement exceeds this model is then lumped together as the cusp recycle factor.

The other model is to look at mirror motion along field lines. For this model you look at loss cones and assume that the electrons effectively scatter every time they pass through the field null region. This model describes the confinement which was observed on the DTI machine in the late 80s.

I don't know how to predict cross-field diffusion on these devices. The gradient scale lengths of the magnetic fields are smaller than the larmor radii and the electrostatic fields should give rise to large shear flows. On top of that, the geometry is 3-D.


The mirror model is a bit of a handwaving model that I believe Nick Krall came up with. The mirror ratio is calculated from the field where the electron Larmor radius is on the order of the device size. Any smaller field than that will not have adiabatic motion. If particles enter the field null region, it is assumed that they effectively scatter. I believe that Dave Anderson at LLNL did a fair amount of particle tracing calculations for FRMs in the late 70s, and not surprisingly saw jumps in the adiabatic invariants when moving through field null regions. I presume similar behavior was observed on FRC simulations. Anyway, it's a ballpark model.

My other comment was related to electrons trapped in the wiffleball. Over most of their orbit there is little or no magnetic field (i.e. Larmor radius bigger than the device size) with the electrons turning when they hit the barrier magnetic field. The electron behavior is stochastic since there are no invariants. We don't have any direct measure of the internal magnetic fields, but we do know the density and have a pretty good idea what the electron energy is. High beta
discharges should expel the magnetic field. The vacuum fields should be in a mirror regime (as was the DTI device) while the wiffleball fields should transition to better confinement. There is about 3 orders of magnitude difference in the predicted confinement times so it's pretty easy to see which regime the device operates in (unless, of course, the cusp recycle is truly enormous).


As you suggest, Bohm diffusion is kind of a catch-all for any kind of confinement you don't understand. We hope we don't end up there, and so far we're OK.


If you are interested in pumps, the specifications for ITER can be found at:

http://www.iter.org/a/index_nav_4.htm . If I am reading this correctly, the pumping power is about 60,000 liters/second. This is ~ 30 times more than the WB-7. It doesn't take a lot of power. Our system takes ~ 500 watts of power. ITER probably requires 10-20 kW.

Tower of Basel

The Global economy has suffered from a halving of total credit largely brought on by the collapse of confidence in the US credit system, but revealing the profound weakness of the EU credit system. The mechanism for creating fiat currency and credit was thereby choked up and this has led to a collapse in money velocity.


This article reveals that need is forcing the creation of a fresh fiat currency system, and ultimately a fresh fiat credit system. That established institutions are stepping into the breach is more an indication of the need to fill the vacuum and must not be mistaken for a well thought out long term plan.


The USA is working through an internal credit disaster. The problem is that an equally large disaster was manufactured offshore on the basis of securitized US credit. Their system disappeared also.


We need a global credit system that accepts an institutional multiple of twelve to one and that means accepting a banking system with a modest internal rate of return. No owner will ever gladly accept that but that must be the price of a government guarantee.

And the price of been too big to fail because that is also a government guarantee.


While the system is contracting, we will be seeing these moves to shore the remnants of the old credit system up to give us time to sort everything out.


The Tower of Basel: Secretive Plans for the Issuing of a Global Currency

Do we really want the Bank for International Settlements (BIS) issuing our global currency


By Ellen Brown

Global Research, April 18, 2009

URL of this article:
www.globalresearch.ca/index.php?context=va&aid=13239

In an April 7 article in The London Telegraph titled “The G20 Moves the World a Step Closer to a Global Currency,” Ambrose Evans-Pritchard wrote:

“A single clause in Point 19 of the communiqué issued by the G20 leaders amounts to revolution in the global financial order.

“We have agreed to support a general SDR allocation which will inject $250bn (£170bn) into the world economy and increase global liquidity,' it said. SDRs are Special Drawing Rights, a synthetic paper currency issued by the International Monetary Fund that has lain dormant for half a century.


“In effect, the G20 leaders have activated the IMF's power to create money and begin global ‘quantitative easing'. In doing so, they are putting a de facto world currency into play. It is outside the control of any sovereign body. Conspiracy theorists will love it.”


Indeed they will. The article is subtitled, “The world is a step closer to a global currency, backed by a global central bank, running monetary policy for all humanity.” Which naturally raises the question, who or what will serve as this global central bank, cloaked with the power to issue the global currency and police monetary policy for all humanity? When the world's central bankers met in Washington last September, they discussed what body might be in a position to serve in that awesome and fearful role. A former governor of the Bank of England stated:


“[T]he answer might already be staring us in the face, in the form of the Bank for International Settlements (BIS).... The IMF tends to couch its warnings about economic problems in very diplomatic language, but the BIS is more independent and much better placed to deal with this if it is given the power to do so.”
1



And if that vision doesn't alarm conspiracy theorists, it should. The BIS has been called “the most exclusive, secretive, and powerful supranational club in the world.” Founded in Basel, Switzerland, in 1930, it has been scandal-ridden from its beginnings. According to Charles Higham in his book Trading with the Enemy, by the late 1930s the BIS had assumed an openly pro-Nazi bias. This was corroborated years later in a BBC Timewatch film titled “Banking with Hitler,” broadcast in 1998.
2 In 1944, the American government backed a resolution at the Bretton-Woods Conference calling for the liquidation of the BIS, following Czech accusations that it was laundering gold stolen by the Nazis from occupied Europe; but the central bankers succeeded in quietly snuffing out the American resolution.3




In Tragedy and Hope: A History of the World in Our Time (1966), Dr. Carroll Quigley revealed the key role played in global finance by the BIS behind the scenes. Dr. Quigley was Professor of History at Georgetown University, where he was President Bill Clinton's mentor. He was also an insider, groomed by the powerful clique he called “the international bankers.” His credibility is heightened by the fact that he actually espoused their goals. He wrote:



“I know of the operations of this network because I have studied it for twenty years and was permitted for two years, in the early 1960's, to examine its papers and secret records. I have no aversion to it or to most of its aims and have, for much of my life, been close to it and to many of its instruments. ... [I]n general my chief difference of opinion is that it wishes to remain unknown, and I believe its role in history is significant enough to be known.”




Quigley wrote of this international banking network:



“[T]he powers of financial capitalism had another far-reaching aim, nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole. This system was to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements arrived at in frequent private meetings and conferences. The apex of the system was to be the Bank for International Settlements in Basel, Switzerland, a private bank owned and controlled by the world's central banks which were themselves private corporations.”



The key to their success, said Quigley, was that the international




bankers would control and manipulate the money system of a nation while letting it appear to be controlled by the government. The statement echoed an often-quoted one made by the German patriarch of what would become the most powerful banking dynasty in the world. Mayer Amschel Bauer Rothschild famously said in 1791:




“Allow me to issue and control a nation's currency, and I care not who makes its laws.”



Mayer's five sons were sent to the major capitals of Europe – London, Paris, Vienna, Berlin and Naples – with the mission of establishing a banking system that would be outside government control. The economic and political systems of nations would be controlled not by citizens but by bankers, for the benefit of bankers. Eventually, a privately-owned “central bank” was established in nearly every country; and this central banking system has now gained control over the economies of the world. Central banks have the authority to print money in their respective countries, and it is from these banks that governments must borrow money to pay their debts and fund their operations. The result is a global economy in which not only industry but government itself runs on “credit” (or debt) created by a banking monopoly headed by a network of private central banks; and at the top of this network is the BIS, the “central bank of central banks” in Basel.



Behind the Curtain



For many years the BIS kept a very low profile, operating behind the scenes in an abandoned hotel. It was here that decisions were reached to devalue or defend currencies, fix the price of gold, regulate offshore banking, and raise or lower short-term interest rates. In 1977, however, the BIS gave up its anonymity in exchange for more efficient headquarters. The new building has been described as “an eighteen story-high circular skyscraper that rises above the medieval city like some misplaced nuclear reactor.” It quickly became known as the “Tower of Basel.” Today the BIS has governmental immunity, pays no taxes, and has its own private police force.
4 It is, as Mayer Rothschild envisioned, above the law.




The BIS is now composed of 55 member nations, but the club that meets regularly in Basel is a much smaller group; and even within it, there is a hierarchy. In a 1983 article in Harper's Magazine called “Ruling the World of Money,” Edward Jay Epstein wrote that where the real business gets done is in “a sort of inner club made up of the half
dozen or so powerful central bankers who find themselves more or less in the same monetary boat” – those from Germany, the United States, Switzerland, Italy, Japan and England. Epstein said:



“The prime value, which also seems to demarcate the inner club from the rest of the BIS members, is the firm belief that central banks should act independently of their home governments... . A second and closely related belief of the inner club is that politicians should not be trusted to decide the fate of the international monetary system.”



In 1974, the Basel Committee on Banking Supervision was created by the central bank Governors of the Group of Ten nations (now expanded to twenty). The BIS provides the twelve-member Secretariat for the Committee. The Committee, in turn, sets the rules for banking globally, including capital requirements and reserve controls. In a 2003 article titled “The Bank for International Settlements Calls for Global Currency,” Joan Veon wrote:




“The BIS is where all of the world's central banks meet to analyze the global economy and determine what course of action they will take next to put more money in their pockets, since they control the amount of money in circulation and how much interest they are going to charge governments and banks for borrowing from them... .

“When you understand that the BIS pulls the strings of the world's monetary system, you then understand that they have the ability to create a financial boom or bust in a country. If that country is not doing what the money lenders want, then all they have to do is sell its currency.”
5




The Controversial Basel Accords



The power of the BIS to make or break economies was demonstrated in 1988, when it issued a Basel Accord raising bank capital requirements from 6% to 8%. By then, Japan had emerged as the world's largest creditor; but Japan's banks were less well capitalized than other major international banks. Raising the capital requirement forced them to cut back on lending, creating a recession in Japan like that suffered in the U.S. today. Property prices fell and loans went into default as the security for them shriveled up. A downward spiral followed, ending with the total bankruptcy of the banks, which had to be nationalized – although that word was not used, in order to avoid criticism.
6



Among other collateral damage produced by the Basel Accords was a spate of suicides among Indian farmers unable to get loans. The BIS capital adequacy standards required loans to private borrowers to be “risk-weighted,” with the degree of risk determined by private rating agencies; and farmers and small business owners could not afford the agencies' fees. Banks therefore assigned 100 percent risk to the loans, and then resisted extending credit to these “high-risk” borrowers because more capital was required to cover the loans. When the conscience of the nation was aroused by the Indian suicides, the government, lamenting the neglect of farmers by commercial banks, established a policy of ending the “financial exclusion” of the weak; but this step had little real effect on lending practices, due largely to the strictures imposed by the BIS from abroad.
7



Similar complaints have come from Korea. An article in the December 12, 2008 Korea Times titled “BIS Calls Trigger Vicious Cycle” described how Korean entrepreneurs with good collateral cannot get operational loans from Korean banks, at a time when the economic downturn requires increased investment and easier credit:



“‘The Bank of Korea has provided more than 35 trillion won to banks since September when the global financial crisis went full throttle,' said a Seoul analyst, who declined to be named. ‘But the effect is not seen at all with the banks keeping the liquidity in their safes. They simply don't lend and one of the biggest reasons is to keep the BIS ratio high enough to survive,' he said... .



“Chang Ha-joon, an economics professor at Cambridge University, concurs with the analyst. ‘What banks do for their own interests, or to improve the BIS ratio, is against the interests of the whole society. This is a bad idea,' Chang said in a recent telephone interview with Korea Times.”



In a May 2002 article in The Asia Times titled “Global Economy: The BIS vs. National Banks,” economist Henry C K Liu observed that the Basel Accords have forced national banking systems “to march to the same tune, designed to serve the needs of highly sophisticated global financial markets, regardless of the developmental needs of their national economies.” He wrote:



“[N]ational banking systems are suddenly thrown into the rigid arms of the Basel Capital Accord sponsored by the Bank of International Settlement (BIS), or to face the penalty of usurious risk premium in securing international interbank loans... . National policies suddenly are subjected to profit incentives of private financial institutions, all members of a hierarchical system controlled and directed from the money center banks in New York. The result is to force national banking systems to privatize ... .


“BIS regulations serve only the single purpose of strengthening the international private banking system, even at the peril of national economies... . The IMF and the international banks regulated by the BIS are a team: the international banks lend recklessly to borrowers in emerging economies to create a foreign currency debt crisis, the IMF arrives as a carrier of monetary virus in the name of sound monetary policy, then the international banks come as vulture investors in the name of financial rescue to acquire national banks deemed capital inadequate and insolvent by the BIS.”



Ironically, noted Liu, developing countries with their own natural resources did not actually need the foreign investment that had trapped them in debt to outsiders:



“Applying the State Theory of Money [which assumes that a sovereign nation has the power to issue its own money], any government can fund with its own currency all its domestic developmental needs to maintain full employment without inflation.”



When governments fell into the trap of accepting loans in foreign currencies, however, they became “debtor nations” subject to IMF and BIS regulation. They were forced to divert their production to exports, just to earn the foreign currency necessary to pay the interest on their loans. National banks deemed “capital inadequate” had to deal with strictures comparable to the “conditionalities” imposed by the IMF on debtor nations: “escalating capital requirement, loan writeoffs and liquidation, and restructuring through selloffs, layoffs, downsizing, cost-cutting and freeze on capital spending.” Liu wrote:



“Reversing the logic that a sound banking system should lead to full employment and developmental growth, BIS regulations demand high unemployment and developmental degradation in national economies as the fair price for a sound global private banking system.”



The Last Domino to Fall



While banks in developing nations were being penalized for falling short of the BIS capital requirements, large international banks managed to escape the rules, although they actually carried enormous risk because of their derivative exposure. The mega-banks succeeded in avoiding the Basel rules by separating the “risk” of default out from the loans and selling it off to investors, using a form of derivative known as “credit default swaps.”



However, it was not in the game plan that U.S. banks should escape the BIS net. When they managed to sidestep the first Basel Accord, a second set of rules was imposed known as Basel II. The new rules were established in 2004, but they were not levied on U.S. banks until November 2007, the month after the Dow passed 14,000 to reach its all-time high. The economy was all downhill from there. Basel II had the same effect on U.S. banks that Basel I had on Japanese banks: they have been struggling ever since to survive.
8



Basel II requires banks to adjust the value of their marketable
securities to the “market price” of the security, a rule called “mark to market.”
9 The rule has theoretical merit, but the problem is timing: it was imposed ex post facto, after the banks already had the hard-to-market assets on their books. Lenders that had been considered sufficiently well capitalized to make new loans suddenly found they were insolvent. At least, they would have been insolvent if they had tried to sell their assets, an assumption required by the new rule. Financial analyst John Berlau complained:



“The crisis is often called a ‘market failure,' and the term ‘mark-to-market' seems to reinforce that. But the mark-to-market rules are profoundly anti-market and hinder the free-market function of price discovery... . In this case, the accounting rules fail to allow the market players to hold on to an asset if they don't like what the market is currently fetching, an important market action that affects price discovery in areas from agriculture to antiques.”1
0




Imposing the mark-to-market rule on U.S. banks caused an instant credit freeze, which proceeded to take down the economies not only of the U.S. but of countries worldwide. In early April 2009, the mark-to-market rule was finally softened by the U.S. Financial Accounting Standards Board (FASB); but critics said the modification did not go far enough, and it was done in response to pressure from politicians and bankers, not out of any fundamental change of heart or policies by the BIS.



And that is where the conspiracy theorists come in. Why did the BIS not retract or at least modify Basel II after seeing the devastation it had caused? Why did it sit idly by as the global economy came crashing down? Was the goal to create so much economic havoc that the world would rush with relief into the waiting arms of the BIS with its privately-created global currency? The plot thickens ... .




Ellen Brown developed her research skills as an attorney practicing civil litigation in Los Angeles. In Web of Debt, her latest book, she turns those skills to an analysis of the Federal Reserve and “the money trust.” She shows how this private cartel has usurped the power to create money from the people themselves, and how we the people can get it back. Her earlier books focused on the pharmaceutical cartel that gets its power from “the money trust.” Her eleven books include Forbidden Medicine, Nature's Pharmacy (co-authored with Dr. Lynne Walker), and The Key to Ultimate Health (co-authored with Dr. Richard
Hansen). Her websites are www.webofdebt.com and
www.ellenbrown.com

Rasmussen Poll on Global Warming

Public Opinion polls are a curious beastie that mean little as far as the science itself is concerned. What is remarkable here is that the public is moving firmly out of any support for the global warming theory and is certain to punish political types that hang unto the program. This has occurred in the face of almost blind levels of media support for the theory and no end of pro theory propaganda. The anti theory crowd has been fighting a battle to get their side out and in fact has almost been feeble.

Yet the public is in complete revolt if these numbers hold up. Of course, the cause of public skepticism is the unrelenting lack of cooperation from the weather. Two clod winters in a row and a complete reversal of the warming trend line is impossible to cover over and explanations are shallow and not been strongly promoted since the proponents are also now covering their backside.

I always thought that linking the weather to CO2 pollution was a bad move in a strategic sense. If the linkage failed to visibly hold up as has abruptly happened, then the linkage damages the linked theory that still needs support.

The arguments for managing CO2 production are excellent and well received. Yet we are all starting back at square one in building support.

There is a lesson here on the power of the media and its real limitations that is unclear.

Energy Update
Only 34% Now Blame Humans for Global Warming

Friday, April 17, 2009

http://www.rasmussenreports.com/public_content/politics/environment/energy_update

Just one-out-of-three voters (34%) now believe global warming is caused by human activity, the lowest finding yet in Rasmussen Reports national surveying. However, a plurality (48%) of the Political Class believes humans are to blame.

Forty-eight percent (48%) of all likely voters attribute
climate change to long-term planetary trends, while seven percent (7%) blame some other reason. Eleven percent (11%) aren’t sure.

These numbers reflect a reversal from
a year ago when 47% blamed human activity while 34% said long-term planetary trends.

Most Democrats (51%) still say humans are to blame for global warming, the position taken by former Vice President Al Gore and other climate change activists. But 66% of Republicans and 47% of adults not affiliated with either party disagree.

Sixty-two percent (62%) of all Americans believe global warming is at least a somewhat serious problem, with 33% who say it’s Very Serious. Thirty-five percent (35%) say it’s a not a serious problem. The overall numbers have remained largely the same for several months, but the number who say Very Serious has gone down.

Forty-eight percent (48%) of Democrats say global warming is a Very Serious problem, compared to 19% of Republicans and 25% of unaffiliateds.

(Want a free
daily e-mail update? Sign up now. If it's in the news, it's in our polls.) Rasmussen Reports updates also available on Twitter.

President Obama has made global warming a priority for his administration. Half (49%) of Americans think the president believes climate change is caused primarily by human activity. This is the first time that belief has fallen below 50% since the president took office. Just 19% say Obama attributes global warming to long-term planetary trends.

Forty-eight percent (48%) rate the president good or excellent on energy issues. Thirty-two percent (32%) give him poor grades in this area.

Sixty-three percent (63%) of adults now say finding new sources of energy is more important that reducing the amount of energy Americans currently consume. However, 29% say energy conservation is the priority.

A growing number of Americans (58%) say the United States needs to build more nuclear plants. This is up five points from
last month and the highest finding so far this year. Twenty-five percent (25%) oppose the building of nuclear plants.

While the economy remains the top issue for most Americans, 40% believe there is a conflict between economic growth and environmental protection. Thirty-one percent 31% see no such conflict, while 29% are not sure.

Please sign up for the Rasmussen Reports
daily e-mail update (it’s free)… let us keep you up to date with the latest public opinion news.

Sunday, April 19, 2009

Cold Fusion on 60 Minutes Tonight

As I posted a couple weeks back, the recent announcements regarding what has been popularly described as cold fusion were game changers. An experiment was run that was able to show the existence of daughter neutrons in association with an unexplained increase in heat.


Having this all land on 60 minutes so quickly will neutralize the decades long bias working against cold fusion research and now spur an upsurge in new research. I for one would like to see dozens of programs launched.


Even if a working device is never made, the advance in knowledge will be awesome. Most of it will be cheap.


When the first announcements were made twenty years ago, the surplus heat production was small enough to allow for other alternative explanations including simple experimental error. Today we have deuterium, a metal powder and way too much heat and no chemistry. If we did not have a fusion pathway, this result would be a mystery for the ages and would turn over everything.


That we are looking at 2500 times input energy is a good sign that we will get a heat engine out of this yet.
Yes, maybe it is time to get excited again.


Try to catch the show tonight. With Robert Duncan doing an audit of the experimental protocols, I see little reason room for the deniers here although I am sure someone will attempt to toss a little skepticism into the stew.

I will also be doing an comment on the Bussard Polywell Fusion experiment update shortly.

Energetics Technologies and Other New Cold Fusion Research Will Be on CBS 60 Minutes April 19, 2009



COLD FUSION IS HOT AGAIN - Presented in 1989 as a revolutionary new source of energy, cold fusion was quickly dismissed as junk science. But today, the buzz among scientists is that these experiments produce a real physical effect that could lead to monumental breakthroughs in energy production. Scott Pelley reports. Denise Schrier Cetta is the producer.




At the present time, using the approaches described above, and thanks in large part to these unique relationships, Energetics Technologies is able to produce excess heat in a significant percentage of the experiments. Extraordinary breakthroughs have been accomplished, backed by tested reproducibility through the multiple independent channels of SRI, and ENEA. With proof of principle, it is now time to accelerate the work, leading to the commercialization of this promising technology.

The Promise of SuperWave™ Fusion / Dr. Irving Dardik Is it Cold Fusion? / Dr. Irving Dardik [the fusion is motion]






CBS asked Robert Duncan, vice chancellor for research at the University of Missouri and an expert in low-temperature physics, to look into the LENR research. Duncan was referred to CBS by Allen Goldman, the head of the condensed matter physics group at the American Physical Society. Duncan spent several weeks (on his own time) investigating LENR in October. CBS paid his travel expenses to meet with researchers at Energetics' laboratory in Omer, Israel, and observe a working LENR excess-heat experiment. Duncan emphasized to New Energy Times his objectivity of and independence from the research. "‘60 Minutes’ asked the American Physical Society for a reference for someone like myself who’s done very careful measurements in related fields but not specifically in LENR," Duncan said. "I've never been involved in any 'cold fusion' research in the past, nor am I involved in any now." 

Duncan also met with researchers at NRL in Washington, D.C., and the SPAWAR researchers when they were in Salt Lake City at the American Chemical Society meeting in March. He was skeptical of the LENR excess heat before his investigation. New Energy Times spoke with Duncan today. "Sam Hornblower of CBS asked me to read some papers and talk to some of the scientists, and it quickly became clear to me that it was a very interesting result. After I saw some of the hardware, I had a chance to ask about the experimental configurations and dig in deeper, and now I am convinced that this excess-heat effect is real." Duncan was particularly impressed with the SPAWAR research because of its clear evidence for nuclear reactions.