Friday, February 6, 2009

Nanodot Advance Threatens Moore's Law

This item from the Edmonton Journal is of great interest. We have suddenly able to trap one or two electrons or whatever is meant as a particle in this story. This is a long awaited threshold because again with a small number of active electrons, the ability to manipulate rises sharply.

Also it suggests that we have finally found the limits of Moore’s Law, at least as far as devices working with electrons. It has been an amazing fifty years or so. And now for our next trick?

This is a fine result and I hope that we are picking up fabrication methods. We now need an overview article on where we are with exactly that. Single layer graphene sheets with electron holes sound possible today. I would like to see a carbon lattice with cut outs produced that can be stacked with successively smaller cut outs to create wells. I would like to see these wells interacting with photons. I fear we are still a ways off yet.

It is clear that we are now on the way to truly superior processors that will have the capacity to support the holodeck of Star Trek fame.

Find may revolutionize computers

U of A breakthough in nanotechnology redefines 'small'
By Keith Gerein, The Edmonton Journal February 3, 2009


Robert Wolkow led a team of U of A nanotechnology experts that discovered new, energy-efficient quantum dots.

Scientists at Edmonton's National Institute for Nanotechnology have made a significant breakthrough that could help pave the way for new generations of smaller, more energy-efficient computers.

The team, led by Robert Wolkow, has invented the world's smallest quantum dots, atom-sized devices capable of controlling electrons, using a fraction of the power of current computer technology.

"Roughly speaking, we predict there could be a 1,000-time reduction in power consumption with electronic computers built in this new way," said Wolkow, a physicist at the University of Alberta.

"And they could be something like 1,000 times smaller in size. So it's reaching the very limit as far as anyone could imagine of how small things could get."

The team's work is published in the latest edition of Physical Review Letters, considered the world's premier physics journal.

Current computers use transistors, which are essentially valves for flowing streams of electrons around a circuit. In recent years, engineers have found ways to make these devices smaller, but pushing electrons through narrow spaces raises the danger of the machines overheating and failing.

"So the problem is no longer how do we make it smaller, it's how do we consume less power," Wolkow said.

His team's development is timely, because the new technology largely eliminates the need for electron flow and instead makes use of a wave-like phenomenon to transmit information, he said.

As a crude analogy, Wolkow described a scenario in which two people are standing at opposite ends of a calm pool and one person drops a pebble in the water. A tiny wave eventually moves across the pool, sending information about the pebble to the other person.

"The energy required to do that would be rather minimal instead of hefting a pipe full of water and pushing the water through it," Wolkow said.

"You don't have to flow millions and millions of uncountable electrons through wires or pipes to transmit information."

This is where the quantum dots, which are essentially vessels or "bottles" for electrons, come in.

Previously developed quantum dots range in size from two to 10 nanometres -- a nanometre is one-billionth of a metre -- and contain groupings of 1,000 or more atoms. Although very tiny, those vessels are still large enough to allow the electrons to dash around in ways too varied to be controlled. Only lowering the temperature to ultra-cold extremes provides any measure of control, and this is impractical.

The quantum dot developed by Wolkow's team is much smaller; less than a nanometre in diameter and containing only one or two particles.

The advantage of such a minuscule environment is that the electrons have few choices in how to behave. As such, their motions and interactions can be more easily harnessed to transmit information, even at room temperature.

"By having this tiniest quantum dot, we have created a way that allows the normally complex interactions among electrons to be distilled down to a controllable and useful level."

The discovery is a highly anticipated milestone in nanotechnology circles.

kgerein@thejournal.canwest.com

Solar Power Game Changer

These folks have achieved near perfect light absorption for silicon based solar energy and obviously any other optical device using silicon. This will surely lead to many device improvements, probably including eyeglasses.

It should also improve general efficiency of a range of solar energy devices. Hopefully conversion improves also or we will end up with simply warmer glass.

And certainly the ability to absorb easily from a much wider range of angles is clearly a huge breakthrough for the technology that will slash costs.

It is marvelous how the capability to extract results from a layer of atoms is now racing ahead everywhere.

Solar Power Game-Changer: “Near Perfect” Absorption of Sunlight, From All Angles

Researchers at Rensselaer Polytechnic Institute have discovered and demonstrated a new method for overcoming two major hurdles facing solar energy. By developing a new antireflective coating that boosts the amount of sunlight captured by solar panels and allows those panels to absorb the entire solar spectrum from nearly any angle, the research team has moved academia and industry closer to realizing high-efficiency, cost-effective solar power.

“To get maximum efficiency when converting solar power into electricity, you want a solar panel that can absorb nearly every single photon of light, regardless of the sun’s position in the sky,” said Shawn-Yu Lin, professor of physics at Rensselaer and a member of the university’s Future Chips Constellation, who led the research project. “Our new antireflective coating makes this possible.”

Results of the year-long project are explained in the paper “Realization of a Near Perfect Antireflection Coating for Silicon Solar Energy,” published this week by the journal Optics Letters.

An untreated silicon solar cell only absorbs 67.4 percent of sunlight shone upon it — meaning that nearly one-third of that sunlight is reflected away and thus unharvestable. From an economic and efficiency perspective, this unharvested light is wasted potential and a major barrier hampering the proliferation and widespread adoption of solar power.

After a silicon surface was treated with Lin’s new nanoengineered reflective coating, however, the material absorbed 96.21 percent of sunlight shone upon it — meaning that only 3.79 percent of the sunlight was reflected and unharvested. This huge gain in absorption was consistent across the entire spectrum of sunlight, from UV to visible light and infrared, and moves solar power a significant step forward toward economic viability.

Lin’s new coating also successfully tackles the tricky challenge of angles.

Most surfaces and coatings are designed to absorb light — i.e., be antireflective — and transmit light — i.e., allow the light to pass through it — from a specific range of angles. Eyeglass lenses, for example, will absorb and transmit quite a bit of light from a light source directly in front of them, but those same lenses would absorb and transmit considerably less light if the light source were off to the side or on the wearer’s periphery.

This same is true of conventional solar panels, which is why some industrial solar arrays are mechanized to slowly move throughout the day so their panels are perfectly aligned with the sun’s position in the sky. Without this automated movement, the panels would not be optimally positioned and would therefore absorb less sunlight. The tradeoff for this increased efficiency, however, is the energy needed to power the automation system, the cost of upkeeping this system, and the possibility of errors or misalignment.

Lin’s discovery could antiquate these automated solar arrays, as his antireflective coating absorbs sunlight evenly and equally from all angles. This means that a stationary solar panel treated with the coating would absorb 96.21 percent of sunlight no matter the position of the sun in the sky. So along with significantly better absorption of sunlight, Lin’s discovery could also enable a new generation of stationary, more cost-efficient solar arrays.

“At the beginning of the project, we asked ‘would it be possible to create a single antireflective structure that can work from all angles?’ Then we attacked the problem from a fundamental perspective, tested and fine-tuned our theory, and created a working device,” Lin said. Rensselaer physics graduate student Mei-Ling Kuo played a key role in the investigations.

Typical antireflective coatings are engineered to transmit light of one particular wavelength. Lin’s new coating stacks seven of these layers, one on top of the other, in such a way that each layer enhances the antireflective properties of the layer below it. These additional layers also help to “bend” the flow of sunlight to an angle that augments the coating’s antireflective properties. This means that each layer not only transmits sunlight, it also helps to capture any light that may have otherwise been reflected off of the layers below it.

The seven layers, each with a height of 50 nanometers to 100 nanometers, are made up of silicon dioxide and titanium dioxide nanorods positioned at an oblique angle — each layer looks and functions similar to a dense forest where sunlight is “captured” between the trees. The nanorods were attached to a silicon substrate via chemical vapor disposition, and Lin said the new coating can be affixed to nearly any photovoltaic materials for use in solar cells, including III-V multi-junction and cadmium telluride.

Along with Lin and Kuo, co-authors of the paper include E. Fred Schubert, Wellfleet Senior Constellation Professor of Future Chips at Rensselaer; Research Assistant Professor Jong Kyu Kim; physics graduate student David Poxson; and electrical engineering graduate student Frank Mont.

Funding for the project was provided by the U.S. Department of Energy’s Office of Basic Energy Sciences, as well as the U.S. Air Force Office of Scientific Research.

Published November 3, 2008 Contact: Michael Mullaney Phone: (518) 276-6161E-mail:
mullam@rpi.edu

Niche of the Burrunjor






The possible presence of a living theropod in the tropics of Northern Australia was a bit of a shock and I have since had time to reflect and some interesting inferences suggest themselves.

The continent of Australia included Papua – New Guinea until about twelve thousand years ago. This block was and is cut off by deep waters from the remainder of the globe and has been since the age of the dinosaurs. That means that the only place on Earth that had any chance of retaining a markedly different suite of plants and animals was this region. Every where else was eventually over run by the Eurasian biome.

There were four separate tropical zones suitable for the survival of theropods. Obviously Africa and South America and then we have the island arc and continent of Indonesia and Australia, neatly cut into two separate zones. Even more important, Yucatan which was the site of the asteroid impact that ended the age of dinosaurs was on the opposite side of the globe from the jungles of northern Australia. That was simply the best place to survive the initial shock and as it turned out the place where a more restrained evolutionary development took place.

The evidence so far is also very instructive. The animal is non migratory and generally slow moving compared to its obvious upland competitors. It did not run down game, but certainly was able to take out a cow and carry it off. That is no big trick. So could you. It probably runs as fast as we do.

That means that it operates inside a hunting range and this is supported by the pattern of losses and the behavior of dogs who immediately recognized the range of a superior predator. The terrain descriptions so far are of tropical woodlands with a low canopy and dense ground cover. This is difficult to travel through and see anything.

That leaves only the important question that should have been understood a century ago. What does it eat? It cannot run down herds of Kangaroos, nor can it prey on large herds of grazing dinosaurs that these hard cases likely hunted to extinction in their refuge a few million years ago. However, it is the perfect hunting machine for eating crocodiles and alligators. With its huge legs and weight, it can leap onto the back of such a reptile and use its jaws to crush the prey to death.

The rivers are full of such game and the hunting area can be small. You can also be sure that nosy little primates will stay far away. Thus we have almost unlimited food and secure hunting areas along rivers. No wonder they are still with us. That begs the question of survival in the Congo and the Amazon. They could have and we certainly were avoiding them anyway.

Now we understand their biological niche, it should be possible to stake out districts and to run one down. Carefully of course, as no human in their right mind will go personally into such an area on foot.

Thursday, February 5, 2009

The Burrunjor

I bought a copy of nexus magazine today and after reading several articles, came across an article on the Burrunjor. Now this was my first introduction to the word ever. And I should have come across it long ago. So I was surprised. I suspect that it is also your first introduction.

I learned a long time ago that eyewitness reports must be gathered when a new phenomena arises. In this case, in remote country, we have a full range of reports. I will go further here and state simply that the phenomenon not only exists but is certainly what we think we are seeing.

What is astonishing is that we have not had much more press coverage telling this wonderfully sensational story.

A comparable is Bigfoot which has received continuous coverage for decades.

I admit that like most that I dismissed reports on Bigfoot. Then I investigated. I discovered a library of nearly 10,000 unique sightings that were unmistakable by individuals from all walks and mostly experienced observers in a woodland situation.

What was so shocking to me was that Bigfoot numbers in the thousands and is distributed throughout North America. He avoids easy discovery because his biological niche never clashes with our own. He lives in the forest and avoids clearings and he is primarily nocturnal. Yet he still generated 10,000 sightings

Here we have a theropod distributed in tropical conditions in northern Australia and New Guinea. It is living in refuges caused by the lift of the sea levels ten thousands of years ago. It is the right place and is truly separate from the rest of Asia.

The good news is that he should not be too hard to track down and properly film, with quiet air support. He is not overly aggressive or too quick. He probably hunts by waiting for opportunity and relying on camouflage. Once again this is an expensive proposition to pursue, but will be much easier than Bigfoot.

The Burrunjor

There is an aboriginal legend of a frightening creature called the Burrunjor. The Burrunjor is said to be bipedal and have 2 short, almost useless arms - a good description of a carnivorous theropod.
[i]Rex Gilroy's book, Out of the Dreamtime:

My readers might think I am going “off the deep end” for what I am about to propose, but there has, since long before the coming of White Man, traditions among the Aboriginal tribes of Australia’s ‘Red Centre’ to the Gulf Country and Kimberley region, of a ferocious giant reptilian carnivore that roams the landscape day and night in search of food, both animal and human.

Known as ‘Burrunjor’, the mere mention of his name is often guaranteed to send a shiver down the spine of any Aboriginal. Yet it is not only Aborigines who have claimed to have seen these monsters, but also many Europeans, stockmen, residents of remote cattle stations and 4-wheel drive travellers.

Burrunjor can best be described as a huge, bipedal-walking reptilian monster. Tyrannosaurus comes to mind. Whatever Burrunjor is, ‘he’ leaves huge three-toed tracks behind him wherever he strides. This is significant, because there have been suggestions that Burrunjor could be based upon the ‘extinct’ giant Australian monitor Lizard, Megalania prisca, which reached up to 30ft [9.14m] and which is almost the height reached by Burrunjors claimed seen by both Aborigines and Europeans.
Out of the Dreamtime – The Search for Australasia’s Unknown Animals” contains three chapters on the subject of giant monitor lizards, not just in Australia, but throughout Australasia. Burrunjor is however, something else, for while even a giant monitor might copy its smaller relatives and adopt a bipedal stance, even to run a short distance, Burrunjor is said to maintain a bipedal stance for considerable distances.
If indeed Burrunjor is a surviving form of dinosaur, ‘he’ belongs to the Theropods, the group of reptilian bipeds that flourished throughout the Cretaceous period, becoming extinct by 65 million or so years ago. Perhaps Burrunjor is a ‘neodinosaur’, that is an evolved latter-day offshoot from this group.

Campfire stories substantiating Aboriginal claims are commonplace across the far north. Back in 1978, a Northern Territory busman and explorer, Bryan Clark, related a story to me of his own that had taken place some years before. While mustering cattle in the Urapunji area, he became lost in the remote wilderness of that part of Arnhem Land. It took him three days to find his way out of the region and back to the homestead from where he originally set out.

He had not known at the time, but his footprints had been picked up and followed by two Aboriginal trackers and a mounted policeman. On the first night of their search they camped on the outskirts of the Burrunjor scrub, even though the two trackers protested strongly against doing so. The policeman hobbled his
horse, cooked their meal, then climbed into his swag and went to sleep.

Later that night the two Aborigines shouting intelligibly and grasping for their packs and saddles suddenly woke him up. The policeman also realised at this moment that the ground appeared to be shaking. Hurriedly getting to his feet, he too gathered up his belongings, and shortly afterwards, the three galloped away. As he told Bryan Clark later at the Urapunji homestead, he had also heard a sound, somewhat like a loud puffing or grunting noise, certainly loud enough to be coming from some large animal.

When asked if he intended to include this incident in his report, he replied he would not because he feared no one would believe him.

The policeman warned Bryan never again to return to that area, because if he got lost there again he’d be “on his own”, as he would not come looking for him! The region’s cave art, thousands of years old, depicts these monstrous animals. Many Aborigines believe these monsters wander back and forth across the Gulf country and Cape York to this day.

Back in 1950, cattlemen lost stock to some mysterious beast that left the mutilated, half-eaten remains of
cows and bulls in its wake over a wide area, stretching between the border country and Burketown. Searchers on horseback found huge reptilian tracks of some bipedal-walking beast. They followed these three-toed tracks with their cattle dogs through some rough jungle terrain until they entered swampland beyond which was more dense scrub.However, it was at this point that the cattle dogs became uneasy and ran off. The horses were also uneasy and obviously did not want to cross the swamp. While most of the cattlemen decided their animals knew best, two men set off on foot with their carbines.

The story goes that they soon came across further tracks in an open area beyond the swamp. While his mate searched about, the other man briefly spotted the dark form of an enormous creature, perhaps 30ft in height, further off in dense timber. The men left the scene in haste.

Johnny Mathews, a part-Aboriginal tracker, claimed to have seen a 25ft tall bipedal reptilian monster, moving through scrub near lagoon Creek on the Gulf coast one day in 1961. “Hardly anyone outside my own people believes my story, but I known what I saw”, he said to me in 1970.In 1985 a 4-wheel drive vehicle and it s family of travellers, the Askeys, heading for Roper River Mission, happened to take a back road for some sightseeing. Just before they were to pull up and turn around to resume their journey to the mission, they all saw, moving together across an open plain some distance away, two bipedal-walking reptilian creatures a good 20ft tall respectively.

“The monsters were a greyish-brown colour and dinosaur-like in appearance. We didn’t wait around”, said the father, Mr Greg Askey.

Intellectual Lapse on Climate Debate

This spat is quite delightful. The scientific community has not quite woken up to the reality that many new ideas are going to be published first on the internet to establish priority and to expose them to an initial peer review. They all can be improved on by work and editing, but they are there.

Regardless, there are no two classes of people, those permitted to have an idea and those not.

In this case, the internet has bared a prime example of intellectual dishonesty that was simply intentional. It is no different that thinking you are a better human being because you have a bit of money in your jeans.

A true scholar must abhor the idea of not crediting an idea that he is working on.

This has been going on for several days the culprit is getting roasted by his peers.

A major part of scholarship is to chronicle work and contributions by others. This is important because an informant’s context is often very critical in later review.

Scholarship is today devolving to virtual teams of informed participants who access each other through the internet. It is allowing scholarship to be super efficient and it is allowing other informants to be involved. The fact that brainstorming the issue provided a new insight to Mr. Schmidt to build into his paper was a benefit of the internet to Mr Schmidt. It is only honorable to give a passing nod at least for the idea. And it strikes right at the heart of the scientific method which depends on the trustworthiness of the observer(s).
Update: Prominent Scientist Again Challenges Schmidt on Climate Models!

Atmospheric scientist Dr. Hendrik Tennekes slammed RealCliamte.org’s Gavin Schmidt on Janaury 29, 2009: See:
Prominent Scientist ‘Appalled’ By Gavin Schmidt’s ‘lack of knowledge’ – ‘Back to graduate school, Gavin!’ – Climate Science Blog

On February 4, 2009, Tennekes
posted a follow up report on Schmidt’s scientific views. Tennekes is a scientific pioneer in the development of numerical weather prediction and former director of research at The Netherlands' Royal National Meteorological Institute, and an internationally recognized expert in atmospheric boundary layer processes. Tennekes is featured in U.S. Senate Minority Report Update: More Than 650 International Scientists Dissent Over Man-Made Global Warming Claims

Tennekes wrote on Feb. 4: “I understand that Gavin Schmidt was upset by my essay of January 29. [...] So why should one base climate policy on forecasts made by climate models? Curiously, Gavin’s text is conceptually vague. He should be able to do better. It is up to you, Gavin. I am waiting.”
http://climatesci.org/2009/02/04/dissecting-a-real-climate-text-by-hendrik-tennekes

February 4, 2009
Dissecting a Real Climate Text by Hendrik Tennekes

I understand that Gavin Schmidt was upset by my essay of January 29 . I admit that I neglected to mention that I responded to his long exposition of January 6 on Real Climate. The part of his text that deals with the difference between weather models and climate models reads:

“Conceptually they are very similar, but in practice they are used very differently. Weather models use as much data as there is available to start off close to the current weather situation and then use their knowledge of physics to step forward in time. This has good skill for a few days and some skill for a little longer. Because they are run for short periods of time only, they tend to have much higher resolution and more detailed physics than climate models (but note that the Hadley Centre for instance, uses the same model for climate and weather purposes). Weather models develop in ways that improve the short term predictions, though the impact for long term statistics or the climatology needs to be assessed independently. Curiously, the best weather models often have a much worse climatology than the best climate models. There are many current attempts to improve the short-term predictability in climate models in line with the best weather models, though it is unclear what impact that will have on projections.”

What to make of this? I will dissect this paragraph line by line.

“Conceptually they are very similar……”

In practice, they are. However, as I have argued time and again, this apparent similarity is a serious defect. A crude representation of the ocean is all that is needed for a weather model, but in a climate model the ocean should share center stage with deforestation and other land use changes.
“Weather models …use their knowledge of physics to step forward in time.”

What Gavin leaves unsaid here is that most of the physics in a weather model deals with the atmosphere. Also, most of the physics is parameterized and the reliability of the parameterizations continues to be debated. I don’t want to pick nits, else I would query how models can possess knowledge of any kind.
“This has good skill for a few days…….”

Yes, Gavin is aware of Lorenz’ butterfly. He fails to state, however, that the average prediction horizon of weather forecasts is comparable to the lifetime of synoptic weather systems. I would not mind this omission, were it not for the fact that the (unknown) prediction horizon of climate models is determined in part by the life time of circulation systems in the ocean, such as the Pacific Decadal Oscillation. Since weather models and climate models are conceptually similar, one must expect similar predictability problems.

“Because they are run for short periods of time only……”

The logic in this sentence is inverted. The development of weather and climate models is driven by the desire to employ the latest supercomputers available. It is conceptually a small matter to fill these computers with parameterizations operating at higher resolution. My interactions with Tim Palmer of ECMWF (see my weblog of June 24, 2008) focused on his claim for Seamless Prediction Systems. His advocacy boiled down to a quest for a computer facility that could run climate models at the resolution now feasible for weather models. I submit that no conceptual progress can be expected if the modeling community fails to reconsider the architecture of their software.

“Weather models develop in ways that improve…….”

This line ends with the need to independently assess the impact of model improvements on long-term statistics. I agree with the need, but not with Gavin’s off-hand way of letting this problem pass by without explaining how such assessments can or should be performed. Throughout this text Gavin avoids matters of methodology. That, to me, misleads all readers who are not professionals themselves.

“Curiously, the best weather models…….”

At this point, a Dutchman would say “Nu breekt mijn klomp” (now my clog breaks). Gavin Schmidt is a professional climate modeler, but he appears surprised that the climatology of weather models is inferior. Of course it is. Weather models deal with the atmosphere, climate models with the entire climate system.
“There are many current attempts to improve the short-term predictability …….”

Climate modelers are responding to public opinion and have chosen to develop “seamless” or “unified” prediction systems. The present skill of seasonal forecasts is marginal at best; why should the public and their governments have confidence in forecasts many ten of years ahead? Conceptually, this is indeed a crucial question. It cannot be answered by increasing computer power. Gavin admits as much:

“….. it is unclear what impact that will have on projections.”

So why should one base climate policy on forecasts made by climate models?

Curiously, Gavin’s text is conceptually vague. He should be able to do better.

It is up to you, Gavin. I am waiting.

Subject: Update: Gavin Schmidt demands Pielke Jr. Pull Critical Blog!! - Schmidt "Uses terms like 'slander' and 'abuse'"

[Note: My last email alert on the unfolding Gavin Schmidt comedy show is now posted here:
http://hallofrecord.blogspot.com/2009/02/remember-antarctic-warming-that.html - Real Climate does not appear to like criticism. Real Climate’s Eric Steig, like Schmidt, has also been recently throwing phrases as “fraud” and “libel” after receiving critical analysis of his work. See: First Author of ‘Antarctic Warming Paper’ Claims Libel - http://co2sceptics.com/news.php?id=2670 ]

Update: Gavin Schmidt demands Pielke Jr. Pull Critical Blog!! – Schmidt "Uses terms like 'slander' and 'abuse'" – February 4, 2009 – By Roger Pielke, Jr.

Excerpt: Gavin Schmidt at NASA has just now written an email to the director of CIRES and the Director of the Center for Science and Technology Policy Research (but not to me), where I work at the University of Colorado, demanding that we take down
this post and extend to him an apology. If Gavin wants, he is free to respond on this blog. I have not posted his email, though if he wants, I’d be happy to post that up as well. He does use terms like “slander” and “abuse.” I think my comments in the posting are are a fair representation of the pickle Gavin has gotten himself into. When will these guys learn that bullying and bluster is not going to win them any respect or friends?

http://sciencepolicy.colorado.edu/prometheus/gavin-schmidts-demands-4931

Piekle Jr. added: “Gavin got caught out. I feel bad for the guy. But writing screeds to my superiors at the University won’t help him move past this episode. He should just say ‘whoops, my bad, learn and move on.’”

Gavin Schmidt’s Demands
February 4th, 2009
Posted by:
Roger Pielke, Jr.

Gavin Schmidt at NASA has just now written an email to the director of CIRES and the Director of the Center for Science and Technology Policy Research (but not to me), where I work at the University of Colorado, demanding that we take down
this post and extend to him an apology.

If Gavin wants, he is free to respond on this blog. I have not posted his email, though if he wants, I’d be happy to post that up as well. He does use terms like “slander” and “abuse.” I think my comments in the posting are are a fair representation of the pickle Gavin has gotten himself into.

When will these guys learn that bullying and bluster is not going to win them any respect or friends?
This entry was posted on Wednesday


Alert: Real Climate Woes: Pielke Jr.: 'Gavin Schmidt admits to stealing a scientific idea from his arch-nemesis, Steve McIntyre' – February 4, 2009

Excerpt: This is not a hypothetical example, but a caricature of real goings on with our friends over at Real Climate . . .Due to an inadvertent release of information, NASA’s
Gavin Schmidt (a “real scientist” of the Real Climate blog) admits to stealing a scientific idea from his arch-nemesis, Steve McIntyre (not a “real scientist” of the Climate Audit blog) and then representing it as his own idea, and getting credit for it. (Details here and here.) In his explanation why this is OK, Gavin explains that he did some work on his own after getting the idea from Steve’s blog, and so it was OK to take full credit for the idea. I am sure that there are legions of graduate students and other scientific support staff who do a lot of work on a project, only to find their sponsor or advisor, who initially proposed the idea, as first author on the resulting paper, who might have empathy for Gavin’s logic. […] But lets be clear, in science, the ethical thing to do is to give full credit to the origination of an idea, even if it comes from your arch-enemy. Gavin’s outing is remarkable because it shows him not only stealing an idea, but stealing from someone who he and his colleagues routinely criticize as being wrong, corrupt, and a fraud. Does anyone wonder why skepticism flourishes? When evaluations of expertise hinge on trust, stealing someone’s ideas and taking credit for them does not help.

http://sciencepolicy.colorado.edu/prometheus/alls-fair-in-love-war-and-science-4929

Gavin's "Mystery Man" Revealed - by Steve McIntyre on February 4th, 2009

Excerpt: On Monday, Feb 2, Gavin Schmidt explained some "ethics" to realclimate readers as follows: [Response: People will generally credit the person who tells them something. BAS were notified by people Sunday night who independently found the Gill/Harry mismatch. SM could have notified them but he didn’t. My ethical position is that it is far better to fix errors that are found than play around thinking about cute names for follow-on blog posts. That might just be me though. - gavin] As readers know, I was interested in who was the scientist that, unbeknowst to me, had "independently" identified the problem with Harry - a problem overlooked by BAS, NASA GISS for a year or so anyway; and a problem which had been missed by his realclimate coauthors, Steig and Mann, during their preparation of Steig et al 2009, and which had been missed by the Nature peer reviewers. And remarkably this had been "independently" identified just after I had noted the problem at Climate Audit and Climate Audit readers had contributed ideas on it, even during the Super Bowl. Yesterday, I inquired about the identity of Gavin's "mystery man"? Today (Feb 4) the British Antarctic survey revealed the identity of Gavin's "mystery man". It was… GAVIN.

http://www.climateaudit.org/?p=5093

Schmidt’s Antics Prompts Laughter From Scientist ‘“How am I supposed to get any work done when I am laughing so hard?”

Reaction By Climate researcher Dr. Craig Loehle, formerly of the department of Energy Laboratories and currently with the National Council for Air and Stream Improvements, who has published more than 100 peer-reviewed scientific papers.
“How am I supposed to get any work done when I am laughing so hard?”
http://www.climateaudit.org/?p=5093#comment-324316

Report: Error in Antarctic Warming Paper? Warming trend 'arises entirely from the impact of splicing the two data sets together' – Australia’s Herald Sun – February 4, 2009

Excerpt: But Steve McIntyre, who did most to expose Mann’s “hockey stick”, now notices a far more embarrassing problem with Steig’s paper. Previous researchers hadn’t overlooked the data. What they’d done was to ignore data from four West Antarctic automatic weather stations in particular that didn’t meet their quality control. As you can see above, one shows no warming, two show insignificant warming and fourth - from a station dubbed “Harry” shows a sharp jump in temperature that helped Steig and his team discover their warming Antarctic. Uh oh. Harry in fact is a problematic site that was buried in snow for years and then re-sited in 2005. But, worse, the data that Steig used in his modelling which he claimed came from Harry was actually old data from another station on the Ross Ice Shelf known as Gill with new data from Harry added to it, producing the abrupt warming. The data is worthless. Or
as McIntyre puts it: Considered by itself, Gill has a slightly negative trend from 1987 to 2002. The big trend in “New Harry” arises entirely from the impact of splicing the two data sets together. It’s a mess.

Read
this link and this to see McIntyre’s superb forensic work. Why wasn’t this error picked up earlier? Perhaps because the researchers got the results they’d hoped for, and no alarm bell went off that made them check. Now, wait for the papers to report the error with the zeal with which they reported Steig’s “warming”.

http://blogs.news.com.au/heraldsun/andrewbolt/index.php/heraldsun/comments/going_cold_on_antarctic_warming#48360

Prominent Scientist ‘Appalled’ By Gavin Schmidt’s ‘lack of knowledge’ – ‘Back to graduate school, Gavin!’ – Climate Science Blog – January 29, 2009

By Atmospheric scientist Dr. Hendrik Tennekes, a scientific pioneer in the development of numerical weather prediction and former director of research at The Netherlands' Royal National Meteorological Institute, and an internationally recognized expert in atmospheric boundary layer processes. Tennekes is featured in
U.S. Senate Minority Report Update: More Than 650 International Scientists Dissent Over Man-Made Global Warming Claims

Excerpt: Roger Pielke Sr. has graciously invited me to add my perspective to his discussion with Gavin Schmidt at RealClimate. If this were not such a serious matter, I would have been amused by Gavin’s lack of knowledge of the differences between weather models and climate models. As it stands, I am appalled. Back to graduate school, Gavin! [...] Gavin Schmidt is not the only meteorologist with an inadequate grasp of the role of the oceans in the climate system. In my weblog of June 24, 2008, I addressed the limited perception that at least one other climate modeler appears to have. A few lines from that essay deserve repeating here.” [...] From my perspective it is not a little bit alarming that the current generation of climate models cannot simulate such fundamental phenomena as the Pacific Decadal Oscillation. I will not trust any climate model until and unless it can accurately represent the PDO and other slow features of the world ocean circulation. Even then, I would remain skeptical about the potential predictive skill of such a model many tens of years into the future.

http://climatesci.org/2009/01/29/real-climate-suffers-from-foggy-perception-by-henk-tennekes/

[Note: for more analysis of the warming partisans at Real Climate, see these links from
Israeli Astrophysicist Nir Shaviv’s website: “The aim of RealClimate.org is not to engage a sincere scientific debate. Their aim is post a reply full of a straw man so their supporters can claim that your point ‘has been refuted by real scientists at ReaClimate.org.’” Shaviv, who calls the website “Wishfulclimate.org”

Green Comet Lulin's Passage

This is a great story reminding us that the human eyeball is out there working no matter what else we might be doing. This is a great discovery and the green color is something I have never before been made aware of.
It is times like this that I personally miss the old farm that I grew up on. The midnight sky in the month of January was spectacular. Orion filled the south and the milky wave poured across the heavens in a dash of cream. In such environs this comet will be soon visible.

A surprise for me is the apparent scale of the comet described. It is as big as the atmosphere of Jupiter.

This means that when it becomes time to terraform Venus, one such comet will be perfect. This is a much more tractable problem. Find the right size far enough out and give it a nudge in the right direction. The gravity well of Venus will do the rest.

It occurs to me that each passage through the sun’s gravity well should strip off a lot of the associated gas and even leaves a lot behind by interaction with the solar flux, incrementally altering the orbit. I know that this has been postulated before but this particular comet is ideal for study of initial effects. Too bad we do not have a craft to track its passage.

02.04.2009
Green Comet Approaches Earth

February 4, 2009: In 1996, a 7-year-old boy in China bent over the eyepiece of a small telescope and saw something that would change his life--a comet of flamboyant beauty, bright and puffy with an active tail. At first he thought he himself had discovered it, but no, he learned, two men named "Hale" and "Bopp" had beat him to it. Mastering his disappointment, young Quanzhi Ye resolved to find his own comet one day.

And one day, he did.

Fast forward to a summer afternoon in July 2007. Ye, now 19 years old and a student of meteorology at China's Sun Yat-sen University, bent over his desk to stare at a black-and-white star field. The photo was taken nights before by Taiwanese astronomer Chi Sheng Lin on "sky patrol" at the Lulin Observatory. Ye's finger moved from point to point--and stopped. One of the stars was not a star, it was a comet, and this time Ye saw it first.

Comet Lulin, named after the observatory in Taiwan where the discovery-photo was taken, is now approaching Earth. "It is a green beauty that could become visible to the naked eye any day now," says Ye.

Amateur astronomer Jack Newton sends this photo from his backyard observatory in Arizona:

"My retired eyes still cannot see the brightening comet," says Newton, "but my 14-inch telescope picked it up quite nicely on Feb. 1st."

The comet makes its closest approach to Earth (0.41 AU) on Feb. 24, 2009. Current estimates peg the maximum brightness at 4th or 5th magnitude, which means dark country skies would be required to see it. No one can say for sure, however, because this appears to be Lulin's first visit to the inner solar system and its first exposure to intense sunlight. Surprises are possible.

Lulin's green color comes from the gases that make up its Jupiter-sized atmosphere. Jets spewing from the comet's nucleus contain cyanogen (CN: a poisonous gas found in many comets) and diatomic carbon (C2). Both substances glow green when illuminated by sunlight in the near-vacuum of space.

In 1910, many people panicked when astronomers revealed Earth would pass through the cyanogen-rich tail of Comet Halley. False alarm: The wispy tail of the comet couldn't penetrate Earth's dense atmosphere; even it if had penetrated, there wasn't enough cyanogen to cause real trouble. Comet Lulin will cause even less trouble than Halley did. At closest approach in late February, Lulin will stop 38 million miles short of Earth, utterly harmless.

To see Comet Lulin with your own eyes, set your alarm for 3 am. The comet rises a few hours before the sun and may be found about 1/3rd of the way up the southern sky before dawn. Here are some dates when it is especially easy to find:

Feb. 6th: Comet Lulin glides by Zubenelgenubi, a double star at the fulcrum of Libra's scales. Zubenelgenubi is not only fun to say (zuBEN-el-JA-newbee), but also a handy guide. You can see Zubenelgenubi with your unaided eye (it is about as bright as stars in the Big Dipper); binoculars pointed at the binary star reveal Comet Lulin in beautiful proximity. [sky map]

Feb. 16th: Comet Lulin passes Spica in the constellation Virgo. Spica is a star of first magnitude and a guidepost even city astronomers cannot miss. A finderscope pointed at Spica will capture Comet Lulin in the field of view, centering the optics within a nudge of both objects. [
sky map]

Feb. 24th: Closest approach! On this special morning, Lulin will lie just a few degrees from Saturn in the constellation Leo. Saturn is obvious to the unaided eye, and Lulin could be as well. If this doesn't draw you out of bed, nothing will. [
sky map]

Ye notes that Comet Lulin is remarkable not only for its rare beauty, but also for its rare manner of discovery. "This is a 'comet of collaboration' between Taiwanese and Chinese astronomers," he says. "The discovery could not have been made without a contribution from both sides of the Strait that separates our countries. Chi Sheng Lin and other members of the Lulin Observatory staff enabled me to get the images I wanted, while I analyzed the data and found the comet."

Somewhere this month, Ye imagines, another youngster will bend over an eyepiece, see Comet Lulin, and feel the same thrill he did gazing at Comet Hale-Bopp in 1996. And who knows where that might lead...?
"I hope that my experience might inspire other young people to pursue the same starry dreams as myself," says Ye.

Wednesday, February 4, 2009

EMP Threatens Pearl Harbor like Surprise

I have been aware of the EMP device protocol for some time, but it had become hazy. This article on the subject is enough to make you an unwilling expert. The vulnerability is real and it is a weapon that sidesteps the genocide conflict that has prevented all other types of nuclear weapons from ever been fired. It also makes exactly those weapons completely obsolete.

Very simply, by detonating a nuclear device that is gamma ray rich at a high altitude, you fry all electrical systems in line of sight. This does not kill anyone outright, which sidesteps the ethical issues surrounding deployment. It puts a real premium on first strike ability and it tips the scales in all sorts of other ways.

This article goes on to show that we can make smaller non nuclear devices with the same capability. It is all real and available in the here and now. In fact, the illustration is a primitive photon torpedo.

This is the one device possibly able to take out an aircraft carrier on the high seas, while possibly mounted on a small vessel. I am pretty sure that they are sufficiently shielded, but it is time to conduct stress tests.

What this all means though is that a national initiative needs to be launched that progressively shields vulnerabilities. It is neither easy nor impossible.

The military has its priorities and they are much more demanding. However, a first priority should be to protect out our long haulage fleet and work down into smaller vehicles. Fifty pounds of lead is an option on a transport truck.

The second priority is to shield transformers and other vulnerable components of our power grid. It needs to be done by regulation and over the years to avoid unnecessary outlay. This also allows the pending integration of solar energy into that system to be done at the same time. Again fifty pounds of lead is a good solution. More practical will be the simple placement of these devices under a goodly layer of concrete and dirt.

It is surely impossible to shield consumer electronics but it is certainly easy to shield the infrastructure sufficiently to prevent major losses. The easiest way to do all this is to mandate placing vulnerable hardware in the deep cellars of office towers or even better in underground facilities.

Maybe the folks in Utah have it right with their huge underground data storage facilities.

This also may make it a good idea to have shielded warehouses for electronic inventory. The point of this article is that industry and government need to begin working this risk into their considerations. Start now and in a decade we can have a majority of our infrastructure hardened.

That way we will never discover the effectiveness of this weapon by way of a Pearl Harbor event. And if we ever decide to deploy this weapon, it will be at a disadvantage unless the opponent actually has electronics. It unfortunately will not fry an AK 47.

EMP -- a deadly kink in our armor!

by Dan Eden for Viewzone

http://www.viewzone.com/emp.html

As usual, viewzone gave me a writing assignment about something of which I knew very little -- this time it was something called the electromagnetic pulse, or "EMP" weapon. As usual, what I learned from my research scared the hell out of me.

When I first sat down to write this story, America had no real enemy capable of using an EMP weapon. That's a good thing because, as you will learn, this is far more dangerous and potentially more lethal than a nuclear bomb. But now, as I am forced to re-write this article for publication, our old nemesis -- Russia -- has emerged as a potential foe. Not only do the Russians have EMP weapons but the likelihood of their use, especially if the geo-political tensions continue, is almost certain.

First, let me emphasize this: Unlike nuclear bombs that can destroy a large city like New York or Chicago, an EMP weapon is so powerful that just one -- detonated over the central USA -- could knock out the entire country. That's right -- just one!

EMP -- the worst weapon you will ever survive.

When I was a teenager I was a ham radio operator. I listened to my short wave radio for hours just about every day. I remember one summer in 1962 when I thought my receiver was broken; I even bought new tubes for it. On every band, on every frequency, there was a loud and annoying hiss that lasted for a couple of days. Communication was impossible.

No one spoke about it on the news, nothing was written -- it was "top secret" at the time -- but I later learned that it was the result of an atmospheric nuclear explosion. The 1.4 megaton detonation, called "Operation Starfish Prime" [right] occurred 250 miles above Johnston Island in the Pacific. Not only did the explosion shut down short wave radio communications in America, but it also overloaded and blew out over 300 streetlights in Oahu, Hawaii -- some 740 miles away!

Scientists learned that nuclear explosions, in addition to the blast, heat and deadly gamma radiation, also produce a strong electromagnetic pulse that is capable of passing thousands of volts of electrical energy to just about anything that conducts electricity.

The pulse is generated by the interaction of gamma rays, released by a nuclear explosion, with the atmosphere and the Earth. Called the "
Compton effect", the resulting pulse is like a radio signal and any metallic object acts like a receiving antenna. Whereas a radio signal might produce a thousandth of a volt or less in a receiving antenna, an EMP pulse produces thousands of volts. Antennae, pipelines, underground gas and water pipes, metal fences, iron railroad tracks, power lines, telephone wires, electric cables -- all suddenly act like antennae and become the recipient of huge currents, starting fires, explosions and frying all forms of circuitry beyond repair.

Because our bodies do not conduct this current, the EMP does not directly harm humans or other living organisms. But, as you will see, that's not necessarily good news.

Back to the stone age

Look around you. Right now you are using a computer. You probably also depend on a cellphone and, most likely, you have a car to get you from home to work or to the market.

Most vulnerable to this high current of EMP are transistors, those small devices that are inside computer chips, radios, cellphones, ATM machines, automotive ingnition systems, calculators, clocks and just about everything that requires batteries. All of these would be useless after an EMP and repairing them would not be an option. They would all have to be replaced.

An EMP lasts just 1 nano-second! It radiates at 50 kv/m at the speed of light. At such a high altitude, it could happen in the blink of an eye and might not be immediately associated with any type of explosion or blast.
Imagine surviving an EMP only to find you cannot call your family to see if they are safe, cannot get cash from your ATM or bank (since they cannot access your records), cannot buy food at a supermarket (no scanners) or even drive your car (ignition destroyed). Everything relying on modern electrical technology will be useless.

When you finally walk to your home, city water will not be pumped, lights will not come on, sewage treatment will be off-line, heat and air conditioning will be gone and the production, harvesting and transport of food will have come to a grinding halt. What's even worst is that the lack of radio and television news will keep most people from understanding what has happened or how to respond. Confusion and eventually anarchy would reign. This is the reality of an EMP weapon.

It is possible to shield electronic equipment from EMP radiation, but it is extremely difficult. The protected components must be completely enclosed in a copper shield. Even a small hole will allow the radiation to enter and any wires extending outside the shield will defeat the protection. For all practical purposes, this is not possible.

Air Force One, the President's jet, and certain other strategic aircraft have been equipped with special shields to reduce the effects of the radiation. Oddly, old radio equipment using vacuum tubes are less likely to be hurt by the radiation than transistor circuitry and many existing Russian aircraft still maintain these old systems for this reason.

Just one is enough

This threat is not imaginary. The
House Armed Services Committee issued its warning to Congress on July 10, 2008 citing the fact that the Russians have already tested small yield EMP weapons of 300 kilotons at altitudes of 60, 150 and 300 kilometers and produced the effects described above at distances up to 600 kilometers from the test site. An actual weapon would be many times stronger.

The scientists describe three possible EMP effects, depending on the height of the detonation. Because the gamma rays travel in a line of sight path, the most dramatic effect comes from a high altitude blast. Here, the gamma rays can travel further and the Earth's magnetic field deflects the pulse back to the surface.

Such an attack could easily be launched from a submarine, just off the coast, and need not be accurate. The altitude and coordinates of detonation can vary widely and the effects will be just as lethal to the electronics. If several are launched, the chances of one reaching its optimum height are almost guaranteed. Once the EMP has been discharged, the circuitry of subsequent missile electronics will be useless, leaving us vulnerable to more attacks.

The House Armed Services Committee confirmed that these weapons do not need to be accurately targeted to destroy our infrastructure. A medium size nuke, detonated from between 150 and 500 miles anywhere over Kansas, would be enough to destroy a hundred years of technological development. By comparison, the space shuttle and space station orbit about 110 to 150 miles above Earth, well below the optimal detonation of a potential EMP weapon.

An orbiting nuke could possibly already be poised, waiting to be used in a future conflict. If such a weapon is already up there, it is completely beyond our current capability to destroy it. Once it is detonated, our remaining electronic defense systems would be relatively useless. Checkmate.

But wait -- there's more!

Recently, scientists have developed a way to produce an EMP weapon without fissionable material. These non-nuclear devices are smaller than nuclear fueled pulse generators but their effect can be just as effective over a limited geographic area.

In the apparatus above, power is supplied by a bank of capacitors. These capacitors are similar to the system used to power a strobe flash. The ingition first charges the large coil, producing a magnetic field. Then an explosive charge begins at one end of the coil, shorting the coil by compressing it against the metallic shield. The effect changes the inductance of the coil, essentially shortening the field as the explosion advances down the tube.

The resulting electromagnetic pulse has a wide spectrum of wavelengths, capable of energizing metallic objects of varied lengths, from hundreds of feet to a few inches. All of the artificial "antennae" then become charged with enormous electrical currents which burn and render their components useless.

This type of non-nuclear apparatus is almost impossible to detect but is limited in scope and power by the size of the charging capacitors.

Some smaller devices could easily be made by terrorists. In fact, a simple search of the internet yields plans and specifications for constructing small EMP devices from readily available components such as the megatrons from industrial microwave ovens, rated at 800 to 1200 watts, and several automotive batteries. These devices are said to be able, with the addition of a parabolic reflector, to disable the electronics of a modern car or truck within 200 meters.

Commercially sold shock wave generators are capable of producing focused acoustic or electromagnetic energy that can break up objects such as kidney stones and other similar materials. EMP generators can produce pulses of electromagnetic energy that can destroy the sensitive electronics in computers and microprocessors. Destabilized LCR circuits can produce multi megawatt pulses by using an explosive wire disruptive switch to achieve an effect similar to larger EMP devices.

The point here is not to make you go out and build an EMP weapon. It is to make you aware that the technology is very real and that we are currently extremely vulnerable to the effects of this type of weapon. Nations, like the US and Russia, have already developed these weapons on a scale capable of devastating our modern, technological dependent civilization.

Should you be paranoid? Well, I know some people who have stopped putting their money in banks and have started storing food and water. Collecting books on farming and gardening is another good idea. Some are even exercising more, getting ready for long walks along deserted highways and roads.

What will you do?

Graphene and Graphane

News out of the labs on graphene is coming fast and furious. We are learning to manipulate it and to fabricate characteristics. By simply adding in hydrogen and we have no clue as to how, a single layer is turned into an insulator. Already, we can think semiconductors.

The newly announced single layer material is called graphane which is going to cause all end of confusion.

We very suddenly can imagine a layer of graphine that is altered chemically into various devices, perhaps by laser printing. Again the speed of research here is breathtaking.

I have posted two recent articles that demonstrate what is taking place. I wonder if graphene demonstrates any super conductor characteristics.

New products are been created by simple bombardment of the original graphene with a range of other elements.

The second article is controlling outcomes by altering the substrate on which the graphene is deposited.

Scientists Invent One-Atom-Thick Crystal Graphane was obtained from graphene
By Tudor Vieru

http://news.softpedia.com/news/Scientists-Invent-One-Atom-Thick-Crystal-103388.shtml

31 January, 2009

Scientists at the University of Manchester, who first discovered graphene, were also behind the creation of the new material, which differs from its predecessor through the fact that it also incorporates hydrogen atoms, which it attracts towards its ultra-thin structure. Basically, one could say that graphane, the new material, and graphene only have two sides, because their depth is only one atom, which by all accounts means it's invisible. Professor Andre Geim and Dr. Kostya Novoselov, the experts behind the 2004 research that yielded graphene, published the discoveries related to the new material on Friday, January 30th, in the prestigious journal Science.

The experts announced that the new material was obtained by making graphene react with various other substances, and added that the ones containing hydrogen were just one of many possibilities. Already, the material discovered in 2004 opened up new avenues for physics and material sciences, through the fact that it behaved like a never-before-encountered semiconductor. Its potential for transporting electrical current is impressive, which makes graphene one of the revolutionary materials to be employed in the construction of many types of innovative electronics.

On the other hand, the new material, graphane, exhibits insulating properties, which means that it could be used as an isolation around the circuits made from its predecessor. “Our work proves that this is a viable route and hopefully will open the floodgates for other graphene-based chemical derivatives. This should widen the possible applications dramatically,” Dr. Novoselov said about the new material.

“The modern semiconductor industry makes use of the whole period table: from insulators to semiconductors to metals. But what if a single material is modified so that it covers the entire spectrum needed for electronic applications? Imagine a graphene wafer with all interconnects made from highly conductive, pristine graphene whereas other parts are modified chemically to become semiconductors and work as transistors,” adds professor Geim, who is also a scientist at the University’s School of Physics and Astronomy.

The team plans to develop other crystalline compounds of graphene by using roughly the same technique used to create graphane, namely bombarding the 2004 compound with atoms of various substances. This might make them become insulators or semiconductors, or any other type of material the researchers could think of.


Researchers Discover Method for Controlling Graphene

New method allows researchers to make graphene with semiconductor or metallic properties at willToday, the interconnects in a CPU or any other electronic device using a semiconductor are made from copper. Scientists are looking at ways to use new material for these interconnects using substances that are faster and produce less heat.

One of the new materials with the most potential is called graphene. Researchers at the
Rensselaer Polytechnic Institute claim that they have discovered a new method for controlling graphene's nature. Graphene is a one-atom thin sheet of carbon that was discovered in 2004.

Graphene is being used by researchers at Rice University to make a
new type of memory that could one day replace flash storage. Before graphene memory and other nanoelectronics using graphene can become a reality, researchers have to find more effective methods of producing graphene with the properties they need.

Rensselaer researcher Saroj Nayak and a postdoctoral research associate have demonstrated a new method that can be used to control the nature of graphene. According to the pair, the nature of graphene can be controlled depending on the substrate on which it is grown, thus shaping its conductive properties.
Results based on large-scale quantum mechanical simulations show that graphene deposited on a surface treated with oxygen results in semiconductor properties while graphene deposited on a surface treated with hydrogen exhibits metallic properties.

This is a key discovery according to the researchers because when a conventional batch of graphene is produced some of it has semiconductor properties and some has metallic properties. The researchers say that using conventional methods it would be impossible to extract one form of graphene. Devices based on graphene would need to have only one graphene form in order to function.

Nayak said in a statement, "Depending on the chemistry of the surface, we can control the nature of the graphene to be metallic or semiconductor. Essentially, we are ‘tuning’ the electrical properties of material to suit our needs."

The reason researchers are pushing so hard to discover better methods for the production of graphene is that the substance could one day replace silicon and copper as the building blocks of electronics. Graphene has excellent conductive properties and at room temperature, electrons can pass through it at close to the speed of light with very little resistance.

Interconnects made from graphene would therefore create much less heat and would be able to run cooler. Cooler interconnects are important because heat can have a negative effect on a CPUs speed and performance. Just consider the huge increases overclockers are able to get out of a CPU when it is cooled more efficiently as an example.

The results of the study by the Rensselaer researchers was published in a paper this week titled "Electronic structure and band-gap modulation of graphene via substrate surface chemistry" published in Applied Physics Letters' January issue.

It is interesting to note that the researchers who were trying to make memory using graphene were using ten atom thick sheets of graphene, whereas the researchers at Rensselaer are using single atom thick graphene sheets.

Growing Nile Delta Fishery

Sometimes, Mother Nature has a lesson to teach us. When the dam was built, the fishery collapse and the loss of the annual floods were quite rightly seen as an environmental disaster and must still be seen as such.

That modern fertilization has sustained the fertility of the farmland was a resulting necessity. That improved methods will serve to surpass the best that the Nile floods could do is inevitable.

That this should result in a tripling of the delta fishery has made everyone happy.

There is still plenty to do with the Nile. Sooner or later, Lake Nasser will completely silt up and the Aswan high dam will have to be abandoned and the annual floods will again dominate the lower Nile.

Alternatively, a barrage could be built upriver that diverts the surplus flood waters into the Qattara Depression a few miles west of the Nile, thereby creating a second Nile valley and a large new lake and delta system.

That would be a fitting monument for the descendents of the pyramid builders.

Nile Delta Fishery Grows Dramatically
by Staff Writers

Narragansett RI (SPX) Jan 27, 2009

http://www.seeddaily.com/reports/Nile_Delta_Fishery_Grows_Dramatically_999.html

While many of the world's fisheries are in serious decline, the coastal Mediterranean fishery off the Nile Delta has expanded dramatically since the 1980s.

The surprising cause of this expansion, which followed a collapse of the fishery after completion of the Aswan High Dam in 1965, is run-off of fertilizers and sewage discharges in the region, according to a researcher at the University of Rhode Island Graduate School of Oceanography.

Autumn Oczkowski, a URI doctoral student, used stable isotopes of nitrogen to demonstrate that 60 to 100 percent of the current fishery production is supported by nutrients from fertilizer and sewage. Her research will be reported in the Jan. 21 online edition of the Proceedings of the National Academy of Sciences.

"This is really a story about how people unintentionally impact ecosystems," Oczkowski said.

Historically, the Nile would flood the delta every fall, irrigate nearby agricultural land, and flow out to the Mediterranean, carrying with it nutrients to support a large and productive fishery. Construction of the dam stopped the flooding, and the fishery collapsed.

"That's when fertilizer consumption in the country skyrocketed," said Oczkowski. "The Egyptians were fertilizing the land, and then fertilizing the sea with the run-off. It also corresponded with a population boom and the expansion of the public water and sewer systems."

As a result, landings of fish in coastal and offshore waters are more than three times pre-dam levels. While increased fishing effort in recent years may have played some role in the recovery, Oczkowski's findings indicate that anthropogenic nutrient sources have now more than replaced the fertility carried by the historical flooding.

Oczkowski and colleagues from URI, the U.S. Environmental Protection Agency, and the University of Alexandria collected more than 600 fish in 2006 and 2007 from four regions that received run-off from the delta and two areas not affected by the Nile drainage. Stable isotopes of nitrogen in each fish were measured and compared.

She found that the isotope signatures in the fish reflected two distinct sources of nitrogen: anthropogenic nitrogen from fertilizers and sewage in the fish caught in coastal and offshore areas of the delta, and nitrogen values consistent with the middle of the Mediterranean in fish caught in waters that were not affected by the delta drainage.

These results have raised questions among many scientists about the value of anthropogenic sources of nutrients to ecosystems.

"We're programmed in the West to think of nutrient enrichment of coastal systems as bad," Oczkowski said. "Here in Rhode Island we've spent hundreds of millions of dollars to upgrade sewage plants to reduce nutrient loading into Narragansett Bay. And it's a major issue in the Chesapeake Bay and in the Gulf of Mexico, where run-off of fertilizers from the country's breadbasket into the Mississippi River has caused a dead zone in the Gulf.

"But the Egyptians don't think it's a bad thing. For them, it's producing tons of fish and feeding millions of hungry people. It's forcing us to reconsider whether we can say that nutrient inputs are always a bad thing."

Tuesday, February 3, 2009

Australia Advancing Biochar

From this distant perspective, Australia is well ahead of everyone in learning how to use and apply biochar. The results been reported are significant and emphatic and hold up of every soil tested, although I am sure someone went out and failed the impossible somewhere.

A few notes here that I would like to add.

Firstly, Australian soils are famously poor at holding nutrients and have troubled Australian agriculture from the beginning. This is the one country in the world that is highly aware of soil fertility and deficiencies. It is thus no surprise whatsoever that they will lead the pack on this.

Secondly, a few technical misperceptions linger. Additional water retention is a result of increased soil biomass encouraged by the presence of elemental carbon.

Everyone sees a forest as a visible source of carbon. That is wrong. It produces carbon in a very inconvenient form for agricultural use and usually quite slowly, or at least no faster than corn or tobacco.

The Wollongbar test used maize because it is ideal for agricultural biochar production. It produced a huge tonnage per acre, perhaps sufficient to deliver one to three tons of biochar per acre. More critically, after the corn harvest, it also dries out as standing.

Most other plant waste has to be dried, or it will simply rot in the field. Corn is infamous for resisting rotting in the field which is one reason it has to be gathered and burned anyway.

Conversations regarding cost concerns are misplaced. A simple sheet metal kiln will easily replicate the conditions of an earthen kiln used by the Indians and quickly produce biochar from any and all dry plant waste.

Of course it you insist in operating a tightly controlled modern kiln to produce so called syngas along with your completely reduced biomass, it will be much more costly. The yield is naturally as expensive and possibly less satisfactory as a soil additive.

A lower temperature biochar will still be mostly elemental carbon but will also have a lot of remaining complex organic matter that will degrade slowly.

Why biochar?

29/01/2009 10:34:00 PM

Tim Flannery loves it, Malcolm Turnbull wants it on the political agenda, and ancient Amazonian cultures used it to make soil that is still fertile after hundreds of years. Why aren't we knee-deep in biochar?

Biochar is the charcoal created by burning organic waste without oxygen—a charring process that also delivers a biofuel, syngas—to produce a very stable form of carbon that can persist, unchanged, for hundreds or thousands of years.

The technology's fans point out that unlike the end-product of the still-theoretical "carbon capture" technology being proposed for coal power stations, biochar both stabilizes carbon and enhances the biological cycle that humanity depends on.

Studies around the world, including in Australia, have shown that adding the char to agricultural soils can boost water and nutrient retention and crop yields, and lower nitrous oxide emissions from fertilizer by 50-80pc.

The history of biochar explains what it is and its benefits.

Hundreds, sometimes thousands of years after they were created by people living in the Amazon basin of South America, the black soils known as 'terra preta‚' are still fertile to the point that some‚ 'like the Magic Pudding‚' regenerate after being harvested for potting mix.

Part of terra preta's secret, researchers believe, is the big quantities of slow-burned charcoal that people from ancient cultures dug into these soils.

Science is trying to recreate terra preta soils, so far unsuccessfully. But it seems that some of terra preta's qualities can be recaptured using biochar.

Biochar contains valuable nutrients that help plant growth, but its primary long-term benefit lies in its complex structure that holds big quantities of nutrients, moisture and microbes in a way that is still accessible to plants.

Biochar trials on maize at NSW Department of Agriculture's Wollongbar facility found that when applied at 10 tonnes per hectare, the char tripled the biomass of wheat and doubled that of soybean, while lifting soil pH and calcium levels and reducing aluminum toxicity.

There was more soil biology in the soil containing biochar, better water retention, and less carbon dioxide and nitrous oxide emissions.

However, CSIRO soil nutrient researcher Dr Evelyn Krull warns against treating biochar as a magic bullet for either carbon sequestration or agricultural productivity.

"Malcolm Turnbull is jumping the gun here a little bit, by saying that we need to do it now. I say there is potential, but let's get the fundamentals right," he said.

The qualities of biochar differ depending on its parent material, Dr Krull said, and the effects of biochar in different soil types is still to be established.

Nor has it been established that biochar can be produced cost-effectively enough to be attractive to farmers.

On Monday's edition of the ABC’s 7.30 report, climate change campaigner and chairman of the Copenhagen Climate Council, Professor Tim Flannery, questioned why the Federal government was throwing $600 million at developing coal carbon capture technology, yet failed to recognize biochar.

"You can quantify (biochar) to the nearest kilogram, you can put it in the soil and know it will stay there for thousands of years, we know it's safe and it's good for agriculture—why wouldn't we recognize that when we're happy to recognize a technology that isn't in existence yet?" Professor Flannery said.

Ironically, biochar development seems to have suffered in Australia because it has been closely associated with agriculture.

"Because we were lumped in with the agricultural sector, biochar hasn't been included in the Carbon Pollution Reduction Scheme (CPRS)," said Adriana Downie, technical manager with BEST Energies at Somersby, NSW.

"At the moment there's no motivation and no funding for anyone to research the true greenhouse accounting balance of this technology."

Using home-grown technology, BEST have developed a pilot plant that demonstrates the feasibility of biochar’s mass production from organic waste as diverse as poultry litter, nut shells and woody weeds.

Adding biochar to farm soils is ideal because it creates additional production benefits, Ms Downie said, but the product could equally be put in a hole in the ground: the primary benefit of biochar is that it creates highly stable carbon from organic matter that would otherwise decompose and return carbon dioxide to the atmosphere.

"If you can grow a forest, stabilize that forest carbon into biochar, and then regrow that forest, that’s a way to really bank those stocks of carbon," Ms Downie said.

"The fact that we were lumped in with agriculture (outside the CPRS) really isn’t fair. We want to see it go into farm soils, but the policy makers have got caught up in that and are missing the big picture."

BEST will continue to develop its home-grown biochar technology, but outcomes will be “perverted” in favour of syngas, the biofuel produced during biochar production, because that part of the technology can secure carbon credits to fund ongoing research.

Cellulosic Ethanol Again

This is a brave pronouncement by a clear leader in the field of cellulosic ethanol. One is almost able to accept that they are on the way to been able to convert the globe’s unlimited supply of waste plant material into ethanol. I was sufficiently encouraged as to review the companies’ website. I have rarely met a more determined tap dance in my life.

You know folks, if you can accept a cellulose feedstock and separate out the cellulose and convert that same cellulose to its constituent sugars and then convert those sugars into ethanol, you say so and you specify the yield achieved.

Otherwise you are building a perfectly good ethanol plant able to accept only molasses and corn starch.

I have come around to thinking that the problem of cellulose and lignin is not entirely intractable. On the other hand, my appreciation of the difficulties suggests that it is a problem highly likely to keep an army of scientists gainfully employed for a very long time even after they get it working.

No mention is made of a working process for directly converting cellulose into glucose on the website, so it is reasonable to assume that they do not have it all together either. At this point, they seem to be relying on microbes to do a job and that job is likely handled slowly and problematically. This suggests that yields are still a long way from commercial necessity.

Once again, it appears easier to accept money to build a factory than to continue funding the research grind. I hope that I am pleasantly surprised and this facility belts out the ethanol.

Verenium Announces First Commercial Cellulosic Ethanol Project

by Staff Writers
Tallahassee Fl (SPX) Jan 29, 2009

Verenium has announced plans to build its first commercial-scale cellulosic ethanol facility in Highlands County, Florida. The Company has entered into long-term agreements with Lykes Bros. Inc., a multi-generation Florida agri-business to provide the agricultural biomass for conversion to fuel.

Verenium also announced that the Highlands Ethanol project has been awarded a $7 million grant as part of Florida's "Farm to Fuel" initiative. These announcements were discussed at a press briefing at the Florida Department of Agriculture and Consumer Services in Tallahassee, Florida.

Verenium's planned commercial facility will be the first in the State of Florida to use next-generation cellulosic ethanol technology to convert renewable grasses to fuel, rather than processing food crops. The plant will be constructed on fallow land, and is expected to produce up to 36 million gallons of cellulosic ethanol per year and provide the region with about 140 full-time jobs, once commercial operations begin.

Verenium anticipates breaking ground on this facility in the second half of this year, and expects to start producing fuel in 2011. Additional jobs will be created during the 18-to-24 months of construction on the plant, which is estimated to cost between $250 and $300 million to build.

Verenium recently received a special use permit from Highlands County for this facility, located in South Central Florida, and is in the process of finalizing other necessary permit applications.

"This plant, the first of many we anticipate building in the years ahead, will help fulfill the U.S. government's mandate for advanced, sustainable biofuels to meet America's energy needs," said Carlos A. Riva, Verenium's President and Chief Executive Officer.

"The facility will serve as a blueprint for how we develop future projects. This milestone is just the beginning."

Riva said the strategic partnership with Lykes Bros. provides the basis for a long-term supply of agricultural feedstock, essential to ensuring next-generation biofuels are cost-efficient. The Florida project is the first of several the Company has under development.

The Highlands Ethanol Project

The agreements between Verenium and Lykes Bros. include a facility site option and a long-term farm lease. Under these agreements, Lykes will provide the necessary feedstock from approximately 20,000 farmable acres adjacent to the site.

The project has been awarded a $7 million grant under Florida Agriculture and Consumer Services Commissioner Charles H. Bronson's "Farm to Fuel" initiative, designed to stimulate the development of a
renewable energy industry in Florida.

This $25 million program provides matching grants to bio-energy firms for demonstration, commercialization and research and development projects utilizing Florida-grown biomass or crops. Verenium was also awarded an additional incentive package from the State of Florida.

"The message today is that Florida's agricultural industry can produce fuel crops on a major commercial scale without sacrificing food crops," Bronson said.

"This is a major step forward for our 'Farm to Fuel' program and hopefully will serve as a catalyst for additional investment by companies interested in producing renewable energy in Florida."

Howell Ferguson, CEO and Chairman of the Board of Lykes Bros. commented, "We foresee great potential for fuel production from biomass, and we believe that it will offer significant benefits for the U.S. and for agriculture throughout Florida.

It is exciting to join in a project using cellulosic ethanol technology pioneered at the University of Florida Institute of Food and Agricultural Science, and we are very pleased to work with Verenium on this project."

Verenium's conversion process originated from the landmark technology developed by a team led by Dr. Lonnie Ingram at the University of Florida.

The grant agreement, which Verenium and Florida Commissioner of Agriculture and Consumer Services Charles Bronson will discuss at a press briefing at the Florida State Capitol in Tallahassee today, follows Verenium's success at its pilot- and demonstration-scale plants in Jennings, La., where the Company has been developing and testing processes to optimize production and lower the cost of making cellulosic ethanol.

Separately, in August Verenium announced a strategic partnership with BP, an international energy company and leader in alternative energy, to speed the development of its cellulosic ethanol technology. Verenium and BP are currently focusing on a second phase of collaboration surrounding the development and deployment of commercial-scale cellulosic ethanol production facilities.