Friday, July 30, 2010

Potentially Hazardous Asteroid Might Collide With The Earth In 2182





This asteroid will be a great object to work with in order to learn how to alter orbits effectively to eliminate such problems.  It is one task that mankind can agree on to do and do well.

Sometime in the next century, everything in the inner solar system will be fully mapped and perhaps every comet in the outer solar system can also be mapped.  Per folks truly understand just how huge comets actually are.  They contain a lot of carbon dust and coming in they charge up and produce a huge tail that we see.

We are going to want to have the ready ability to intercept all these objects whenever it is necessary in order to perturb their orbit.  This is not a big trick if the intercept is at orbital apogee were a little disturbance makes for a large perturbation as it approaches Earth orbit.

This object is perfect to experiment with.



Potentially Hazardous Asteroid Might Collide With The Earth In 2182

by Staff Writers

Madrid, Spain (SPX) Jul 28, 2010


These are asteroids and comets visited by spacecraft. Credit: ESA, NASA, JAXA, RAS, JHUAPL, UMD, OSIRIS



"The total impact probability of asteroid'(101955) 1999 RQ36' can be estimated in 0.00092 - approximately one-in-a-thousand chance - but what is most surprising is that over half of this chance (0.00054) corresponds to 2182," explains to SINC Maria Eugenia Sansaturio, co-author of the study and researcher of Universidad de Valladolid (UVA).

The research also involved scientists from the University of Pisa (Italy), the Jet Propulsion Laboratory (USA) and INAF-IASF-Rome (Italy).

Scientists have estimated and monitored the potential impacts for this asteroid through 2200 by means of two mathematical models (Monte Carlo Method and line of variations sampling).

Thus, the so called Virtual Impactors (VIs) have been searched. VIs are sets of statistical uncertainty leading to collisions with the Earth on different dates of the XXII century. Two VIs appear in 2182 with more than half the chance of impact.

Asteroid '(101955) 1999 RQ36' is part of the Potentially Hazardous Asteroids (PHA), which have the possibility of hitting the Earth due to the closeness of their orbits, and they may cause damages. This PHA was discovered in 1999 and has around 560 meters in diameter.

The Yarkovsky effect

In practice, its orbit is well determined thanks to 290 optical observations and 13 radar measurements, but there is a significant "orbital uncertainty" because, besides gravity, its path is influenced by the Yarkovsky effect.

Such disturbance slightly modifies the orbits of the Solar System's small objects because, when rotating, they radiate from one side the radiation they take from the sun through the other side.

The research, which has been published in Icarus journal, predicts what could happen in the upcoming years considering this effect. Up to 2060, divergence of the impacting orbits is moderate; between 2060 and 2080 it increases 4 orders of magnitude because the asteroid will approach the Earth in those years; then, it increases again on a slight basis until another approach in 2162, it then decreases, and 2182 is the most likely year for the collision.

"The consequence of this complex dynamic is not just the likelihood of a comparatively large impact, but also that a realistic deflection procedure (path deviation) could only be made before the impact in 2080, and more easily, before 2060," stands out Sansaturio.

The scientist concludes: "If this object had been discovered after 2080, the deflection would require a technology that is not currently available. Therefore, this example suggests that impact monitoring, which up to date does not cover more than 80 or 100 years, may need to encompass more than one century. Thus, the efforts to deviate this type of objects could be conducted with moderate resources, from a technological and financial point of view."

A Decade of Declining Home Prices Ahead





I have already posted on this particular topic and its solutions.  It can be solved, but present players simply have no clue.  They have never faced the problem in anything like this magnitude and danger.

What has happened is that prices have declined to the natural floor in which the lenders actually lose more cash than they gain if they accept a lower price.  How this works is that say they have ten million dollars of inventory at current pricing and they can sell a property for say $100,000 which is say 10% below inventory pricing, then they must put up cash to make up the $1,000,000 capital shortfall they created by accepting the sale.

Fortunately, no one is likely to create actual selling pressure because everyone is in the same ship and all are forced to wait on bona fide buyers to buy at the present pricing which is clearly fair enough anyway.  Obviously interest rates will not be rising until enough buyers qualify to clear the bulk of the inventory.

I do not think we will have actual declines because of just this.

This report shows us how damaging this has all been.  The reason we are not having a great depression rerun is because we are not forcing this massive inventory into the market by liquidating the banks on the sheriff’s block.  However we have a massive shrinkage of equity leaving you to rely on cash flow.  The party is so over.

The fix is a rewrite of foreclosure laws as suggested in earlier posts.  It would work even now because it would empower the necessary twenty million or so folks to pull us out of this morass and swiftly sent the housing market into an orderly recovery.


A Decade of Declining Home Prices Ahead

By Mike Whitney



The housing depression will last for a decade or more. This is by design. The Fed has been working with the banks to withhold inventory so prices do not fall too fast or too far. That way the banks can manage their write-downs without slipping into insolvency. But what's good for the banks is bad for the country. Capital impairment at the banks, means no credit expansion in the near-term. It means the economy will continue to contract, unemployment will remain high, and deflation will push down wages and prices.

Everyone will pay for the mortgage-backed securities scam that was engineered by the banks.

Typically, personal consumption expenditures (PCE) and real estate lead the way out of recession. But not this time. Both PCE and RE will stay depressed and act as a drag on employment and growth. Last week, in testimony before the congress, Fed chair Ben Bernanke made it clear that the Central Bank has no intention of providing extra monetary stimulus to make up for rapidly-dissipating fiscal stimulus or the winding down of government subsidies for auto, home, and appliance purchases. The economy must muddle through on its own. But without additional pump-priming, disinflation will turn to outright deflation and the economy will sink into negative territory. Bernanke knows this, but he's absolved himself of any further responsibility. It's just a matter of time before the next slump.

Look at housing. The facts are grim. This is from Charles Hugh Smith:

About two-thirds of U.S. households own a house (75 million); 51 million have a mortgage and 24 million own homes free and clear (no mortgage). Most of the other 36 million households are moderate/low income and have limited or no access to credit and limited or no assets.

If we look up all the gory details in the fed Flow of Funds, we find that household real estate fell from $23 trillion in 2006 to $16.5 trillion at the end of 2009. That is a decline of $6.5 trillion, more than half the total $11 trillion lost in the credit/housing bust. Home mortgages have fallen a negligible amount, from $10.48 trillion in 2007 to $10.26 trillion at the end of 2009. As of the end of 2009, total equity in household real estate was a paltry $6.24 trillion of which about $5.25 trillion was held in free-and-clear homes (32% of all household real estate, i.e. 32% of $16.5 trillion).

That leaves about $1 trillion--a mere 1.85% of the nation's total net worth-- of equity in the 51 million homes with mortgages. ...$6 trillion in wealth is gone ("What we know--and don't want to know-- about housing", Charles Hugh Smith, of two minds.com)

The bursting of the housing bubble wiped out the middle class. Now--even in the best case scenario--private sector deleveraging will continue for years to come. Baby boomers are not nearly as wealthy as they believed; they must slash spending and save for the future. US household debt as a share of disposable income, remains historically high  (122%) and will have to return-to-trend (100%) before consumers loosen the purse-strings and resume spending. Repeat: 51 million homeowners have a meager $1 trillion in home equity. We're a nation of paupers.

More than 7 million homeowners are presently in some stage of foreclosure

 Obama's mortgage modification program (HAMP) has been an utter failure. More than half the applicants default within the year. At the same time, mortgage purchase applications have fallen off a cliff. "The weekly applications index is at the lowest level since December 1996, and and the four week average is at the lowest level since September 1995 - almost 15 years ago." (calculated risk)

This is from the Wall Street Journal:

"How much should we worry about a new leg down in the housing market? If the number of foreclosed homes piling up at banks is any indication, there’s ample reason for concern.

As of March, banks had an inventory of about 1.1 million foreclosed homes, up 20% from a year earlier, according to estimates from LPS Applied Analytics. Another 4.8 million mortgage holders were at least 60 days behind on their payments or in the foreclosure process, meaning their homes were well on their way to the inventory pile. That "shadow inventory" was up 30% from a year earlier.

Based on the rate at which banks have been selling those foreclosed homes over the past few months, all that inventory, real and shadow, would take 103 months to unload. That’s nearly nine years. Of course, banks could pick up the pace of sales, but the added supply of distressed homes would weigh heavily on prices — and thus boost their losses." ("Number of the Week: 103 Months to Clear Housing Inventory", Mark Whitehouse Wall Street Journal)

A 9-year backlog of homes. No wonder the yield on the 10-year Treasury is under 3%. The country is in a Depression.

Housing prices have already fallen 30% from their peak in 2006, but they temporarily stabilized during the period that the Fed was exchanging toxic mortgage-backed securities (MBS) for $1.25 trillion in reserves. The banks collaborated with the Fed (I believe) to hold back supply so the public would be duped into thinking that Bernanke's cash-for-trash (Quantitative Easing) program was actually supporting the market. But it wasn't. Prices stayed flat because the banks were deliberately withholding supply. The Fed's action did nothing. Now that Bernanke has ended the program, inventory is rising.

 How far prices drop will depend on the rate at which the banks dump their backlog of homes onto market. The longer the process is dragged out, the longer the recession will persist.

The housing market has been nationalized. More than 95% of the funding for new mortgages comes from the government--mainly Fannie Mae, Freddie Mac, FHA guarantees or VA loans. There is no market in housing--it's all central planning with the Fed acting as the financial Politburo. It's all designed to stealthily transfer the losses of the Kleptocrats onto the taxpayer. Subprime lending continues behind the mask of FHA-backed mortgages. FHA underwrites mortgages with as little as 3.5% down and credit scores in the high 500-range. It's a joke. The lending system is designed to implode and it will, leaving more red ink for the public to mop up. Nothing has changed.

Anyone who is thinking about buying a house should mull over the facts before making a final decision. The market is so distorted by the buildup of shadow inventory there's no way of knowing whether prices are fair or not. It's a crapshoot. An article in businessinsider.com titled "Banks can't hold back high-end mortgage repos for long" is a "must read" for anyone presently looking to buy. Here's an excerpt:

"Let's begin with Chicago.....As of July 15, RealtyTrac listed 28,829 properties which had been foreclosed and repossessed by lenders. Some have been owned by the bank as long as 2½ years without having been placed on the market. Roughly half have been repossessed by the lender since late January 2010.

This year, banks in the Chicago area have foreclosed on a huge number of expensive homes. RealtyTrac lists 2,650 repossessed homes for more than $300,000 and 169 for more than $1 million.... Out of 28,829 repossessed properties, there were only 1,292 listed by lenders as "for sale." The vast majority of these available homes were inexpensive. A mere 29 homes over $300,000 were for sale. In other words, the banks have withheld from the market 2,621 properties listed at $300,000 or higher." ("Banks can't hold back high-end mortgage repos for long", Keith Jurow, businessinsider.com)

We can see that the banks are deliberately keeping homes off the market to keep prices artificially high so they don't have to write down the losses. Clearly, the Fed knows what's going on.

Here's more from businessinsider.com:

In Miami-Dade County, the same thing--"Out of 10,858 bank-owned homes, a mere 983 were listed for sale....

Orange County, same deal--"As of July 16, RealtyTrac listed 6,270 repossessed properties.... very few foreclosed homes in Orange County are listed for sale - 227. (and even more interesting) "650 of theses repossessed homes are priced at more than $1 million. Yet not a single home over $1 million is currently on the market." ( "Banks can't hold back high-end mortgage repos for long", Keith Jurow, businessinsider.com)

Now that the Fed's mortgage-backed securities buyback program (QE) is over, the banks are stepping up foreclosures and short sales. Expect more homes to flood the market pushing down prices. But whether the banks release more of their shadow inventory or not, it will still take years before the market returns to a (normal) 5 to 6 month backlog. Take a look at this chart and see the extent to which the banks are deceiving the public.

http://boombustblog.com/media/wpmu/uploads/blogs.dir/1/files/2010/07/image0012.jpg

There are remedies for our housing woes, but they require massive government intervention. Mortgages must be restructured in a way that keeps as many people as possible in their homes. That means bondholders and banks will have to take a sizable haircut, which is the way capitalism is supposed to work when risky investments blow up. The write-downs will force many of the banks into bankruptcy, so the Obama administration will have to resurrect the 
Resolution Trust Corporation (RTC) to resolve the banks, replace management, and auction off their downgraded assets. It's all been done before. When the toxic assets and non performing loans have been purged from bank balance sheets, the banks will be able to fulfill their function as providers of credit to consumers, households and small businesses. Credit expansion will lower unemployment, reduce excess capacity and increase GDP. The economy will begin to grow again. Regrettably, Bernanke has chosen the path of deception and deflation, which is why there won't be any real recovery until he is removed.

Roy Spenser on Global Warming Skepticism for Dummies




Dr Spenser has weighed in with a great summary of the relevant issues regarding climate science and is a must read for those wanting background.  At least one can debate with others from the same basic page of knowledge.

My own interpretation of general climate change is that it is adjusted by a millennium long oceanic current shift that cycle back and forth causing the types of changes we see.  The recent disconnect with tree ring data needs to be understood and may reflect something else besides been totally wrong which no one wants to believe including myself.

The geological record calls for this type of adjusting mechanism.  It is a good bet that the ocean easily trumps atmospheric heat energy.


SATURDAY, JULY 17, 2010


In a new highly recommended post today by Dr. Roy Spencer he states, "Adding more [CO2] “should” cause warming, with the magnitude of that warming being the real question. But I’m still open to the possibility that a major error has been made on this fundamental point. Stranger things have happened in science before."  


Might the "possible major errors" be violations of the 1st and 2nd laws of thermodynamics? Also today, see the new posts by Professor of Applied Mathematics Claes Johnson, author of several books on thermodynamics, illustrating the errors of climate science of imaginary "back radiation" and "radiative forcing." Also today, see John O'Sullivan's new post "End of the road for Greenhouse Gas Theory : Bogus Budget Busted"
 

My Global Warming Skepticism, for Dummies



I receive many e-mails, and a recurring complaint is that many of my posts are too technical to understand. This morning’s installment arrived with the subject line, “Please Talk to Us”, and suggested I provide short, concise, easily understood summaries and explanations “for dummies”.


So, here’s a list of basic climate change questions, and brief answers based upon what I know today. I might update them as I receive suggestions and comments. I will also be adding links to other sources, and some visual aids, as appropriate. 


Deja vu tells me I might have done this once before, but I’m too lazy to go back and see. So, I’ll start over from scratch. (Insert smiley)


It is important to understand at the outset that those of us who are skeptical of mankind’s influence on climate have a wide variety of views on the subject, and we can’t all be right. In fact, in this business, it is really easy to be wrong. It seems like everyone has a theory of what causes climate change. But it only takes one of us to be right for the IPCC’s anthropogenic global warming (AGW) house of cards to collapse.


As I like to say, taking measurements of the climate system is much easier than figuring out what those measurements mean in terms of cause and effect. Generally speaking, it’s not the warming that is in dispute…it’s the cause of the warming.


If you disagree with my views on something, please don’t flame me. Chances are, I’ve already heard your point of view; very seldom am I provided with new evidence I haven’t already taken into account. 


1) Are Global Temperatures Rising Now? There is no way to know, because natural year-to-year variability in global temperature is so large, with warming and cooling occurring all the time. What we can say is that surface and lower atmospheric temperature have risen in the last 30 to 50 years, with most of that warming in the Northern Hemisphere. Also, the magnitude of recent warming is somewhat uncertain, due to problems in making long-term temperature measurements with thermometers without those measurements being corrupted by a variety of non-climate effects. But there is no way to know if temperatures are continuing to rise now…we only see warming (or cooling) in the rearview mirror, when we look back in time.


2) Why Do Some Scientists Say It’s Cooling, while Others Say that Warming is Even Accelerating? Since there is so much year-to-year (and even decade-to-decade) variability in global average temperatures, whether it has warmed or cooled depends upon how far back you look in time. For instance, over the last 100 years, there was an overall warming which was stronger toward the end of the 20th Century. This is why some say “warming is accelerating”. But if we look at a shorter, more recent period of time, say since the record warm year of 1998, one could say that it has cooled in the last 10-12 years. But, as I mentioned above, neither of these can tell us anything about whether warming is happening “now”, or will happen in the future.


3) Haven’t Global Temperatures Risen Before? Yes. In the longer term, say hundreds to thousands of years, there is considerable indirect, proxy evidence (not from thermometers) of both warming and cooling. Since humankind can’t be responsible for these early events, this is evidence that nature can cause warming and cooling. If that is the case, it then opens up the possibility that some (or most) of the warming in the last 50 years has been natural, too. While many geologists like to point to much larger temperature changes are believed to have occurred over millions of years, I am unconvinced that this tells us anything of use for understanding how humans might influence climate on time scales of 10 to 100 years. 


4) But Didn’t the “Hockey Stick” Show Recent Warming to be Unprecedented? The “hockey Stick” reconstructions of temperature variations over the last 1 to 2 thousand years have been a huge source of controversy. The hockey stick was previously used by the IPCC as a veritable poster child for anthropogenic warming, since it seemed to indicate there have been no substantial temperature changes over the last 1,000 to 2,000 years until humans got involved in the 20th Century. The various versions of the hockey stick were based upon limited amounts of temperature proxy evidence — primarily tree rings — and involved questionable statistical methods. In contrast, I think the bulk of the proxy evidence supports the view that it was at least as warm during the Medieval Warm Period, around 1000 AD. The very fact that recent tree ring data erroneously suggests cooling in the last 50 years, when in fact there has been warming, should be a warning flag about using tree ring data for figuring out how warm it was 1,000 years ago. But without actual thermometer data, we will never know for sure.


5) Isn’t the Melting of Arctic Sea Ice Evidence of Warming? Warming, yes…manmade warming, no. Arctic sea ice naturally melts back every summer, but that meltback was observed to reach a peak in 2007. But we have relatively accurate, satellite-based measurements of Arctic (and Antarctic) sea ice only since 1979. It is entirely possible that late summer Arctic Sea ice cover was just as low in the 1920s or 1930s, a period when Arctic thermometer data suggests it was just as warm. Unfortunately, there is no way to know, because we did not have satellites back then. Interestingly, Antarctic sea ice has been growing nearly as fast as Arctic ice has been melting over the last 30+ years. 


6) What about rising sea levels? I must confess, I don’t pay much attention to the sea level issue. I will say that, to the extent that warming occurs, sea levels can be expected to also rise to some extent. The rise is partly due to thermal expansion of the water, and partly due to melting or shedding of land-locked ice (the Greenland and Antarctic ice sheets, and glaciers). But this says nothing about whether or not humans are the cause of that warming. Since there is evidence that glacier retreat and sea level rise started well before humans can be blamed, causation is — once again — a major source of uncertainty. 


7) Is Increasing CO2 Even Capable of Causing Warming? There are some very intelligent people out there who claim that adding more carbon dioxide to the atmosphere can’t cause warming anyway. They claim things like, “the atmospheric CO2 absorption bands are already saturated”, or something else very technical. [And for those more technically-minded persons, yes, I agree that the effective radiating temperature of the Earth in the infrared is determined by how much sunlight is absorbed by the Earth. But that doesn't mean the lower atmosphere cannot warm from adding more greenhouse gases, because at the same time they also cool the upper atmosphere]. While it is true that most of the CO2-caused warming in the atmosphere was there before humans ever started burning coal and driving SUVs, this is all taken into account by computerized climate models that predict global warming.
Adding more “should” cause warming, with the magnitude of that warming being the real question. But I’m still open to the possibility that a major error has been made on this fundamental point. Stranger things have happened in science before.


8 ) Is Atmospheric CO2 Increasing? Yes, and most strongly in the last 50 years…which is why “most” climate researchers think the CO2 rise is the cause of the warming. Our site measurements of CO2 increase from around the world are possibly the most accurate long-term, climate-related, measurements in existence.


9) Are Humans Responsible for the CO2 Rise? While there are short-term (year-to-year) fluctuations in the atmospheric CO2 concentration due to natural causes, especially El Nino and La Nina, I currently believe that most of the long-term increase is probably due to our use of fossil fuels. But from what I can tell, the supposed “proof” of humans being the source of increasing CO2 — a change in the atmospheric concentration of the carbon isotope C13 — would also be consistent with a natural, biological source. The current atmospheric CO2 level is about 390 parts per million by volume, up from a pre-industrial level estimated to be around 270 ppm…maybe less. CO2 levels can be much higher in cities, and in buildings with people in them.


10) But Aren’t Natural CO2 Emissions About 20 Times the Human Emissions? Yes, but nature is believed to absorb CO2 at about the same rate it is produced. You can think of the reservoir of atmospheric CO2 as being like a giant container of water, with nature pumping in a steady stream into the bottom of the container (atmosphere) in some places, sucking out about the same amount in other places, and then humans causing a steady drip-drip-drip into the container. Significantly, about 50% of what we produce is sucked out of the atmosphere by nature, mostly through photosynthesis. Nature loves the stuff. CO2 is the elixir of life on Earth. Imagine the howls of protest there would be if we were destroying atmospheric CO2, rather than creating more of it.


11) Is Rising CO2 the Cause of Recent Warming? While this is theoretically possible, I think it is more likely that the warming is mostly natural. At the very least, we have no way of determining what proportion is natural versus human-caused.


12) Why Do Most Scientists Believe CO2 is Responsible for the Warming? Because (as they have told me) they can’t think of anything else that might have caused it. Significantly, it’s not that there is evidence nature can’t be the cause, but a lack of sufficiently accurate measurements to determine if nature is the cause. This is a hugely important distinction, and one the public and policymakers have been misled on by the IPCC.


13) If Not Humans, What could Have Caused Recent Warming? This is one of my areas of research. I believe that natural changes in the amount of sunlight being absorbed by the Earth — due to natural changes in cloud cover — are responsible for most of the warming. Whether that is the specific mechanism or not, I advance the minority view that the climate system can change all by itself. Climate change does not require an “external” source of forcing, such as a change in the sun.


14) So, What Could Cause Natural Cloud Changes? I think small, long-term changes in atmospheric and oceanic flow patterns can cause ~1% changes in how much sunlight is let in by clouds to warm the Earth. This is all that is required to cause global warming or cooling. Unfortunately, we do not have sufficiently accurate cloud measurements to determine whether this is the primary cause of warming in the last 30 to 50 years.


15) How Significant is the Climategate Release of E-Mails? While Climategate does not, by itself, invalidate the IPCC’s case that global warming has happened, or that humans are the primary cause of that warming, it DOES illustrate something I emphasized in my first book, “Climate Confusion”: climate researchers are human, and prone to bias.


16) Why Would Bias in Climate Research be Important? I thought Scientists Just Follow the Data Where It Leads Them When researchers approach a problem, their pre-conceived notions often guide them. It’s not that the IPCC’s claim that humans cause global warming is somehow untenable or impossible, it’s that political and financial pressures have resulted in the IPCC almost totally ignoring alternative explanations for that warming. 


17) How Important Is “Scientific Consensus” in Climate Research? In the case of global warming, it is nearly worthless. The climate system is so complex that the vast majority of climate scientists — usually experts in variety of specialized fields — assume there are more knowledgeable scientists, and they are just supporting the opinions of their colleagues. And among that small group of most knowledgeable experts, there is a considerable element of groupthink, herd mentality, peer pressure, political pressure, support of certain energy policies, and desire to Save the Earth — whether it needs to be saved or not.


18) How Important are Computerized Climate Models? I consider climate models as being our best way of exploring cause and effect in the climate system. It is really easy to be wrong in this business, and unless you can demonstrate causation with numbers in equations, you are stuck with scientists trying to persuade one another by waving their hands. Unfortunately, there is no guarantee that climate models will ever produce a useful prediction of the future. Nevertheless, we must use them, and we learn a lot from them. My biggest concern is that models have been used almost exclusively for supporting the claim that humans cause global warming, rather than for exploring alternative hypotheses — e.g. natural climate variations — as possible causes of that warming.


19) What Do I Predict for Global Temperature Changes in the Future? I tend to shy away from long-term predictions, because there are still so many uncertainties. When pressed, though, I tend to say that I think cooling in our future is just as real a possibility as warming. Of course, a third possibility is relatively steady temperatures, without significant long-term warming or cooling. Keep in mind that, while you will find out tomorrow whether your favorite weather forecaster is right or wrong, no one will remember 50 years from now a scientist today wrongly predicting we will all die from heat stroke by 2060.


Concluding Remarks


Climate researchers do not know nearly as much about the causes of climate change as they profess. We have a pretty good understanding of how the climate system works on average…but the reasons for small, long-term changes in climate system are still extremely uncertain. 


The total amount of CO2 humans have added to the atmosphere in the last 100 years has upset the radiative energy budget of the Earth by only 1%. How the climate system responds to that small “poke” is very uncertain. The IPCC says there will be strong warming, with cloud changes making the warming worse. I claim there will be weak warming, with cloud changes acting to reduce the influence of that 1% change. The difference between these two outcomes is whether cloud feedbacks are positive (the IPCC view), or negative (the view I and a minority of others have).


So far, neither side has been able to prove their case. That uncertainty even exists on this core issue is not appreciated by many scientists!


Again I will emphasize, some very smart people who consider themselves skeptics will disagree with some of my views stated above, particularly when it involves explanations for what has caused warming, and what has caused atmospheric CO2 to increase. 


Unlike the global marching army of climate researchers the IPCC has enlisted, we do not walk in lockstep. We are willing to admit, “we don’t really know”, rather than mislead people with phrases like, “the warming we see is consistent with an increase in CO2″, and then have the public think that means, “we have determined, through our extensive research into all the possibilities, that the warming cannot be due to anything but CO2″.


Skeptics advancing alternative explanations (hypotheses) for climate variability represent the way the researcher community used to operate, before politics, policy outcomes, and billions of dollars got involved.



Weaponized Chillies





One wonders what took them so long.  This stuff is clearly dangerous and if blended with a visible smoke bomb, it is reasonable that potential victims will be able to move off.  That this is necessary is true for both mobs and police.  Everybody is confronted with a no go zone.

I also think is oil based so it will linger a long time, making it impossible to wait out.

Gas masks will limit breathing problems but will do little to prevent skin irritation.  One will be unwilling to be immersed in such gas.

Beyond all that, this is simply pepper spray made a little more scary and dangerous if one does not be careful.

Indian military to weaponize hot chilies
Monday, July 26, 2010 by: David Gutierrez, staff writer



(NaturalNews) The Indian military has announced a plan to convert the world's hottest chili pepper into a weapon.


"The chili grenade has been found fit for use after trials in Indian defense laboratories, a fact confirmed by scientists at the Defense Research and Development Organization (DRDO)," said defense spokesperson Col. R. Kalia.


The government is also developing a version of the weapon for crowd-control by police and for self-defense use by women.


The weapons will be made from the bhut jolokia, also known as the "ghost chili" and acknowledged by Guinness World Records as the most potent chili in the world. The potency of the bhut jolokia measures more than one million Scoville units, compared with the 2,500 to 8,000 Scoville units of the typical jalapeno pepper, or the 2,500 to 5,000 Scoville units of Classic Tabasco sauce.


The peppers are grown and predominantly consumed in northeast India, and are also used as a cure for stomach problems. They are small, wrinkled and deep maroon in color, resembling prunes. Their flesh is so potent that it can burn the skin, forcing farmers, grocers and chefs to handle them with gloves on.


"This is definitely going to be an effective nontoxic weapon because its pungent smell can choke terrorists and force them out of their hide-outs," R. B. Srivastava, the director of the DRDO's Life Sciences Department in New Delhi.


But chili broker Ashit Mehta questioned the government's characterization of the pepper as non-toxic, calling for rigorous safety tests on the aftereffects of exposure to weaponized bhut jolokia. In contrast to tear gas, which produces "only ... this eye burning sensation," the ghost chili can burn the skin itself, he said. Although people who are used to consuming and handling the pepper can tolerate it in small amounts, its effect on most people is far more severe.


"People should not get burned," he said. "Normal people will have a lot of problems."

Thursday, July 29, 2010

Predictions of Human Extinction







Every once in a while, another disciple of Malthus springs up and proclaims the end of humanity on Earth because of overpopulation and resource starvation.   Most are characterized by a narrow viewpoint through the lens of their own specialty which is often very blinding.

A bushman survives by tapping Mother Nature with no significant prior intervention. This means he must harvest occasional wild plants and other consumers of other wild plants over hundreds of acres.

An operator of a rice paddy can do the same thing using one hectare of land and a chicken coop and if lucky a cow.  In Northern Europe, a family survived on several acres and a cow or two and access to grassland.

We have become much better than all that.  Where we still have work is in the elimination of the oil economy and its replacement with grid power and possibly biodiesel in order to provide equipment energy.  Also soils need to be augmented with biochar in order to eliminate the need for significant chemical fertilization.

After that, it is a matter of mastering the Eden machine to fully populate the semi tropical desert lands.  I have also posted on a protocol to convert the boreal forest into productive protein production and able to support many billions of population also.

In short, I can add ten billion to our present population, while reducing their individual consumptive footprint to a fraction of today’s if not almost eliminating it.

During the next one hundred years, we will also master the magnetic exclusion vessel (MEV) and set about building space habitats able to house a hundred million at a time.  We actually could build thousands of those since once built they consume nothing except fuel for fusion energy.

The reality is that mankind is on the road to becoming self sustaining on Earth, in preparation for space borne habitats.  Thinking otherwise is outright nonsense.


Human race ‘will be extinct within 100 years’, claims leading scientist
JULY 25, 2010
by Stacey


As the scientist who helped eradicate smallpox he certainly know a thing or two about extinction.

And now Professor Frank Fenner, emeritus professor of microbiology at the Australian National University, has predicted that the human race will be extinct within the next 100 years.

He has claimed that the human race will be unable to survive a population explosion and ‘unbridled consumption.’

Fenner told The Australian newspaper that ‘homo sapiens will become extinct, perhaps within 100 years.’

‘A lot of other animals will, too,’ he added.

‘It’s an irreversible situation. I think it’s too late. I try not to express that because people are trying to do something, but they keep putting it off.’

Since humans entered an unofficial scientific period known as the Anthropocene – the time since industrialisation – we have had an effect on the planet that rivals any ice age or comet impact, he said.

Fenner, 95, has won awards for his work in helping eradicate the variola virus that causes smallpox and has written or co-written 22 books.

He announced the eradication of the disease to the World Health Assembly in 1980 and it is still regarded as one of the World Health Organisation’s greatest achievements.

He was also heavily involved in helping to control Australia’s myxomatosis problem in rabbits.

Last year official UN figures estimated that the world’s population is currently 6.8 billion. It is predicted to exceed seven billion by the end of 2011.

Fenner blames the onset of climate change for the human race’s imminent demise.

He said: ‘We’ll undergo the same fate as the people on Easter Island.

‘Climate change is just at the very beginning. But we’re seeing remarkable changes in the weather already.’
‘The Aborigines showed that without science and the production of carbon dioxide and global warming, they could survive for 40,000 or 50,000 years.

‘But the world can’t. The human species is likely to go the same way as many of the species that we’ve seen disappear.’

Retired professor Stephen Boyden, a colleague of Professor Fenner, said that while there was deep pessimism among some ecologists, others had a more optimistic view.

‘Frank may well be right, but some of us still harbour the hope that there will come about an awareness of the situation and, as a result the revolutionary changes necessary to achieve ecological sustainability.’
Simon Ross, the vice-chairman of the Optimum Population Trust, said: ‘Mankind is facing real challenges including climate change, loss of bio-diversity and unprecedented growth in population.’

Professor Fenner’s chilling prediction echoes recent comments by Prince Charles who last week warned of ‘monumental problems’ if the world’s population continues to grow at such a rapid pace.

And it comes after Professor Nicholas Boyle of Cambridge University said that a ‘Doomsday’ moment will take place in 2014 – and will determine whether the 21st century is full of violence and poverty or will be peaceful and prosperous.

in the last 500 years there has been a cataclysmic ‘Great Event’ of international significance at the start of each century, he claimed.

In 2006 another esteemed academic, Professor James Lovelock, warned that the world’s population may sink as low as 500 million over the next century due to global warming.

He claimed that any attempts to tackle climate change will not be able to solve the problem, merely buy us time.

Superconducting at Room Temperature Indicated








This paper is worth reading slowly.  The results are indirectly showing the possibility of super conductance in the sandwich of metals.  As told, the setup is crude enough to suggest an underlying principle that can be winkled out and applied effectively.

Again I see the possibility of single layer possibly amorphous metals controlling the activity.  All this calls for a huge amount of empirical effort and careful record keeping in order to discover what is happening.  At least we are encouraged and may be able to alter variables easily.

Let us hope this actually room temperature super conductance.  It is a necessary first step in producing a magnetic exclusion vessel.



JULY 26, 2010

From the University of Bengal, India



We report the observation of an exceptionally large room-temperature electrical conductivity in silver and aluminum layers deposited on a lead zirconate titanate (PZT) substrate. The surface resistance of the silver-coated samples also shows a sharp change near 313 K. The results are strongly suggestive of a superconductive interfacial layer, and have been interpreted in the framework of Bose-Einstein condensation of bipolarons as the suggested mechanism for high-temperature superconductivity in cuprates.

The samples used for current-voltage measurements were (i) thin strips 2 cm × 2 mm cut off from commercial PZT discs (supplied by Central Electronics Ltd., New Delhi, India) 0.3 mm thick and with an average grain size 1 μm which were supplied in the poled state and with 0.1 mm silver coating on both faces and (ii) the same type of strips with the original silver coating removed and 4000A aluminum deposited by vacuum evaporation. The Curie temperature of the material was 3600C as specified in the manufacturer’s data sheet.

Measurements were carried out at room temperature using a four-probe arrangement with the sample placed inside a double permalloy magnetic shield, the residual magnetic field inside the enclosure being less than 10^−5 tesla. The output voltage, which was of the order of microvolts, was measured using a home-built instrumentation amplifier based on an Analog Devices AD620 chip. Data were recorded in an Agilent 54622A digital storage oscilloscope by using a sawtooth current excitation at a frequency of 20 Hz from a function generator. It was found that scanning near this rate yielded the most consistent and reproducible data, least affected by fluctuations and noise. 

The experimental results reported here strongly suggest the presence of a superconducting layer near room temperature in the interface between a metal film and a PZT substrate. The data have been interpreted in the framework of the above model in terms of the experimentally-observed inhomogeneous charge patterns in high-temperature superconductors.

The fact that the results described above have been obtained from very simply-fabricated systems, without the use of any sophisticated set-up and any special attention being given to crystal purity, atomic perfection, lattice matching, etc. suggests that the physical process is a universal one, involving only an interface between a metal and an insulator with a large low-frequency dielectric constant. We note in passing that PZT and the cuprates have similar (perovskite or perovskite-based) crystal structures. This resemblance may provide an added insight into the basic mechanism of high-temperature superconductivity.
There was an error in this gadget