All posts by admin

Overestimating Climate Feedback

I can never make this point too often:  When considering the scientific basis for climate action, the issue is not the warming caused directly by CO2.  Most scientists, even the catastrophists, agree that this is on the order of magnitude of 1C per doubling of CO2 from 280ppm pre-industrial to 560ppm (to be reached sometime late this century).  The catastrophe comes entirely from assumptions of positive feedback which multiplies what would be nuisance level warming to catastrophic levels.

My simple analysis shows positive feedbacks appear to be really small or non-existent, at least over the last 120 years.  Other studies show higher feedbacks, but Roy Spencer has published a new study showing that these studies are over-estimating feedback.

And the fundamental issue can be demonstrated with this simple example: When we analyze interannual variations in, say, surface temperature and clouds, and we diagnose what we believe to be a positive feedback (say, low cloud coverage decreasing with increasing surface temperature), we are implicitly assuming that the surface temperature change caused the cloud change — and not the other way around.

This issue is critical because, to the extent that non-feedback sources of cloud variability cause surface temperature change, it will always look like a positive feedback using the conventional diagnostic approach. It is even possible to diagnose a positive feedback when, in fact, a negative feedback really exists.

I hope you can see from this that the separation of cause from effect in the climate system is absolutely critical. The widespread use of seasonally-averaged or yearly-averaged quantities for climate model validation is NOT sufficient to validate model feedbacks! This is because the time averaging actually destroys most, if not all, evidence (e.g. time lags) of what caused the observed relationship in the first place. Since both feedbacks and non-feedback forcings will typically be intermingled in real climate data, it is not a trivial effort to determine the relative sizes of each.

While we used the example of random daily low cloud variations over the ocean in our simple model (which were then combined with specified negative or positive cloud feedbacks), the same issue can be raised about any kind of feedback.

Notice that the potential positive bias in model feedbacks can, in some sense, be attributed to a lack of model “complexity” compared to the real climate system. By “complexity” here I mean cloud variability which is not simply the result of a cloud feedback on surface temperature. This lack of complexity in the model then requires the model to have positive feedback built into it (explicitly or implicitly) in order for the model to agree with what looks like positive feedback in the observations.

Also note that the non-feedback cloud variability can even be caused by…(gasp)…the cloud feedback itself!

Let’s say there is a weak negative cloud feedback in nature. But superimposed upon this feedback is noise. For instance, warm SST pulses cause corresponding increases in low cloud coverage, but superimposed upon those cloud pulses are random cloud noise. That cloud noise will then cause some amount of SST variability that then looks like positive cloud feedback, even though the real cloud feedback is negative.

I don’t think I can over-emphasize the potential importance of this issue. It has been largely ignored — although Bill Rossow has been preaching on this same issue for years, but phrasing it in terms of the potential nonlinearity of, and interactions between, feedbacks. Similarly, Stephen’s 2005 J. Climate review paper on cloud feedbacks spent quite a bit of time emphasizing the problems with conventional cloud feedback diagnosis.

This is Science??

This, incredibly, comes from the editor of Science magazine

With respect to climate change, we have abruptly passed the tipping point in what until recently has been a tense political controversy. Why? Industry leaders, nongovernmental organizations, Al Gore, and public attention have all played a role. At the core, however, it’s about the relentless progress of science. As data accumulate, denialists retreat to the safety of the Wall Street Journal op-ed page or seek social relaxation with old pals from the tobacco lobby from whom they first learned to "teach the controversy." Meanwhile, political judgments are in, and the game is over. Indeed, on this page last week, a member of Parliament described how the European Union and his British colleagues are moving toward setting hard targets for greenhouse gas reductions.

Guess we can certainly expect him to be thoughtful and balanced in his evaluation of submissions for the magazine.  "seek social relaxation with old pals from the tobacco lobby"??  My god that is over the top.

Possibly the Most Important Climate Study of 2007

I have referred to it before, but since I have been posting today on surface temperature measurement, I thought I would share a bit more on "Quantifying the influence of anthropogenic surface processes and inhomogeneities on gridded global climate data" by Patrick Michaels and Ross McKitrick that was published two weeks ago in Journal of Geophysical Research – Atmospheres (via the Reference Frame).

Michaels and McKitrick found what nearly every sane observer of surface temperature measurement has known for years:  That surface temperature readings are biased by urban growth.  The temperature measurement station I documented in Tucson has been reading for 100 years or so.  A century ago, it was out alone in the desert in a one horse town.  Today, it is in the middle of an asphalt parking lot dead center of a town of over 500,000 people.

Here is what they did and found:

They start with the following thesis. If the temperature data really measure the climate and its warming and if we assume that the warming has a global character, these data as a function of the station should be uncorrelated to various socioeconomic variables such as the GDP, its growth, literacy, population growth, and the trend of coal consumption. For example, the IPCC claims that less than 10% of the warming trend over land was due to urbanization.

However, Michaels and McKitrick do something with the null hypothesis that there is no correlation – something that should normally be done with all hypotheses: to test it. The probability that this hypothesis is correct turns out to be smaller than 10-13. Virtually every socioeconomic influence seems to be correlated with the temperature trend. Once these effects are subtracted, they argue that the surface warming over land in the last 25 years or so was about 50% of the value that can be uncritically extracted from the weather stations.

Moreover, as a consistency check, after they subtract the effects now attributed to socioeconomic factors, the data from the weather stations become much more compatible with the satellite data! The first author thinks that it is the most interesting aspect of their present paper and I understand where he is coming from.

What they are referring to in this last paragraph is the fact that satellites have been showing a temperature anomaly in the troposphere about half the size of the surface temperature readings, despite the fact that the theory of global warming says pretty clearly that the troposphere should warm from CO2 more than the surface.

I will repeat what I said before:  The ONLY reason I can think of that climate scientists still eschew satellite measurement in favor of surface temperature measurement is because the surface readings are higher.  Relying on the likely more accurate satellite data would only increase the already substantial divergence problem they have between their models and reality.

Temperature Measurement Fact of the Day

Climate scientists know this of course, but there is something I learned about surface temperature measurement that really surprised me when I first got into this climate thing.  Since this is a blog mainly aimed at educating the layman, I thought some of you might find this surprising as well.

Modern temperatures sensors, like the MMTS that is used at many official USHCN climate stations, can theoretically read temperatures every hour or minute or even continuously.  I originally presumed that these modern devices arrived at a daily temperature reading by continuously integrating the temperature over a 24-hour day, or at worst averaging 24 hourly readings.

WRONG!  While in fact many of the instruments could do this, in reality they do not.  The official daily temperature in the USHCN and most other databases is based on the average of that day’s high and low temperatures.  "Hey, that’s crazy!" You say.  "What if the temperature hovered at 50 degrees for 23 hours, and then a cold front comes in the last hour and drops the temperature 10 degrees.  Won’t that show the average for the day around 45 when in fact the real average is 49.8 or so?"  Yes.  All true.  The method is course and it sucks. 

Surface temperature measurements are often corrected if the time of day that a "day" begins and ends changes.  Apparently, a shift form a midnight to say a 3PM day break can make a several tenths of degree difference in the daily averages.  This made no sense to me.  How could this possibly be true?  Why should an arbitrary begin or end of a day make a difference, assuming that one is looking at a sufficiently long number of days.  That is how I found out that the sensors were not integrating over the day but just averaging highs and lows.  The latter methodology CAN be biased by the time selected for a day to begin and end (though I had to play around with a spreadsheet for a while to prove it to myself).  Stupid. Stupid. Stupid.

It is just another reason why the surface temperature measurement system is crap, and we should be depending on satellites instead.  Can anyone come up with one single answer as to why climate scientists eschew satellite measurements for surface temperatures EXCEPT that the satellites don’t give the dramatic answer they want to hear?  Does anyone for one second imagine that any climate scientist would spend 5 seconds defending the surface temperature measurement system over satellites if satellites gave higher temperature readings?

Postscript:  Roger Pielke has an interesting take on how this high-low average method introduces an upwards bias in surface temperatures.

Cargo Cult Science

A definition of "cargo cult" science, actually in the context of particle and high-energy physics, but the term will feel very familiar to those of us who try to decipher climate science:

Her talk is a typical example of cargo cult science. They use words that sound scientific, they talk about experiments and about their excitement. Formally, everything looks like science. There is only one problem with their theoretical work: the airplanes don’t land and the gravitons don’t scatter. It is because they are unable to impartially evaluate facts, to distinguish facts from wishful thinking and results from assumptions, and to abandon hypotheses that have been falsified.

They play an "everything goes" game instead. In this game, nothing is ever abandoned because it would apparently be too cruel for them. They always treat "Yes" and "No" as equal, never answer any question, except for questions where their answers seem to be in consensus – and these answers are usually wrong.

The Technocratic Trap

Technocrats tend to hate and/or distrust bottom up economic solutions that result from a spontaneous order resulting from changing pricing signals and incentives.  As a result, technocrats in government tend to not only want a problem solved, but solved their way.  They get just as mad, or even more upset, at a problem solved by the market in a way in which the don’t approve than they get from the problem going unsolved altogether.

TJIC brings us a great example of this technocratic trap as applied to CO2 abatement:

http://www.boston.com/news/local/article…

Greenhouse gas emissions from Northeast power plants were about 10 percent lower than predicted during the last two years…

But the decrease may have some unanticipated consequences for efforts to combat global warming: It could have the perverse effect of delaying more lasting reductions, by undercutting incentives intended to spur power plants to invest in cleaner technologies and energy efficiency…

I wonder if environmentalists are really as pathetic and perpetually grumpy as they always sound, or if that’s just some sort of kabuki political theater?

Massachusetts and nine other Northeast states are part of a landmark pact called the Regional Greenhouse Gas Initiative that is designed to cap power plant emissions in 2009 and then gradually reduce them by 10 percent over the next decade. Power plants will have to buy emission allowances…

But if emissions are significantly lower than the cap, there would be less demand for allowances, driving down their price and giving power plants little financial incentive to invest in cleaner and more efficient technologies…

It’s almost as if people are hung up on the means, and not the ends.

Oh noz, the industry has realized that the cheapest way (which is to say “the way that bes preserves living standards) to cut carbon emissions is to switch from coal to natural gas…which means that they’re not taking the more expensive way (which is to say “way that destroys living standards”) that we want them to. Boo hoo!

“If the cap is above what power plants are emitting, we won’t see any change in their behavior,” said Derek K. Murrow, director of policy analysis for Environment Northeast, a nonprofit research and advocacy organization. “They just continue business as usual.”

(a) Umm…you’ve already seen a change in their behavior

(b) what do you want? Lower carbon emissions, or to force them to use some pet technology?…

Officials of states involved in RGGI and energy specialists are discussing ways to ensure that allowances have enough value to spark investments in cleaner technologies.

Again, the insistence on technologies. Why?

One solution would be to lower the cap, but that’s likely to be politically difficult…

Laurie Burt, commissioner of the Massachusetts Department of Environmental Protection, said she and other state officials are aware of the problem and discussing ways to solve it.

What problem?

Read it all.

Update: False Sense of Security

A while back I wrote about a number of climate forecasts (e.g. for 2007 hurricane activity) wherein we actually came in in the bottom 1% of forecasted outcomes.  I wrote:

If all your forecasts are coming out in the bottom 1% of the forecast range, then it is safe to assume that one is not forecasting very well.

Well, now that the year is over, I can update one of those forecasts, specifically the forecasts from the UK government Met office that said:

  • Global temperature for 2007 is expected to be 0.54 °C above the long-term (1961-1990) average of 14.0 °C;
  • There is a 60% probability that 2007 will be as warm or warmer than the current warmest year (1998 was +0.52 °C above the long-term 1961-1990 average).

Playing around with the numbers, and assuming a normal distribution of possible outcomes, this implies that their forecast mean is .54C and their expected std deviation is .0785C.  This would give a 60% probability of temperatures over 0.52C.

The most accurate way to measure the planet’s temperature is by satellite.  This is because satellite measurements include the whole globe (oceans, etc) and not just a few land areas where we have measurement points.  Satellites are also free of things like urban heat biases.  The only reason catastrophists don’t agree with this statement is because satellites don’t give them the highest possible catastrophic temperature reading (because surface readings are, in fact, biased up).  Using this satellite data:

Rssmsuanomaly

and scaling the data to a .52C anomaly in 1998 gets a reading for 2007 of 0.15C.  For those who are not used to the magnitude of anomalies, 0.15C is WAY OFF from the forecasted 0.54C.  In fact, using the mean and std. deviation of the forecast we derived above, the UK Met office basically was saying that it was 99.99997% certain that the temperature would not go so low.  Another way of saying this is that the Met office forecast implied 2,958,859:1 odds against the temperature being this low in 2007.

What are the odds that the Met office needs to rethink its certainty level?

Much more from The Reference Frame

Burying Trees

[Cross-posted from Coyote Blog]

A few weeks ago I argued that if we really thought that CO2 was the biggest threat to the environment (a proposition with which I do not agree) we should not recycle paper or Christmas trees – we should wrap them in Saran Wrap and bury them.  Earlier I wrote this:

Once trees hit their maturity, their growth slows and therefore the rate they sequester CO2 slows.  At this point, we need to be cutting more down, not less, and burying them in the ground, either as logs or paper or whatever.  Just growing forests is not enough, because old trees fall over and rot and give up their carbon as CO2.  We have to bury them.   Right?

I was being a bit tongue-in-cheek, trying to take CO2 abatement to its illogical extreme, but unfortunately the nuttiness of the environmental movement can outrun satire.  These folks advocate going into the forests and cutting down trees and burying them:

Here a carbon sequestration strategy is proposed in which certain dead or live trees are harvested via collection or selective cutting, then buried in trenches or stowed away in above-ground shelters. The largely anaerobic condition under a sufficiently thick layer of soil will prevent the decomposition of the buried wood. Because a large flux of CO2 is constantly being assimilated into the world as forests via photosynthesis, cutting off its return pathway to the atmosphere forms an effective carbon sink….

Based on data from North American logging industry, the cost for wood burial is estimated to be $14/tCO2 ($50/tC), lower than the typical cost for power plant CO2 capture with geological storage. The cost for carbon sequestration with wood burial is low because CO2 is removed from the atmosphere by the natural process of photosynthesis at little cost. The technique is low tech, distributed, easy to monitor, safe, and reversible, thus an attractive option for large-scale implementation in a world-wide carbon market

Its a little scary to me that I can anticipate this stuff.

More on Feedback

James Annan, more or less a supporter of catastrophic man-made global warming theory, explains how typical climate sensitivities (of the order of magnitude of 3 or more) used by catastrophists are derived (in an email to Steve McIntyre)  As a reminder, climate sensitivity is the amount of temperature rise we would expect on earth from a doubling of CO2 from pre-industrial 280ppm to 560ppm.

If you want to look at things in the framework of feedback analysis, there’s a pretty clear explanation in the supplementary information to Roe and Baker’s recent Science paper. Briefly, if we have a blackbody sensitivity S0 (~1C) when everything else apart from CO2 is held fixed, then we can write the true sensitivity S as

S = S0/(1- Sum (f_i))

where the f_i are the individual feedback factors arising from the other processes. If f_1 for water vapour is 0.5, then it only takes a further factor of 0.17 for clouds (f_2, say) to reach the canonical S=3C value. Of course to some extent this may look like an artefact of the way the equation is written, but it’s also a rather natural way for scientists to think about things and explains how even a modest uncertainty in individual feedbacks can cause a large uncertainty in the overall climate sensitivity.

This is the same classic feedback formula I discussed in this prior article on feedback.  And Dr. Annan basically explains the origins of the 3C sensitivity the same way I have explained it to readers in the past:  Sensitivity from CO2 alone is about 1C (that is S0) and feedback effects from things like water vapour and clouds triples this to three.  The assumption is that the climate has very strong positive feedback.

Note the implications.  Without any feedback, or feedback that was negative, we would not expect the world to heat up much more than a degree with a doubling of CO2, of which we have already seen perhaps half.  This means we would only experience another half degree of warming in the next century or so.  But with feedbacks, this half degree of future warming is increased to 2.5 or 3.0 or more degrees.  Essentially assumptions about feedback are what separates trivial nuisance levels of warming from forecasts that are catastrophic. 

Given this, it is instructive to see what Mr. Annan has to say in the same email about our knowledge of these feedbacks:

The real wild card is in the behaviour of clouds, which have a number of strong effects (both on albedo and LW trapping) and could in theory cause a large further amplification or suppression of AGW-induced warming. High thin clouds trap a lot of LW (especially at night when their albedo has no effect) and low clouds increase albedo. We really don’t know from first principles which effect is likely to dominate, we do know from first principles that these effects could be large, given our current state of knowledge. GCMs don’t do clouds very well but they do mostly (all?) suggest some further amplification from these effects. That’s really all that can be done from first principles.

In other words, scientists don’t even know the SIGN of the most important feedback, ie clouds.  Of course, in a rush to build the most alarming model, they all seem to rush to the assumption that it is positive.  So, yes, if the feedback is a really high positive number (something that is very unlikely in natural, long-term stable physical processes) then we get a climate catastrophe.  Of course if it is small or negative, we don’t get one at all. 

My Annan points to studies he claims shows climate sensitivity net of feedbacks in the past to be in the 2-3C range.  Note that these are studies of climate changes tens or hundreds of thousands of years ago, as recorded imperfectly in ice and other proxies.  The best data we have is of course for the last 120 years when we have measured temperature with thermometers rather than ice crystals, and the evidence of this data points to a sensitivity of at most about 1C net of feedbacks.

So to summarize:

  • Climate sensitivity is the temperature increase we might expect with a doubling of CO2 to 560 ppm from a pre-industrial 280ppm
  • Nearly every forecast you have ever seen assumes the effect of CO2 alone is about a 1C warming from this doubling.  Clearly, though, you have seen higher forecasts.  All of the "extra" warming in these forecasts come from positive feedback.  So a sensitivity of 3C would be made up of 1C from CO2 directly that is tripled by positive feedbacks.  A sensitivity of 6 or 8 still starts with the same 1C but has even higher feedbacks
  • Most thoughtful climate scientists will admit that we don’t know what these feedbacks are — in so many words, modelers are essentially guessing.  Climate scientists don’t even know the sign (positive or negative) much less the magnitude.  In most physical sciences, upon meeting such an unknown system that has been long-term stable, scientists will assume neutral to negative feedback.  Climate scientists are the exception — almost all their models assume strong positive feedback.
  • Climate scientists point to studies of ice cores and such that serve as proxies for climate hundreds of thousands of years ago to justify positive feedbacks.  But for the period of history we have the best data, ie the last 120 years, actual CO2 and measured temperature changes imply a sensitivity net of feedbacks closer to 1C, about what a reasonable person would assume from a stable process not dominated by positive feedbacks.

This is Science?

It is just amazing to me that the press has granted statements like the one below the imprimatur of being scientific while labeling folks like me "anti-science" for calling them out:

Previously it was assumed that gradual increases in carbon dioxide (CO2) and other heat-trapping gases in the atmosphere would produce gradual increases in global temperatures. But now scientists predict that an increase of as little as 2˚C above pre-industrial levels could trigger environmental effects that would make further warming—as much as 8˚C—inevitable.

Worse still, a 2˚C increase is highly likely if greenhouse gas concentrations reach 450 parts per million (ppm). They presently stand at 430ppm and are increasing by 2-2.5 ppm per year.

Gee, where do I start?  Well, first, the author can’t even get the simplest facts correct.  World CO2 concentrations hover in the 380’s (the amount varies seasonally) and is not anywhere near 430.  Second, I have demonstrated any number of times that our history over the past 120 years would lead us to expect at most a 1 degree rise over pre-industrial levels at 560, and thus a 2 degree rise by 450 is not "highly likely."  Third, just look at the author’s numbers at face value.  Catastrophists believe temperatures have risen (reason disputed) about 0.6-0.7 degrees in the last century or so.  If we are really at 430 ppm, then that means the first 150ppm rise (preindustrial CO2 was bout 280ppm) caused at most 0.6C, but the next 20 ppm to 450 would cause 1.4C, this despite the fact that CO2 concentations have a diminishing return relationship to temperature.  Yeah, I understand time delays and masking, but really — whoever wrote these paragraphs can’t possibly have understood what he was writing.

But I have not even gotten to the real whopper — that somehow, once we hit 2 degrees of warming, the whole climate system will run away and temperatures will "inveitably" rise another 8C. Any person who tells you this, including Al Gore and his "tipping points," is either an idiot or a liar.  Long-term stable systems do not demonstrate this kind of radically positive feedback-driven runaway behavior (much longer post on climate and positive feedbacks here).  Such behavior is so rare in physical systems anyway, much less ones that are demonstrably long-term stable, that a scientist who assumes it without evidence has to have another agenda, because it is not a defensible assumption (and scientists have no good evidence as to the magnitude, or even the sign, of feedbacks in the climate system).

By the way, note the source and remember my and others’ warmings.  A hard core of global warming catastrophists are socialists who support global warming abatement not because they understand or agree with the science, but because they like the cover the issue gives them to pursue their historic goals.

HT:  Tom Nelson

Working This Week On A Book Project

I am working on a submission (outline and several chapters) for a book prize that is due December 31, so I may not be posting much over the next week.  The contest is for a novel that promotes the principals of freedom, capitalism, and individual responsibility in the context of a novel (hopefully without 120-page John Galt radio speeches). 

My project is one I have been tinkering with for a while, an update of the Marshall Jevons economist mysteries from the 1980’s.  If you are not familiar with this series, Marshall Jevons was a pseudonym for a couple of economists who wrote several murder mysteries that included a number of expositions on how economics apply to everyday life.  Kind of Agatha Christie meets Freakonomics.  I found the first book, Murder at the Margin, to be disappointing, but the second book called the Fatal Equilibrium was pretty good.  I think the latter was a better book because the setting was university life, and the murder revolved around a tenure committee decision, topics the authors could write about closer to their experience.  The books take a pro-free-market point of view (which already makes them unique) and it is certainly unusual to have the solution to a murder turn on how search costs affect pricing variability.

Anyway, for some time, I have been toying with a concept for a young adult book in roughly the same tradition.  I think the Jevons novels are a good indicator of how a novel can teach some simple economics concepts, but certainly the protagonist as fusty stamp-collecting Harvard professor would need to be modified to engage young adults. 

My new novel (or series of novels, if things go well) revolves around a character named Adam Smith.  Adam is the son of a self-made immigrant and heir to a nearly billion dollar fortune.  At the age of twenty, he rejects his family and inheritance in a wave of sixties rebellion, joins a commune, and changes his name to the unfortunate "Moonbeam."  After several years, he sours on commune life, put himself through graduate school in economics, and eventually reclaims his family fortune.  Today, he leads two lives:  Adam Smith, eccentric billionaire, owner of penthouses and fast cars, and leader of a foundation [modeled after the IJ]; and Professor Moonbeam, aging hippie high school economics teacher who drives a VW beetle and appears to live in a trailer park.  There is a murder, of course, and the fun begins when three of his high school students start to suspect that their economics teacher may have a second life.  As you might expect, the kids help him solve the murder while he teaches them lessons about life and economics.  The trick is to keep the book light and fun rather than pedantic, but since one business model in my last novel revolved around harvesting coins in fountains, I think I can do it.

Anyway, wish me luck and I will be back in force come the new year. 

Climate Biases Still Solidly Frozen in Place

I thought this Google screen shot from Tom Nelson was pretty funny.  You actually don’t even need the second article to see the irony.  The Daily Green headline reads "Arctic Sea Ice Freezes Slowly" while the actual text reads, as you can see even in the Google excerpt, "Arctic sea ice refroze at a record pace."  (click for larger view)

Googleice

By the way, since this story came out, freezing no longer lags history and Northern Hemisphere snow and ice coverage exceeds the averages of past Decembers.  More on Arctic ice here and here.

Northwest Passage

The other day, when I mentioned the irony of the AP publishing a story about Artic ice melting on the same day it was announce the Artic ice was growing at a record rate, I forgot to deal with the bogus claim in the article that this was the first time the Northwest Passage had ever opened.

This myth is discussed here.   In summary:

the Northwest Passage was successfully navigated in 1906, 1940, 1941, 1942, 1944, 1957, 1969, 1977, 1984, 1988, and 2000.

Global Warming Solutions

Via Tom Nelson, comes this helpful list of proposals offered to date to help reverse global warming.  Note that these were presented by their authors as serious proposals.  A couple of examples:

1. Get rid of humans.

Greenpeace co-founder Paul Watson insists we "reduce human populations to fewer than one billion".

2. Put a carbon tax on babies.

Prof Barry Walters, of the University of Western Australia, says families with more than, say, two children should be charged a carbon tax on their little gas emitters.

3. Cull babies.

Toni Vernelli, of green group PETA, says she killed her unborn child because of its potential emissions: "It would have been immoral to give birth to a child that I felt strongly would only be a burden to the world."

4. Sterilise us all.

Dr John Reid, a former Swinburne University academic, gave a lecture on ABC radio recommending we "put something in the water, a virus that would be specific to the human reproductive system, and would make a substantial proportion of the population infertile".

5. Ban second children.

Says Melbourne University population guru Prof Short: "We need to develop a one-child family policy because we are the global warmers."

Read them all.

Another Government Boondoggle

The NY Times has an interesting article on the government subsidized boondoggle to build a zero-emissions coal plant.  First, we see all the usual aspects of what happens when the government steps in to fund non-economic projects no private investor would touch:

But choosing the location was perhaps the least daunting step. The project, announced by President Bush in 2003, seems to be in perpetual creep mode. The budget, as Matt Wald wrote yesterday, has ballooned 50 percent (because of the worldwide shortage of basics like cement). The timetable has slid. Components are being shed. The portion of the eventual $1.8 billion cost paid by the government is shrinking.

Though I have never studies the numbers in depth, it has been my intuition (from years working as an engineer with power plants and oil refineries) that it will be almost impossible to make this technology economic.  Apparently, others agree:

There are plenty of experts who still doubt that capturing carbon dioxide and putting it in cold storage will ever work at a meaningful scale. Vaclav Smil at the University of Manitoba has calculated that capturing, compressing and storing just 10 percent of current CO2 emissions — here and now — would require as much pipeline and plant infrastructure as are now used worldwide to extract oil from the ground. And oil is a pricey commodity while carbon dioxide is a waste gas.

Handling and compressing gasses are a lot more difficult than liquids, and liquefying them may take almost as much energy as you get out of the combustion.  So, what does the government do if a technology is so uneconomic and unworkable that it can’t even scrape up enough money to subsidize one plant?  Why, it mandates the technology:

Mr. Hawkins said adequate construction of CO2-trapping plants would happen more swiftly if a “performance standard” requiring this technology were added to climate legislation like the Warner-Lieberman bill being considered in the Senate. Such a provision would require new coal-plant construction to incorporate such systems and spread the cost over the economy so utilities aren’t hit too hard.

Such a mandate would effectively end the construction new coal plants, which may in fact be the motive of supporters. 

This is what happens when government tries to pick winning technologies.  If carbon combustion is really bad (a proposition with which I do not agree) then a carbon tax needs to be instituted, so that individuals can figure out on their own which carbon combustion is the most economic to eliminate and with what technologies.  If the government insists on picking winners with subsidized technology programs, then nuclear strikes me as much more fruitfull, as it is already proven and close to economic and investment can be concentrated on marginal problems like waste disposal, rather than fundamental problems like "will this approach even work."

Global Warming Trojan Horse

Investors Business Daily has a great article reinforcing the point many of us have been making for a while:  The Marxists and anti-globalization rioters and other left-wing extremists that have seemed awfully quiet of late have not disappeared;  they have repackaged themselves as global warming activists, but their goals are exactly the same.

The driving force of the environmental movement is not a cleaner planet — or a world that doesn’t get too hot, in the case of the global warming issue — but a leftist, egalitarian urge to redistribute wealth. A CO2 tax does this and more, choking economic growth in the U.S. and punishing Americans for being the voracious consumers that we are.

Eco-activists have been so successful in distracting the public from their real intentions that they’re becoming less guarded in discussing their ultimate goal.

"A climate change response must have at its heart a redistribution of wealth and resources," Emma Brindal, a "climate justice campaign coordinator" for Friends of the Earth Australia, wrote Wednesday on the Climate Action Network’s blog.

In this case, redistribution would be handled by the Multilateral Adaptation Fund, an agency that would use the carbon tax receipts to help countries that are having to deal with climate change.

Since the "complete list of things caused by global warming" now exceeds 600 (see our "Chilled By The Heat" editorial, Dec. 13), there would be few if any limits on the U.N.’s ability to move riches from countries that have created and earned them to those that have done neither.

Still think this is all about halting climate change? We would go as far as to say that anyone who does is either naive or a dupe. Both the rhetoric and the behavior of the eco-activists back us up.

Protein Wisdom adds this:

The “Greens” are no more interested in clean air and water today than the Soviets were in liberty when they rolled tanks into Prague in 1968. We dismiss them as “silly” at our own peril.

A New Political Gambit

It used to be that the Presidential trick was to set goals, such as "balancing the budget" targeted at a date several years after he has left office.  Now, political candidates are going to the next step:  Setting goals targeted at a date several years after they will be safely dead:

…[Clinton’s] plan would reduce greenhouse gas emissions by 80 percent from 1990 levels by 2050 to avoid the worst effects of global warming…Hillary would increase fuel efficiency standards to 55 miles per gallon by 2030…

By the way, 80 percent below 1990 levels is what? The bottom of the great depression?

First Against the Wall

It appears that civil discourse on climate science may soon not be possible, as folks like this are discussing use of violence against those who do not support the religion of catastrophic man-made global warming. 

These are words to contemplate as we head into a 2008 without any significant action taken by the US government (to say nothing of other countries) on climate change. We are in critical battle for this planet, and we need to think seriously about doing whatever it takes to stop the actions which are destroying the land and seas…and contributing to snowballing (or, more appropriately, snow-melting) climate collapse. Are petitions, lobby days, call-ins, protests, and nonviolent civil disobedience enough?

I hear Galileo had the same problem.  By the way, I certainly found it entertaining that the author signs his call to violence against people who do not share the same science as he "in good heart."

(via Tom Nelson)