Tuesday, February 05, 2013



Where has that consensus gone?

In good Warmist style, the paper below polls "experts" rather than looking at any climate data.  Sadly, however, they found that their "experts" disagreed about almost everything.  The one consolation was that most of the experts allowed some possibility that it would get VERY warm (over 4.5°C warmer) in future
Expert judgments about transient climate response to alternative future trajectories of radiative forcing

By Kirsten Zickfeld et al.

Abstract

There is uncertainty about the response of the climate system to future trajectories of radiative forcing. To quantify this uncertainty we conducted face-to-face interviews with 14 leading climate scientists, using formal methods of expert elicitation. We structured the interviews around three scenarios of radiative forcing stabilizing at different levels. All experts ranked “cloud radiative feedbacks” as contributing most to their uncertainty about future global mean temperature change, irrespective of the specified level of radiative forcing. The experts disagreed about the relative contribution of other physical processes to their uncertainty about future temperature change. For a forcing trajectory that stabilized at 7 Wm-2 in 2200, 13 of the 14 experts judged the probability that the climate system would undergo, or be irrevocably committed to, a “basic state change” as ≥0.5. The width and median values of the probability distributions elicited from the different experts for future global mean temperature change under the specified forcing trajectories vary considerably. Even for a moderate increase in forcing by the year 2050, the medians of the elicited distributions of temperature change relative to 2000 range from 0.8–1.8 °C, and some of the interquartile ranges do not overlap. Ten of the 14 experts estimated that the probability that equilibrium climate sensitivity exceeds 4.5°C is > 0.17, our interpretation of the upper limit of the “likely” range given by the Intergovernmental Panel on Climate Change. Finally, most experts anticipated that over the next 20 years research will be able to achieve only modest reductions in their degree of uncertainty.

SOURCE








Some criticism of the IPCC by a Warmist

He thinks the IPCC are deliberately exaggerating the warming that would result from their prophecies coming to pass.  In passing he (rightly) dismisses the paper above as a lightweight contribution to understanding and notes that one of the pollees had admitted openly to deliberate lying about the likelihood of warming

So, sensitivity has been in the climate blogosphere a bit recently. Just a few days ago, that odd Norwegian press release got some people excited, but it's not clear what it really means. There is an Aldrin et al paper, published some time ago - which gave a decent constraint on climate sensitivity, though nothing particularly surprising or interesting IMO. We thought we had sorted out the sensitivity kerfuffle several years ago, but it seems that the rest of the world still hasn't yet caught up. As I said to Andy Revkin (and he published on his blog), the additional decade of temperature data from 2000 onwards (even the AR4 estimates typically ignored the post-2000 years) can only work to reduce estimates of sensitivity, and that's before we even consider the reduction in estimates of negative aerosol forcing, and additional forcing from black carbon (the latter being very new, is not included in any calculations AIUI). It's increasingly difficult to reconcile a high climate sensitivity (say over 4C) with the observational evidence for the planetary energy balance over the industrial era. But the Norwegian press release seems to refer to as yet unpublished research, and some of the claims seem a bit hard to credit. So we will have to wait for more details before drawing any more solid conclusions.

Before then, there was the minor blogstorm (at least in some quarters) surrounding Nic Lewis' criticism of the IPCC's stubborn adherence to their old estimate of climate sensitivity. This, of course, being despite the additional evidence which I've just mentioned above.

When I looked at the IPCC drafts, I didn't actually notice the substantial change in estimated aerosol uncertainty that Nic focussed on. With limited time and energy to wade through several hundred pages of draft material, I mostly looked for how and where they had (or had not, but perhaps should have) referred to my work, to make sure it was fairly and accurately represented. I was pretty unimpressed with some parts of first draft, actually, and made a number suggestions. Of course in line with the IPCC conditions, I'm not going to say what was or was not in any draft. According to IPCC policy, my comments will all be available in the fullness of time, but I have also criticised this delayed release so in the spirit of openness here is one comment I made about their discussion of sensitivity in Chapter 12 (p55 in the first order draft):

    "It seems very odd to portray our work as an outlier here. Sokolov et al 2009, Urban and Keller 2010, Olson et al (in press JGR) have also recently presented similar results (and there may be more as yet unpublished, eg Aldrin at the INI meeting back in 2010). Such "observationally constrained pdfs" were all the rage a few years ago and featured heavily in the last IPCC report, there is no clear explanation for your sudden dismissal of them in favour of what seems to be a small private opinion poll. A more balanced presentation could be: "Annan and Hargreaves (2011a) criticize the use of uniform priors and argue that sensitivities above 4.5°C are extremely unlikely (less than 5%). Similar results have been obtained by a number of other researchers [add citations from the above]."

Note for the avoidance of any doubt I am not quoting directly from the unquotable IPCC draft, but only repeating my own comment on it. However, those who have read the second draft of Chapter 12 will realise why I previously said I thought the report was improved :-) Of course there is no guarantee as to what will remain in the final report, which for all the talk of extensive reviews, is not even seen by the proletariat, let alone opened to their comments, prior to its final publication. The paper I refer to as a "small private opinion poll" is of course the Zickfeld et al PNAS paper. The list of pollees in the Zickfeld paper are largely the self-same people responsible for the largely bogus analyses that I've criticised over recent years, and which even if they were valid then, are certainly outdated now. Interestingly, one of them stated quite openly in a meeting I attended a few years ago that he deliberately lied in these sort of elicitation exercises (i.e. exaggerating the probability of high sensitivity) in order to help motivate political action. Of course, there may be others who lie in the other direction, which is why it seems bizarre that the IPCC appeared to rely so heavily on this paper to justify their choice, rather than relying on published quantitative analyses of observational data. Since the IPCC can no longer defend their old analyses in any meaningful manner, it seems they have to resort to an unsupported "this is what we think, because we asked our pals". It's essentially the Lindzen strategy in reverse: having firmly wedded themselves to their politically convenient long tail of high values, their response to new evidence is little more than sticking their fingers in their ears and singing "la la la I can't hear you".

Of course, this still leaves open the question of what the new evidence actually does mean for climate sensitivity. I have mentioned above several analyses that are fairly up to date. I have some doubts about Nic Lewis' analysis, as I think some of his choices are dubious and will have acted to underestimate the true sensitivity somewhat. For example, his choice of ocean heat uptake is based on taking a short term trend over a period in which the observed warming is markedly lower than the longer-term multidecadal value. I don't think this is necessarily a deliberate cherry-pick, any more than previous analyses running up to the year 2000 were (the last decade is a natural enough choice to have made) but it does have unfortunate consequences. Irrespective of what one thinks about aerosol forcing, it would be hard to argue that the rate of net forcing increase and/or over-all radiative imbalance has actually dropped markedly in recent years, so any change in net heat uptake can only be reasonably attributed to a bit of natural variability or observational uncertainty. Lewis has also adjusted the aerosol forcing according to his opinion of which values are preferred - concidentally, he comes down on the side of an answer that gives a lower sensitivity. His results might be more reasonable if he had at least explored the sensitivity of his result to the assumptions made. Using the last 30y of ocean heat data and simply adopting the official IPCC forcing values rather than his modified versions (since after all, his main point is to criticise the lack of coherence in the IPCC report itself) would add credibility to his analysis. A still better approach would be to use a model capable of representing the transient change, and fitting it to the entire time series of the various relevant observations. Which is what people like Aldrin et al have done, of course, and which is why I think their results are superior.

But the point stands, that the IPCC's sensitivity estimate cannot readily be reconciled with forcing estimates and observational data. All the recent literature that approaches the question from this angle comes up with similar answers, including the papers I mentioned above. By failing to meet this problem head-on, the IPCC authors now find themselves in a bit of a pickle. I expect them to brazen it out, on the grounds that they are the experts and are quite capable of squaring the circle before breakfast if need be. But in doing so, they risk being seen as not so much summarising scientific progress, but obstructing it.

SOURCE   






New paper finds tree-ring studies underestimate climate extremes of the past

A new paper published in Nature Climate Change finds tree-ring reconstructions of temperature, such as Mann's infamous hockey stick, "underestimate climate fluctuations of, for example, air temperature," due to data complicated by "the climate of past years and other factors like tree age" and precipitation. "Our results point to uncertainties in the global climate system that were previously not recognized," says David Frank, co-author of the study.

Understanding Earth's Climate Prior to the Industrial Era
enlarge

Climate signals locked in the layers of glacial ice, preserved in the annual growth rings of trees, or fingerprinted in other so-called proxy archives such as lake sediments, speleothems, and corals allow researchers to quantify climate variation prior to instrumental measurements. An international research team has now investigated hundreds of these proxy records from across the globe and compared them with both simulations of the Earth’s climate and instrumental measurements of temperature and precipitation.

Climate extremes not always recognized in proxy archives

The scientists learned that these proxy archives provide an incomplete record of climate variation. The annual width or density of tree-rings is not only influenced by temperature while the ring is developing, but also from the climate of the past years and other factors like tree age. This makes it difficult to extract pure temperature signals from these natural archives.

Importantly, the researchers found out that proxy data underestimate climate fluctuations of, for example, air temperature over the land surface where large year-to-year variability is common. In contrast, long-term trends in precipitation tend to be exaggerated by the proxy records. These findings indicate that the proxy data often result in a “blurry picture” of climate variation. The researchers were able to conclude from their work that short-term extreme climate events, such as individual years with hot summers, are not well captured by the proxy reconstructions.

Temperature trends can’t be used to understand rainfall

Investigations on the individual factors and processes fingerprinted in tree-ring, ice-core and speleothem records are needed to develop a more accurate history and understanding of the climate system. The authors explicitly warn that proxy records that predominately reflect temperature variation should not be used to make conclusions about precipitation change and vice-versa. "Our results point to uncertainties in the global climate system that were previously not recognized," says David Frank, co-author of this study. He continues: "This might be surprising because we know more about the Earth’s climate now than say 20-years ago. Part of the scientific process is to confront and uncover these unknowns while developing climate reconstructions." There is still a lot of basic research needed to reduce uncertainties about how the Earth’s climate system operated prior to the industrial era and how it may operate in the future.
Spectral biases in tree-ring climate proxies

Jörg Franke et al.

Abstract

External forcing and internal dynamics result in climate system variability ranging from sub-daily weather to multi-centennial trends and beyond1, 2. State-of-the-art palaeoclimatic methods routinely use hydroclimatic proxies to reconstruct temperature (for example, refs 3, 4), possibly blurring differences in the variability continuum of temperature and precipitation before the instrumental period. Here, we assess the spectral characteristics of temperature and precipitation fluctuations in observations, model simulations and proxy records across the globe. We find that whereas an ensemble of different general circulation models represents patterns captured in instrumental measurements, such as land–ocean contrasts and enhanced low-frequency tropical variability, the tree-ring-dominated proxy collection does not. The observed dominance of inter-annual precipitation fluctuations is not reflected in the annually resolved hydroclimatic proxy records. Likewise, temperature-sensitive proxies overestimate, on average, the ratio of low- to high-frequency variability. These spectral biases in the proxy records seem to propagate into multi-proxy climate reconstructions for which we observe an overestimation of low-frequency signals. Thus, a proper representation of the high- to low-frequency spectrum in proxy records is needed to reduce uncertainties in climate reconstruction efforts.

SOURCE

SOURCE




 

Is clean energy an impossible dream?

"Don’t get us wrong: If low-polluting renewable energy sources could displace fossil fuels without massive taxpayer subsidies that would harm the economy, you’d find us at the front of the parade. But President Obama’s undying devotion to clean energy — memorably invoked in his inaugural address — should trouble anyone who does not believe in showering public money on industries with no hope of a back-end payoff for the taxpayer or consumer.

On the steps of the Capitol, Obama again spoke of clean energy as the energy of the future, intoning: “We cannot cede to other nations the technology that will power new jobs and new industries.” He also repeated the argument that clean energy is a necessary prerequisite for saving the world from catastrophic global warming. Let’s look at both points.

If clean energy is the energy of future, then it’s news to the analysts within the Obama administration. The U.S. Energy Information Administration (EIA) — the analytic arm of the U.S. Department of Energy — predicts that renewable energy (excluding liquid biofuels like ethanol which are, at present, as carbon-intensive as crude oil) will rise from 8 percent of total U.S. energy consumption today to a grand total of 11 percent in 2040. Moreover, that modest gain in market share is not expected to come from improvements in clean energy’s ability to compete with fossil fuels. No, the EIA believes that this anemic growth stems “mainly from the implementation of … state renewable portfolio standard (RPS) programs for electricity generation” (that is, state programs that simply dictate that a certain amount of renewables are produced regardless of cost).

If this is the main pillar of the president’s plan to create jobs, then we’re in big trouble. First of all, there’s no evidence to suggest that “clean” energy is more labor intensive than “brown” energy. After all, once the wind farms or solar facilities are built, it doesn’t take a lot of employees to fuel them or run them unless they happen to break down. If plant construction is the main source of job creation, then we could accomplish the same end by building museums, highways, oil refineries, or a few dozen Egyptian-style pyramids for that matter.

To be fair, forecasting future energy market shares is a problematic and — if past is prologue — nearly pointless exercise. Much hinges on technological innovations and breakthroughs that have yet to occur (and may never occur). Even on the eve of a revolution in hydraulic fracturing, few forecasters saw anything but sky-high natural gas prices as far as the eye could see. Still, the EIA’s forecasts represent our most educated guesses about where the future will take us — and alas, even those who draw paychecks from the Obama administration believe that clean energy will remain a bit player in energy markets despite the myriad tax credits, loan guarantees, and government production mandates to change that reality.

It’s difficult to believe that this modest gain in market share is going to do much to reduce the impact of climate change. Happily, hydraulic fracturing is doing that environmental job for us. As Mitt Romney and his cohorts on the right were fond of telling us during the recent presidential campaign, 135 coal-fired power plants have already closed during the Obama administration and another 175 are scheduled to close by 2016. But what Romney & co. didn’t tell us is that low-cost natural gas — courtesy of hydraulic fracturing — was the main reason for those plant closures. Jesse Ausubel, director of the Program for the Human Environment at Rockefeller University, argues persuasively that this will continue as carbon-rich fuels continue to give way — as they have historically — to hydrogen-rich fuels. Yesterday, it was coal displacing biomass, then oil displacing coal. Today, it’s natural gas displacing oil and coal. Tomorrow, it will likely be hydrogen displacing natural gas.

Would a more aggressive set of government policies succeed on the clean energy front? One never knows, but it’s worth noting that the two instances in which the federal government has made Herculean efforts to turn ugly energy ducks into beautiful economic swans — nuclear energy and corn ethanol — have failed spectacularly despite decades of concentrated political effort and tens of billions of dollars of taxpayer assistance. Nuclear energy and corn ethanol continue to be so uncompetitive that, absent continuing government subsidy, those industries would largely disappear. There’s no reason to think that throwing the same effort into clean energy will turn out any differently.

Environmentalists remain wedded to clean energy subsidies because they fear that, even if we are correct, no better policy avenue exists to address climate change. This approach is likely to yield next to nothing, although it does provide the illusion that climate risks are being addressed. But they aren’t. Far better, we think, for environmental voters to have no such illusions about what the president is delivering.

SOURCE





Glencore director says corn use in biofuel questionable

A director of commodity trading giant Glencore on Sunday questioned the conversion of corn into ethanol biofuel, saying it can contribute to higher prices.

Critics of using foodstuffs to make fuel say the process can drive up food prices by reducing available supplies, hitting the world's poorest people hardest.

Responding to a question in a panel discussion at the Kingsman Dubai sugar conference, Chris Mahoney, director of agricultural products at Glencore, said; "Ethanol production from grains and from edible oil is questionable."  He added, "It has been a factor in creating a higher price environment."

Sunny Verghese, CEO of commodity merchant Olam International Ltd, which trades a range of agricultural commodities, was more critical of the use of corn to make ethanol.

"It is inappropriate. It does not make sense to convert corn to ethanol," Verghese told delegates.  "But it makes sense to convert sugarcane to ethanol."

Later Verghese told Reuters: "I don't believe that converting corn into ethanol helps the food complex. I don't think, given the input-output usage efficiency, it makes a lot of sense to do this."  He did not elaborate.

The February 2-5 Kingsman sugar conference has gathered more than 600 sugar trade leaders from around the world.

SOURCE




 

Another Made Up Mandate on Energy that Doesn't Exist

Requiring the citizens of the kingdom to purchase something that doesn’t exist, and then fining them for not doing it sounds more like the behavior of a tinhorn dictator than the actions of a global superpower—but then, maybe the “superpower” status has led the US government to believe that it can “let the wish be father to the thought.”

Perhaps Congress, the authors of the Clean Air Act, and, more specifically, the Environmental Protection Agency (EPA)—and even biofuel lobbyists—have attended too many motivational seminars in which they were taught: “If you can dream it, then you can achieve it.”

The dream to “achieve” is cellulosic biofuel or ethanol—which has an admirable goal of producing a renewable transportation fuel without impacting the world’s food supply. Different from corn- or sugar-based ethanol—which is technologically achievable (with questionable benefits)—cellulosic ethanol is made from wood chips, switchgrass, and agricultural waste, such as corn cobs.

The problem is the dream doesn’t match reality.

Through the Clean Air Act, the EPA can mandate a set volume of cellulosic biofuels that refiners must blend into gasoline based on “the projected volume available.” In 2007, the Energy Independence and Security Act (EISA) established annual renewable fuel volume targets. The “targets” increase each year to reach 36 billion gallons by 2022. The EISA’s original cellulosic biofuel expectation for 2013 was 1 billion gallons.

The targets gave birth to a new cellulosic ethanol industry. Thanks to the government mandates, start-ups such as Range Fuels and Cello Energy were born. They cranked out press releases touting a potential for millions of gallons of the biofuel. Based on optimistic projections aimed at attracting investors, the EPA set its targets.

In 2006, President Bush pledged government funding for the nascent industry—declaring that cellulosic ethanol would be “practical and competitive within six years.” In March 2007, Range Fuels received a $76 million grant from the Department of Energy and another $80 million from the Obama administration in 2009. According to the Wall Street Journal, in May 2009, “Range's former CEO, Mitch Mandich, explained that the problem was that nobody had figured out how to produce cellulosic ethanol in commercial quantities.” Despite the approximately $300 million in a combination of private, state, and federal funding, Range Fuels never produced cellulosic ethanol. The company filed for bankruptcy in December 2011. Cello Energy filed Bankruptcy in October 2010.

So much for “If you can dream it, then you can achieve it.”

But, the lack of cellulosic ethanol didn’t deter the EPA. While they did dial back a 100 million gallon 2010 mandate, to 6.5 million, the EPA didn’t give up on its “dream.” It has continued to predict fantastical production volumes: approximately 5 million gallons in 2010, 6.6 in 2011 and 8.7 in 2012. These predictions establish the volumes that refiners must use to blend into our gasoline—regardless of whether or not the cellulosic ethanol is available. These mandates are called the Renewable Fuel Standards (RFS). If refiners fail to meet the mandate, industry has to purchase waiver credits—essentially a fine that serves as a hidden tax on consumers. The startups who failed to meet their projections and, therefore, didn’t provide the fuel stock to the blenders, weren’t penalized, but the refiners, who are generally in no position to ensure the fuels’ availability, are. Addressing the fines, Stephen Brown, VP for Federal Government Affairs at Tesoro, operating seven refineries in the western United States, says: “An unavoidable fine levied by the government sounds an awful lot like a tax.”

To date, virtually no cellulosic ethanol has been produced. Despite the demise of Range and Cello, there are still a few other companies, which have received taxpayer funding, that claim to be near commercial production. Bill Day, Executive Director of Media Relations for Valero—which had made some investments in cellulosic ethanol (though there are none currently active) and is the world’s largest independent petroleum refiner and marketer, told me cellulosic ethanol was five years away five years ago, and is still five years away today. Companies like Tesoro and Valero still have to pay the fines, even though the product doesn’t exist. Those costs are passed on to us, the consumers, in the form of higher gasoline prices.

It is the absurdity of these credit purchases that prompted the American Petroleum Institute (API) to file a lawsuit last year challenging the EPA’s rulemaking. API petitioned the court to review the EPA’s January 2012 RFS—which API’s Group Down Stream Director Bob Greco claims would have cost more than $8 million in credit purchases.

On January 25, the US Court of Appeals for the District of Columbia, in a unanimous decision. rejected the 8.65 million gallon cellulosic ethanol target—finding that the EPA was projecting far too much production of cellulosic ethanol for 2012.

API’s Greco was optimistic about the decision: “The court has provided yet another confirmation that EPA’s renewable fuels program is unworkable and must be scrapped.” He said: “This decision relieves refiners of complying with the unachievable 2012 mandate and forces EPA to adopt a more realistic approach for setting future cellulosic biofuel mandates.” But Valero’s Day is more cautious: “It is too early to say what the result will be. This issue is likely to continue beyond this one decision. Valero will be watching closely.”

US Senator David Vitter (R-LA), top Republican of the Environment and Public Works Committee, is also enthusiastic: “The EPA has been playing games with made-up standards on renewable fuels, but the recent appellate court decision should be their first clue that it needs to stop. I applaud the DC Circuit Court of Appeals for recognizing how ludicrous the situation is to force refiners to either purchase amounts of a product that doesn’t even exist or pay a hefty fine.”

Likewise, Charles T. Drevna, President of the American Fuel & Petrochemical Manufacturers (AFPM), believes: “The court’s decision provides welcome relief and puts the EPA on notice that it must act as a neutral arbiter rather than as a promoter of cellulosic fuel.” AFPM has a pending petition for a waiver of the 2012 mandate. A similar 2011 waiver was denied.

While the court vacated the 2012 mandate, it was not a total win for industry. The court rejected API's argument that the EPA had to follow the US Energy Information Administration's cellulosic biofuel volume projections in setting its own and, also, that the EPA was not entitled to consider information from cellulosic biofuel producers in setting its projection. Biofuel lobbying groups didn’t see the court’s ruling as a total setback: “Today’s decision, once again, rejects broad-brushed attempts to effectively roll back the federal Renewable Fuel Standard.”

A January 29 Bloomberg report predicts: “Tossing out the 2012 standard for those cellulosic fuels, … leaves the 2013 standard in doubt, as well. The EPA is overdue to issue its mandate for 2013, and this decision may further complicate that process.” On January 30, Vitter said: “The EPA has been getting away with mandating exaggerated fuel standards based on a pie-in-the-sky wish, but now they’ll actually have to use some cold, hard facts.”

Apparently the EPA didn’t allow the decision to “complicate” their 2013 standards. On January 31, the EPA released its 2013 requirements: 14 million gallons of cellulosic biofuels—60% more than 2012. The Fuels America Coalition continues to dream. It reports: “Cellulosic biofuels are being produced now and millions of gallons of cellulosic fuel are expected to come online in the next two years.”

“If you can dream it, then you can achieve it” hasn’t worked so far.

In response to the EPA’s 2013 proposed mandate, Greco suggests that the EPA needs to “provide a more realistic assessment of potential future production rather than simply relying on the assertions of companies whose self-interest is to advertise lofty projections of their ability to produce the cellulosic biofuel.” API recommends basing predictions on the previous year of actual production.

Once again, it looks like the consumers will be paying the price for the EPA’s incompetence in mandating something that doesn’t exist—and for more lawsuits.

Maybe if we dream about a reasonable EPA Administrator to replace the ideologically blinded Lisa Jackson, we can achieve it.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL  and EYE ON BRITAIN.   My Home Pages are   here or   here or   here.  Email me (John Ray) here

Preserving the graphics:  Graphics hotlinked to this site sometimes have only a short life and if I host graphics with blogspot, the graphics sometimes get shrunk down to illegibility.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here and here


*****************************************


No comments: