Thursday, February 07, 2013



Scientific tests of global warming

Common ground amongst climate protagonists

Though you wouldn’t know it from the antagonistic nature of public discussions about global warming, a large measure of scientific agreement and shared interpretation exists amongst nearly all scientists who consider the issue. The common ground includes:

 *   that climate has always changed and always will,
 
*    that carbon dioxide is a greenhouse gas and warms the lower atmosphere,

*   that human emissions are accumulating in the atmosphere,

*    that a global warming of around 0.5OC occurred in the 20th century, but

*    that global warming has ceased over the last 15 years.

The scientific argument over DAGW is therefore about none of these things. Rather, it is almost entirely about three other, albeit related, issues. They are:

*    the amount of net warming that is, or will be, produced by human-related emissions,

*    whether any actual evidence exists for dangerous warming of human causation over the last 50 years, and

*    whether the IPCC’s computer models can provide accurate climate predictions 100 years into the future.

How does science work?

Arguments about global warming, or more generally about climate change, are concerned with a scientific matter. Science deals with facts, experiments and numerical representations of the natural world around us. Science does not deal with emotions, beliefs or politics, but rather strives to analyse matters dispassionately and in an objective way, such that in consideration of a given set of facts two different practitioners might come to the same interpretation; and, yes, I am aware of the irony of that statement in the present context.

Which brings us to the matter of Occam’s Razor and the null hypothesis. William of Occam (1285-1347) was an English Franciscan monk and philosopher to whom is attributed the saying ‘Pluralitas non est ponenda sine necessitate’, which translates as ‘Plurality should not be posited without necessity.’ This is a succinct statement of the principle of simplicity, or parsimony, that was first developed by Aristotle and which has today come to underlie all scientific endeavour.

The phrase ‘Occam’s Razor’ is now generally used as shorthand to represent the fundamental scientific assumption of simplicity. To explain any given set of observations of the natural world, scientific method proceeds by erecting, first, the simplest possible explanation (hypothesis) that can explain the known facts. This simple explanation, termed the null hypothesis, then becomes the assumed interpretation until additional facts emerge that require modification of the initial hypothesis, or perhaps even invalidate it altogether.

Given the great natural variability exhibited by climate records, and the failure to date to compartmentalize or identify a human signal within them, the proper null hypothesis – because it is the simplest consistent with the known facts – is that global climate changes are presumed to be natural, unless and until specific evidence is forthcoming for human causation.

It is one of the more extraordinary facts about the IPCC that the research studies it favours mostly proceed using an (unjustified) inversion of the null hypothesis  – namely that global climate changes are presumed to be due to human-related carbon dioxide emissions, unless and until specific evidence indicates otherwise.

What hypothesis do we wish to test?

Though climate science overall is complex, the greenhouse hypothesis itself is straightforward and it is relatively simple to test it, or its implications, against the available data. First, though, we need to be crystal clear about precisely what we mean by the term.

In general communication, and in the media, the terms greenhouse and greenhouse hypothesis have come to carry a particular vernacular meaning – almost independently of their scientific derivation. When an opinion poll or a reporter solicits information on what members of the public think about the issue they ask questions such as “do you believe in global warming”, “do you believe in climate change” or “do you believe in the greenhouse effect”.

Leaving aside the issue that science is never about belief, all such questions are actually coded ones, being understood by the public to mean “is dangerous global warming being caused by human-related emissions of carbon dioxide”. Needless to say, this is a different, albeit related, question. These and other sloppy ambiguities (“carbon” for “carbon dioxide”, for example) are in daily use in the media, and they lead to great confusion in the public discussion about climate change; they also undermine the value of nearly all opinion poll results.

The DAGW hypothesis that I want to test here is precisely and only “that dangerous global warming is being caused, or will be, by human-related carbon dioxide emissions”. To be “dangerous”, at a minimum the change must exceed the magnitude or rate of warmings that are known to be associated with normal weather and climatic variability.

What evidence can we use to test the DAGW hypothesis?

Many different lines of evidence can be used to test the DAGW hypothesis. Here I have space to present just five, all of which are based upon real world empirical data. For more information, please read both Dr. Hayhoe’s and my book.

Consider the following tests:

(i)     Over the last 16 years, global average temperature, as measured by both thermometers and satellite sensors, has displayed no statistically significant warming; over the same period, atmospheric carbon dioxide has increased by 10%.

Large increases in carbon dioxide have therefore not only failed to produce dangerous warming, but failed to produce any warming at all. Hypothesis fails.

(ii)   During the 20th century, a global warming of between 0.4O C and 0.7O C occurred, at a maximum rate, in the early decades of the century, of about 1.7O C/century. In comparison, our best regional climate records show that over the last 10,000 years natural climate cycling has resulted in temperature highs up to at least 1O C warmer than today, at rates of warming up to  2.5O C/century.

In other words, both the rate and magnitude of 20th century warming falls well within the envelope of natural climate change. Hypothesis fails, twice.

(iii)  If global temperature is controlled primarily by atmospheric carbon dioxide levels, then changes in carbon dioxide should precede parallel changes in temperature.

In fact, the opposite relationship applies at all time scales. Temperature change precedes carbon dioxide change by about 5 months during the annual seasonal cycle, and by about 700-1000 years during ice age climatic cycling. Hypothesis fails.

(iv)  The IPCC’s computer general circulation models, which factor in the effect of increasing carbon dioxide, project that global warming should be occurring at a rate of +2.0O C/century.

In fact, no warming at all has occurred in either the atmosphere or the ocean for more than the last decade. The models are clearly faulty, and allocate too great a warming effect for the extra carbon dioxide (technically, they are said to overestimate the climate sensitivity). Hypothesis fails.

(v)    The same computer models predict that a fingerprint of greenhouse-gas-induced warming will be the creation of an atmospheric hot spot at heights of 8-10 km in equatorial regions, and enhanced warming also near both poles.

Given that we already know that the models are faulty, it shouldn’t surprise us to discover that direct measurements by both weather balloon radiosondes and satellite sensors show the absence of surface warming in Antarctica, and a complete absence of the predicted low latitude atmospheric hot spot. Hypothesis fails, twice.

One of the 20th century’s greatest physicists, Richard Feynman, observed about science that:

“In general we look for a new law by the following process. First we guess it. Then we compute the consequences of the guess to see what would be implied if this law that we guessed is right. Then we compare the result of the computation to nature, with experiment or experience; compare it directly with observation, to see if it works.

It’s that simple statement that is the key to science. It does not make any difference how beautiful your guess is. It does not make any difference how smart you are, who made the guess, or what his name is. If it disagrees with experiment it is wrong.”

None of the five tests above supports or agrees with the predictions implicit in the greenhouse hypothesis as stated above. Richard Feynman is correct to advise us that therefore the hypothesis is invalid, and that many times over.

Summary

The current scientific reality is that the IPCC’s hypothesis of dangerous global warming has been repeatedly tested, and fails. Despite the expenditure of large sums of money over the last 25 years (more than $100 billion),  and great research effort by IPCC-related and other (independent) scientists, to date no scientific study has established a certain link between changes in any significant environmental parameter and human-caused carbon dioxide emissions.

In contrast, the null hypothesis that the global climatic changes that we have observed over the last 150 years (and continue to observe today) are natural in origin has yet to be disproven. As summarised in the reports of the Nongovernmental International Panel on Climate Change (NIPCC), literally thousands of papers published in refereed journals contain facts or writings consistent with the null hypothesis, and plausible natural explanations exist for all the post-1850 global climatic changes that have been described so far.

SOURCE





Consensus science is bad science

When scientists wish to speak with one voice, they typically do so in a most unscientific way: the consensus report. The idea is to condense the knowledge of many experts into a single point of view that can settle disputes and aid policy-making. But the process of achieving such a consensus often acts against these goals, and can undermine the very authority it seeks to project.

My most recent engagement with this form of penance is marked this week with the release of Geoengineering: A National Strategic Plan for Research on Climate Remediation. Sponsored by the Bipartisan Policy Center in Washington DC, the report reflects more than a year of discussion between 18 experts from a diverse range of fields and organizations. It sets out, I think, many valuable principles and recommendations.

The discussions that craft expert consensus, however, have more in common with politics than science. And I don't think I give too much away by revealing that one of the battles in our panel was over the term geoengineering itself.

This struggle is obvious in the report's title, which begins with 'geoengineering' and ends with the redundant term 'climate remediation'. Why? Some of the committee felt that 'geoengineering' was too imprecise; some thought it too controversial; others argued that it was already commonly used, and that a new term would create confusion.

I didn't have a problem with 'geoengineering', but for others it was a do-or-die issue. I yielded on that point (and several others) to gain political capital to secure issues that had a higher priority for me. Thus, disagreements between panellists are settled not with the 'right' answer, but by achieving a political balance across many of the issues discussed.

This political essence of consensus leads to other difficulties. Ask a panel to address broad questions — future directions for a field, say, or ways to improve a government programme — and the recommendations that come back are typically bland and predictable. New and controversial ideas are inherently difficult for experts to agree on. In the absence of consensus, the default position is simply to call for more research — the one recommendation that most scientists can get behind.

Sometimes, expert panels are asked to find consensus on narrow technical questions at the heart of public controversies. The hope is that a unified scientific voice will resolve the dispute, but it rarely works out that way. In 2000, the US National Academies assembled climate experts to resolve discrepancies in surface and satellite climate temperature records, as if this would help to settle the political debate. A decade on, it is clear that the goal was not met.

And in 2009, at the height of the US debate on health-care reform, the US Preventive Services Task Force released a consensus report on the risks and benefits of mammograms. Rather than clarifying anything, the key recommendation — that mammograms were being overutilized — became instant ammunition for reform opponents, who viewed it as a threat to patient autonomy.

The fuss over mistakes in the 2007 reports by the Intergovernmental Panel on Climate Change highlights a related problem: a claim of scientific consensus creates a public expectation of infallibility that, if undermined, can erode public confidence. And when expert consensus changes, as it has on health issues from the safety of hormone replacement therapy to nutritional standards, public trust in expert advice is also undermined.

The very idea that science best expresses its authority through consensus statements is at odds with a vibrant scientific enterprise. Consensus is for textbooks; real science depends for its progress on continual challenges to the current state of always-imperfect knowledge. Science would provide better value to politics if it articulated the broadest set of plausible interpretations, options and perspectives, imagined by the best experts, rather than forcing convergence to an allegedly unified voice.

Yet, as anyone who has served on a consensus committee knows, much of what is most interesting about a subject gets left out of the final report. For months, our geoengineering group argued about almost every issue conceivably related to establishing a research programme. Many ideas failed to make the report — not because they were wrong or unimportant, but because they didn't attract a political constituency in the group that was strong enough to keep them in. The commitment to consensus therefore comes at a high price: the elimination of proposals and alternatives that might be valuable for decision-makers dealing with complex problems.

Some consensus reports do include dissenting views, but these are usually relegated to a section at the back of the report, as if regretfully announcing the marginalized views of one or two malcontents. Science might instead borrow a lesson from the legal system. When the US Supreme Court issues a split decision, it presents dissenting opinions with as much force and rigour as the majority position. Judges vote openly and sign their opinions, so it is clear who believes what, and why — a transparency absent from expert consensus documents. Unlike a pallid consensus, a vigorous disagreement between experts would provide decision-makers with well-reasoned alternatives that inform and enrich discussions as a controversy evolves, keeping ideas in play and options open. That is something on which we should all agree.

SOURCE





Spreading an Energy Revolution

ONLY two or three years ago, consensus was building among pundits that we had reached peak oil, that the fossil fuel industry was in its dotage and that the world would suffer repeated energy price shocks in the transition to a post-fossil fuel economy. Many people in the oil industry were skeptical of this dire prognosis, and the extraordinary recent expansion of unconventional gas and oil production in North America proved the optimists to be correct.

What many fail to recognize, however, is that North America’s oil and gas renaissance, which has the potential to fuel a U.S. industrial recovery with cheaper energy, is not a happy accident of geology and lucky drilling. The dramatic rise in shale-gas extraction and the tight-oil revolution (mostly crude oil that is found in shale deposits) happened in the United States and Canada because open access, sound government policy, stable property rights and the incentive offered by market pricing unleashed the skills of good engineers.

Last year, in BP’s Energy Outlook 2030, we hailed the prospect of North American energy self-sufficiency. With the incentive of high oil prices and the application to oil of drilling techniques mastered for shale gas, we now estimate that tight oil will account for almost half of the 16 million barrel per day increase in the world’s oil output by 2030. Almost two thirds of the new oil will come from the Americas, mainly U.S. tight oil and oil sands from Canada. The United States is likely to surpass Saudi Arabia in daily output very soon, and non-OPEC production will dominate global supply growth over the coming decade.

Policy, not geology, is driving the extraordinary turn of events that is boosting America’s oil industry. East Asia boasts shale and tight-oil resources greater than those of the United States. Latin America and Africa too have very substantial endowments. However, the competitive environment, government policy and available infrastructure mean that North America will dominate the production of shale gas and tight oil for some time to come.

Markets play a dual role in changing the landscape. A decade of high prices spurred on the technological change that is now restoring North America’s energy crown. And the same market forces are promoting efficiency and curtailing energy demand in countries where the price mechanism is allowed to do its job. Consumption of liquid fuel among member countries of the Organization for Economic Cooperation and Development will continue to fall and be overtaken in 2014 by demand from emerging market nations, where fuel prices are still often subsidized.

One consequence, not often discussed, is the impact of these changes on today’s oil market. The tight-oil revolution poses a challenge to the OPEC nations and their national oil companies. We predict that all of the additional oil supplied to the market over the next decade will come from unconventional sources outside OPEC.

The expected surge of new oil will lead to increased supply overall and continued market volatility. If history is any guide, OPEC will cut production and forego market share in favor of price stability. But as so often before, its policy response and its ability and willingness to manage spare capacity will be crucial in determining market conditions in the medium and longer term.

In the 1980s, oil from the North Sea and Alaska transformed the market. Today, market-led innovation has brought us to a crossroads again, and the time has come to make critical decisions about energy. Nations with abundant resources must decide whether to follow the path of open markets, including foreign access and competitive pricing. Alternatively, they can opt for restrictive investment regimes that risk becoming less rewarding.

Communities must decide whether the carbon-reducing benefit of using natural gas in power generation outweighs the fear of new drilling technologies. Europe too must grasp the market nettle. Without a clear signal that carbon has a price, European power utilities will be charmed by the cheapness of coal, increasingly available thanks to America’s embrace of shale gas.

A surge in shale gas and tight-oil production is transforming our energy landscape. Forecasts of its potential differ widely. What is certain, however, is that our energy future is not wholly at the mercy of geology. The speed at which we can bring this useful resource to market will depend to a great extent on issues that will be decided by our governments, in our parliaments and in our town halls.

SOURCE




Study: Work less, save the planet

Americans should work less, play more -- and in doing so, save the planet. That’s the basic formula a Washington think tank is shopping around as a way to cut down on global warming.

The shift from a U.S. work model to a more "European" one – which includes shorter work weeks and more vacation time -- could cut as much as half of the expected global temperature rise by 2100, according to a new report from the Center for Economic and Policy Research.  The study claims that scaling back on work hours could bring down greenhouse gases.

“The calculation is simple: fewer work hours means less carbon emission, which means less global warming,” economist David Rosnick said.

Assuming that 40-to-60 percent of potential global warming is already locked in, about one-quarter to one-half of the warming that is not already locked in could be cut by scaling back hours, Rosnick, who wrote the study, said.

“For many years, European countries have been reducing work hours -- including by taking more holidays, vacation and leave – while the United States has gone the route of increased production, but it might be time for a change,” Rosnick told FoxNews.com Tuesday.

But it's not all umbrella-drinks and beaches. There is a trade-off that many people may not like, he said.

“Obviously, if you are working less, you’re buying less,” he said. “That’s the flip side.”

Rosnick’s study released this week is similar to a 2006 one he co-wrote with economist Mark Weisbrot that looked at the potential environmental effects of European and other countries adopting the American ethic of longer work hours.

In that report, the authors found that “Old Europe” currently consumes about half as much energy per person as does the U.S. If Europe upped its production levels, they would most likely consume 30 percent more energy. The report outlined how worldwide energy patterns could be dependent on which model developing countries choose to copy in the coming years. If they choose  to follow the U.S. model, they would apparently consume 30 percent more energy than they do now

SOURCE




Electric cars not viable

Hybrid car pioneer and “father of the Prius” Takeshi Uchiyamada says the billions poured into developing battery electric vehicles have ultimately been in vain. "Because of its shortcomings--driving range, cost and recharging time--the electric vehicle is not a viable replacement for most conventional cars," said Uchiyamada. "We need something entirely new."

Uchiyamada’s comments come as the U.S. Department of Energy announced Thursday that the government is backing off President Barack Obama’s promise to put one million electric cars on American roads by 2015. As Breitbart News reported last September, there are just 30,000 electric cars on American roads.

"Whether we meet that goal in 2015 or 2016, that's less important than that we're on the right path to get many millions of these vehicles on the road," said an Energy Department official.
President Obama made promotion of electric vehicles a key component of his green initiative. Last September, the Congressional Budget Office reported that federal policies to prop up and promote electric cars will cost taxpayers $7.5 billion through 2019.

Several of the electric car companies Obama has funneled taxpayer funds to have floundered. U.S. electric battery maker A123 Systems, which received a $249 million taxpayer-funded government loan, announced last year its decision to sell a controlling stake to Wanxiang, a Chinese company, for $450 million.

Similarly, lithium-ion battery manufacturer Ener1, Inc., which received a $118.5 million taxpayer-funded grant, filed for bankruptcy. And another company, Aptera Motors, has already folded.

“The electric car, after more than 100 years of development and several brief revivals, still is not ready for prime time--and may never be,” concludes Reuters.

SOURCE




World’s first methane hydrate mining to begin off central Japan coast

The Japanese government has revealed that its Japan Oil, Gas, and Metals National Corp. has dispatched a mining ship that will begin the world’s first offshore test to excavate methane hydrate from the seabed. As a potential new energy source, the search for methane hydrate will take place in the eastern Nankai Trough, roughly 70 kilometers (43.5 miles) off Aichi Prefecture‘s Atsumi Peninsula, in central Japan.

The oil, gas, and metal company’s deep-sea drilling ship Chikyu set sail last week for an offshore well that drilled last year. Measuring 1,000 meters (0.6 miles) deep, the well reaches a 300 meter (980 feet) layer of methane hydrate under the seabed, where the testing is to take place. Also known as “burning ice,” there has been much attention on methane hydrate as a new plentiful natural fuel resource.

The next step will involve inserting a large pipe down into the well in order to separate methane hydrate into methane gas and water. If everything goes smoothly, and there are no delays in scheduling, the extraction will begin in March seeing the removal of as much as 10,000 cubic meters of gas per day over a two-week period. Estimates say that Japan’s coastal waters hold nearly 100 times the amount of natural gas that the country uses per year, and the government’s Industry Ministry plans to eventually survey the Sea of Japan for methane hydrate.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL  and EYE ON BRITAIN.   My Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Graphics hotlinked to this site sometimes have only a short life and if I host graphics with blogspot, the graphics sometimes get shrunk down to illegibility.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here and here

*****************************************



No comments: