Thursday, July 03, 2014


Goddard was right after all

Anthony Watts climbs down below. A climate record that people rely on to justify billions of dollars  of panic spending is now admitted to contain extensive "zombie" data.  Watts is still apologizing for the Warmist "scientists" as he obviously wants to be loved.  But he has no explanation for the fact that most of the "errors" are in the Warmist direction.  If you are aware of the extensive exposition of crookedness at NOAA by Roger Pielke Sr., you would be much less optimistic about the Warmist "scientists" than Watts is

Sometimes, you can believe you are entirely right while simultaneously believing that you’ve done due diligence. That’s what confirmation bias is all about. In this case, a whole bunch of people, including me, got a severe case of it.

I’m talking about the claim made by Steve Goddard that 40% of the USHCN data is “fabricated”. which I and few other people thought was clearly wrong.

Dr. Judith Curry and I have been conversing a lot via email over the past two days, and she has written an illuminating essay that explores the issue raised by Goddard and the sociology going on. See her essay:

http://judithcurry.com/2014/06/28/skeptical-of-skeptics-is-steve-goddard-right/

Steve Goddard aka Tony Heller deserves the credit for the initial finding, Paul Homewood deserves the credit for taking the finding and establishing it in a more comprehensible way that opened closed eyes, including mine, in his post entitled Massive Temperature Adjustments At Luling, Texas.  Along with that is his latest followup, showing the problem isn’t limited to Texas, but also in Kansas. And there’s more about this below.

Goddard early on (June 2) gave me his source code that made his graph, but I couldn’t get it to compile and run. That’s probably more my fault than his, as I’m not an expert in C++ computer language. Had I been able to, things might have gone differently. Then there was the fact that the problem Goddard noted doesn’t show up in GHCN data and I didn’t see the problem in any of the data we had for our USHCN surface stations analysis.

But, the thing that really put up a wall for me was this moment on June 1st, shortly after getting Goddard’s first email with his finding, which I pointed out in On ‘denying’ Hockey Sticks, USHCN data, and all that – part 1.

Goddard initially claimed 40% of the STATIONS were missing, which I said right away was not possible. It raised my hackles, and prompted my “you need to do better” statement. Then he switched the text in his post from stations to data while I was away for a couple of hours at my daughter’s music recital. When I returned, I noted the change, with no note of the change on his post, and that is what really put up the wall for me. He probably looked at it like he was just fixing a typo, I looked at it like it was sweeping an important distinction under the rug.

All of that added up to a big heap of confirmation bias, I was so used to Goddard being wrong, I expected it again, but this time Steve Goddard was right and my confirmation bias prevented me from seeing that there was in fact a real issue in the data and that NCDC has dead stations that are reporting data that isn’t real: mea culpa.

But, that’s the same problem many climate scientists have, they are used to some skeptics being wrong on some issues, so they put up a wall. That is why the careful and exacting analyses we see from Steve McIntyre should be a model for us all. We have to “do better” to make sure that claims we make are credible, documented, phrased in non-inflammatory language, understandable, and most importantly, right.

Otherwise, walls go up, confirmation bias sets in.

Now that the wall is down, NCDC won’t be able to ignore this, even John Nielsen-Gammon, who was critical of Goddard along with me in the Polifact story now says there is a real problem. So does Zeke, and we have all sent or forwarded email to NCDC advising them of it.

I’ve also been on the phone Friday with the assistant director of NCDC and chief scientist (Tom Peterson), and also with the person in charge of USHCN (Matt Menne). Both were quality, professional conversations, and both thanked me for bringing it to their attention.  There is lots of email flying back and forth too.

They are taking this seriously, they have to, as final data as currently presented for USHCN is clearly wrong. John Neilsen-Gammon sent me a cursory analysis for Texas USHCN stations, noting he found a number of stations that had “estimated” data in place of actual good data that NCDC has in hand, and appears in the RAW USHCN data file on their FTP site

What is going on is that the USHCN code is that while the RAW data file has the actual measurements, for some reason the final data they publish doesn’t get the memo that good data is actually present for these stations, so it “infills” it with estimated data using data from surrounding stations. It’s a bug, a big one. And as Zeke did a cursory analysis Thursday night, he discovered it was systemic to the entire record, and up to 10% of stations have “estimated” data spanning over a century:

And here is the real kicker, “Zombie weather stations” exist in the USHCN final data set that are still generating data, even though they have been closed.

Remember Marysville, CA, the poster child for bad station siting? It was the station that gave me my “light bulb moment” on the issue of station siting.

It was closed just a couple of months after I introduced it to the world as the prime example of “How not to measure temperature”. The MMTS sensor was in a parking lot, with hot air from a/c units from the nearby electronics sheds for the cell phone tower:

Guess what? Like Luling, TX, which is still open, but getting estimated data in place of the actual data in the final USHCN data file, even though it was marked closed in 2007 by NOAA’s own metadata, Marysville is still producing estimated monthly data, marked with an “E” flag:

There are quite a few “zombie weather stations” in the USHCN final dataset, possibly up to 25% out of the 1218 that is the total number of stations. In my conversations with NCDC on Friday, I’m told these were kept in and “reporting” as a policy decision to provide a “continuity” of data for scientific purposes. While there “might” be some justification for that sort of thinking, few people know about it there’s no disclaimer or caveat in the USHCN FTP folder at NCDC or in the readme file that describes this, they “hint” at it saying:

"The composition of the network remains unchanged at 1218 stations"

But that really isn’t true, as some USHCN stations out of the 1218 have been closed and are no longer reporting real data, but instead are reporting estimated data.

NCDC really should make this clear, and while it “might” be OK to produce a datafile that has estimated data in it, not everyone is going to understand what that means, and that the stations that have been long dead are producing estimated data. NCDC has failed in notifying the public, and even their colleagues of this. Even the Texas State Climatologist John Nielsen-Gammon didn’t know about these “zombie” stations until I showed him. If he had known, his opinion might have been different on the Goddard issue. When even professional people in your sphere of influence don’t know you are doing dead weather station data infills like this, you can be sure that your primary mission to provide useful data is FUBAR.

NCDC needs to step up and fix this along with other problems that have been identified.

And they are, I expect some sort of a statement, and possibly a correction next week. In the meantime, let’s let them do their work and go through their methodology. It will not be helpful to ANYONE if we start beating up the people at NCDC ahead of such a statement and/or correction.

And there is yet another issue: The recent change of something called “climate divisions” to calculate the national and state temperatures.

Certified Consulting Meteorologist and Fellow of the AMS Joe D’Aleo writes in with this:

"I had downloaded the Maine annual temperature plot from NCDC Climate at a Glance in 2013 for a talk. There was no statistically significant trend since 1895. Note the spike in 1913 following super blocking from Novarupta in Alaska (similar to the high latitude volcanoes in late 2000s which helped with the blocking and maritime influence that spiked 2010 as snow was gone by March with a steady northeast maritime Atlantic flow). 1913 was close to 46F. and the long term mean just over 41F.

Seemingly in a panic change late this frigid winter to NCDC, big changes occurred. I wanted to update the Maine plot for another talk and got this from NCDC CAAG.

Note that 1913 was cooled nearly 5 degrees F and does not stand out. There is a warming of at least 3 degrees F since 1895 (they list 0.23/decade) and the new mean is close to 40F.

Does anybody know what the REAL temperature of Maine is/was/is supposed to be? I sure as hell don’t. I don’t think NCDC really does either."

In closing…

Besides moving toward a more accurate temperature record, the best thing about all this hoopla over the USHCN data set is the Polifact story where we have all these experts lined up (including me as the token skeptic) that stated without a doubt that Goddard was wrong and rated the claim “pants of fire”.

They’ll all be eating some crow, as will I, but now that I have Gavin for dinner company, I don’t really mind at all.

More HERE  (See the original for links, graphics etc.)




What a nit!

"There is no such thing as right and wrong"  -- except when it suits Leftists, of course.  And climate change is WRONG!

Abstract:

The prominent Australian earth scientist, Tim Flannery, closes his recent book Here on Earth: A New Beginning with the words “… if we do not strive to love one another, and to love our planet as much as we love ourselves, then no further progress is possible here on Earth”. This is a remarkable conclusion to his magisterial survey of the state of the planet. Climatic and other environmental changes are showing us not only the extent of human influence on the planet, but also the limits of programmatic management of this influence, whether through political, economic, technological or social engineering. A changing climate is a condition of modernity, but a condition which modernity seems uncomfortable with. Inspired by the recent “environmental turn” in the humanities—and calls from a range of environmental scholars and scientists such as Flannery—I wish to suggest a different, non-programmatic response to climate change: a reacquaintance with the ancient and religious ideas of virtue and its renaissance in the field of virtue ethics. Drawing upon work by Alasdair MacIntyre, Melissa Lane and Tom Wright, I outline an apologetic for why the cultivation of virtue is an appropriate response to the challenges of climate change.

SOURCE





Swapping Climate Models for a Roll of the Dice

One of the greatest failures of climate science has been the dismal performance of general circulation models (GCM) to accurately predict Earth's future climate. For more than three decades huge predictive models, run on the biggest supercomputers available, have labored mighty and turned out garbage. Their most obvious failure was missing the now almost eighteen year “hiatus,” the pause in temperature rise that has confounded climate alarmists and serious scientists alike. So poor has been the models' performance that some climate scientists are calling for them to be torn down and built anew, this time using different principles. They want to adopt stochastic methods—so called Monte Carlo simulations based on probabilities and randomness—in place of today’s physics based models.

It is an open secret that computer climate models just aren't very good. Recently scientists on the Intergovernmental Panel on Climate Change (IPCC) compared the predictions of 20 major climate models against the past six decades of climate data. According to Ben Kirtman, a climate scientist at the University of Miami in Florida and IPCC AR5 coordinating author, the results were disappointing. According to a report in Science, “the models performed well in predicting the global mean surface temperature and had some predictive value in the Atlantic Ocean, but they were virtually useless at forecasting conditions over the vast Pacific Ocean.”

Just how bad the models are can be seen in a graph that has been widely seen around the Internet. Generated by John Christy, Richard McNider, and Roy Spencer, the graph has generated more heat than global warming, with climate modeling apologists firing off rebuttal after rebuttal. Problem is, the models still suck, as you can see from the figure below.



Regardless of the warmists' quibbles the truth is plain to see, climate models miss the mark. But then, this comes as no surprise to those who work with climate models. In the Science article, “A touch of the random,” science writer Colin Macilwain lays out the problem: “researchers have usually aimed for a deterministic solution: a single scenario for how climate will respond to inputs such as greenhouse gases, obtained through increasingly detailed and sophisticated numerical simulations. The results have been scientifically informative—but critics charge that the models have become unwieldy, hobbled by their own complexity. And no matter how complex they become, they struggle to forecast the future.”

Macliwain describes the current crop of models this way:

"One key reason climate simulations are bad at forecasting is that it's not what they were designed to do. Researchers devised them, in the main, for another purpose: exploring how different components of the system interact on a global scale. The models start by dividing the atmosphere into a huge 3D grid of boxlike elements, with horizontal edges typically 100 kilometers long and up to 1 kilometer high. Equations based on physical laws describe how variables in each box—mainly pressure, temperature, humidity, and wind speed—influence matching variables in adjacent ones. For processes that operate at scales much smaller than the grid, such as cloud formation, scientists represent typical behavior across the grid element with deterministic formulas that they have refined over many years. The equations are then solved by crunching the whole grid in a supercomputer."

It's not that the modelers haven't tried to improve their play toys. Over the years all sorts of new factors have been added, each adding more complexity to the calculations and hence slowing down the computation. But that is not where the real problem lies. The irreducible source of error in current models is the grid size.

Indeed, I have complained many times in this blog that the fineness of the grid is insufficient to the problem at hand. This is because many phenomena are much smaller than the grid boxes, tropical storms for instance represent huge energy transfers from the ocean surface to the upper atmosphere and can be totally missed. Other factors—things like rainfall and cloud formation—also happen at sub-grid size scales.

“The truth is that the level of detail in the models isn't really determined by scientific constraints,” says Tim Palmer, a physicist at the University of Oxford in the United Kingdom who advocates stochastic approaches to climate modeling. “It is determined entirely by the size of the computers.”

The problem is that to halve the sized of the grid divisions requires an order-of-magnitude increase in computer power. Making the grid fine enough is just not possible with today's technology.

In light of this insurmountable problem, some researchers go so far as to demand a major overhaul, scrapping the current crop of models altogether. Taking clues from meteorology and other sciences, the model reformers say the old physics based models should be abandoned and new models, based on stochastic methods, need to be written from the ground up. Pursuing this goal, a special issue of the Philosophical Transactions of the Royal Society A will publish 14 papers setting out a framework for stochastic climate modeling. Here is a description of the topic:

"This Special Issue is based on a workshop at Oriel College Oxford in 2013 that brought together, for the first time, weather and climate modellers on the one hand and computer scientists on the other, to discuss the role of inexact and stochastic computation in weather and climate prediction. The scientific basis for inexact and stochastic computing is that the closure (or parametrisation) problem for weather and climate models is inherently stochastic. Small-scale variables in the model necessarily inherit this stochasticity. As such it is wasteful to represent these small scales with excessive precision and determinism. Inexact and stochastic computing could be used to reduce the computational costs of weather and climate simulations due to savings in power consumption and an increase in computational performance without loss of accuracy. This could in turn open the door to higher resolution simulations and hence more accurate forecasts."

In one of the papers in the special edition, “Stochastic modelling and energy-efficient computing for weather and climate prediction,” Tim Palmer, Peter Düben, and Hugh McNamara state the stochastic modeler's case:

"[A] new paradigm for solving the equations of motion of weather and climate is beginning to emerge. The basis for this paradigm is the power-law structure observed in many climate variables. This power-law structure indicates that there is no natural way to delineate variables as ‘large’ or ‘small’—in other words, there is no absolute basis for the separation in numerical models between resolved and unresolved variables."

In other words, we are going to estimate what we don't understand and hope those pesky problems of scale just go away. “A first step towards making this division less artificial in numerical models has been the generalization of the parametrization process to include inherently stochastic representations of unresolved processes,” they state. “A knowledge of scale-dependent information content will help determine the optimal numerical precision with which the variables of a weather or climate model should be represented as a function of scale.” It should also be noted that these guys are pushing “inexact” or fuzzy computer hardware to better accommodate their ideas, but that does not change the importance of their criticism of current modeling techniques.

So what is this “stochastic computing” that is supposed to cure all of climate modeling's ills? It is actually something quite old, often referred to as Monte Carlo simulation. In probability theory, a purely stochastic system is one whose state is non-deterministic—in other words, random. The subsequent state of the system is determined probabilistically using randomly generated numbers, the computer equivalent of throwing dice. Any system or process that must be analyzed using probability theory is stochastic at least in part. Perhaps the most famous early use was by Enrico Fermi in 1930, when he used a random method to calculate the properties of the newly discovered neutron. Nowadays, the technique is used by professionals in such widely disparate fields as finance, project management, energy, manufacturing, engineering, research and development, insurance, oil & gas, transportation, and the environment.

Monte Carlo simulation generates a range of possible outcomes and the probabilities with which they will occur. Monte Carlo techniques are quite useful for simulating systems with many coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and weather forecasts. Other examples include modeling phenomena with significant uncertainty in inputs, which certainly applies to climate modeling. Unlike current GCM, this approach does not seek to simulate natural, physical processes, but rather to capture the random nature of various factors and then make many simulations, called an ensemble.

Since the 1990s, ensemble forecasts have been used as routine forecasts to account for the inherent uncertainty of weather processes. This involves analyzing multiple forecasts created with an individual forecast model by using different physical parameters and/or varying the initial conditions. Such ensemble forecasts have been used to help define forecast uncertainty and to extend forecasting further into the future than otherwise possible. Still, as we all know, even the best weather forecasts are only good for five or six days before they diverge from reality.

An example can be seen in the tracking of Atlantic hurricanes. It is now common for the nightly weather forecast during hurricane season to include a probable track for a hurricane approaching the US mainland. The probable track is derived from many individual model runs.

Can stochastic models be successfully applied to climate change? Such models are based on a current state which is the starting point for generating many future forecasts. The outputs are based on randomness filtered through observed (or guessed at) probabilities. This, in theory, can account for such random events as tropical cyclones and volcanic eruptions more accurately than today's method of just applying an average guess across all simulation cells. The probabilities are based on previous observations, which means that the simulations are only valid if the system does not change in any significant way in the future.

And here in lies the problem with shifting to stochastic simulations of climate change. It is well know that Earth's climate system is constantly changing, creating what statisticians term nonstationary time series data. You can fit a model to previous conditions by tweaking the probabilities and inputs, but you cannot make it forecast the future because the future requires a model of something that has not taken form yet. Add to that the nature of climate according to the IPCC: “The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible.”

If such models had been constructed before the current hiatus—the 17+ year pause in rising global temperatures that nobody saw coming—they would have been at as much a loss as the current crop of GCM. You cannot accurately predict that which you have not previously experienced, measured, and parametrized, and our detailed climate data are laughingly limited. With perhaps a half century of detailed measurements, there is no possibility of constructing models that would encompass the warm and cold periods of the Holocene interglacial, let alone the events that marked the last deglaciation (or those that will mark the start of the next glacial period).

Economists had been forced to deal with this type of system because the economic system of the world is not static but always changing (see “Econometrics vs Climate Science”). They have developed a number of tools that can provide some insight but not a solution to this situation. While economists have led the way for climate forecasters, look at how untrustworthy economic forecasts remain. The sad truth is that this effort will also not work for long-range prediction, anymore than economists can tell us what the economic outlook is for 2100. It is time for climate scientists to get out of the forecasting game and go back to doing real, empirically based science.

Be safe, enjoy the interglacial and stay skeptical.

SOURCE




Jim Hansen's 400,000 Hiroshima bombs worth of heat per day produces RECORD Antarctic sea ice

Amid much wriggling by the Warmists

The sea ice surrounding Antarctica, which, as I reported in my book, has been steadily increasing throughout the period of satellite measurement that began in 1979, has hit a new all-time record high for areal coverage.

The new record anomaly for Southern Hemisphere sea ice, the ice encircling the southernmost continent, is 2.074 million square kilometers and was posted for the first time by the University of Illinois at Urbana-Champaign’s The Cryosphere Today early Sunday morning.

It was not immediately apparent whether the record had occurred on Friday or Saturday. Requests for comment to Bill Chapman, who runs The Cryosphere Today, were not immediately returned.

The previous record anomaly for Southern Hemisphere sea ice area was 1.840 million square kilometers and occurred on December 20, 2007.

Global sea ice area, as of Sunday morning, stood at 0.991 million square kilometers above average. (The figure was arrived at by adding the Northern Hemisphere anomaly and the Southern Hemisphere anomaly. A graph provided by The Cryosphere Today showed the global anomaly as 1.005 million square kilometers.)

Although early computer models predicted a diminishment of both Northern Hemisphere and Southern Hemisphere sea ice due to anthropogenic global warming, subsequent modeling has posited that the results of warming around Antarctica would, counter-intuitively, generate sea ice growth.

A freshening of the waters surrounding the southernmost continent as well as the strengthening of the winds circling it were both theorized as explanations for the steady growth of Antarctica’s sea ice during the period of satellite measurement.

A number of prominent climatologists have discounted the growth of Antarctic sea ice, arguing that it is less significant to global circulation than ice in the Arctic basin.

Walt Meier, formerly of the National Snow and Ice Data Center and currently of NASA’s Goddard Institute for Space Studies, has previously said that Antarctic sea ice, which has little ice that survives year to year, is less significant than Arctic sea ice to the climate system.

“While the Arctic has seen large decreases through the year in all sectors, the Antarctic has a very regional signal – with highs in some areas and lows in others,” Meier said in 2013. “And of course, the Arctic volume is decreasing substantially through the loss of old ice. The Antarctic, which has very little old ice, hasn’t much of a volume change, relatively speaking.”

The new Antarctic record anomaly was more than 10 percent greater than the previous record.

The steady growth of Antarctic sea ice and its influence on global sea ice appeared to provide a public relations problem, at a minimum, for those warning of global warming’s menace. According to Meier and some other climatologists, global sea ice area is simply not a metric to consider when examining the climate system.

“A plot of global sea ice is just not informative or useful,” Meier said.

Global sea ice, during the course of the last year and a half, has seen its most robust 18-month period of the last 13 years, maintaining, on average, a positive anomaly for an 18-month period for the first time since 2001.

Phil Jones, of the Climatic Research Unit at the University of East Anglia, waded into the global sea ice analysis in 2013 as well.

“Adding the Arctic and Antarctic sea ice extents doesn’t make that much sense as the two regions are at opposite ends of the world, and the seasons are opposite,” Jones said at the time.

As I also reported in Don’t Sell Your Coat, the temperature at the South Pole has been declining during the past four decades as well.

SOURCE





What Is The Right Level Of Response To Anthropogenic Induced Climate Change?

DEBATE SUMMARY

What Is The Right Level Of Response To Anthropogenic Induced Climate Change?

Held at The Royal Society on 16th June, 2014

Chair: The Earl of Selborne GBE FRS
Chairman, The Foundation for Science and Technology

Speakers:

Sir Mark Walport FRS FMedSci
Government Chief Scientific Adviser

David Davies MP
MP for Monmouth

Professor Jim Skea CBE
Imperial College London and Committee on Climate Change

The Rt Hon Peter Lilley MP
MP for Hitchin and Harpenden

THE EARL OF SELBORNE opened the debate by explaining that the Foundation welcomed the opportunity to provide a neutral platform for both sides of the climate change debate to come together. He hoped that the debate would help to identify common ground.

SIR MARK WALPORT said that it was clear that climate change was happening; the question was ‘what should be our response?’ The physics was accepted; the changing concentration of greenhouse gases (GHGs) was leading to warming of the atmosphere. We know levels of carbon dioxide are higher than ever before and that global emissions are rising. 36 gigatonnes of carbon dioxide were emitted in 2013. The latest report of the Intergovernmental Panel on Climate Change (IPCC)1 report discusses the decline in Arctic ice extent and thickness, the rise in sea levels and indications that there is an increasing likelihood of extreme weather patterns and temperatures, such as intense rainstorms and periods of excess temperatures.

We can respond to climate change through mitigation, adaptation or enduring suffering. In all probability we will need all three. We can mitigate through reducing GHG emissions, and physical works; we can adapt - but there are limits of resources available, security issues, and human will, and we can change lifestyles. We cannot accurately predict regional effects of global warming, but are sure that most effects will be negative. Limiting the rise in atmospheric temperature is vital - if the temperature range were an increase from 2 oC to 5 oC it could, at the upper end of the range, lead to the extinction of many species. Above 2 oC it was possible that “tipping points” such as the melting of the Greenland ice sheet, could occur over a very long period. So we must try to limit global GHG emissions to keep temperature rises below 2oC. Many countries are legislating in an effort to do this, but international agreement is important. As a contribution to meeting the global 2oC target the UK has set a target of reducing GHG emissions by 80% by 2050 compared to 1990 levels.

We need an urgent debate between scientists and politicians about how to do this at affordable cost, while maintaining sustainability and security. There is no magic single bullet - we need greater energy efficiency, reduction of emissions from all carbon fuels wherever used - in transport or industry or domestically - and development of low carbon supply options and increased research and innovation in mitigation and adaptation to climate change.

We cannot wait and see; this generation must choose what to do now to safeguard the planet for future generations.

DAVID DAVIES said that he knew no one who denied the fact that climate was changing, because of the presence of carbon dioxide in the atmosphere. The activities of mankind and society lead to carbon dioxide emissions but it does not follow that the observed increase in atmospheric temperatures in the last 150 years comes from human activity.

There is great variability in global temperature arising from natural causes, as the effect of ice ages throughout history makes clear. Even within historical memory we know that there were warmer and colder periods (the little ice age of the 17th century) and it may be that we are moving from a colder period to a warmer one simply through natural variation. So how can we be sure that the observed 0.8 oC global temperature rise over the last 150 years comes from anthropogenic sources?

There is no clear correlation between temperature rises and carbon dioxide emissions. There was no correlation in the early 20th century and since 1997 there has been no global temperature rising trend.

There are many other causes which can effect temperature changes, such as volcanic emissions. We need to be able to distinguish increases in temperature due to human activity from changes from natural causes. This we cannot do; so to base policies on the need to reduce emission from human activities is unsound.

The precautionary principle is often evoked - we must do something in case disaster might otherwise happen. But this ignores the possibility that disasters can happen in other areas – pandemic disease or financial meltdown for example. What response should be made to these or other possible disasters? By pursuing policies which raise energy costs, the government is driving manufacturing abroad, where manufacturing facilities will continue to emit just as much carbon dioxide.

The UK is being expected to pay the equivalent of an insurance premium for risks which other countries are also responsible for. He did not accept that the increase in emissions from developing countries will be disastrous for them because these countries will become much wealthier and will be able to spend their increased wealth on coping with climate change.

He welcomed the debate because he doubted whether scientists were as open as they should be about the data they held and their models. Environmental groups should be challenged for pursuing contradictory agendas - wanting to limit carbon emissions, yet opposing nuclear new build and the development of shale gas. Gas could displace coal in power generation reducing carbon emissions.

PROFESSOR SKEA said he sat on Working Group III of the Inter-governmental Panel on Climate Change (IPCC). The principal concern of Working Group III was to address the options to mitigate climate change. A key concern was how to respond to the upward trend of the change in temperature rise in the 20th century.

More than 190 countries have signed up to agreements to the UN goal of keeping global temperature increases below 2 oC. This meant according to the IPCC report reducing global emissions by 40% to 70% by 2050 compared to 2010 levels. This could only be done by a massive increase in low carbon energy production through developing nuclear power, renewables or deploying cost effective carbon capture and storage (CCS) systems, and promoting energy efficiency, particularly in transport.

This meant a change in investment priorities, away from fossil fuels towards other energy options. We do not have sufficient information about costs to judge between expenditure on mitigation and adaptation, but overall, if the 2 oC target is to be reached, we will need to forego 1% to 4% of consumption by 2030. But these estimates do not take into account the reduced impacts and benefits from better air quality and greater energy security.

Climate change is a global problem; dealing with it is a common responsibility. The UK is not alone - consider the actions taken in the US and China. Of course economic development is good - but it brings unwelcome side effects which need government action. The policy response should be based on scientific evidence. He cited the early resistance to the passing of the Public Health Acts after the cholera epidemics in the 19th century and the Clean Air Act of the 1950s which eventually gained wide acceptance. Climate change is one of the biggest global challenges. The UK is right in its response.

PETER LILLEY said that he did not doubt the science of climate change, but he was concerned about the refusal of those committed to the environmental cause to engage in debate about the economic consequences of proposals. He was particularly concerned about the effects premature decarbonisation would have on the poor and in developing countries. He had voted against the Climate Change Act because he had read the cost benefit analysis provided when the Bill was debated in the House. The analysis showed that the potential cost was twice the benefit from global warming. No one wanted to discuss the cost; they simply wished to demonstrate moral superiority.

He particularly doubted the way that models had been used to forecast the future path of global average temperatures. He showed a chart of 50 model plots of global temperature versus time. Only two models in his diagram correlated with historical date. But all 50 were cited as evidence. In short, we do not know the path of future long-term temperature trends. Asked if the current pause is temporary or long-term, a scientist’s reply was that they would only know in 50 years what were the long term trends.

The poor in developing countries were vulnerable because they were poor, not because they suffered from the weather. If their energy costs rise - because of renewables- they will consume less energy and remain poorer than they would otherwise be. They would be less healthy as a result. Lord Stern in his report to HM Treasury in advocated spending now, so that our descendants would have to spend less in the future. But this meant in practice, sacrificing the poor - the great multitude - in Africa and Asia.

We do not know what the effects of a 4 oC rise will be - whether it will mean the extinction of the human race, or great inconvenience. Society can adapt to a great deal of change; and knowledge of how to respond increases continually. Global warming has benefits; it will reduce temperature variability between the poles and tropics; which might be a benefit. Our policies should be to focus on promoting energy efficiency, innovative energy storage and developing shale gas and drop expensive uncertain technologies such as biofuels, wind and solar generation. Above all we should link any increases in carbon tax to actual increases in global temperatures.

More HERE



GREENIE ROUNDUP FROM AUSTRALIA

'The unaffordable energy capital of the world': Tony Abbott blames green companies for increasing power prices in Australia

Three current articles below

Tony Abbott has hit out at the green energy sector claiming the renewable energy target (RET) is the cause of rising energy prices in Australia.

The Prime Minister said the country is well on its way to being 'the unaffordable energy capital of the world' and that's the reason for the government's review of the RET, report The Financial Review.

'We should be the affordable energy capital of the world, not the unaffordable energy capital of the world and that’s why the carbon tax must go and that’s why we’re reviewing the RET,' he told the publication.

Clean energy companies have responded to these claims saying Mr Abbott completely exaggerated the impact that the target would have, and in the long run the nation would be better off financially and environmentally from the scheme.

The RET currently states that by 2020, 20 percent of energy should come from renewable sources, however this could be subject to change under the government's upcoming review.

In the Senate next week the government will try to abolish the carbon tax, but opposition leader Bill Shorten has vowed to continue the crusade for action against climate change.

Clive Palmer is set to block the government from lowering or abandoning the RET until after the election in 2016.

Infigen, Pacific Hydro, Senvion and the Clean Energy Council are all among the companies who have disagreed with the Prime Minister's comments, and a spokesperson for Senvion said if the RET is kept in place the price of power bills will drop off by 2020.

Clean Energy Council director Russell March agreed, claiming the only other alternative to the target is a switch to gas-fired power, but the price of that resource is on the up.

The consensus in the renewable energy industry is that power prices will drop as more forms of renewable energy are being utilised, with some companies citing the decrease in power bills around the $50 mark.

This week saw the Crawford Australian Leadership Forum take place in Canberra, and economists from around the world including Nobel Prize recipient Joseph Stiglitz and former Reserve Bank of Australia board member Warwick McKibbin were among the experts calling for Australia to have a price on carbon, according to AFR.

Professor Stiglitz described putting a price on carbon as a 'no-brainer' and said it is more practical than taxing labour or capital, plus it would set Australia up for the future.

By pricing carbon now Australia would be taking a step forward to combating climate change he said, and the world would soon follow.

Aluminium refineries are also a big player in the RET debate, which are currently said to be 90 percent exempt from paying for renewable energy.

The government is expected to make a move from the backbench to completely clear the refineries from paying for any form of green energy.

SOURCE

Australia: Power price hikes bite in Queensland

QUEENSLANDERS face a dramatic hike in power bills with the start of the new financial year, and households with solar panels are also likely to take a hit to the hip pocket.

The average power bill is expected to rise by $191, or 13.6 per cent, pushed up by green policies and the increasing cost of poles, wires, and electricity generation.

However, prices will only go up by about 5.1 per cent if the federal government's carbon tax is repealed.

Queensland's Energy Minister Mark McArdle has blamed much of the hike on the former Labor government's over-investment in the power distribution network.

"Every power bill that is issued, 54 per cent of that bill relates to the cost of poles and wires - the gold-plated legacy of Labor that we're now having to unravel," Mr McArdle told ABC radio.

Pensioners and seniors will be able to apply for an electricity rebate of $320 after the government upped concessions to $165 million for this financial year.

"The Queensland government promised to lower the cost of living wherever we could and we're making sure that pensioners and other vulnerable Queenslanders get some relief on household costs," Mr McArdle said.

Consumers are forking out 50 per cent more for electricity than they did three years ago, and shadow treasurer Curtis Pitt says price hikes under the Newman government total $560.

"Campbell Newman arrogantly promised to lower Queenslanders' electricity bills, yet ever since he's become premier they've just gone up and up and up," he said.

This financial year, about 50,000 homeowners who have solar panels will no longer be guaranteed a feed-in tariff of eight cents.

Government-owned distributors will no longer be responsible for paying the tariff and households will have to negotiate directly with electricity retailers for the price they are paid for the solar power they generate.

The 44 cent tariff, paid to some 284,000 people who were first to sign up to the scheme, will remain unchanged.

Australian Solar Council chief executive John Grimes says consumers need to shop around, or join forces to negotiate as a block with electricity retailers.

"As an independent customer, with an average-size system on your roof, you really have little leverage when talking to a utility," Mr Grimes told ABC radio.

SOURCE

Motorized climate change??

ADVOCATES for action against climate change do themselves few favours when they turn legitimate concerns into outright political propaganda.

Current editions of the official NSW government handbook for learner drivers carry a bizarre warning about the future risks of climate change, claiming that a changed climate could cause "unpredictable weather events” due to "greenhouse gas emissions”.

The excuse for including this information is that drivers should beware of taking to the roads in extreme conditions brought about by climate change.

This is more than a little absurd. If this approach was taken to logical extremes, we could see government climate change warnings attached to almost every conceivable human activity.

Impressively, state Coalition government roads minister Duncan Gay recognises the warning for the political sloganeering that it is and has vowed to cut the lines in future editions of the handbook.

The public might be more inclined to listen to climate activists if the activists’ messages were more realistic and less evangelical.

Which brings us to Scott Ferguson of Haberfield, who brought this to The Daily Telegraph’s attention after a copy of the handbook was given to his young daughter Riley.

"I haven’t been this annoyed since Riley’s old primary school made her sit in scripture class,” Mr Ferguson said. That’s a very good comparison. When climate change activists take their views to extremes, they sound more like religious zealots than like advocates for a better planet.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************

2 comments:

Olaf Koenders said...

Hi John. Here's an image you can put alongside the "CRU graph" that you use as the header at:

antigreen.blogspot.com.au

It shows a virtually straight line from 1880.

https://lh6.ggpht.com/OFp8pKk1YHUhYKg7pmM80YUMB_k8bkfa6W0sC-dWoqP6nqkJrFfwlMed1Dfw09CXq7Uj

Thanks mate.

Olaf Koenders
Albury

King of the Road said...

On the other hand, here's an analogy: http://hamiltonianfunction.blogspot.com/2014/07/its-all-in-presentation.html