Tuesday, August 01, 2017



Britain’s energy policy keeps picking losers

The public have paid the price for years of missteps: it’s time to scrap Hinkley Point C [nuclear] and support the shale revolution

Shortly before parliament broke up this month, there was a debate on a Lords select committee report on electricity policy that was remarkable for its hard-hitting conclusions. The speakers, and signatories of the report, included a former Labour chancellor, Tory energy secretary, Tory Scottish secretary, cabinet secretary, ambassador to the European Union and Treasury permanent secretary, as well as a bishop, an economics professor, a Labour media tycoon and a Lib Dem who was shortlisted for governor of the Bank of England.

Genuine heavyweights, in short. They were in general agreement: energy policy is a mess, decarbonisation has been pursued at the expense of affordability and, in particular, the nuclear plant at Hinkley Point C in Somerset is an expensive disaster. Their report came out before the devastating National Audit Office report on Hinkley, which said the government had “locked consumers into a risky and expensive project [and] did not consider sufficiently the risks and costs to the consumer”.

Hinkley is but the worst example of a nationalised energy policy of picking losers. The diesel fiasco is another. The wind industry, with its hefty subsidies paid from the poor to the rich to produce unreliable power, is a third. The biomass mess (high carbon, high cost and environmental damage) is a fourth.

The liberalised energy markets introduced by Nigel Lawson in 1982, embraced by the Blair government and emulated across Europe, delivered both affordability and reliability. But they were abandoned and, in the words of the Lords committee, “a succession of policy interventions has led to the creation of a complex system of subsidies and government contracts at the expense of competition. Nobody has built a power station without some form of government guarantee since 2012.”

All three parties share the blame. Labour’s Climate Change Act of 2008 made Britain the only country with mandatory decarbonisation targets, a crony-capitalist’s dream. The Lib Dems who ran the energy department for five years, Chris Huhne and Ed Davey, negotiated the disastrous Hinkley contract. The Tories reviewed the decision in 2016, by which time it was clear we had managed the unique feat of finding a technology that was untested yet already obsolete. They decided to go ahead anyway, missing the chance to blame the other parties for it. As the energy analyst Peter Atherton put it, the three parties “have managed to design possibly the most expensive programme for delivering nuclear power we could have come up with”.

The chief Lib Dem mistake was to ignore the shale gas and oil revolutions under way in America and assume that fossil fuel prices would rise from already high levels. By 2011, influenced by peak-oil nonsense and lobbied by professors of “sustainability”, the department of energy and climate change was projecting that the oil price would be between $97 and $126 per barrel in 2017. Today it is about $50 a barrel, roughly half the lowest of the 2011 projections. Gas prices were expected to be about 76p per therm by now, whereas they are actually about half that: 37p.

The shale revolution is gathering pace all the time. Britain has very promising shales and could prosper and cut emissions if it joins in, so let us hope the first wells about to be drilled in Lancashire by Cuadrilla, against the determined opposition of wealthy, middle-class protesters, prove successful. (No, I don’t have a commercial interest in shale.)

American industry pays about half as much for its electricity as we do
This forecasting mistake is behind much of the rising cost of Hinkley. In 2015 the whole-life cost of its power was expected to be £14 billion. Now it is £50 billion. Because consumers are on the hook to pay the difference between the wholesale price of electricity and the “strike price” for Hinkley, we must hope that the project is badly delayed, because that way our children will at least spend fewer years paying inflated electricity prices.

These bad forecasts, widely criticised at the time, make all strike prices horribly expensive, for onshore and offshore wind and solar as well. Lib Dem ministers kept saying at the time that subsidies for renewables and Hinkley would protect the consumer against “volatile” gas prices. Yes, they have done so: by guaranteeing high prices. Oh for a little downward volatility!

Britain’s industrial and commercial users now have some of the highest electricity prices in the developed world, which find their way to households in cost of living and a downward pressure on wages. American industry pays about half as much for its electricity as we do, and everyone benefits. Energy prices are not just any consumer price: they determine the prosperity of the entire economy.

It is just possible some new arrangement could be salvaged
Well, no use crying over spilt future money. What are we to do? Here is where it could get interesting. Almost nobody wants Hinkley to go ahead, apart from the contractors who get to build it. EDF and Areva, the French owner and developer, are in trouble over the only two comparable reactors in Europe. The one at Flamanville is still to start working, many years behind schedule. The French unions want Hinkley cancelled. Lord Howell of Guildford, the former energy secretary, wisely pointed out in the Lords that the key player is China, a partner in the project. Rather than cost, the government’s excuse for revisiting Hinkley last year was partly worries about security. This was a silly worry and bad diplomacy. However, it is not clear China wants to go ahead, and subtle negotiation could tease this out. The great prize for China was regulatory approval through Britain’s gold-standard “generic design assessment” process, which could unlock foreign markets and give a green light for a Chinese-built reactor at Bradwell in Essex.

But Lord Howell says the Chinese increasingly realise that the Hinkley design is a dead end, as costs escalate and delays grow. And they know that the future for nuclear power must lie in smaller, modular units, mass-manufactured like cars rather than assembled from scratch like Egyptian pyramids. Their “Nimble Dragon” design could slot into both the Hinkley and Bradwell sites, perhaps beside the larger Hualong design.

Cancellation would cost some £20 billion. But if the initiative comes from Beijing it is just possible that some new arrangement could be salvaged from the certain wreckage of the EDF scheme, without seriously damaging both livelihoods and our relations with China.

SOURCE





Man Made Warming from Adjusting Data

Roger Andrews does a thorough job analyzing the effects of adjustments upon Surface Air Temperature (SAT) datasets. His article at Energy Matters is Adjusting Measurements to Match the Models – Part 1: Surface Air Temperatures. Excerpts of text and some images are below.  The whole essay is informative and supports his conclusion:

In previous posts and comments I had said that adjustments had added only about 0.2°C of spurious warming to the global SAT record over the last 100 years or so – not enough to make much difference. But after further review it now appears that they may have added as much as 0.4°C.

The current GISS series shows about 0.3°C more global warming than the old version, with about 0.2°C more warming in the Northern Hemisphere and about 0.5°C more in the Southern. The added warming trends are almost exactly linear except for the downturns after 2000, which I suspect (although can’t confirm) are a result of attempts to track the global warming “pause”. How did GISS generate all this extra straight-line warming? It did it by replacing the old unadjusted records with “homogeneity-adjusted” versions.

The homogenization operators used by others have had similar impacts, with Berkeley Earth Surface Temperature (BEST) being a case in point. Figure 3, which compares warming gradients measured at 86 South American stations before and after BEST’s homogeneity adjustments (from Reference 1) visually illustrates what a warming-biased operator does at larger scales. Before homogenization 58 of the 86 stations showed overall warming, 28 showed overall cooling and the average warming trend for all stations was 0.54°C/century. After homogenization all 86 stations show warming and the average warming trend increases to 1.09°C/century:

The adjusted “current” GISS series match the global and Northern Hemisphere model trend line gradients almost exactly but overstate warming relative to the models in the Southern (although this has only a minor impact on the global mean because the Southern Hemisphere has a lot less land and therefore contributes less to the global mean than does the Northern). But the unadjusted “old” GISS series, which I independently verified with my own from-scratch reconstructions, consistently show much less warming than the models, confirming that the generally good model/observation match is entirely a result of the homogeneity adjustments applied to the raw SAT records.

Summary

In this post I have chosen to combine a large number of individual examples of “data being adjusted to match it to the theory” into one single example that blankets all of the surface air temperature records. The results indicate that warming-biased homogeneity adjustments have resulted in current published series overestimating the amount by which surface air temperatures over land have warmed since 1900 by about 0.4°C (Table 1), and that global surface air temperatures have increased by only about 0.7°C over this period, not by the ~1.1°C shown by the published SAT series.

Land, however, makes up only about 30% of the Earth’s surface. The subject of the next post will be sea surface temperatures in the oceans, which cover the remaining 70%. In it I will document more examples of measurement manipulation malfeasance, but with a twist. Stay tuned.

SOURCE




The 6 biggest reasons I’m a climate-change skeptic — and why you should be a skeptic too

For nearly 30 years, some scientists and many liberal activists have been alleging that the world is on the verge of collapse because of humans’ use of fossil fuels, which they say have been causing global warming.

For example, the San Jose Mercury News (Calif.) reported on June 30, 1989: “A senior environmental official at the United Nations, Noel Brown, says entire nations could be wiped off the face of the earth by rising sea levels if global warming is not reversed by the year 2000. Coastal flooding and crop failures would create an exodus of ‘eco-refugees,’ threatening political chaos, said Brown, director of the New York office of the U.N. Environment Program. He said governments have a 10-year window of opportunity to solve the greenhouse effect before it goes beyond human [control.]”

But despite the constant cries from the left proclaiming the “science is settled” and that there’s a “scientific consensus,” there are many reasons to reject these assumptions. Here are six of the most important ones:

1. Climate alarmists’ temperature-predicting track record is abysmal.
Most people don’t know anything about climate science, and with all that’s going on in the world, who can blame them? Instead of studying the issue for themselves, people rely on the media and the scientists the media has promoted to provide them with scientific conclusions. In other words, to the extent the public believes in the theory humans are responsible for global warming, it’s because they trust the scientists and media outlets they hear from most often on this issue, but should they? Based on climate-alarmist scientists’ track record, the answer is clearly “no.”

Over the past three decades, many climate scientists have repeatedly made a number of significant and alarming predictions about global warming, and the vast majority of the time, they’ve been wrong—really, really wrong. As Roy Spencer—who earned his Ph.D. in meteorology from the University of Wisconsin in 1981 and previously served as the senior scientist for climate studies at NASA’s Marshall Space Flight Center—wrote in 2014, greater than 95 percent of the climate models through 2013 “over-forecast the warming trend since 1979.”

2. Climate alarmists’ predictions about extreme weather and other crises have also failed.
It’s common for climate alarmists to argue that global warming has caused and will continue to cause a significant increase in extreme weather events, including hurricanes, and that sea levels will eventually rise to the point that massive cities will someday be flooded and uninhabitable, but the available data say otherwise.

H. Sterling Burnett, Ph.D., a research fellow specializing in environment and climate issues for The Heartland Institute, where I work as executive editor, wrote in January for Red State, “For instance, climate models predicted more intense hurricanes, but for nearly a decade, the United States has experienced far fewer hurricanes making landfall than the historic average, and those hurricanes that have made landfall have been no more powerful than previously experienced.”

“Additionally,” Burnett continued, “while scientists have claimed anthropogenic warming should cause sea levels to rise at increasing rates—because of melting ice caps in Greenland and Antarctica and the thermal expansion of water molecules under warmer conditions—sea-level rise has slowed. Sea levels have always risen between ice ages or during interglacial periods. Indeed, sea levels have risen more than 400 feet since the end of the last interglacial period. However, the rate of sea-level rise since 1961 (approximately one-eighth of an inch per year) is far lower than the historic average (since the end of the previous ice age), and sea-level rise has not increased appreciably over the past century compared to previous centuries.”

3. There are many unexplainable problems with the theory rising carbon-dioxide levels have caused global temperature to increase.
One of the most common misconceptions in the climate-change debate is that skeptics reject the claim global temperatures have risen in recent decades. Virtually everyone agrees temperatures have increased, the primary issue is the reason or reasons for those increases. Climate-change alarmists say humans are to blame, and skeptics believe, to varying degrees, humans’ responsibility is relatively minimal or nonexistent. One of the reasons, but not the only reason, many skeptics have rejected the assertion carbon-dioxide and temperature are linked is that there have been periods during the past two centuries in which global temperature has dropped or paused.

For instance, from the 1940s to the 1970s, Earth experienced a global cooling period, even while carbon-dioxide levels continuously rose. In the early 21st century, global temperature “paused” for 18 years, again during a period in which carbon-dioxide levels increased.

4. It’s not clear the most widely used climate data are accurate.
For many years, climate skeptics, concerned by numerous leaked documents showing climate data had been unscientifically altered to make it appear as though warming had been more significant than it actually was, have argued many of the climate datasets advanced by prominent organizations, including NASA, are not accurate. A new peer-reviewed study by prominent researchers James P. Wallace III, Joseph S. D’Aleo and Craig Idso seems to support that belief.

In their study, titled “On the Validity of NOAA, NASA and Hadley CRU Global Average Surface Temperature Data and the Validity of EPA’s CO2 Endangerment Finding,” the researchers “sought to validate the current estimates of GAST [global average surface temperature] using the best available relevant data,” the authors wrote. “This included the best documented and understood data sets from the U.S. and elsewhere as well as global data from satellites that provide far more extensive global coverage and are not contaminated by bad siting and urbanization impacts.”

They concluded—by comparing trusted raw climate data with the widely used altered datasets, which have been adjusted to account for numerous problems, such as contamination from heat in urban areas—the datasets used by NASA, the National Oceanic and Atmospheric Administration and the Met Office in the United Kingdom “are not a valid representation of reality.”

“In fact, the magnitude of their historical data adjustments, that removed their cyclical temperature patterns, are totally inconsistent with published and credible U.S. and other temperature data,” the researchers wrote. “Thus, it is impossible to conclude from the three published GAST data sets that recent years have been the warmest ever — despite current claims of record setting warming.”

5. Even if humans are creating a slightly warmer climate, it’s not necessarily a bad thing.
The underlying assumption that virtually all climate alarmists operate under is that the warming Earth is experiencing now is harmful, destructive and dangerous, but there is much evidence to suggest that moderate warming benefits most plants, animals and humans. We know, for instance, that plants grow significantly better with higher carbon-dioxide concentrations, which is why many greenhouses pump additional CO2 into their buildings.

It’s also been confirmed by multiple studies that greening has increased in recent decades — and likely because of higher carbon-dioxide concentrations. According to a study by Martin Brandt et al., published in the journal Nature Ecology & Evolution in May, 36 percent of the continent of Africa became greener over the 20-year period from 1992 to 2011, while only 11 percent became “less green.” Interestingly, the researchers found the increased greening was likely “driven” by higher carbon-dioxide levels and precipitation, and the decreased greening was largely a result of humans cutting down vegetation.

A greener planet means there is more food for humans and animals to consume, but a cooler global climate has historically been associated with significant food shortages and, in extreme cases, starvation. An article in the influential journal The Lancet, published in 2015, examined health data from 13 countries, accounting for more than 74 million deaths. The authors concluded cold weather, directly or indirectly, kills 1,700 percent more people than hot weather.

6. There’s no reason to believe humans won’t develop cheap energy alternatives during the next century.
Let’s assume the climate is warming because of human action and will eventually become problematic. The most serious problems are still a century or more away, even under some of the most dire, scientifically unsupported models. That means the world has at least a half-century to come up with alternate energy sources and determine once and for all whether fossil-fuels are truly causing the problem.

A century ago, civilized nations were still fighting each other on horseback and traveling using steam engines. Fifty years ago, cellphones were the stuff of science fiction. Thirty years ago, the average American household didn’t have a computer. Today, people fly across the world in a few hours on planes equipped with Wi-Fi, allowing them to access a nearly endless supply of news, information, and entertainment using pocket-sized super computers. Does anyone really think energy won’t change over the next century as well?

Being a climate-change skeptic doesn’t mean you deny Earth’s climate has warmed or scientific findings. It simply means that you let facts, not speculation and fear-mongering, guide how you view the debate. If that sounds reasonable, then you’re probably a climate-change skeptic too.

SOURCE




Study blows 'greenhouse theory out of the water'

'All observed climatic changes have natural causes completely outside of human control'

A new scientific paper contends the entire foundation of the man-made global-warming theory – the assumption that greenhouse gases warm the atmosphere by trapping heat – is wrong.

If confirmed, the study’s findings would crush the entire “climate change” movement to restrict CO2 emissions, the authors assert

Some experts contacted by WND criticized the paper, while others advised caution.  Still others suggested that the claimed discovery represents a massive leap forward in human understanding – a “new paradigm.”

The paper argues that concentrations of CO2 and other supposed “greenhouse gases” in the atmosphere have virtually no effect on the earth’s temperature. They conclude the entire greenhouse gas theory is incorrect.

Instead, the earth’s “greenhouse” effect is a function of the sun and atmospheric pressure, which results from gravity and the mass of the atmosphere, rather than the amount of greenhouse gases such as CO2 and water vapor in the atmosphere.

The same is true for other planets and moons with a hard surface, the authors contend, pointing to the temperature and atmospheric data of various celestial bodies collected by NASA.

So precise is the formula, the authors of the paper told WND, that, by using it, they were able to correctly predict the temperature of other celestial bodies not included in their original analysis.

The paper

The paper, published recently in the journal “Environment Pollution and Climate Change,” was written by Ned Nikolov, a Ph.D. in physical science, and Karl Zeller, retired Ph.D. research meteorologist.

The prevailing theory on the earth’s temperature is that heat from the sun enters the atmosphere, and then greenhouse gases such as CO2, methane and water vapor trap part of that energy by preventing it from escaping back into space.

That theory, which underpins the anthropogenic global-warming hypothesis and the climate models used by the United Nations, was first proposed and developed in the 19th century.

However, the experiments on which it was based involved glass boxes that retain heat by preventing the mixing of air inside the box with air outside the box.

The truth about global warming is no further than the WND Superstore, where “Climategate,” “The Greatest Hoax,” and more publications are available.

The experiment is not analogous to what occurs in the real atmosphere, which does not have walls or a lid, according to Nikolov and Zeller.

The new paper, headlined “New Insights on the Physical Nature of the Atmospheric Greenhouse Effect Deduced from an Empirical Planetary Temperature Model,” argues that greenhouse theory is incorrect.

“This was not a pre-conceived conclusion, but a result from an objective analysis of vetted NASA observations,” Nikolov told WND.

The real mechanisms that control the temperature of the planet, they say, are the sun’s energy and the air pressure of the atmosphere. The same applies to other celestial bodies, according to the scientists behind the paper.

To understand the phenomena, the authors used three planets – Venus, Earth and Mars – as well as three natural satellites: the Moon of Earth, Titan of Saturn and Triton of Neptune.

They chose the celestial bodies based on three criteria: having a solid surface, representation of a broad range of environments, and the existence of reliable data on temperature, atmospheric composition and air pressure.

“Our analysis revealed a poor relationship between global mean annual temperature] and the amount of greenhouse gases in planetary atmospheres across a broad range of environments in the Solar System,” the paper explains.

“This is a surprising result from the standpoint of the current Greenhouse theory, which assumes that an atmosphere warms the surface of a planet (or moon) via trapping of radiant heat by certain gases controlling the atmospheric infrared optical depth,” the study continues.

ClimateGraphic

The paper outlines four possible explanations for those observations, and concludes that the most plausible was that air pressure is responsible for the greenhouse effect on a celestial body.

In essence, what is commonly known as the atmospheric “greenhouse” effect is in fact a form of compression heating caused by total air pressure, the authors told WND in a series of e-mails and phone interviews, comparing the mechanics of it to the compression in a diesel engine that ignites the fuel.”

And that effect is completely independent of the so-called “greenhouse gases” and the chemical composition of the atmosphere, they added.

“Hence, there are no greenhouse gases in reality – as in, gases that can cause warming,” Nikolov said when asked to explain the paper in layman’s terms.

“Humans cannot in principle affect the global climate through industrial emissions of CO2, methane and other similar gases or via changes in land use,” he added. “All observed climatic changes have natural causes that are completely outside of human control.”

For the first time, Nikolov said, there is now empirical evidence from NASA data that the greenhouse effect of the atmosphere is not caused by the trapping of heat, but by the force of atmospheric pressure.

The pressure is the weight of the atmosphere, he added.

And the combination of gravity and the mass of the atmosphere explains why the Earth, for example, is warmer than the moon.

“The moon receives about the same amount of heat from the sun as Earth, yet it is 90 degrees [Celsius] colder than the Earth, because it has no atmosphere,” Nikolov explained.

What it all means for science and the climate debate

This is not the first paper to reject the greenhouse-gas theory entirely.

In 2009, for example, Gerhard Gerlich and Ralf Tscheuschner published a paper titled “Falsification of the Atmospheric CO2 Greenhouse Effects Within The Frame Of Physics” in the International Journal of Modern Physics.

They wrote that the “atmospheric greenhouse effect” that “is still supported in global climatology” basically “describes a fictitious mechanism.” The second law of thermodynamics, they said, shows that “can never exist.”

However, their paper did not propose a mechanism to explain the higher temperature of Earth relative to the moon.

The new paper by Nikolov and Zeller does propose such a mechanism – atmospheric pressure.

If correct, the implications of the discovery would be enormous, multiple scientists told WND.

For one, it means the climate projections used to forecast warming doom and justify a wide range of policies are completely wrong.

That is because they were produced by computer models built around a “physically deeply flawed concept, the radiative greenhouse theory,” said Nikolov, who works as a federal scientist but did the new study completely on his own time.

“One major implication of our recently published study is that there is indeed a fundamental problem with the physics of current radiative greenhouse concept,” he told WND, highlighting the origin of the “inaccurate” theory in two 19th century papers.

“The foundation of the greenhouse theory was born of an assumption, it was never shown experimentally, and our results show this is completely wrong,” Nikolov said. “Our study blows the greenhouse theory completely out of the water. There is nothing left.”

“Hence, the public debate on climate needs now to shift focus to the fact that the basic science concept underlying current climate projections by the UN [Intergovernmental Panel on Climate Change] IPCC and other international bodies is physically flawed,” Nikolov added, saying the new findings require a “fundamental overhaul of climate science” and that Earth may be heading for a cooling period.

“This is what the data shows,” he said. “We didn’t start with a theory, we started with the data, which is the opposite of how the greenhouse theory came about.”

The greenhouse theory, Nikolov explained, is based on the assumption that a free convective atmosphere – an atmosphere with no “lid” on it – can trap heat.

“This was an assumption born out of a misinterpretation of experiments involving glass boxes in the early 19th century by Joseph Fourier, a French mathematician,” he said.

“Glass boxes get warmer inside when exposed to the sun not because they trap long-wave radiation, as thought by Fourier, but because they hamper the exchange of air between the inside of a box and the outside environment,” he added.

Next came Svante Arrhenius, a Swedish scientist, who assumed Fourier was correct and in 1896 created an equation to calculate the Earth’s temperature based on CO2 in the atmosphere.

“This equation is both mathematically and physically wrong,” argued Nikolov. “Yet, this paper is still cited as ‘evidence’ that the physics of the greenhouse effect have been well-known for over 100 years.”

More HERE





Australian Greens plan to curb property investment

There are actually some good points in the plan.  Reverting to inflation adjustment for assessing capital gains rather than giving a fixed 50% discount is much fairer though more complex to administer

The attack on negative gearing is very unrealistic, however. It would simply prevent a lot of property investment occurring so would constitute no gain to the treasury while reducing the supply of rental accommodation.  But it is mostly the poor who rent so the plan would hit the poor while trying to hit the rich.  But maybe that scenario appeals to the elitist Greens.


The Australian Greens are preparing to unveil the most ambitious plan yet to get young people into homes, costed at an extraordinary $51 billion. The $51 billion figure is a net saving to the budget rather than a cost, calculated over 10 by the Parliamentary Budget Office.

The three-point plan, Houses for Young People: Freeing up Investment Properties, would phase out the capital gains tax discount available to property investors over five years.

During the first year, the standard 50 per cent discount on capital gains tax would shrink to 40 per cent, to 10 per cent after four years and zero after five years.

Income from capital gains would be then be taxed at almost the same rate as income from other sources, except that the inflation component would be tax exempt, as it used to be before 1999 when the Howard government replaced the exemption with a 50 per cent discount.

Reverting to the original means of compensating investors for inflation would bring in an extra $2.75 billion over four years and $16.1 billion over 10 years.

It would make property investment and speculation less attractive, winding back the competition faced by owner-occupiers at auctions.

The plan would also end negative gearing for all new property purchases. Businesses would continue to be able to negatively gear non-property investments.

Landlords would continue to able to write off property investment costs against property investment income, but not against salaries and other income.

His part of the plan would bring in $2.4 billion over four years and $34.5 billion over 10 years.

The third leg of the plan would limit existing negative gearers to one property. Only 583,000 out of Australia's 1.5 million property investors invest in two or more investment properties.

The deductions available for second or more properties would shrink by one-fifth each year until reaching zero after the fifth year.

The limit would bring in an extra $100 million in tax revenue in the first four years and $1.3 billion over 10 years.

Launching the plan on Saturday, Greens leader Richard Di Natale will say it is "time to dismantle the rigged system that privileges investors and landlords over everybody else".

"Australia is facing a housing crisis. Everyone needs a home where they can feel secure, live comfortably and be part of the community," his speaking notes say. "But this is becoming increasingly difficult for millions of average Australians."

Greens Treasury spokesman, senator Peter Whish-Wilson will say the government has "rigged the tax system to favour wealthy people".

"Negative gearing and capital gains tax discounts have driven house prices sky high, making it easier for wealthy people to buy more homes and harder for first home buyers," he will say. "At the same time, stamp duty raises the price of homes and stops people from moving house, even when they're ready to downsize."

The Greens will also push the Commonwealth government to back state governments that replace stamp duty with land tax.

The plan goes further than the one Labor took to the election that retained negative gearing for all pre-existing investors, no matter how many properties they geared.

Labor proposed halving the capital gains tax discount from 50 per cent to 25 per cent rather than abolishing it and replacing it with indexation.

In the budget Treasurer Scott Morrison wound back some of the excesses of negative gearing by withdrawing deductions for things such as the cost of travel to inspect rented-out properties.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************



No comments: