99
ITER Forum Website update 10/13 B.J.Green (16/10/13) 1. Climate consensus 'skewing' science GRAHAM LLOYD THE AUSTRALIAN SEPTEMBER 21, 2013 12:00AM http://www.theaustralian.com.au/national- affairs/policy/climate-consensus-skewing- science/story-e6frg6xf-1226724080490 CONSENSUS decision-making on climate change has oversimplified the problem and how to solve it, and unduly politicised the process, a leading US climate scientist has said. Writing in The Weekend Australian today, Judith Curry, professor and chairwoman of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, said the consensus-building process itself could be a source of bias. "A strongly held prior belief can skew the total evidence that is available subsequently in a direction that is favourable to itself," Professor Curry said. "The consensus-building process has been found to generally act in the direction of understating the uncertainty associated with a given outcome."

fusion.ainse.edu.aufusion.ainse.edu.au/__data/assets/word_doc/0005/65165/... · Web viewB.J.Green (16/10/13) 1. Climate consensus 'skewing' science GRAHAM LLOYD THE AUSTRALIAN SEPTEMBER

  • Upload
    voduong

  • View
    216

  • Download
    2

Embed Size (px)

Citation preview

ITER Forum Website update 10/13B.J.Green (16/10/13)

1. Climate consensus 'skewing' scienceGRAHAM LLOYDTHE AUSTRALIANSEPTEMBER 21, 2013 12:00AM

http://www.theaustralian.com.au/national-affairs/policy/climate-consensus-skewing-science/story-e6frg6xf-1226724080490

CONSENSUS decision-making on climate change has oversimplified the problem and how to solve it, and unduly politicised the process, a leading US climate scientist has said.

Writing in The Weekend Australian today, Judith Curry, professor and chairwoman of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology, said the consensus-building process itself could be a source of bias.

"A strongly held prior belief can skew the total evidence that is available subsequently in a direction that is favourable to itself," Professor Curry said.

"The consensus-building process has been found to generally act in the direction of understating the uncertainty associated with a given outcome."

Professor Curry has led debate in the science community about the process of reviewing climate change, including giving testimony before the US house subcommittee on environment this year, remarking on the many large uncertainties in forecasting future climate.

Australian climate scientist David Karoly, professor of atmospheric science at the University of Melbourne and a review editor of the Intergovernmental Panel on Climate Change's fifth assessment report, said he did not believe uncertainty was underplayed in the

IPCC assessments. "There is a thorough and comprehensive consideration of uncertainty in the IPCC reports and in their summaries, including estimates of uncertainties through multiple different approaches," Professor Karoly said.

"It has been reported that uncertainty is referred to 42 times in 31 pages in the leaked draft of the summary of the IPCC report to be released at the end of next week.

"The comprehensive treatment of uncertainty in the IPCC assessment process is much better than in any single or group of peer-reviewed research papers."

Professor Curry said virtually all climate scientists agreed on the basic physics of what impact increased levels of carbon dioxide in the atmosphere would have on temperature.

"If all other things remain equal, more carbon dioxide in the atmosphere will have a warming effect on the planet," Professor Curry said.

"Further, virtually all agree that the planet has warmed over the past century, and that humans have had some impact on the climate.

"But understanding the causes of recent climate change and predicting future change is far from a straightforward endeavour."

Professor Curry has been involved in lively international debate in the lead-up to the release next week of the IPCC's fifth update report. Debate has centred on the inability of many climate models to predict accurately actual global temperature changes.

She said climate scientists did "not need to be consensual to be authoritative".

"Authority rests in the credibility of the arguments, which must include explicit reflection on uncertainties, ambiguities and areas of ignorance and more openness for dissent," she said.

"The role of scientists should not be to develop political will to act by hiding or simplifying the uncertainties, either explicitly or implicitly, behind a negotiated consensus."

Professor Curry has recommended that the scientific consensus-seeking process be abandoned in favour of a more traditional review that presents arguments for and against, and discusses the uncertainties.

Will Steffen, the executive chairman of the Australian National University's Climate Change Institute, who was a commissioner on the Climate Commission that was axed on Thursday, said he believed Professor Curry was mounting a "straw man" argument.

"The emphasis on consensus is overdone," Professor Steffen said. "In fact, the IPCC reports do an excellent job of representing uncertainties. The terminology that the IPCC uses -- such as 'virtually certain', 'very likely', 'likely' etcetera and the very careful definitions that are attached to them, are an excellent way of communicating the uncertainties around various aspects of climate change, and are a significant improvement over the traditional way of discussing uncertainties in the scientific literature.

"As for being 'authoritative', the IPCC certainly passes that test with flying colours."

 2. Consensus distorts the climate pictureJUDITH CURRYTHE AUSTRALIANSEPTEMBER 21, 2013 12:00AM

http://www.theaustralian.com.au/opinion/consensus-distorts-the-climate-picture/story-e6frg6zo-1226724019428#

IN February 2007, publication of the Intergovernmental Panel on Climate Change's Fourth Assessment Report (AR4) was received with international acclaim.

The vaunted IPCC process -- multitudes of experts from more than 100 countries examining thousands of

refereed journal publications, with hundreds of expert reviewers, across a period of four years -- elevated the authority of the IPCC report to near biblical heights. Journalists jumped on board and even the oil and energy companies neared capitulation.

The veneration culminated with the Nobel Peace Prize, which the IPCC was awarded jointly with former US vice-president Al Gore. At the time, I joined the consensus in supporting this document as authoritative; I was convinced by the rigours of the process. Although I didn't agree with some statements in the document and had nagging concerns about the treatment of uncertainty, I bought into the meme of: "Don't trust what one scientist says; trust the consensus-building process of the IPCC experts."

Six-and-a-half years later, nominally a week before the release of the IPCC Fifth Assessment Report (AR5), substantial criticisms are being made of leaked versions of the report as well as of the IPCC process. IPCC insiders are bemoaning their loss of scientific and political influence. What happened to precipitate this change?

The IPCC was seriously tarnished by the unauthorised release of emails from the University of East Anglia in November 2009 that became known as the Climategate affair. These emails revealed the "sausage-making" involved in the IPCC's consensus-building process, including denial of data access to individuals who wanted to audit its data processing and scientific results, interference in the peer-review

process to minimise the influence of sceptical criticisms and manipulation of the media.

Climategate was soon followed by the identification of an egregious error involving the melting of Himalayan glaciers. These revelations were made much worse by the response of the IPCC to these issues. Then came concerns about the behaviour of the IPCC's chairman Rajendra Pachauri and investigations of the infiltration of green advocacy groups into the IPCC. All of this was occurring against a background of explicit advocacy and activism by IPCC leaders related to carbon dioxide mitigation policies.

Although the scientists and institutions involved in Climategate were cleared of charges of scientific misconduct, the scientists and the IPCC did not seem to understand the cumulative impact of these events on the loss of trust in climate scientists and the IPCC process.

The IPCC's consensus-building process relies heavily on expert judgment; if the public and the policymakers no longer trust these particular experts, then we can expect a very different dynamic to be in play with regards to the reception of the AR5 relative to the release of the AR4 in 2007.

THERE is another, more vexing dilemma facing the IPCC, however. Since the publication of the AR4, nature has thrown the IPCC a curveball: there has been no significant increase in global average surface

temperature for the past 15-plus years. This has been referred to as a pause or hiatus in global warming.

Almost all climate scientists agree on the physics of the infrared emission of the CO2 molecule and understand that if all other things remain equal, more CO2 in the atmosphere will have a warming effect on the planet. Further, almost all agree that the planet has warmed across the past century and that humans have had some impact on the climate.

But understanding the causes of recent climate change and predicting future change is far from a straightforward endeavour.

The heart of the debate surrounding the IPCC's AR5 is summarised by the graphic on this page that compares climate model projections of global average surface temperature anomalies against observations.

This diagram is Figure 1.4 from the first chapter of an AR5 draft. FAR denotes the First Assessment Report (1990), SAR the second (1995) and TAR the third (2001), which was followed by the AR4 (2007). It is seen that climate models have significantly over-predicted the warming effect of CO2 since 1990, a period during which CO2 concentrations increased from 335 parts per million to more than 400ppm.

The most recent climate model simulations used in the AR5 indicate that the warming stagnation since 1998 is no longer consistent with model projections even at the 2 per cent confidence level. Based on early drafts

of the AR5, the IPCC seemed prepared to dismiss the pause as irrelevant noise associated with natural variability. Apparently the IPCC has been under pressure from reviewers and its policymaker constituency to address the pause specifically.

Here is the relevant text from the leaked final draft of the AR5 summary for policymakers: "Models do not generally reproduce the observed reduction in surface warming trend over the last 10-15 years.

"The observed reduction in warming trend over the period 1998-2012 as compared to the period 1951-2012 is due in roughly equal measure to a cooling contribution from internal variability and a reduced trend in radiative forcing (medium confidence).

"The reduced trend in radiative forcing is primarily due to volcanic eruptions and the downward phase of the current solar cycle. However, there is low confidence in quantifying the role of changes in radiative forcing in causing this reduced warming trend. There is medium confidence that this difference between models and observations is to a substantial degree caused by unpredictable climate variability, with possible contributions from inadequacies in the solar, volcanic and aerosol forcings used by the models and, in some models, from too strong a response to increasing greenhouse-gas forcing."

The IPCC acknowledges the pause and admits climate models do not reproduce the pause. I infer from these statements that the IPCC has failed to convincingly

explain the pause in terms of external radiative forcing from greenhouse gases, aerosols, solar or volcanic forcing; this leaves natural internal variability as the predominant candidate to explain the pause.

Natural internal variability is associated with chaotic interactions between the atmosphere and ocean. The most familiar mode of natural internal variability is El Nino/La Nina. On longer multi-decadal time scales, there is a network of atmospheric and oceanic circulation regimes, including the Atlantic Multidecadal Oscillation and the Pacific Decadal Oscillation.

The IPCC refers to this as "unpredictable climate variability" in its statement above.

My chain of reasoning leads me to conclude that the IPCC's estimates of the sensitivity of climate to greenhouse gas forcing are too high, raising serious questions about the confidence we can place in the IPCC's attribution of warming in the last quarter of the 20th century primarily to greenhouse gases, and also its projections of future warming. If the IPCC attributes the pause to natural internal variability, then this prompts the question as to what extent the warming between 1975 and 2000 can also be explained by natural internal variability.

Nevertheless, the IPCC concludes in the final AR5 draft of the summary for policymakers: "There is very high confidence that climate models reproduce the

observed large-scale patterns and multi-decadal trends in surface temperature, especially since the mid-20th century.

"It is extremely likely that human influence on climate caused more than half of the observed increase in global average surface temperature from 1951-2010.

"Continued emissions of greenhouse gases would cause further warming. Emissions at or above current rates would induce changes in all components in the climate system, some of which would very likely be unprecedented in hundreds to thousands of years."

WHY is my reasoning about the implications of the pause, in terms of attribution of the late 20th-century warming and implications for future warming, so different from the conclusions drawn by the IPCC? The disagreement arises from different assessments of the value and importance of particular classes of evidence as well as disagreement about the appropriate logical framework for linking and assessing the evidence. My reasoning is weighted heavily in favour of observational evidence and understanding of natural internal variability of the climate system, whereas the IPCC's reasoning is weighted heavily in favour of climate model simulations and external forcing of climate change.

I do not expect my interpretation and analysis to be given credence above the IPCC consensus. Rather, I am arguing that the complexity of the problem, acknowledged uncertainties and suspected areas of

ignorance indicate several different plausible interpretations of the evidence. Hence ascribing a high confidence level to either of these interpretations is not justified by the available evidence and our present understanding.

How to reason about uncertainties in the complex climate system and its computer simulations is neither simple nor obvious. Biases can abound when reasoning and making judgments about such a complex system, through excessive reliance on a particular piece of evidence, the presence of cognitive biases in heuristics, failure to account for indeterminacy and ignorance, and logical fallacies and errors including circular reasoning.

The politicisation of climate science is another source of bias, including explicit policy advocacy by some IPCC scientists. Further, the consensus-building process can be a source of bias. A strongly held prior belief can skew the total evidence that is available subsequently in a direction that is favourable to itself. The consensus-building process has been found to act generally in the direction of understating the uncertainty associated with a given outcome. Group decisions can be dominated by a single confident member.

Once the IPCC's consensus claim was made, scientists involved in the IPCC process had reasons to consider the possible effect of their subsequent statements on their ability to defend the consensus claim, and the impact of their statements on policymaking.

The climate community has worked for more than 20 years to establish a scientific consensus on anthropogenic climate change. The IPCC consensus-building process played a useful role in the early synthesis of the scientific knowledge. However, the ongoing scientific consensus-seeking process has had the unintended consequence of oversimplifying the problem and its solution and hyper-politicising both, introducing biases into the science and related decision-making processes.

SCIENTISTS do not need to be consensual to be authoritative. Authority rests in the credibility of the arguments, which must include explicit reflection on uncertainties, ambiguities and areas of ignorance, and more openness for dissent. The role of scientists should not be to develop political will to act by hiding or simplifying the uncertainties, explicitly or implicitly, behind a negotiated consensus. I have recommended that the scientific consensus-seeking process be abandoned in favour of a more traditional review that presents arguments for and against, discusses the uncertainties, and speculates on the known and unknown unknowns. I think such a process would support scientific progress far better and be more useful for policymakers.

The growing implications of the messy wickedness of the climate-change problem are becoming increasingly apparent, highlighting the inadequacies of the "consensus to power" approach for decision-making on such complex issues.

Let's abandon the scientific consensus-seeking approach in favour of open debate and discussion of a broad range of policy options that stimulate local and regional solutions to the multifaceted and interrelated issues surrounding climate change.

Judith Curry is a professor and chair of the school of earth and atmospheric sciences at the Georgia Institute of Technology in the US, and president of Climate Forecast Applications Network. She is proprietor of the blog Climate Etc.

judithcurry.com

3. Hardly any experts doubt human-caused climate changeJOHN COOKTHE AUSTRALIANSEPTEMBER 21, 2013 12:00AM

http://www.theaustralian.com.au/opinion/hardly-any-experts-doubt-human-caused-climate-change/story-e6frg6zo-1226723829174

IN 2009, University of East Anglia servers were hacked, with years of private correspondence between climate scientists stolen. The hacker uploaded the emails to the internet, allowing bloggers to republish carefully selected quotes. During the next two years, nine investigations from university and government bodies on both sides of the Atlantic investigated the stolen emails. All unanimously found no evidence of data falsification. The sinister conspiracies conjured by the fevered imagination of

the blogosphere failed to materialise.

The theft of the private emails, dubbed Climategate, demonstrates the fallacy of over-interpreting cherry-picked quotes from private conversations. A single quote cannot capture the full context of a conversation, let alone explain the nuances of the science. Climategate lends credence to the infamous saying by Cardinal Richelieu: "Give me six lines written by the most honest man in the world, and I will find enough in them to hang him."

In an article in Inquirer last week, Andrew Montford republished illegally obtained private correspondence, falling into the same fallacy of portraying an incomplete, misleading picture. Last year, my server was hacked and years of private conversations were stolen. Montford republished a quote in which I discussed reducing the public misperception about the scientific consensus on human-caused global warming. Montford argued from this quote that our research was a public relations exercise rather than a scientific investigation.

What is the bigger picture that Montford overlooks? To begin with, he fails to consider that we had already performed a great deal of scientific investigation, scanning more than 12,000 abstracts and determining that papers rejecting human-caused global warming had a vanishingly small presence in the peer-reviewed literature.

Preliminary analysis had already observed that the amount of research endorsing human-caused global warming was increasing at an accelerating rate.

Montford also fails to realise that a high-impact, peer-reviewed journal such as Environmental Research Letters publishes only research that makes a significant scholarly contribution. Our research analysed for the first time the evolution of scientific consensus across the past two decades. We found that scientific agreement strengthened as more evidence for human-caused global warming accumulated.

Another novel contribution of our research was inviting the authors of the papers to rate their own research. After all, who is more of an expert on a paper than its author? This independent approach

produced a 97.2 per cent consensus on human-caused global warming, confirming the 97.1 per cent consensus we observed from the abstract text.

That 1200 scientists from across the world confirmed the overwhelming consensus is a fact studiously ignored by critics of our paper. The independent ratings by the paper's authors also expose the fallacy that we used an asymmetrical definition for consensus. We adopted several different definitions of consensus because scientists endorse human-caused global warming in different ways.

Some are explicit about how much humans have contributed. Others endorse the consensus without quantifying the human contribution. Others imply rather than explicitly endorse the consensus. The bottom line is no matter which definition you adopt, you always find an overwhelming consensus.

What if we only use symmetrical definitions of consensus; for example, humans are causing more than half of global warming versus less than half? Scientists who rated their own papers by these definitions show a consensus of 96.2 per cent.

Scientific agreement is so robust, you can look at it front-on, sideways or upside down and still find a consensus.

In fact, our research is not the only evidence for scientific consensus. A survey of earth scientists found 97 per cent consensus among actively publishing climate scientists. A compilation of public statements on climate change found 97 per cent consensus among published climate scientists. National academies of science from 80 countries endorse the consensus. Not a single academy of science disputes human-caused global warming.

Our research is the latest in a long line of statements and studies affirming that among the world's experts on climate change, it's considered a fundamental fact humans are causing global warming.

Despite this robust agreement, a "consensus gap" exists, with the public perceiving a 50:50 scientific debate. As well as provide scholarly contributions with our research, we set out to reduce this

persistent public misperception. It is possible to do both at the same time.

Trying to reframe peer-reviewed research based on an out-of-context quote as a "PR exercise" is simply an attempt to avoid facing the facts.

John Cook is climate change communication research fellow at the Global Change Institute at the University of Queensland.

4. Don't sweat fickleness, it's sun's fault or something in the waterGRAHAM LLOYDTHE AUSTRALIANSEPTEMBER 21, 2013 12:00AM

http://www.theaustralian.com.au/opinion/dont-sweat-fickleness-its-suns-fault-or-something-in-the-water/story-e6frg6zo-1226723836623

TO explain the world's recent fickle temperature record, it's not surprising sceptics and mainstream climate scientists are searching in opposite directions.

While many scientists are looking to the oceans for answers, others are turning to the sky.

Many people claim a confirmed downturn in solar activity can help explain why surface temperatures on Earth have not risen as quickly as expected during the past decade or more.

Just as they believe a lack of solar activity during the Maunder Minimum in the 17th century explains the Little Ice Age, they argue weaker recent solar activity can explain the so-called pause.

But scientists argue there were many more factors than solar activity responsible for the mini ice age. And most are adamant that the power of the sun, or lack of it, has been fully accounted for in the

modern climate stocktake.

As Judith Curry writes today in Inquirer, a draft of the upcoming fifth report by the Intergovernmental Panel on Climate Change says the observed reduction in the warming trend across the period 1998 to last year was due at least in part to the "downward phase of the current solar cycle".

Former chief climate commissioner Tim Flannery says: "The best information that we have is that the solar activity is factored into all the models."

Despite the denials, the instinctive belief many people have in the power of the sun has been hard to shake. It may be reflected in a waning mainstream interest in climate change in some parts of the world.

A British government-funded survey this week found the proportion of people who did not believe in climate change had more than quadrupled since 2005. Nineteen per cent of Britons now say they do not believe the world's climate is changing, up from only 4 per cent in 2005.

Flannery rejects any suggestion climate scientists have erred in not tackling head-on contentious issues such as the plateau in average global surface temperatures.

It is now widely acknowledged that average surface temperatures have remained largely steady - at a high level - for more than a decade. But according to Flannery, the hiatus is mostly a mirage.

"We have been able to explain the pause clearly, in that what we have seen, at least on the part of the atmosphere, is that it bounces around from year to year or decade to decade in terms of average temperature, but the oceans are much more steady," Flannery says.

"That is because about 90 per cent of the heat captured by the atmosphere ends up in the ocean and you only need to change that heat take-up (in the ocean) a little bit because the atmosphere is 500 times smaller than the ocean.

"With the atmosphere, we can see a bit of a pause; but when you ask

is the Earth warming or not, you have to look at the atmosphere, the oceans and the rocks - and the answer is yes."

Eventually, however, scientists acknowledge the sun will get its way. According to research published in Britain's The Guardian newspaper, the planet eventually will warm so much it will become impossible for plants, animals and humans to survive. This will be as a consequence of the sun's natural life cycle and the orbit of the Earth.

Andrew Rushby, from the University of East Anglia, is quoted saying: "It will get progressively hotter and there's nothing we can do about it."

But Earth has some time left before the sun gets too hot. In work published in the journal Astrobiology, scientists calculate the big sweat will come in the next 1.75 billion to 3.25 billion years.

5. Fusion: the optimist's choice18 September 2013 | By Stuart Nathan

http://www.theengineer.co.uk/opinion/comment/fusion-the-optimists-choice/1017135.article

An article in the Guardian newspaper a few weeks back asked the leading question: would we rather shave 35 minutes off the journey time between London and Birmingham, or have endless non-polluting electricity? It was referring to the projected costs of the High Speed 2 rail line and the estimated cost of developing nuclear fusion to the point where it could supply commercial power: both are around £50billion.

Put like that, it’s a simple choice, isn’t it? Bugger the trains; let’s have a fusion reactor with a Union Flag on it! But of course it isn’t that simple. Look at it this way: £50billion will definitely buy you a railway (or five-eighths of a railway, if you want the trains as well). But it won’t definitely buy you fusion. because fusion isn’t something you can buy. It’s something we have to develop, and the only certainty in R&D is that nothing’s certain. The physicists working on the ITER project to

build a magnetic confinement fusion tokamak in southern France are certain that their system will work and say it’s just a matter of engineering: but a matter of engineering which involves the biggest superconducting magnets ever built and a wall that can withstand neutron fluxes and temperatures hotter than the sun doesn’t really deserve the epithet “just”, as our recent Q&A shows.

It’s a bit of a daft question anyway. You might as well ask why Real Madrid bought Gareth Bale when they could have developed fusion for the same money. It’s not as though Cristiano Ronaldo is going to stop scoring goals. Maybe they could build a tokamak under the Bernabéu. It’s not like there isn’t a precendent: Enrico Fermi’s first nuclear reactor was under the football stadium at Chicago University’s football stadium. And it could bring a whole new meaning to ‘midfield powerhouse’.

Joking aside, there is a pertinent question here: given the goal of limitless clean energy, why are we spending so little on fusion? The worldwide fusion budget is less than $1US per person per year. In the UK, we spend 0.05 per cent of our energy budget on it — and we’re one of the leading investors and technology centres, with the world’s current largest tokamak, the Joint European Torus, housed at Culham in Oxfordshire. Anyone looking from outside might conclude that, as a species or a collection of economies, we aren’t really that serious about it. When asked what he’d spend more money on if he were in charge of the country’s research budget, Prof Brian Cox (who, good looks and camera presence aside, is a leading particle physicist, so he knows what he’s talking about) says that we should throw it at fusion.

Mention fusion within the hearing of a bunch of technology-savvy cynics such as Engineer readers (sorry, realists) and you soon run into the old chestnut about fusion being 30 years away, and it’s been 30 years away for the past half-century. And, indeed, the ITER timetable for fusion research does say commercial power is maybe 30 years away: the ITER reactor itself has to be built, and its experimental programme run for some 18-20 years, and if that is successful, then a further tokamak reactor has to be built which is configured to take the heat of the fusion reactor to generate steam and run turbines. But if we’re being optimistic — and as I’ve said before, if engineers can’t be optimistic they might as well just give up —

then that 30 years looks a lot more like a timetable than a hopeful projection. If we can make the Large Hadron Collider work, then why not ITER?

Of course, ITER isn’t the only egg in the fusion basket. The US National Ignition Facility is taking strides towards self-sustaining fusion with its laser-powered inertial confinement system (and if that works, then the next step in the programme will be built in the UK) and, intriguingly, Lockheed Martin’s ‘Skunk Works’ announced earlier this year that it is working on a compact magnetic confinement system, believed to be kind of twisted tokamak known as a spheromak or stellarator, which would fit in a standard shipping container and will be ready for scale-up to commercial use by 2020 (although the lead researcher, Charles Chase, says that a full scale reactor wouldn’t be ready before 2050. There’s that 30 years again). But we’re taking that one with a large pinch of salt. Prof Chase, if you’re reading this, I’ll believe it when I see it.

Fusion is surely humanity’s best hope for non-polluting, sustainable energy, alongside some more efficient form of photovoltaics. The laws of physics say that it should be possible; and translating the laws of physics into reality for the good of mankind is one good definition of engineering. It’s something which deserves to be taken much more seriously and needs investment. The goal of such investment wouldn’t necessarily be financial return, it would be the future of civilisation. That’s not something which often shows up on bankers’ balance sheets, but it probably should be; it’s certainly something which, as we can see from the ITER collaboration, is enough to motivate thousands of people from many different countries to work together. Surely it would be less controversial than a high-speed railway? Surely?

6. Tide turning on climate changeANDREW HAMMONDTHE AUSTRALIANSEPTEMBER 27, 2013 12:00AMhttp://www.theaustralian.com.au/national-affairs/

opinion/tide-turning-on-climate-change/story-e6frgd0x-1226727882606#

THE Intergovernmental Panel on Climate Change today will release the most comprehensive study on global warming. The landmark study, prepared by more than 200 scientists across two years, reportedly will conclude that global temperatures could rise by up to 4.8C by the end of this century compared with pre-industrial levels, but potentially could still be held to 0.3Cwith deep, speedy cuts in emissions.

This is hugely significant because all countries have agreed that temperature rises should be restricted to no more than 2C, thus increasing prospects of preventing so-called dangerous or runaway climate change.

The study reportedly also will conclude that it is "extremely likely" (at least 95 per cent probability) that human activity, not climate variations, has caused most of the increase in global temperature in recent decades.

A key reason the IPCC report is attracting such enormous media scrutiny is that climate change sceptics, in particular, are looking to see how the report explains the fact the rate of increase in global average surface temperature has slowed for the past 1 1/2 decades. This is contrary to earlier IPCC predictions.

Despite the controversy it will cause with sceptics, the IPCC report will be welcomed by many across the world, and comes when it may seem hard not to be pessimistic about the global battle to manage the huge risks of climate change.

For instance, the last annual UN international climate change summit in Doha in December made only modest progress towards securing a comprehensive, global deal.

Moreover, climate change sceptics appear to be winning the battle for public opinion across much of the world. Earlier this month, for instance, one study showed that the percentage of British people who do not think the world's climate is changing had increased by a staggering fourfold in less than a decade.

However, far from being the hopeless situation some suggest, there are signs that we may be reaching a point when the tide turns on tackling climate change. To be sure, much more needs to be done, but if one takes a step back and examines what is already happening at national and sub-national level across the world, a relatively encouraging picture is emerging.

That is, domestic laws and regulations to address climate change are being passed at an increasing rate, in stark contrast to the pace of progress in UN-driven international negotiations. Last year alone, as described in a report published by GLOBE International, 32 of 33 surveyed countries (which account for more than 85 per cent of global greenhouse gas emissions), including the US and China, have introduced or are progressing significant climate or related legislation and regulation.

This is nothing less than a game-changing development:

• China, after the publication of its 12th five-year plan in 2011, has proceeded with more detailed implementation guidelines including rules for its emissions trading pilots, progress with drafting its climate change law and publication of an energy white paper. Moreover, at the end of October last year, sub-national legislation was passed in Shenzhen to tackle climate change, the first such legislation in China.

• Mexico has passed a general law on climate change that is a comprehensive legislative framework packaged together with the first Redd+ readiness legislation to tackle deforestation.

• South Korea passed legislation to begin a nationwide emissions trading scheme by 2015.

• South Africa has proposed a carbon tax.

• There has also been progress in the developed world. For instance, the EU passed a new directive on energy efficiency, and Germany strengthened legislation relating to carbon capture and storage and energy efficiency.

As these examples underline, it is mainly developing countries,

which will provide the motor of global economic growth in coming decades, that are leading this drive. Many are concluding it is in their national interest to reduce greenhouse gas emissions by embracing low-carbon growth and development, and to better prepare for the impact of climate change.

They see that expanding domestic sources of renewable energy not only reduces emissions but also increases energy security by reducing reliance on imported fossil fuels. Reducing energy demand through greater efficiency reduces costs and increases competitiveness. Improving resilience to the effects of climate change also makes sound economic sense.

Many governments and companies have recognised that a green race has started, and they are determined to compete.

They also acknowledge that, across time, those that produce in "dirty" ways will be increasingly likely to face border adjustment mechanisms that take account of the subsidy associated with their taking advantage of any unpriced pollution.

It follows, therefore, that advancing domestic legislation on climate change, and experiencing the co-benefits of reducing emissions, is a crucial building block to help create the political conditions to enable a comprehensive, global climate agreement to be reached. Domestic laws give clear signals about direction of policy, increasing confidence and reducing uncertainty, particularly for the private sector, which can drive low-carbon economic growth.

With negotiations on a post-2020 comprehensive global deal scheduled to conclude in 2015, it is unlikely that an agreement, with necessary ambition, will be reached unless more domestic frameworks are in place in key countries. Sound domestic actions enhance the prospects of international action and better international prospects enhance domestic actions.

Given this outlook, and as negotiators prepare for the next annual UN climate change summit in Poland in November, a potential danger is that some countries may lower their long-term ambition. This would be ill-timed. Indeed, now is the right moment for countries to invest more in tackling climate change to help expedite

the creation of conditions on the ground that will enable a comprehensive global treaty to be reached.

Andrew Hammond was a special adviser in Tony Blair's government and a senior consultant at Oxford Analytica.

7. Profitable path to sustainabilityDENNIS JENSENTHE AUSTRALIANSEPTEMBER 27, 2013 12:00AM

http://www.theaustralian.com.au/national-affairs/opinion/profitable-path-to-sustainability/story-e6frgd0x-1226727881468

BJORN Lomborg has stated "if it is not economic, it is not sustainable". That single statement encapsulates all that is wrong with the climate change debate. It also points to a potential solution.

For those who know me, don't be confused. I have not changed my view that human activity is not a major driver of global warming.

Indeed, the more than decade-long lack of warming, opposed to the warming predicted by the global circulation models referred to by the Intergovernmental Panel on Climate Change, simply reinforces my view.

The problem is the debate has become polarised. Perhaps what is needed is refocusing on how a position can be reached where there is benefit to people on all sides of the argument.

Looking at the past, punitive measures have been recommended and put in place.

First the carbon tax, followed by emissions trading the last government put in place. The latter is the worst of all worlds, as it ends up with the effective payment of "indulgences" to overseas carbon traders for shonky carbon credits while emissions in Australia continue to increase.

Direct action nobly tries to move towards a reward structure to reduce emissions within Australia, but even it is less than optimal, considering Lomborg's statement. Another scheme that lamentably fails the Lomborg test is that of the Renewable Energy Target, which is certainly worse than direct action and should be dumped.

Forcing the generators to use uneconomic methods of generating power is a sop to green carpetbaggers, costing the Australian community dearly.

For the sake of argument, let's assume the most catastrophic climate projections are correct. Even if Australia completely ceased emitting anthropogenic carbon dioxide tomorrow, the net "benefit" in terms of forestalling temperature increases is vanishingly close to zero.

The simple fact is, even under this scenario, the only way to help the situation is to come up with a global solution that conforms with the need to be economic to be sustainable.

At present the only methods of generating power that emit minimal levels of carbon dioxide conforming to this proposition are nuclear power and hydroelectricity, both of which the green and other left movements see as anathema. Other methods such as wind and solar are a long way from being able to generate baseload power economically.

So, what can be done? Instead of foisting uneconomic "solutions" on the market, we need to find ways of making alternatives economic (and for those who argue renewables are economically competitive, the reality check is the generators would jump on them if they were, no subsidies or RETs required). The show stopper for most of the alternatives is economically competitive energy storage.

We should address this at the cheap end of the innovation pipeline - research! Australia should commit to providing significant funding for energy storage research.

The government should stay away from cherry-picking the research proposals. Selection of the most worthy research proposals should be left to the Australian Research Council.

By putting money into energy research, many benefits will follow. For those concerned with global warming, it provides potential for a real energy solution globally that conforms to Lomborg's statement and would have global energy consequences.

For Australia, it provides a realistic prospect for large windfalls as a result of the intellectual property generated, giving a positive return on the investment put into the research, unlike the other methods of trying to solve the anthropogenic global warming problem, which are a financial burden to Australians. Last, but by no means least, it provides a means of reinvigorating our struggling science sector, giving realistic prospects of careers in scientific research and improving the quality of the intake of those aiming for a science-related profession.

Win, win, win - plus the prospect of coming up with a path on the climate change issue on which most, if not all, could agree.

Former CSIRO research scientist and defence analyst Dennis Jensen is the federal Liberal member for Tangney in Western Australia.

 8. Science solid on global warming, IPCC declaresGRAHAM LLOYDTHE AUSTRALIANSEPTEMBER 28, 2013 12:00AM

http://www.theaustralian.com.au/national-affairs/policy/science-solid-on-global-warming-ipcc-declares/story-e6frg6xf-1226728912605

THE case for a global agreement to limit carbon-dioxide emissions has been bolstered after the world's top climate scientists increased their level of confidence that humans are changing the climate.

Despite predicting a range of possible temperatures over the century - an increase of 0.3C to 4.8C by 2100 - the Intergovernmental Panel on Climate Change's fifth report for policy-makers warns of serious consequences if no action is taken.

"We need to seize the opportunities of a low-carbon future," UN Secretary-General Ban Ki-moon said last night. "The heat is on, now we must act."

As expected, the fifth assessment report by the UN body said warming of the climate system was "unequivocal" and there was now a 95 per cent probability that humans were contributing to climate change, up from 90 per cent in the 2007 report.

The IPCC said that since the 1950s many of the observed changes were "unprecedented over decades to millennia".

"The atmosphere and ocean have warmed, the amounts of snow and ice have diminished, sea level has risen, and the concentrations of greenhouse gases has increased," said the report, released last night. The report said each of the past three decades had been warmer at the Earth's surface than any preceding decade since 1850. The IPCC report conceded the so-called "pause" in average surface temperatures over the past 15 years, but said it was not significant.

"Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends," the report said.

It said "internal variability" - including volcanic eruptions, reduced solar activity and a possible redistribution of heat within the ocean - could explain the observed reduction in surface warming from 1998 to 2012 as compared with 1951-2012.

It conceded there may also be "an overestimate of the response to increasing greenhouse gas and other anthopogenic forcing".

The report said there were more hot days and fewer cold ones; heat waves had increased in large parts of Europe, Asia and Australia and rainfall had increased in some areas.

As atmospheric carbon dioxide continued to increase, there was rising concern about increasing acidification of the ocean.

The report said continued emissions of greenhouse gases would cause further warming and changes in all components of the climate system. "Limiting climate change will require substantial and

sustained reductions of greenhouse gas emissions," it said.

Environment Minister Greg Hunt said the report's findings reinforced the government's "bipartisan support for the science and the targets set for emissions reductions".

Greens leader Christine Milne said the report confirmed urgent and deep emission cuts globally were needed. She said the government had "no option but to abandon" its direct-action approach and take "urgent and serious measures immediately".

The director of the Britain-based Global Warming Policy Foundation, Benny Peiser, was critical of the report's handling of the pause.

"It has not only decided to discount the global warming standstill since 1997 as irrelevant, but has also deleted from its draft document its original acknowledgement that climate models failed to 'reproduce the observed reduction in surface warming trend over the last 10-15 years'," Dr Peiser said.

IPCC working groups co-chairman Thomas Stocker said last night the rise in global surface temperatures by the end of the 21st century was likely to exceed 1.5C relative to 1850 to 1900 under all future carbon emissions scenarios.

The most optimistic of four scenarios for warming forecasts an average temperature rise of 1C by 2100 over 2000 levels, ranging from 0.3C to 1.7C.

The highest IPCC scenario has an average additional warming this century of 3.7C, ranging from 2.6C to 4.8C.

Unlike the previous report in 2007, which forecast a range of temperature increases from 0.3C to 6.4C by 2100, the fifth update did not nominate a most likely temperature rise figure.

The likely range for Equilibrium Climate Sensitivity (how much average global temperature is expected to rise after a doubling of atmospheric carbon-dioxide concentrations) was now deemed to be 1.5C to 4.5C, a revision from the Fourth Assessment Report, which provided a range of 2C to 4.5C.

By the end of the century, sea levels were projected to rise between 26cm and 55cm under the best-case scenario to 45cm to 82cm under the worst case. In 2007, the rise was projected in a range from 18cm to 59cm.

Andy Pitman, director of the ARC Centre of Excellence for Climate System Science at the University of NSW, said the report "finally puts to rest the role humans play in causing global warming".

"The good news is it highlights we can still avoid two degrees of warming if we deeply and rapidly cut emissions of greenhouse gases," Professor Pitman said.

CSIRO fellow and IPCC lead author Steve Rintoul said there was "even greater confidence that climate is changing, (that) humans are largely responsible for the warming observed over the last 50 years, and that substantial and sustained reductions in greenhouse gas emissions will be needed to avoid the worst consequences of climate change".

The IPCC report was approved at 5.30am after a marathon session at the Stockholm gathering of scientists and officials from more than 110 of the 195 eligible countries.

The IPCC document will play a key role in negotiations for a global agreement to cut global carbon-dioxide emissions which includes China, the US and India.

The UN has set a target to reach agreement in 2015 for a plan to take effect from 2020.

IPCC chairman Rajendra Pachauri said he believed a market mechanism was the key to reducing carbon-dioxide emissions. "We have to put a price on carbon," Dr Pachauri said. "In the ultimate analysis it is only through the market we might be able to get a large enough and rapid enough response," he said.

 9. EP 26, 2013JAPAN REAL TIME

Fukushima Watch: Japan to Cooperate with IAEA on Public Communication

By Mari Iwata

http://stream.wsj.com/story/latest-headlines/SS-2-63399/SS-2-338086/

Japan's nuclear watchdog will cooperate with the International Atomic Energy Agency to create a new system to provide accurate information internationally about the situation at the Fukushima Daiichi nuclear power plant to mitigate fears following extensive media coverage of contaminated water leaks at the site.

Japan’s nuclear watchdog will cooperate with the International Atomic Energy Agency to create a new system to provide accurate information internationally about the situation at the Fukushima Daiichi nuclear power plant to mitigate fears following extensive media coverage of contaminated water leaks at the site.

“We’ll start working with IAEA about what kind of plan would work well” in providing information to the global community,  Shunichi Tanaka, chairman of Japan’s Nuclear Regulation Authority, told reporters Wednesday.

Yukiya Amano, director-general of the IAEA, recently proposed the idea to Mr. Tanaka.

At least four organizations, including the NRA, have regularly collected data about radiation around Fukushima. Mr. Tanaka said Japan can provide these figures through the proposed system with its own interpretation of the data.

Japanese officials have said current systems have proven ill-equipped to handle the many questions from around the world on the status of the plant and any potential health risks following news of contaminated water leaks.

Besides the NRA’s efforts, a team from the prime minister’s office is now in charge of handling foreign inquiries into what’s happening at the plant. It has already set up an English language website and is working on one in Chinese.

Some experts say that while the leaks are a headache for plant operator Tokyo Electric Power Co., news reports have exaggerated the potential health effects of the

radiation. The experts note that the radiation in the 350 thousand metric tons of water being stored at the plant is of a weaker variety and less harmful to human health. They also note that radiation levels in the waters just outside the plant’s oceanfront remain below permissible levels, and most of the seafood from around the area has shown no detectable levels of radioactive materials.

“The public has been traumatized by the accident itself and ongoing issues and there is a lot of concern out there,” said Lake Barrett, a former member of the U.S. Nuclear Regulation Commission, during a Tokyo visit earlier this month. “In my scientific view, much of that concern is overstated sometimes, but there are legitimate concerns. Not only does the water management plan need to be scientifically effective for the public, it needs to be perceived and understood by the public,” Mr. Barrett said.

Whether Japan can eliminate all “misunderstandings and inaccuracies” remains uncertain, given its poor public relations record, said Kenta Yamada, professor of journalism at Senshu University. But that’s not necessarily a bad thing, he said.

“A government very good at controlling information is dangerous,” he said. “Media reports are not always good in quality throughout, but they tend to include some sense of the truth.”

William Sposato contributed to this item.

10. Tepco Finds New Tank Leak at Fukushima Dai-Ichi Atomic StationBy Jacob Adelman & Chisaki Watanabe - Oct 3, 2013 8:56 AM GMT+0800

http://www.bloomberg.com/news/2013-10-03/tepco-finds-new-tank-leak-at-fukushima-dai-ichi-atomic-station.html

Tokyo Electric Power Co. (9501) found a fresh leak at a

storage tank holding contaminated water at the Fukushima station about six weeks after an earlier outflow prompted the government to intervene in the plant’s cleanup.

Beta radiation levels of 200,000 becquerels per liter were found near the leak that was confirmed at 9:55 p.m. yesterday, the utility known as Tepco said in a statement early this morning. Beta radiation includes strontium-90, which safety rules require to be kept under 30 becquerels at atomic plants.

Tepco didn’t know when the leak started or how much water has been discharged, Yusuke Kunikage, a spokesman, said in an interview before a press conference planned for 10 a.m. The company can’t rule out the possibility that some of the water flowed to the sea, according to the statement.The government last month announced plans to spend 47 billion yen ($483 million) to stop leaks of radioactive water, saying it would be involved more closely in the site’s cleanup. Tepco reported a leak of about 300 metric tons of water from a storage tank on Aug. 20.The Fukushima site has hundreds of thousands of tons of water stored in more than 1,000 tanks, with additional water remaining untreated in reactor basements and service tunnels.Levels of toxic water are rising at a rate of 400 tons a day as groundwater seeping into basements mixes with cooling water that has been in contact with highly radioactive melted reactor cores.To contact the reporters on this story: Jacob Adelman in Tokyo at [email protected]; Chisaki Watanabe in Tokyo at [email protected] contact the editor responsible for this story: Jason Rogers at [email protected]

11. A climate of contentionGRAHAM LLOYD, ENVIRONMENT EDITORTHE AUSTRALIANSEPTEMBER 30, 2013 12:00AM

http://www.theaustralian.com.au/news/features/a-climate-of-contention/story-e6frg6z6-1226729549310

HAVING strengthened its conviction to 95 per cent certainty that human activity is responsible for changing the Earth's climate, scientists have delivered politicians a "carbon budget" road map on what to do about it.

To limit global temperature growth to below 2C - the level considered the best-case scenario and safest outcome - by the second half of the century human activity must be carbon negative.

Rather than the 10 billion tonnes of carbon human activity is pumping into the Earth's atmosphere every year, and rising, humans will have to find ways to pull it out.

For some this means devising new methods of bio-engineering to suck carbon dioxide from the air. For others it means boosting the natural order. Protecting the lungs of the Earth - forests - and making them work harder.

Senior CSIRO research scientist Pep Canadell, a lead author on the latest IPCC report, sees the future in bio-energy.

"We ran 10 models and six of the models said that by the second half of the century you actually have to have negative emissions," Canadell says.

But simply growing trees to burn for energy will not be sufficient. Once the trees have been burned the carbon dioxide given off will have to be captured and pumped underground to be stored.

It is the same controversial process envisioned for fossil fuels.

"It is the only possible, immediate thing that we can have," Canadell says.

"It involves huge expansion of biomass production that has its own issues potentially, and using unproven carbon capture and storage, which is still very expensive as of now.

"But what we are talking about is 60 to 70 years from now and the economy will be a different economy then."

Whichever way you cut it, economic change is at the heart of the

fifth assessment report of the Intergovernmental Panel on Climate Change.

It is a document that outlines the scientific basis for concern about increased human carbon dioxide emissions.

And it has clearly been drafted in a way to encourage politicians to continue down the path already set by the UN process to negotiate a global agreement to radically cut carbon dioxide emissions. The UN timetable is for agreement for a global deal that includes China, India and the US, to be finalised by 2015 to take effect from 2020.

In headline terms, the latest IPCC report says the warming of the climate system is "unequivocal" and "human influence on the climate system was clear".

World Meteorological Organisation Secretary General Michel Jarraud says the report confirms "with even more certainty than in the past that it is extremely likely that the changes in our climate system for the past half a century are due to human influence".

The atmosphere and ocean have warmed, the amounts of snow and ice have diminished, sea levels have risen and the concentrations of greenhouse gases has increased.

Jarraud says multiple lines of evidence confirm that the extra heat being trapped by greenhouse gases is warming the planet to record levels, heating and acidifying the oceans, raising sea levels and melting ice caps and glaciers.

"We are also seeing a change in weather patterns and extreme events such as heat waves, droughts and floods," he says.

IPCC working group co-chairman Thomas Stocker says heat waves are very likely to occur more frequently and last longer. As the Earth warms, wet regions are expected to receive more rainfall and dry regions less, although there will be exceptions.

For Australia, the report says to expect more of the same. Temperature rise predictions mirror the global average. Sea levels will continue to rise faster than the global average in northern areas due in part to ocean currents.

Heatwaves are going to last longer and be hotter, big rain storms will become more frequent and more intense.

While northern Australia will get more rainfall, the south, southwest and Tasmania will continue to get less. A rising concern is the impact of increased acidification of the oceans, a byproduct of greater carbon dioxide absorption.

Overall, the report sets out four scenarios of what to expect.

The first scenario anticipates that global temperature rises to the end of the 21st century can be kept below 2C. To achieve it, Canadell says, no more than 300 billion tonnes of additional carbon can be put into the atmosphere.

As a measure, Canadell says humans have put 550 billion tonnes of carbon into the atmosphere since 1770 and are currently releasing just under 10 billion tonnes a year.

The IPCC carbon budget shows that to limit the end of century temperature rise to 3C, the second scenario, carbon emissions must be kept to a total additional 800 billion tonnes.

The third scenario would limit a future temperature rise to below 4C and require carbon emissions to be kept to 1100 billion tonnes.

Business as usual, which includes a continuing steady rate of energy efficiency gains, will see another 1800 billion tonnes of carbon in the atmosphere by 2100, Canadell says. Under this scenario, the future temperature range is 2.6C to as high as 4.8C but, according to Canadell, "as you get to the highest level the probability is lower but we think it is a real possibility".

It is a similar story with sea level rises. Under the four scenarios, mean sea level rises are expected to be 40cm if the global temperature increase can be kept below 2C.

The two mid-range scenarios forecast mean sea level rises of 47cm and 48cm.

The business as usual scenario forecasts a mean sea level rise of 63cm and a worst-case scenario of 82cm.

The key factor in all of this, of course, is that the climate will react as scientists predict in response to future levels of carbon dioxide in the atmosphere.

The latest report does adjust slightly the likely range for Equilibrium Climate Sensitivity - how much average global temperatures are expected to rise after a doubling of atmospheric CO2 concentrations. The range is now deemed to be 1.5C to 4.5C, a revision from the fourth assessment report of 2.0C to 4.5C.

For some people, including controversial climate commentator Bjorn Lomborg, the revision is a concession that the world may have more time to act, a notion dismissed by mainstream climate scientists.

For others such as Professor Judith Curry, chair of Earth and Atmospheric Sciences at Georgia Institute of Technology, and Professor Richard Lindzen from Massachusetts Institute of Technology, the revision is a small concession in the face of much bigger doubts.

Lindzen accuses the IPCC panel of sinking "to a level of hilarious incoherence", of proclaiming increased confidence in its models as the discrepancies between models and observations increase.

And in a scathing assessment of the latest IPCC report, Curry says it has been undermined by "motivated reasoning, oversimplification, and consensus seeking; worsened and made permanent by a vicious positive feedback effect at the climate science-policy interface".

Like Lindzen, Curry's primary concern has been the IPCC's rising confidence despite a departure from model predictions and the physical evidence. "As temperatures have declined climate models have failed to predict this decline, the IPCC has gained confidence in catastrophic warming and dismisses the pause as unpredictable climate variability."

Curry contends the IPCC has reached this point because both the problem and solution were "vastly oversimplified" where the problem and solution were framed as "irreducibly global".

"This framing was locked in by a self-reinforcing consensus-seeking

approach to the science and a 'speaking consensus to power' approach for decision making that pointed to only one possible course of policy action - radical emissions reductions," she says.

Certainly this is the unambiguous objective of the fifth assessment report - to give a carbon emissions budget for policy-makers on which to base the hoped-for global agreement to be locked in in 2015.

Canadell does not buy the argument.

"The reality is this scientific process of synthesis is something we do all the time," he says. "This is our life.

"The only thing I can see in trying to come up with reports is they need to be backed up with documents and if anything you can become a little bit too conservative," he says.

"It is a purely scientific process, there is no interference from the United Nations or any government group.

"I have never seen anyone saying anything other than you guys have to be very comprehensive, very thorough.

"All the language we use has to be very qualified and consistent with language that has been agreed all across IPCC," Canadell says.

To the chagrin of some, the IPCC explicitly confirmed its confidence in the climate models in its latest report, saying "models reproduce observed continental-scale surface temperature patterns and trends over many decades, including the more rapid warming since the mid-20th century and the cooling immediately following large volcanic eruptions.

"The long-term climate model simulations show a trend in global-mean surface temperature from 1951 to 2012 that agrees with the observed trend.

"There are, however, differences between simulated and observed trends over periods as short as 10 to 15 years."

The IPCC report put the pause down to a combination of natural

variability, including ocean heat take up, and the impact of volcanic activity and weak solar activity.

On the contentious issue of the pause, Canadell says it is not surprising that climate models had difficulty predicting short-term climate variability.

"It is true that the models missed it but the reality is the models don't do very well on climate variability," Canadell says.

"Scientists cannot predict El Nino or La Nina weather patterns even one year in advance," he says.

"We can only do it six months in advance because we have buoys in the ocean telling us the temperature is going up.

"But while we don't understand short-term climate variability we do understand longer-term climate variability," he says.

"The fact the models missed the most recent pause is not particularly a failure."

Nonetheless, many still believe the recent hiatus in global surface temperature - albeit at a record high level - has more significance than it has been given in the latest IPCC report.

According to Curry, the IPCC has taken a big risk.

"The IPCC has thrown down the gauntlet," she says.

"Should the pause continue, they are toast."

12. The world is warming but there's no need to panicBJORN LOMBORGTHE AUSTRALIANOCTOBER 02, 2013 12:00AM

http://www.theaustralian.com.au/opinion/columnists/the-world-is-warming-but-theres-no-need-to-panic/story-fni1hfs5-

1226731120767

LAST week, the Intergovernmental Panel on Climate Change launched its fifth assessment report. It wasn't about panic and catastrophe, which unfortunately has dominated our climate debate, leading to expensive but ineffective policies.

The IPCC clearly tells us that at least half of the 0.7C temperature rise since 1950 has been caused by man. Global warming is real but in the next century the likely rise will be from 1C to 3.7C, not the 5C-plus scenario touted by alarmists.

Similarly the IPCC report calls into question the alarmist claim that sea levels will rise 1m to 2m. Its estimates are in the range of 40cm to 63cm by the end of the century.

Moreover, this moderate message is reinforced by the past 15-20 years of little or no temperature rise. Since 1980, the average of all the current climate models have overestimated the actual temperature rise by 71 to 159 per cent.

This does not mean that there is not some global warming, but it makes the worst scenarios ever more implausible.

Yet our climate conversation has been dominated by fear and end-of-the-world thinking. Less than a decade ago, Al Gore's film An Inconvenient Truth portrayed how a sea level rise of 6m would inundate Florida, along with Beijing and Bangladesh. Yes, it was terrifying. But it had no basis in reality. While panic is a great way to raise awareness and win votes, it does not produce smart policies.

A sea-level rise of 40cm to 63cm poses a more manageable challenge. For purposes of comparison, sea levels have risen about 30cm in the past 150 years. Even with fewer resources and technologies at their disposal, our forefathers handled this challenge quite deftly. There was no catastrophe.

We need to get realism back. Yes, global warming is happening. In the long run, it has an overall negative impact. Yet economic models generally find that moderate global warming is a net global benefit. Globally, cold is a far greater contributor to deaths than heat. With

increasing temperatures, avoided cold deaths will vastly outweigh extra heat deaths. By mid-century, researchers estimate 400,000 more heat deaths but 1.8 million fewer cold deaths.

Likewise, crop production will increase (because CO2 fertilises crops) more in temperate countries than it will slow down crop increases in tropical countries. It will reduce heating costs more than it will increase cooling costs.

According to a new study in the forthcoming book A Scorecard from 1900 to 2050, global warming has been an increasing net benefit for humanity since 1900. Its annual benefits (about 1.5 per cent of gross domestic product) will peak in about 2025. Only towards the end of the century will global warming turn to a net loss so while we cannot be complacent, the policies we enact must be cost-effective.

Yet the IPCC report, like the earlier ones, has spurred many activists and campaigners to argue that now the world must act and impose strong policies to restrict CO2 emissions. Except we've already tried this strategy for 20 years, in some cases failing miserably and in others spending huge amounts of money to achieve little.

A good example is the so-called EU 20/20 climate policy. The average of the top economic energy models shows that this policy costs $250 billion annually, mostly in reduced growth. Across the 21st century it will cost about $20 trillion. Yet, on a standard climate model, by the end of the century, it reduces the temperature rise by a trivial 0.05C. In other words, for every dollar spent, it avoids 3c of global warming damages.

Depressingly, I debated Bryony Worthington, who has helped implement this policy in Britain. She claimed she had worked in this area for decades and had never seen such high costs for the policy, or such low impacts. I actually had to direct her to the peer-reviewed research because the EU has never made analyses of its cost-benefit or effectiveness.

Perhaps tellingly, the IPCC report attracted far greater attention in worried Europe and much less in other parts of the world. Understandably, much of the developing world has more important priorities.

The poorest worry about tackling immediate problems, such as the 3.5 million people who will die this year from lack of access to modern fuels or the 10 million who will die from easily curable diseases.

Nations such as China are busy lifting the next 100 million out of poverty, after the past 30 years of coal use has lifted 680 million out of poverty.

So while we do need to fix global warming, we need to find smarter strategies to do so.

In the Copenhagen Consensus for Climate, economists found that the smartest long-term solution is substantially to increase funding for green energy research and development. In other words, we shouldn't subsidise today's hugely inefficient green technologies but focus on innovation to reduce the cost of future versions of wind and solar energy and the many other amazing possibilities.

Making future green technology cheaper than fossil fuels would mean that everyone would switch, not just subsidised, well-meaning Westerners.

The global cost of this R&D approach would be less than half the cost of the EU's present climate policies alone. The Australian share would be about $3bn. This strategy would enable us to avoid $11 of climate damage for every dollar spent, a benefit about 500 times higher than present policies.

With its moderate tenor, the new IPCC report ought to make our debate more constructive. Instead of being scared silly and overreacting, we need to realise that global warming is just one of the 21st century's challenges, and one that we can address today with low-cost, realistic innovation.

Bjorn Lomborg, an adjunct professor at the Copenhagen Business School, directs the Copenhagen Consensus Centre.

 13. We must harness the power of the sun

In tackling climate change, solar power must be at the forefront of research into non-carbon energy sources

David King and Richard Layard The Observer, Sunday 29 September 2013

http://www.theguardian.com/commentisfree/2013/sep/29/climate-change-energy-sources-solar-power

Last Friday's report from the United Nations confirms the huge danger from our continued dependence on fossil fuel. But one simple thing can break this dependence. It needs to be cheaper to produce non-carbon energy than it is by digging up coal, gas or oil. Once this happens, most of the coal, gas and oil will automatically be left undisturbed in the ground.

To make non-carbon energy become competitive is a major scientific challenge, not unlike the challenge of developing the atom bomb or sending a man to the moon. Science rose to those challenges because a clear goal and timetable were set and enough public money was provided for the research. These programmes had high political profile and public visibility. They attracted many of the best minds of the age.

The issue of climate change and energy is even more important and it needs the same treatment. In most countries, there is at present too little public spending on non-carbon energy research. Instead, we need a major international research effort, with a clear goal and a clear timetable.

What should it focus on? There will always be many sources of non-carbon energy – nuclear fission, hydropower, geothermal, wind, nuclear fusion (possibly) and solar. But nuclear fission and hydropower have been around for many years. Nuclear is essential but faces political obstacles and there are physical limits to hydropower. Nuclear fusion remains uncertain. And, while wind can play a big role in the UK, in many countries its application is limited. So there is no hope of completely replacing fossil fuel without a major contribution from the power of the sun.

Moreover, the sun sends energy to the Earth equal to about 5,000 times our total energy needs. It is inconceivable that we cannot collect enough of this energy for our needs, at a reasonable cost.

The price of photovoltaic energy is falling at 10% a year, and in Germany a serious amount of unsubsidised, solar electricity is already being added to the grid. In California, forward contracts for solar energy are becoming competitive with other fuels and they will become more so, as technology progresses.

But time is desperately short and there are two even bigger scientific challenges. The first is to make solar power available on a 24-hour basis, when the sun shines only part of the day and can be obscured by cloud. This requires a major breakthrough in the storage of electricity.

The second is to reduce the cost of transmitting electricity from areas of high luminosity and low land value to the major population centres of the world. Better storage requires major breakthroughs in the science of batteries; better transmission requires new materials that are much better at conducting electricity without loss of power. In all these cases, the solution requires new disruptive technologies.

So here is our proposal. There should be a world sunpower programme of research, development and demonstration. The goal would be by 2025 to deliver solar electricity at scale to the grid at a cost below the cost of fossil fuel. All countries would be invited to participate. Those who did would commit, in their own countries, to major new programmes of research, internationally co-ordinated, and to share their findings for the benefit of the world.

Each country would have the goal of demonstrating bulk supply of unsubsidised solar electricity in scale to the grid by 2025. At the world level, the target would be for solar electricity to be at least 10% of total energy supply by 2025 and 25% by 2030. Countries' contributions to this target would be closely watched.

The programme would be truly broad. It would cover non-grid solar as well as grid electricity. And it would be of value to wind electricity as well, through improving storage and transmission.

Unlike fossil fuel, solar produces no pollution and no miners get killed. Unlike nuclear fission, it produces no radioactive waste. It harnesses the power of the sun, which is the ultimate source of most energy on Earth. And it can strike the imagination of a people and therefore of their politicians.

A central role of governments is to promote new public knowledge. Surely the most important knowledge of all is how to preserve human life as we know it. In 2015, the nations of the world will meet to agree their commitments on climate change. Whatever else they agree, they should go for a major sunpower programme.

Sir David King will be the foreign secretary's special representative on climate change from 1 October. Lord Layard is former founder-director of the Centre for Economic Performance at the LSE.

14. Last artificial star in tokamak MAST before major upgrade27 September 2013By Tereza Pultarova

Engineering & Technology Magazine

http://eandt.theiet.org/news/2013/sep/fusion-tokamak-must-upgrade.cfm

Scientists at the UK’s Culham Centre for Fusion Energy (CCFE) have run final experiments on the MAST tokamak on Friday 27 September before starting a major overhaul of the device that will pave the way for a prototype fusion plant.“It’s a bittersweet moment for us because we are saying goodbye to the old machine but at the same time, we are already looking forward to the new one,” said the CCFE spokesman Nick Holloway.

“At 4pm today, we will run the last plasmas and within minutes after that, engineers will move in to shut down the tokamak for the next 18 months. By Monday, the roof beams in the MAST machine area will have been taken off before the 25-tonne MAST vessel will be lifted on a big crane and moved to the assembly hall.”

The £30m upgrade is set to make MAST (the Mega Ampere Spherical Tokamak), a cutting edge facility. It will increase its power and enable testing technologies that will improve the knowledge base needed for the construction of ITER, but also to

test systems for the DEMO prototype fusion power plant.

“To take fusion forward to ITER and through to commercial power, we need to keep improving our research facilities. In 2015, CCFE will have a machine that we and our collaborators from around the world can use to explore exciting new areas of plasma physics and test innovative concepts for fusion technology. We can’t wait,” said Dr Brian Lloyd, Head of CCFE Experiments Department.

One of the key technologies the upgraded MAST will be equipped with is the Super-X divertor, an innovative high-power exhaust system that will reduce the power loads from particles leaving the plasma.

The divertor is an exhaust system at the bottom of the fusion chamber, through which waste rejected from the plasma leaves the reactor. The particles being exhausted are extremely energetic, resulting in extreme power loads on in this part of the reactor. The idea of Super-X is to steer the particles along a longer exhaust path, allowing them to cool down and spreading them over a larger area, so that the power loads on materials are significantly reduced. 

“This technology could actually pave the way towards future fusion power stations. It will be the very first time anyone will be using this technology,” Holloway said.

Since 1999, MAST, an innovative spherical tokamak, a successor of UK’s earlier venture called START, has created over 24,000 man-made stars, providing a wealth of data. The knowledge gathered during the MAST experiments has helped advance understanding in many key areas including plasma instabilities and start-up methods.

The spherical concept that MAST inherited from START, has proven over the years to be more efficient than the conventional toroidal design, adopted by JET and ITER.

MAST has originally been commissioned by Euroatom and the UK Atomic Energy Agency, the current upgrade, however, is funded by the Engineering and Physical Sciences Research Council.

Apart from the Super-X divertor, the tokamak will receive a new centre column, better divertor coils, and a cryopump and power supplies that will provide pulse lengths up to ten times that of the

existing machine.

As Holloway said, the MAST engineers and physicist are definitely not going on a 18-month vacation. They will be busy analysing the data the tokamak has provided previosly and will have to prepare new experiments for the improved machine.

“There will also be a lot of work developing the new systems and also a lot of theoretical work to do that will be put to practice later,” Holloway concluded.

15.

http://www.iter.org/newsline/282/1706

TER site, 5:25 a.m. on Friday 20 September. The moon is high and, despite a sleepless night, so are the spirits. The huge trailer with its load of concrete blocks that replicates the size and weight of ITER's most exceptional components—800 tons in all—has rolled to a stop.

Among those present at that early hour there's a feeling of relief and a deep sense of accomplishment: the operation was a complete success.

Arriving on schedule and with only one minor incident the test-convoy has demonstrated the conformity of the ITER Itinerary with the rigorous technical specifications of ITER's most exceptional loads. The way is now open for the delivery of the actual components of the ITER Tokamak.

The journey had begun four nights earlier on the shores of the Étang de Berre, a small inland sea connected by a narrow channel to the Mediterranean. The self-propelled trailer, accompanied by a large escort of support personnel, vehicles, technical experts and gendarmerie motorcyclists got off to a start at around 9:45 p.m.

The setting in Berre was reminiscent of a village fair: complete with lights piercing the night, an exhibition tent (set up by Agence Iter France, the organizers), a large crowd of onlookers of all ages and the feeling of excitement that exceptional events generate.

Although local inhabitants are accustomed to the passage of

exceptional convoys for the steelworks and refineries of the region, they had never seen such a monster: 46 metres long, 9 metres wide and 10 metres high, as heavy as two fully loaded Boeing 747s.Combined with its escort of men and vehicles, the convoy formed a "sealed pocket" more than one hundred metres long, slowly progressing along public roads and some stretches of dedicated track and crossing the A7 and A51 motorways in four different locations. Measurements were made all along the 104-kilometre journey to verify that the stresses caused to the roads, bridges and roundabouts agreed with engineering calculations.

Parked in a secure area during the day, the convoy progressed night after night at speeds varying from 5-15 km/hour. It crossed three times over every bridge (backing up and going over them again) in order to take the required measurements. It is estimated that some 2,000 people came out along the different stages of the Itinerary to watch the passage of the ITER test convoy.

Although it will take several weeks to process and analyze all of the data that was collected over four nights, it was already clear on Friday morning that the reality was in near sync with the calculations—all measurements fell to within a few percentage points of what had been predicted.

The fourth and last leg of the journey was by far the most spectacular: the convoy had to negotiate the narrow main street of Peyrolles, cross the A51 motorway in order to bypass the tunnel of Mirabeau, weave along the road through Saint-Paul-lez-Durance and, last but not least, climb the steep incline of the heavy-duty track leading to the ITER platform (A tractor was attached to the trailer, adding 500 hp of pulling power).

At 4:45 a.m. on Friday 20 September, the convoy arrived at the last roundabout at the entrance of the ITER site. It took another 40 minutes to reach the parking area between the on-site concrete batching plant and the Cryostat Workshop worksite.

As soon as the ignition key was turned off, two large cranes took position at its side to prepare for dismantling operations. By the following Monday, all 360 concrete blocks had been unloaded and the disassembly of the 88-axle vehicle was well advanced.

The successful conclusion of the test convoy operations bring to a close some eight years of preparation, from the first Itinerary feasibility studies to the final adjustments last year.

16. Global warming believers are feeling the heatBy James Delingpolehttp://blogs.telegraph.co.uk/news/jamesdelingpole/100238047/global-warming-believers-are-feeling-the-heat/

James Delingpole is a writer, journalist and broadcaster who is right about everything. He is the author of numerous fantastically entertaining books, including his most recent work Watermelons: How the Environmentalists are Killing the Planet, Destroying the Economy and Stealing Your Children's Future, also available in the US, and in Australia as Killing the Earth to Save It. His website is www.jamesdelingpole.com.

On Friday the Intergovernmental Panel on Climate Change delivers its latest verdict on the state of man-made global warming. Though the details are a secret, one thing is clear: the version of events you will see and hear in much of the media, especially from partis pris organisations like the BBC, will be the opposite of what the IPCC’s Fifth Assessment Report actually says.• Nile Gardiner: Senator Ted Cruz stands up to the liberal Left• Tim Stanley: Ignore the killer abortionist Kermit Gosnell• Con Coughlin: Iran must get over its Holocaust hang-upAlready we have had a taste of the nonsense to come: a pre-announcement to the effect that “climate scientists” are now “95 per cent certain” that humans are to blame for climate change; an evidence-free declaration by the economist who wrote the discredited Stern Report that the computer models cited by the IPCC “substantially underestimate” the scale of the problem; a statement by the panel’s chairman, Dr Rajendra Pachauri, that “the scientific evidence of… climate change has strengthened year after year”.As an exercise in bravura spin, these claims are up there with Churchill’s attempts to reinvent the British Expeditionary Force’s humiliating retreat from Dunkirk as a victory. In truth, though, the new report offers scant consolation to those many alarmists whose careers depend on talking up the threat. It says not that they are winning the war to persuade the world of the case for catastrophic anthropogenic climate change – but that the battle is all but lost.At the heart of the problem lie the computer models which, for 25

years, have formed the basis for the IPCC’s scaremongering: they predicted runaway global warming, when the real rise in temperatures has been much more modest. So modest, indeed, that it has fallen outside the lowest parameters of the IPCC’s prediction range. The computer models, in short, are bunk.

To a few distinguished scientists, this will hardly come as news. For years they have insisted that “sensitivity” – the degree to which the climate responds to increases in atmospheric CO₂ – is far lower than the computer models imagined. In the past, their voices have been suppressed by the bluster and skulduggery we saw exposed in the Climategate emails. From grant-hungry science institutions and environmentalist pressure groups to carbon traders, EU commissars, and big businesses with their snouts in the subsidies trough, many vested interests have much to lose should the global warming gravy train be derailed.This is why the latest Assessment Report is proving such a headache to the IPCC. It’s the first in its history to admit what its critics have said for years: global warming did “pause” unexpectedly in 1998 and shows no sign of resuming. And, other than an ad hoc new theory about the missing heat having been absorbed by the deep ocean, it cannot come up with a convincing explanation why. Coming from a sceptical blog none of this would be surprising. But from the IPCC, it’s dynamite: the equivalent of the Soviet politburo announcing that command economies may not after all be the most efficient way of allocating resources.Which leaves the IPCC in a dilemma: does it ’fess up and effectively put itself out of business? Or does it brazen it out for a few more years, in the hope that a compliant media and an eco-brainwashed populace will be too stupid to notice? So far, it looks as if it prefers the second option – a high-risk strategy. Gone are the days when all anybody read of its Assessment Reports were the sexed-up “Summary for Policymakers”. Today, thanks to the internet, sceptical inquirers such as Donna Laframboise (who revealed that some 40 per cent of the IPCC’s papers came not from peer-reviewed journals but from Greenpeace and WWF propaganda) will be going through every chapter with a fine toothcomb.Al Gore’s “consensus” is about to be holed below the water-line – and those still aboard the SS Global Warming are adjusting their positions. Some, such as scientist Judith Curry of Georgia Tech, have abandoned ship. She describes the IPCC’s stance as

“incomprehensible”. Others, such as the EU’s Climate Commissioner, Connie Hedegaard, steam on oblivious. Interviewed last week by the Telegraph’s Bruno Waterfield, she said: “Let’s say that science, some decades from now, said: 'We were wrong, it was not about climate’, would it not in any case have been good to do many of the things you have to do in order to combat climate change?” If she means needlessly driving up energy prices, carpeting the countryside with wind turbines and terrifying children about a problem that turns out to have been imaginary, then most of us would probably answer “No”.

17. Nuclear fusion milestone passed at US labBy Paul RinconScience Editor, BBC News websitehttp://www.bbc.co.uk/news/science-environment-24429621

Researchers at a US lab have passed a crucial milestone on the way to their ultimate goal of achieving self-sustaining nuclear fusion.

Harnessing fusion - the process that powers the Sun - could provide an unlimited and cheap source of energy.

But to be viable, fusion power plants would have to produce more energy than they consume, which has proven elusive.

Now, a breakthrough by scientists at the National Ignition Facility (NIF) could boost hopes of scaling up fusion.

NIF, based at Livermore in California, uses 192 beams from the world's most powerful laser to heat and compress a small pellet of hydrogen fuel to the point where nuclear fusion reactions take place.

The BBC understands that during an experiment in late September, the amount of energy released through the fusion reaction exceeded the amount of energy being absorbed by the fuel - the first time this had been achieved at any fusion facility in

the world.

This is a step short of the lab's stated goal of "ignition", where nuclear fusion generates as much energy as the lasers supply. This is because known "inefficiencies" in different parts of the system mean not all the energy supplied through the laser is delivered to the fuel.

But the latest achievement has been described as the single most meaningful step for fusion in recent years, and demonstrates NIF is well on its way towards the coveted target of ignition and self-sustaining fusion.

For half a century, researchers have strived for controlled nuclear fusion and been disappointed. It was hoped that NIF would provide the breakthrough fusion research needed.

In 2009, NIF officials announced an aim to demonstrate nuclear fusion producing net energy by 30 September 2012. But unexpected technical problems ensured the deadline came and went; the fusion output was less than had originally been predicted by mathematical models.

Soon after, the $3.5bn facility shifted focus, cutting the amount of time spent on fusion versus nuclear weapons research - which was part of the lab's original mission.

However, the latest experiments agree well with predictions of energy output, which will provide a welcome boost to ignition research at NIF, as well as encouragement to advocates of fusion energy in general.

It is markedly different from current nuclear power, which operates through splitting atoms - fission - rather than squashing them together in fusion.

NIF, based at the Lawrence Livermore National Laboratory, is one of several projects around the world aimed at harnessing fusion. They include the multi-billion-euro ITER facility, currently under construction in Cadarache, France.

However, ITER will take a different approach to the laser-driven

fusion at NIF; the Cadarache facility will use magnetic fields to contain the hot fusion fuel - a concept known as magnetic confinement.

[email protected] and follow me on Twitter

18. Time for a rethink on energyJENNIFER WESTACOTTTHE AUSTRALIANOCTOBER 12, 2013 12:00AM

http://www.theaustralian.com.au/business/time-for-a-rethink-on-energy/story-e6frg8zx-1226738611230

IT is easy to forget how essential reliable and competitively priced energy is to the Australian economy. But to do so puts at risk the nation's long-term economic wellbeing.

Our energy resources are a source of great wealth, with revenue from exports of coal, gas and uranium accounting for about a third of the value of Australia's total exports.

Yet our ability to manage one of our long-held competitive advantages, our energy, is slipping, with the cost of developing energy resource projects in Australia and the cost of energy -- electricity and gas -- to households and businesses rising.

Some of the reasons for these increases are outside the control of governments and businesses, including the increased cost of extracting harder to access and more remote resources, and the need to replace electricity network infrastructure at the end of its useful life. But there are many factors within our control and we should be acting on them.

It is time to rethink our approach to energy policy in four key ways.

First, we need to address the lack of integration in Australia's energy policy.

Higher energy costs, restructuring in the energy-intensive sectors of the economy, changing consumer behaviour, the uptake of renewable energy, increasing imports of liquid fuels and our burgeoning LNG export industry require a coherent and comprehensive energy policy.

We need to have in place the appropriate energy policy settings that respond to these changes, maintain our competitive advantages and ensure we have a stable investment environment for new sources of energy.

A coherent and comprehensive energy policy then provides the framework to drive emissions reduction policy. For too long, energy policy has taken a back seat to emissions reduction policy with little regard for Australia's economic competitiveness.

Second, we need to complete unfinished energy market reforms so that we can ensure reliable and competitively priced electricity.

What is required is for state governments to progress privatisation of electricity assets and remove controls on prices. The evidence is clear that there are major benefits for consumers. As the Australian Energy Market Commission has found in its recent report, NSW is well placed to follow Victoria and South Australia and move to market-based energy pricing because energy retail competition in the state is robust enough to put downward pressure on prices.

With respect to privatisation, the Productivity Commission found in its recent report on energy networks that the sale of state-owned energy assets facilitates more efficient service delivery, with benefits flowing to consumers. The other big plus is the government revenue it frees up to invest in critical infrastructure.

Reform is also needed in our gas markets, such as developing a comprehensive gas strategy and putting in place the steps to support a more transparent and liquid domestic gas market, so that we have a predictable long-term supply for domestic use and exports.

Third, we need to reduce the cost of delivering resources projects in Australia.

Australia is blessed with abundant energy resources -- but that is just one factor when it comes to where an international board will decide to invest billions of dollars in its next resources project.

Australia is a high-cost place to do business. Our research shows that Australian resources projects are 40 per cent more expensive to deliver than projects in the US Gulf Coast. The relative remoteness of Australian resources projects explains part of this, but many of the drivers of cost are within our control.

We must improve the efficiency of our project assessment and approval processes; invest more in skills and training and ensure our workplace relations laws support productivity and competitiveness such as the timely commencement of greenfield developments.

These issues do not all rest with governments -- businesses also need to lift their performance in terms of project management.

But we no longer have the luxury of being able to put off tackling these outstanding issues if we are to have any chance of capturing a second pipeline of investment in our energy resources.

Fourth, if we want reliable and competitively priced electricity we need to increase supply.

The emergence of Australia's coal-seam gas to LNG export industry demonstrates the economic wealth that investment in our energy resources can bring to the nation as a whole through taxation revenue and ongoing employment.

But the sector faces mixed messages from governments that are hampering further investment in gas projects that could supply the Australian market.

State and federal governments should support a coal-seam gas industry that is meeting its environmental and social obligations by streamlining environmental approvals, while maintaining high standards.

The approach of the NSW government, to increase restrictions on gas exploration when the state only provides 5 per cent of its own gas needs, is a backward step that will only contribute to higher

prices and risks supply shortages.

With NSW facing such supply constraints, the commonwealth and the state should be working together to facilitate approvals, such as those in the Pillaga and Gloucester region, while ensuring that environmental standards are met.

The federal Senate must also step up to its national interest obligations and accept the mandate of the Coalition government to implement its one-stop shop for environmental approvals that will help reduce the uncertainty and delays associated with major project investments in Australia.

Now is the time for Australian governments to show leadership by committing to an overarching, integrated energy policy framework that provides for the reliable and competitively priced supply of energy to continue to underpin our economy and way of life.

Jennifer Westacott is the chief executive of the Business Council of Australia.

19. Forget scaremongering, uranium will be like iron ore on steroidshttp://www.theaustralian.com.au/business/opinion/forget-scaremongering-uranium-will-be-like-iron-ore-on-steroids/story-e6frg9if-1226739255591IT has been suggested that the new fracking boom will deliver such a bounty of cheap gas that nuclear power will be doomed, and that government support for renewables will deliver a coup de grace to nuclear's ambitions.

These suggestions are usually accompanied by scaremongering about safety, implying it would be a good idea if nuclear was to be swiftly dispatched by these alternatives.

Cheap gas means that new nuclear power supplies are unlikely to be competitive in the US market. However, it is simply wrong to draw the implication that nuclear power will therefore be phased out.

Reactors are characterised by large upfront capital costs followed by very low marginal operating costs. So while you wouldn't want to build any new ones in the face of cheap gas, you would continue to operate the old ones for as long as you could.

That is what is likely to happen.

Moreover, the economics are different in China, where cheap gas is not likely to be available and the capital costs of reactors are significantly lower, making nuclear the cheapest option.

Although renewables share a low carbon footprint with nuclear power, they are not only more expensive but they are not available on demand. On a cold, calm winter's night, they provide zero power.

Nuclear remains the only viable option for zero-emission baseload power.

Again, China provides the best example: there, wind power exhibits less than 20 per cent capacity utilisation, making the real cost very expensive.

There is little recognition of the huge strides made in designing inherently safer reactors.

The three widely quoted disasters -- Three Mile Island, Chernobyl and Fukushima -- were all second-generation reactors.

All new reactors will be third-generation or higher. It is a bit like criticising Boeing's Dreamliner on the basis of the De Havilland Comet's safety record. Some perspective on associated fatalities is also required.

Direct deaths from radiation exposure at nuclear-reactor accidents globally probably only amount to about 65 people; no one died in the Three Mile Island accident; and no one has yet died as a result of exposure at Fukushima.

More people have died as a result of exposure to excess radiation in hospitals.

The huge numbers attributed to nuclear accidents simply assumes that very low-level exposure spread across a large population results in a significant number of cancer cases -- a claim for which there is no evidence.

Uranium spruikers have been talking up its prospects for almost two years, all while the price has steadily declined.

However, the cynics that do their forecasting with a rearview mirror and a ruler, and see no recovery in sight, are likely to be proved just as wrong.

With the spot price languishing around US$35/lb, new supply simply won't be forthcoming, regardless of new "discoveries".

On the demand side, 70 new reactors are under construction. Every new reactor requires about three years of fuel to fill the core for the first time and more as stock for enrichment and fuel-rod fabrication.

In other words, there is a wave of demand coming, mostly from China, and the rearview mirror experts will miss it, just like they missed the beginning of the iron ore boom 10 years ago.

The impact of temporary and permanent reactor closures in Japan and Germany after Fukushima was the equivalent of cutting global demand for uranium by 14 per cent.

The megatons to megawatts program, which has already made its last shipment, is likely to reduce supply by a little less than 10 per cent, which will not be enough to bring the market back into balance.

However, add into the mix Japanese reactor restarts, which will commence gradually next year, and the impact of stock building for start-ups, and the expectation is for aggregate undersupply next year.

The key unknown factor is the extent of existing physical uranium stocks and the behaviour of their holders when the market tightens.

Distressed stockholders -- such as Japanese utilities -- may decide to unwind excessive debt by selling stock as the market improves, but as soon as prices show sustained upward momentum, stocks held become a good investment.

Those who say that uranium prices could never double clearly don't remember what happened in iron ore; prices effectively doubled between 2004 and 2006 and then kept on rising, despite the pundits claiming that it couldn't last.

The same fundamental drivers characterise the uranium market, with inexorably rising demand from China coming up against excessively long lead times on the supply side.

The difference is that, unlike steel manufacturers, who were price sensitive, reactor operators are not because uranium is only a small fraction of their costs, and that is why the uranium market will be "iron ore on steroids".

Julian Tapp is chief executive of Energy and Minerals Australia, a uranium development company. He was formerly head of government relations and then director of strategy at Fortescue Metals Group, and before that was an economist with BAE Systems, BP and Ford

20. World powers back hotter-than-the-sun reactor04 October 2013by Sebastian Moffett

http://horizon-magazine.eu/article/world-powers-back-hotter-sun-reactor_en.html

Ministers representing many of the world's main

economic powers met on 6 September 2013 to show their support for one of the world’s most ambitious scientific experiments – a nuclear fusion reactor that will operate at temperatures ten times hotter than the core of the sun.

Representatives from the seven regions that are backing ITER – the International Thermonuclear Experimental Reactor – met for only the second time at the site of the planned reactor in southern France in September to underline the importance of the project.‘Not to invest in fusion would be a big mistake,’ said Günther Oettinger, the European Commissioner for Energy. ‘We have oil for the next 20, 30, or maybe 40 years; but nobody knows what will happen at the end of the century. We have to switch and we need to invest in new, innovative energy generating technology for our children.’ITER aims to produce energy through the same nuclear reaction that powers the sun. But, while the centre of the sun burns at 15 million degrees Celsius, the hydrogen inside the ITER reactor will be heated to some 150 million degrees Celsius.At that temperature, electrons are ripped off individual atoms to form plasma, where nuclei float in a sea of electrons.The high temperature means the plasma cannot be allowed to touch the sides of the reactor. So it will be suspended amid a vacuum in a toroid – a doughnut shape – using some of the world’s most powerful magnets.‘The magnetic field will put very high mechanical stress on the supporting structure,’ said fusion physicist Dr Osama Motojima, ITER's Director-General. ‘So we developed an engineering design almost close to the limit of the material.’Vast rewards 

The potential rewards of the project are vast: fusion-based power would solve much of the world's energy needs without the dangers of traditional nuclear reactors.But the difficulty of the technology means it will take time. ITER is aiming to start the most important test reactions in 2027. Success then will mean simply that the principle has

worked so that plans can begin to construct commercial reactors to supply electricity grids. These might not come on line till 2040 or later.Work began in 2010 on the 42-hectare site in Saint Paul-lez-Durance, in the hills of Provence, France, where China, the EU, India, Japan, Russia, South Korea and the United States are collaborating on ITER.So far, a five-storey headquarters building and an assembly building have been built, and the foundations have been prepared for the main reactor. Less hazardous

The reactive materials used in fusion are less hazardous than those for traditional fission reactors. Fission occurs when a large nucleus splits, giving off energy, and it normally uses radioactive forms of uranium, which pose a threat if the reaction leaks. The fusion reactions being worked on consist of two small nuclei – of hydrogen – which collide to form helium, giving off energy in the process. Though some of the hydrogen used will be radioactive – the reaction needs heavier isotopes than the most common form of hydrogen – it will be easier to store and manage than uranium. Moreover, only tiny amounts will be needed because of the huge amounts of energy given off in the reactions.However, mastery of nuclear fusion has proved elusive for half a century. Fusion reactions have been achieved in other test facilities, such as JET, the Joint European Torus, in the United Kingdom. But these runs have lasted just a few seconds.While JET almost achieved ‘break-even’ – when a fusion reaction produces as much energy as was needed to set it off – it has not produced commercially viable amounts. ITER's goal is to sustain a fusion reaction for several minutes: for 50 MW of input power, it's aiming to produce 500 MW of output, enough to show that the technology is practical.The public and scientific community is more supportive than in the past over the chances of success, says Motojima, whose career in plasma physics dates back to the 1970s. A Japanese fusion device he managed from 1998, the Large Helical Device Experiment (LHD), was greeted at the start with widespread

scepticism, he said.‘When we started to build the LHD in Japan, more than 50 % of people said, “It’s crazy, it’s not possible”,’ he said. ‘But now, nobody is saying it’s not possible here. That’s encouraging.’If it is successful, the international participants will take the technology and try to put it to commercial use. South Korea – which, like Japan, has almost no fossil fuel resources – even has a fusion law that authorises an annual budget for research, currently about EUR 185 million.So instead of each country pooling funds and the project being carried out centrally, 90 % of the equipment is being contributed in-kind, with each country assigned to build certain pieces. Roads, bridges and roundabouts have been adapted to form a 104 kilometre route for components arriving by sea before they are assembled like a high-tech jigsaw puzzle.In the building phase, each step has to wait for a previous one to be finished – but these steps are sometimes held up by the arrangements for contract awards in the participating countries. ‘Intensive effort and innovative methods will be required to meet ... the challenge of staying within a tight but realistic schedule while containing costs,’ said Oettinger.The EU is providing 45 % of the funding for ITER. Though most other participants offered firm commitments, the US representative, Edmund Synakowski, Associate Director of Science for Fusion Energy Sciences at the US Department of Energy, emphasised that Congress would first have to approve continued US funding. A decision is expected next year.During the meeting, the ministerial representatives reaffirmed the significance of the ITER experiment as an important step towards fusion energy, and underlined the fact that the project is also defining a new model for international scientific collaboration.e needs to invest in its long-term eneergy, Günther OettingerWhere does ITER and fusion technology fit into Europe’s wider energy policy?

‘Ensuring a high level of security of supply is a big challenge for all continents, but especially for Europe because we do not have many natural sources of fossil fuel and spend billions of

euros on fuel imports each year. Investing in research, innovation, and the development of new technologies is a must for Europe and I think nuclear fusion is a realistic option. Not to invest in fusion would be a big mistake. It makes sense for Europe to invest together with competent partners from highly industrialised countries such as the United States, Russia, Japan, South Korea, India, and China. Locating the investment and the infrastructure within Europe, at the ITER site in France, represents an ideal partnership and a good option for Europe.’What about the time required for fusion to become a viable energy source for Europe’s citizens?

‘Investment in research, namely in energy research, is a long-term investment – not just for the next few years but for the coming decades. If we are now experiencing the oil peak, then we have oil for the next 20, 30, or maybe 40 years; but nobody knows what will happen at the end of the century. We have to switch and we need to invest in new, innovative energy-generating technology for our children, for 2030, 2040, 2050, and beyond. The priority is not to know whether it will be in 2027 or 2031 (that we produce this power), but it’s going to be soon and we are doing it to offer our children a broader energy mix than today. I think that’s a good enough reason to take all of these steps.’Do you think fusion is an expensive form of energy generation in comparison with other energy sources currently available?

‘Energy investments are expensive and in Europe today we spend 10 % of our Gross Domestic Product (GDP) on heating, cooling, power, and transport, for example, and this figure is set to increase to 15 %. I think it’s wise to optimise energy investment and to develop technologies with no greenhouse gas emissions, which will consequently avoid damage to nature and help to prevent climate change.’How confident are you that this technology will be effective in producing significant amounts of power?

‘I am not an engineer; I studied law, but I have many contacts who are energy specialists, experts, or engineers and they are realistic about the situation, but also confident and positive. These are not just German, Dutch, or French engineers; they are engineers from many European Member States and from our Chinese partners. If engineers from seven high-tech

countries are convinced by the technology, then I as a lawyer and politician am convinced as well.’Do you think nuclear fusion technology will be accepted by those who are currently opposed to nuclear fission?

‘We have to decide what our energy future is, but EU countries do not exist in isolation; each country has to accept decisions from its neighbouring countries. At present, we have 14 Member States in the EU with nuclear energy systems, and 14 without nuclear. However, nuclear fusion is not the same as nuclear fission, it’s totally different. We have to make it clear to people what the technical and technological process that occurs inside a fusion plant is. I’m sure that whether or not countries currently choose to accept nuclear fission, they can and should accept and use nuclear fusion.’

21. ITER keeps eye on prizeConstruction delays force rethink of research programme, but fusion target still on track.

Declan Butler15 October 2013

http://www.nature.com/news/iter-keeps-eye-on-prize-1.13957

Delays in the installation of key parts of ITER, a multibillion-euro international nuclear-fusion experiment, are forcing scientists to change ITER’s research programme to focus exclusively on the key goal of generating power by 2028. As a result, much research considered non-essential to the target, including some basic physics and studies of plasmas aimed at better understanding industrial-scale fusion, will be postponed.

Nature has learned that the plans form the main thrust of recommendations by a 21-strong expert panel of international plasma scientists and ITER staff, convened to reassess the project’s research plan in the light of the construction delays. The plans were discussed this week at a meeting of ITER’s Science and Technology Advisory Committee (STAC).

The meeting is the start of a year-long review by ITER to try to keep the experiment on track to generate 500 MW of power from an input of 50 MW by 2028, and so hit its target of attaining the so-called Q ≥ 10, where power output is ten times input or more.

ITER, which will be the world’s largest tokamak thermonuclear reactor is being built in St-Paul-lez-Durance in southern France by the European Union, China, India, Japan, South Korea, Russia and the United States at a cost of €15 billion (US$20.3 billion). Q ≥ 10 is seen as its raison d’être, and achieving it would be likely to revitalize public and political interest in fusion. Crucial to that is getting to the point, scheduled for 2027, when the first nuclear fuel would be injected into the reactor. The fuel will be a plasma of two heavy hydrogen isotopes, deuterium and tritium (DT).

The original 2010 research plan foresaw the entire reactor being built by 2020, when ITER was also scheduled to produce its first plasma, using hydrogen as a test fuel. But cost-cutting and cash-flow problems in member states mean that while the reactor is likely to be operating by then, the delivery of some parts is being deferred until several years later. These include some diagnostics devices for analysing the physics of plasmas at the very large scales of ITER, and elements of the heating system that will eventually take the plasmas to 150,000,000 °C.

“The plan was that everything would be procured and installed before first plasma, and then we would go straight into operation with a full set of systems,” says David Campbell, head of ITER’s plasma directorate. Instead, researchers will start with an initial set of instruments and systems, with others added later as upgrades. One of the main aims of the STAC meeting was for ITER to learn what elements of the research programme were essential to keeping it on track to reach DT phase and Q ≥ 10 on schedule. A local plant that will produce tritium, for example, is one key element.

The outcome of the review is also expected to influence ITER member states’ deferral plans, which will be modified to meet the key scientific priorities identified in the review. By fixing a timetable, Campbell says, STAC “will match up delivery schedules to the research plan, so that the research plan is not waiting for

stuff to be delivered”.

The likely consequence of capping costs is that some parts of the research plan will be postponed until after 2028. ITER initially aims to produce a Q ≥ 10 for a few seconds, and then for pulses of 300–500 seconds, and work up over the following decade to output ratios of 30 times more power out than in, with pulses lasting almost an hour. Eventually the aim is to develop steady-state plasmas, which will yield information relevant to industrial-scale fusion-power generation. It is experiments relating to the understanding of longer-pulse and steady-state ITER plasmas that are most likely to be delayed beyond 2028.

Research into better plasma performance, and with it greater energy output, may also be held back, along with experiments investigating how to control turbulence, which can damage the reactor wall, and the stability and energy characteristics of plasmas.

Olivier Sauter at the Swiss Federal Institute of Technology in Lausanne, Switzerland, one of the reviewers of ITER’s research plan, says that months or more might be cut from the time needed to reach DT. But ITER’s decision to take shortcuts also carries risks, he adds. To help mitigate these, ITER is working closely with researchers at other tokamaks around the world, such as the Joint European Torus in Oxfordshire, UK, to address some of the uncertainties likely to be encountered in plasma energies and stability.

“It is somewhat unfortunate that the compression of the ITER schedule will limit interesting research opportunities during the early stages of ITER operation, but the mission of ITER is clear,” says Mickey Wade, director of the US national DIII-D fusion programme at General Atomics in San Diego, and a member of the review panel advising STAC. “The ITER physics team has done an admirable job of maintaining a single-minded focus on obtaining Q ≥ 10 operation as early as possible.”

Nature 502, 282–283 (17 October 2013)