Western Australia abandons shark cull

barnett_20140813_TomFisher_01

Western Australia Premier Colin Barnett
{credit}Government of Western Australia{/credit}

The state of Western Australia is abandoning a controversial shark-culling programme, but has also gained the right to deploy deadly baited lines for animals that pose an “imminent threat”.

The programme, run by the state government off several Western Australian beaches, had been heavily criticized by scientists when it was announced in 2013. It was due to run until 2017, and had caught at least 170 sharks using hooks suspended from drums moored to the sea floor.

In September the state’s own Environmental Protection Agency halted it. State Premier Colin Barnett then applied to the national government for permission to resume it, but today he announced that his government had ended that effort. “We have withdrawn the application after reaching agreement with the Commonwealth which enables us to take immediate action when there is an imminent threat,” said Barnett.

Under an agreement with the national government, Western Australia will be able to kill sharks in future to deal with a shark that has attacked or with one that it thinks poses a threat. Protocols for how this would happen are now in development.

This apparent concession from the national government has drawn some concern from those celebrating the end of the cull.

“I remain concerned that drum lines could be used in some instances as part of emergency measures and particularly that this could occur without Federal approval,” said Rachel Siewert, the marine spokeswoman for the Australian Greens, in a statement.

The Western Australia cull is also drawing renewed attention to the longstanding cull in Queensland, which continues unabated.

Outbreak of great quakes underscores Cascadia risk

Posted on behalf of Alexandra Witze.

The 18 great earthquakes that have struck Earth in the past decade hold ominous lessons for western North America, a top seismologist has warned. Many of these large quakes — including the 2004 Sumatra quake that spawned the Indian Ocean tsunami, and the 2011 Tohoku disaster in Japan — were surprisingly different from one another despite their similar geologic settings.

That variety implies that almost any scenario is possible in another part of the Pacific Rim where quake risk is thought to be high — along the Cascadia subduction zone offshore of Washington, Oregon, and other parts of the western United States and Canada.

“We do not fully understand the limits of what can happen,” says Thorne Lay, a seismologist at the University of California, Santa Cruz. “We have to be broadly prepared to respond.”

Lay spoke on 21 October at the Geological Society of America meeting in Vancouver, Canada, a city on the front lines of Cascadia earthquake risk.

The last great quake in the region happened in 1700. Conventional wisdom holds that the next one, perhaps as large as magnitude 9, could strike at any time in the next several hundred years. Geologically speaking, Cascadia is a classic subduction zone, where one plate of Earth’s crust plunges beneath another, building up stress and occasionally relieving it in large earthquakes.

The recent spate of great subduction-zone quakes, of magnitude 8 or larger, began with the 2004 Sumatra earthquake. On average, each year since then has brought 1.8 great quakes, more than twice the rate of the previous century.

In large part, they happened where and when seismologists expected them. “The quakes are basically filling in a deficiency of activity,” Lay says. But their details have been surprising.

The 2004 Sumatra quake, for instance, ruptured unexpected portions of a subduction zone off Indonesia, where the fault zone bends as opposed to running straight. That implies that areas in Cascadia with unusual geometry might also be at risk, Lay says.

In 2007, in Peru, a major earthquake began to happen, then essentially stopped for 60 seconds before picking up again and eventually generating a large tsunami. That start-stop-start pattern raises challenges for Cascadia because seismologists are trying to develop an accurate earthquake early warning system there.

And in April 2014, a Chilean quake ruptured a far shorter portion of a subduction zone than scientists had expected. That suggests that researchers can’t be complacent about thinking they know which parts of Cascadia might break, Lay says. (The worst-case scenario for Cascadia involves a rupture of approximately 1,000 kilometres.)

That’s not to say scientists aren’t preparing. The recently launched M9 project, coordinated out of the University of Washington in Seattle, aims to help officials cope with the risk of a great Cascadia quake. At the Vancouver meeting, Arthur Frankel of the US Geological Survey in Seattle showed early results of calculations of where the ground might shake the most. Enclosed basins, like Seattle, amplify the shaking, he reported.

Geologists face off over Yukon frontier

Posted on behalf of Alexandra Witze. 

The walls of the Geological Survey of Canada’s Vancouver office are, not surprisingly, plastered with maps. There’s one of the country of Canada, one of the province of British Columbia, and even a circumpolar Arctic map centered on the North Pole.

IMG_1626

The Klondike schist of Canada (shown in green) stops at the border with the United States. {credit}Alexandra Witze{/credit}

All display that distinctive rainbow mélange so typical of professional geologic maps. Each major rock formation is represented by its own colour, so that pinks and purples and yellows swirl in great stretches representing mountain ranges, coastal plains, and every conceivable landscape in between.

But lying on the table of the survey’s main conference room is a much more problematic map. It shows part of the far northern boundary between the United States and Canada, along a stretch between Alaska and the Yukon territory. And the two sides, on either side of the international border, do not match.

It’s not a question of Canada using one set of colours for its map and the United States using another. The geology simply does not line up. To the east, Canadian mappers have sketched a formation called the Klondike schist, which is associated with the gold-rich rocks that fueled the Klondike gold rush in the late 1890s. To the west, US maps show nothing like it.

“We don’t know why,” says Jamey Jones, a geologist with the US Geological Survey (USGS) in Anchorage, Alaska. “We have got to figure out why these aren’t matching.”

He and two dozen scientists from both sides of the border — but clad equally in plaid shirts and hiking boots — met in Vancouver on 20 October to try to hammer out the discrepancies. For two hours they compared mapping strategies, laid out who needed to explore what next, and swapped tips about the best ways to get helicopters in the region.

The last frontier

At one level, the differing maps are a relatively minor academic point to sort out. Such glitches are fairly common whenever geologists have to match one ‘quadrangle’ mapped from one era or with one technique against another from a different time. And it’s not unusual for geology to not quite line up across international borders.

But American and Canadian geologists have reconciled their maps along nearly the entire northern stretch where Alaska and the Yukon meet, says Frederic “Ric” Wilson, a geologist with the USGS in Anchorage. This last bit is the only one that does not match — and it may well be because the Canadian maps are four years old, while the American ones are four decades old.

The US maps stretch back to the days of legendary geologist Helen Foster, who mapped large parts of Alaska after making her name as a post-war military geologist in former Japanese territories. “With her, you walked every single ridge,” recalls Wilson. “Every single ridge.”

All that walking produced maps of huge stretches of the remote Alaskan landscape. They include the 1970 quadrangle map now in question, which abuts a much newer Canadian quadrangle to the east. Together the maps span part of a massive geological feature known as the Yukon-Tanana Terrane, a collection of rocks caught up in the mighty smearing crush where the Pacific crustal plate collides against North America.

The Canadian side of the map is in good shape. Prompted in part by intense mining interest, geologists there have mapped the Klondike in modern detail.  “I’m willing to integrate any piece of data that comes in,” says Mo Colpron, a geologist with the Yukon Geological Survey. “If you guys come up with things that affect how our side of the border works, then we can sit down and talk and try to mesh it.”

That leaves the burden of work on the US side, to update the Foster maps. “The reconciliation project is what it’s called,” says Rick Saltus, a geologist with the USGS in Denver, Colorado, who served as meeting emcee. “We’re taking a three-year look at cross-border tectonic connections, because things look a little different from one side to the other.”

This summer, Jones and his colleagues hired a helicopter to take them everywhere the Foster maps ran up against the Klondike formation. “We’ve seen a lot of rocks we didn’t anticipate seeing,” he says. That data will go into the new and improved US maps.

There is, however, only so much scientists can do. Citing border regulations, Jones says, the helicopter pilot was unwilling to take them just a tiny bit over into Canada so they could see the geology on the Yukon side.

European Commission: Tar sands no dirtier than other fuels

The European Commission has backed down from plans to label fuels derived from tar sands as more polluting than other fuels.

oil-sands-1aThe move, which EU member states must yet approve, could ease the importation of oil extracted from Canadian tar soils. But environmental groups say it is a blow to Europe’s climate protection targets.

Thanks to Alberta’s extensive tar sands, Canada holds the world’s second largest oil reserves, after Saudi Arabia. But the extraction of oil from tar sands uses considerably more energy and water than conventional oil mining.

The EU’s Fuel Quality Directive requires fuel suppliers to reduce greenhouse gas emissions from vehicle fuels by 6% by 2020.

In 2011, Brussels had proposed to restrict the use of fuel derived from tar sands by revising the directive to classify tar sands as 20% more carbon intensive – in terms of carbon dioxide emissions per unit energy – than other fuel sources. But the following year the European Commission’s proposal was voted down by member states concerned over Canada’s threat to take the issue to the World Trade Organization.

The Commission’s new proposal, released on October 7, requires fuel suppliers to report an average carbon intensity of different fuel types over their lifecycle.

“At this time, the proposed methodology should not require the differentiation of the greenhouse gas intensity of fuel on the basis of the source of the raw material as this would affect current investments in certain refineries in the Union,” the proposed text reads.

“It is no secret that our initial proposal could not go through due to resistance faced in some Member States,” Connie Hedegaard, the EU’s Climate Commissioner, said in a statement.

“However, the Commission is today giving this another push, to try and ensure that in the future, there will be a methodology and thus an incentive to choose less polluting fuels over more polluting ones like for example oil sands.”

Canada has only just begun to ship tar oil to European refineries, but Canadian oil producers hope to increase their market share in the region as EU countries seek to become less dependent on supplies from Russia and the Middle East.

World’s first ‘clean coal’ commercial power plant opens in Canada

Boundary Dam for web

The Boundary Dam Power Station.
{credit}SaskPower{/credit}

The world’s first commercial coal-fired power plant that can capture its carbon dioxide emissions officially launched today in Canada — marking a milestone for ‘clean coal’ technology.

The Boundary Dam project, in Saskatchewan, aims to capture and sell around 1 million tonnes of carbon dioxide a year — up to 90% of the emissions of one of its refitted power units — to oil company Cenovus Energy, which will pipe the compressed gas deep underground to flush out stubborn oil reserves. Unsold gas will be hived off to the Aquistore research project.  (If you’re curious to know what a carbon capture facility looks like, SaskPower provides a virtual tour of the power station.)

As noted in a Nature article about the scheme in April, carbon-dioxide capture and storage (CCS) technology doesn’t come cheap. The Boundary Dam refit will cost Can$1.3 billion (US$1.2 billion), has depended on $240 million in government subsidies, and SaskPower — the sole electricity supplier in the province — hopes that regulators will grant it a 15.5% increase on electricity prices over the next three years. But the hope is that engineers can learn from the experience how to install the technology at lower cost.

The Canadian project is just the first of what will need to be thousands of clean coal plants by 2050 to put a significant dent in emissions. (Coal-burning alone produced 15 billion tonnes of CO2 worldwide in 2012, 43% of the world’s total). On current timetables, the world is nowhere close to achieving this: the technology is just too expensive, and so far there’s been no political will to tax fossil fuels on the basis of their emissions, which would be an incentive for clean coal.

In 2009, the IEA published a road map calling for 100 large CCS projects by 2020, but in July 2013, with projects failing to materialize, it downgraded that to just 30. And even that is ambitious.

Still, one has to start somewhere. Around a dozen projects are already storing carbon dioxide at the million-tonne scale, mostly extracted from natural-gas processing plants, and the Saskatchewan ribbon-cutting today marks the first time that a commercial, grid-connected coal plant has adopted the technology. A newly built advanced coal plant in Kemper County, Mississippi, designed to store 3.5 million tonnes of carbon dioxide annually, was to open this year but has been delayed to 2015.

Animal populations ‘have halved since 1970’

Earth’s wild vertebrate populations have dropped to one-half the size they were in the 1970s, according to an analysis of more than 3,000 species.

Researchers from the WWF wildlife NGO, headquartered in Woking, UK, and the Zoological Society of London (ZSL) aggregated data on 10,380 populations from 3,038 species into an index of the health of the five main groups of vertebrates — mammals, birds, reptiles, fishes and amphibians. Set at 1 in 1970, this index has decreased to 0.48 (meaning by 52%) since then, according to their latest report.

living planet 2 jay

This analysis is the tenth ‘Living Planet Index’ from WWF and ZSL, but this year’s has a crucial difference from previous editions in that it is weighted to take account of the make-up of biodiversity in different areas. Previous versions treated every species on which data were available equally, whereas the new edition attempts to correct for the size of each taxonomic group in a region, for example by giving more weight to fish than mammals in the palearctic.

The last index – published in 2012 – showed a 28% decrease between 1970 and 2008. The bleaker picture painted by the 2014 edition comes both from real declines in newer data, and from the new weighting.

“The scale of biodiversity loss and damage to the very ecosystems that are essential to our existence is alarming,” said Ken Norris, the director of science at ZSL, in a statement. “Although the report shows the situation is critical, there is still hope. Protecting nature needs focused conservation action, political will and support from businesses.”

There have been some successes, especially in protected areas. The study mentions the example of Nepal’s tiger (Panthera tigris), whose population increased by 63% between 2009 and 2013. But most vertebrate populations are in decline, and some drastically — such as rhinos and elephants threatened by poaching in Africa and sharks impacted by overfishing.

living planet 1 jay

{credit}Living Planet Report 2014{/credit}

Obama vastly expands Pacific reserve

Coral_at_Jarvis_Island_National_Wildlife_Refuge

Coral at Jarvis Island National Wildlife Refuge
{credit}US Fish and Wildlife Service{/credit}

US President Barack Obama has vastly increased marine protection in the Pacific by declaring 1 million square kilometres of ocean part of a giant marine reserve.

Obama’s declaration on 25 September increased from 210,000 square kilometres to 1.3 million km2 the size of the protected area around a group of small islands in the central Pacific, stretching from Wake Atoll to Jarvis Island. This makes the Pacific Remote Islands Marine National Monument (PRINM), originally created by former president George W. Bush, one of the largest marine protected areas in the world. Thousands of sea birds, turtles, sharks and other marine life will now be fully protected from commercial, if not from recreational, fishing over this extended area.

The expansion was not as large as some researchers and conservationists were hoping. It had been suggested that the reserve could have been expanded by 1.8 million km2, and the scale-down seems to be a concession to the tuna fishing industry, which is active in the region.

Still, the expansion means Obama has put more of the planet under protection than has any other world leader, says Elliott Norse, chief scientist at the Marine Conservation Institute, a non-governmental organization in Seattle, Washington, that has played a central part in the creation and expansion of PRIMNM.

“We have been working to make this happen for nearly a decade,” says Norse. “We’re thrilled that President Obama has done this.”

Norse says the expansion should act as a trigger to nations that have been less progressive in protected their own waters, such as China, Russia and France: “We would love to see these and other nations take their cue from this action.”

Sea-ice trends are poles apart

Arctic sea ice

{credit}National Snow and Ice Data Center{/credit}

Sea-ice cover in the Arctic Ocean last week dropped to an annual minimum 5.02 million square kilometres, according to satellite observations by the National Snow and Ice Data Center in Boulder, Colorado.

This year’s minimum is the sixth lowest since satellite measurements began in 1979. Arctic sea-ice cover on 17 September was 1.61 million square kilometres higher than the record low extent observed in September 2012, but still 1.20 million square kilometres below the 1981–2010 average.

This year, a sliver of open water from the Laptev Sea off the coast Siberia extended the farthest north that open ocean has reached since the late 1970s.

Meanwhile, sea ice around Antarctica has exceeded the record maximum extent set last year. For the first time in the 35-year satellite era, Antarctic sea ice now covers more than 20 million square kilometres — an area almost the size of North America — and may still be growing.

Ahead of UN summit, chances dwindle to keep warming at bay

smokestacks

Credit: Martin Muránsky/Shutterstock.com

Despite a slowdown in recent years in the rate of global warming, the world remains on a path to substantial and potentially disruptive climate change.

Global carbon dioxide emissions from the burning of fossil fuels and the production of cement reached a record high of 36.1 billion tonnes in 2013, and are now more than 60% above the level of 1990, when the Intergovernmental Panel on Climate Change (IPCC)  released its first report. Compared to 2012, emissions grew by 2.3% last year and are likely to increase by a further 2.5% in 2014.

The new figures were released on 21 September by the Global Carbon Budget, a group that regularly analyses changes in carbon sources and sinks.

CO2 emissions continue to track the high-end scenarios used by the IPCC in its latest report to project the magnitude of global warming. Without sustained mitigation measures — including capturing and storing the carbon produced by power stations — the world is likely to warm by 3.2–5.4 °C above pre-industrial levels by the end of the century.

“It is getting increasingly unlikely that global warming can be kept below 2 °C,” says Glen Peters, a climate scientists at the Center for International Climate and Environmental Research in Oslo. “In any case, the challenge is getting bigger every year and might be unachievable without our betting on negative emissions.”

The dire outlook — detailed in a package of research articles and commentaries in Nature Geoscience and Nature Climate Change — comes on the eve of a climate summit convened by the United Nations on 23 September in New York. At the meeting, world leaders aim to prepare the ground for an international greenhouse-gas reduction agreement to be signed next year.

“Governments say they agree with the 2 °C target but the urgency of action hasn’t really sunk in,” says Corinne Le Quéré, a climate scientist at the University of East Anglia in Norwich, UK, and a co-author of the studies. “We have already used up two-thirds of the fossil fuels we can afford to burn if we want to have a reasonable chance to stay below 2 °C warming. At the rate at which CO2 currently accumulates in the atmosphere, the remaining emissions budget will be exhausted in 30 years.”

When the latest set of IPCC emissions scenarios were developed about ten years ago, many experts had expected the ‘carbon intensity’ of the world economy’s to decrease by 2% to 4.5% per year. But that has not happened — mainly owing to China’s continued reliance on coal as the main energy source for its growing economy, the actual decline in the amount of fossil fuel used to produce 1% of global gross domestic product was merely about 1%. Given current projections of global economic growth, emissions are unlikely to peak and reverse any time soon in the absence of more stringent energy policies.

Despite its increased efforts to reduce pollution, China surpassed the United States as the world’s largest emitter of CO2 in 2007 and is now emitting more than the US and the European Union combined. China’s pro capita emissions are still not as high as those in the US, but in 2013 they were higher than the EU’s. Together, the three regions account for more than half of worldwide emissions.

 

Prime numbers, black carbon and nanomaterials win 2014 MacArthur ‘genius grants’

Yitang Zhang, a mathematician who recently emerged from obscurity when he partly solved a long-standing puzzle in number theory, is one of the 2014 fellows of the John D. & Catherine T. MacArthur Foundation.

The awards, commonly known as ‘genius grants’, were announced on 17 September. Each comes with a no-strings-attached US$625,000 stipend paid out over five years.

Zhang, a mathematician at the University of New Hampshire in Durham, was honored for his work on prime numbers, whole numbers that are divisible only by 1 or themselves. In April 2013 he published a partial solution to a 2,300-year-old question: how many ‘twin primes’ — or pairs of prime numbers separated by two, such as 41 and 43 — exist.

The twin-prime conjecture, often attributed to the Greek mathematician Euclid of Alexandria, posits that there is an infinite number of such pairs. But mathematicians have not been able to prove that the conjecture is true.

Zhang’s work has narrowed the problem, however. In his 2013 proof, Zhang showed that there are infinitely many prime pairs that are less than 70 million units apart.

Other science and maths-related winners of this year’s fellowships are listed below.

Danielle Bassett, a physicist at the University of Pennsylvania in Philadelphia, studies the organizational principles at work in the brain, and how connections within the organ change over time and under stress. Her research, which draws on network science, has revealed that people with more ‘flexible’ brains — those that can easily make new connections — are better at learning new information.

Tami Bond, an environmental engineer at the University of Illinois, Urbana-Champaign, studies the effects of sooty ‘black carbon’ on climate and human health. Bond, who led the most comprehensive study to date of black carbon’s environmental effects, has found that the pollutant is second only to carbon dioxide in terms of its warming impact.

Jennifer Eberhardt, a social psychologist at Stanford University in California, studies the effects of racial bias on the criminal-justice system in the United States. Her analyses have shown, for example, that black defendants with stereotypical ‘black’ features are more likely to receive the death penalty in cases where victims are white.

Craig Gentry, a computer scientist at the IBM Thomas J. Watson Research Center in Yorktown Heights, New York, has shown that encrypted data can be manipulated without being decrypted, and that programs themselves can be encrypted and still function.

Mark Hersam, a materials scientist at Northwestern University in Evanston, Illinois, is developing nanomaterials for a range of uses, such as solar cells and batteries, information technology and biotechnology.

Pamela Long, an historian of science based in Washington DC, has examined intersections between the arts and sciences and issues of authorship and intellectual property. She is now at work on a book tracing the development of engineering in 16th-century Rome.

Jacob Lurie, a mathematician at Harvard University in Cambridge, Massachusetts, studies derived algebraic geometry. “With an entire generation of young theorists currently being trained on Lurie’s new foundations, his greatest impact is yet to come,” the MacArthur Foundation said in its award announcement. In June, Lurie was named a winner of the inaugural $3-million Breakthrough Prize in Mathematics.