Canada’s World-Renowned Freshwater Research Facility Saved by New Management

German Researchers Withdraw From Canadian Oil Sands Project

a
TORONTO, CANADA—German scientists have pulled out of an international research project with Canada that was attempting to find ways to minimize the environmental damage caused by exploiting Alberta’s oil sands. The move comes after political pressure forced Germany’s largest scientific organization, the Helmholtz Association of German Research Centres, to rethink its connections with an industry that many consider to be environmentally destructive.

The scientists who are part of the Helmholtz-Alberta Initiative (HAI) will no longer be involved in developing technologies that improve Alberta’s crude oil or treat the toxic effluent from the oil sands projects. Instead, the scientists will focus their efforts on the initiative’s remaining research avenues, such as carbon capture and storage and mine site reclamation.

It is a change in focus, Stefan Scherer, the managing director for the HAI, tells ScienceInsider. HAI, founded in 2011, is a partnership between the Helmholtz Association and the University of Alberta “designed to find solutions to the pressing environmental issues facing energy projects such as Alberta’s oil sands in Canada and coal production in Germany,” according to the project’s Web site. “I don’t anticipate laying off scientists,” nor will money be withdrawn from the project; the initiative is not collapsing, Scherer adds. That sentiment was echoed by a spokesperson for Alberta’s Environment Minister Diana McQueen, whose department donated CAD $25 million to the project 2 years ago.

Of the four Helmholtz institutes involved in the partnership, only one, the Centre for Environmental Research (UFZ) in Leipzig, has suspended its work in Canada. The institute’s supervisory board voted in December to impose a moratorium on UFZ’s involvement in the project. This decision is a “small hiccup”, explained Lorne Babiuk, the vice president of research at the University of Alberta and co-chair in the initiative. He added that the initiative’s focus can easily be redirected because much of the technology being developed for use in the oil sands is relevant to other carbon industries. “We will reorient the initiative,” agrees the other co-chair, Reinhard Hüttl, scientific executive director of Helmholtz Centre Potsdam. “We won’t have projects directly related to oil sands.”

The German move was in part triggered by ongoing debate over a possible amendment to the European Union’s fuel quality directive that would restrict the use of “high-polluting” oil within Europe. Germany, the largest market for fuels in Europe and the fourth largest in the world, has so far blocked the move along with the United Kingdom, but public opposition to importing Albertan oil remains high. The Canadian government has been lobbying German politicians at both the national and the European level to continue blocking the ban. That lobbying, along with Canada’s withdrawal from the Kyoto Protocol, prompted several German politicians to ask the Helmholtz Association pointed questions about the Alberta project.

“It was seen as a risk for our reputation,” Frank Messner, Helmholtz’s Environmental Research Centre head of staff, told a European news Web site. “As an environmental research centre we have an independent role as an honest broker and doing research in this constellation could have had reputational problems for us, especially after Canada’s withdrawal from the Kyoto Protocol,” he said.

An independent assessment into Helmholtz-Alberta Initiative environmental credentials will report its findings in June.

:: Read original here ::

Fukushima fallout fears over Japan farms

New research has found that radioactive material in parts of north-eastern Japan exceeds levels considered safe for farming.

The findings provide the first comprehensive estimates of contamination across Japan following the nuclear accident in 2011.

Food production is likely to be affected, the researchers say.

The results are reported in the Proceedings of National Academy of Sciences(PNAS) journal.

In the wake of the accident at Japan’s Fukushima nuclear power plant, radioactive isotopes were blown over Japan and its coastal waters.

Fears that agricultural land would be contaminated prompted research into whether Japanese vegetables and meat were safe to eat.

An early study suggested that harvests contained levels of radiation well under the safety limit for human consumption.

Contaminated crops

Now, an international team of researchers suggests this result deserves a second look.

To estimate contamination levels, Teppei Yasunari, from the Universities Space Research Association in the US state of Maryland, and his colleagues, took measurements of the radioactive element caesium-137 in soil and grass from all but one of Japan’s 47 regions and combined these results with simulations based on weather patterns following the meltdown.

Caesium-137 lingers in the environment for decades, and so is more of a concern than other radioactive elements released in the cloud of steam when the reactors’ cooling systems failed, leading to explosions.

The team found that the area of eastern Fukushima had levels of the radioactive element that exceeded official government limits for arable land.Under Japanese Food Sanitation Law, 5,000 becquerel per kg (Bq/kg) of caesium is considered the safe limit in soil (caesium-137 makes up about half of total radioactive caesium, and therefore its safe limit is 2,500 Bq/kg).

The researchers estimate that caesium-137 levels close to the nuclear plant were eight times the safety limit, while neighbouring regions were just under this cut off; the rest of Japan was well below (averaging about 25 Bq/kg) the safety limit.

Relatively low contamination levels in western Japan could be explained by mountain ranges sheltering those regions from the dispersal of radioactive material, the authors said.

Food production in the most contaminated regions, the researchers write, is likely to be “severely impaired”, and that Fukishima’s neighbouring regions, such as, Iwate, Miyagi, Yamagata, Niigata, Tochigi, Ibaraki, and Chiba are likely to also be affected.

“Some neighbouring prefectures… are partially close to the limit under our upper bound estimate and, therefore, local-scale exceedance is likely given the strong spatial variability of [caesium-137] deposition,” the researchers explained in PNAS.

They urge the Japanese government to carry out a more thorough assessment of radioactive contamination across Japan before considering future decontamination plans.

A second study, published in the same edition of PNAS, collected over a hundred soil samples from within 70km of the Fukishima plant, and found similarly high caesium-137 levels across the Fukishima prefecture, and its neighbouring regions.

Radioecologist Nick Beresford from Centre of Ecology and Hydrology in Lancaster explained that once in soil, caesium will become bound to mineral components, which limits its uptake into plants.

However, this process depends on the soil type. “Caesium stays mobile for longer in organic soils, hence why England and Wales still have some post-Chernobyl restrictions in upland areas,” he told BBC News.

Ploughing, and some fertilisers can help farmers reduce plants’ uptake of the dangerous elements, and binding agents can be added to animal feed to reduce their uptake from the gut, he added.

Local recordings

New figures on background radiation levels recorded 60km northwest of the Daiichi power plant have also been released this week by Japanese physicist Tsuneo Konayashi from Fukushima Medical University.

Dr Konayashi saw an initial spike reaching over nine times the usual levels hours after the explosions at the plant; five months later levels have dropped to one and a half times those expected.

He continues to monitor radiation levels and distribute his data around campus.

Becquerels and Sieverts

  • A becquerel (Bq), named after French physicist Henri Becquerel, is a measure of radioactivity
  • A quantity of radioactive material has an activity of 1Bq if one nucleus decays per second – and 1kBq if 1,000 nuclei decay per second
  • A sievert (Sv) is a measure of radiation absorbed by a person, named after Swedish medical physicist Rolf Sievert
  • A milli-sievert (mSv) is 1,000th of a Sievert

:: Read original here ::

The EcoIsland

The UK’s energy prices are soaring. As gas and oil reserves run dry, the cost of energy will continue to climb. But what if we could wean ourselves off fossil fuels and make the jump to clean, renewable energy?

This is exactly what a small island off the coast of Africa plans to do. With a population of 11,000 people, El Hierro is building a solution to its mounting energy costs. As the most remote Canary Island, it struggles to meet the high price of shipping oil from the mainland. But what the island lacks in fossil fuels it makes up for in wind – over 3,000 hours a year of gusts blowing fast enough to propel windmills and generate
electricity.

And on the rare windless day, El Hierro hopes to bridge the gaps in its electricity supply with the ultimate energy cache: a 500,000m cubed reservoir some 700m up inside the island’s dormant volcano. When the power supply dwindles, the reservoir
can be drained downhill through turbines to generate electricity.

:: Read more here ::

Species flee warming faster than previously thought

Animals and plants are shifting their natural home ranges towards the cooler poles three times faster than scientists previously thought.

In the largest study of its kind to date, researchers looked at the effects of temperature on over 2,000 species.

They report in the journal Science that species experiencing the greatest warming have moved furthest.

The results helped to “cement” the link between climate change and shifts in species’ global ranges, said the team.

Scientists have consistently told us that as the climate warms we should expect animals to head polewards in search of cooler temperatures.

Animals like the British comma butterfly, for example, has moved 220km northward from central England to southern Scotland in the last two decades.

An uphill struggle

There is also evidence that more species seem to be moving up mountains than down, explained conservation biologist Chris Thomas from the University of York, UK, who led the study.

But studies had stopped short of showing that rising temperatures are responsible for these shifts in range, he added.

Now he and his team have made this link.

Analysing the range shifts of more than 2,000 species – ranging from butterflies to birds, algae to mammals – across Europe, North and South America and Malaysia over the last four decades, they show that organisms that experience the greatest change in temperatures move the fastest.

The team found that on average organisms are shifting their home ranges at a rate of 17km per decade away from the equator; three times the speed previously thought.

Organisms also moved uphill by about 1m a year.

“Seeing that species are able to keep up with the warming is a very positive finding,” said biologist Terry Root from Stanford University in California, US.

She added that it seemed that species were able to seek out cooler habitats as long as there was not an obstacle in their way, like a highway.

Out of range

But what about the animals that already live at the poles, or at the top of mountains?

“They die,” said Dr Thomas. Take the polar bear, it does most of its hunting off the ice, and that ice is melting – this July was the lowest ever recorded Arctic ice cover – it has nowhere to go.

However, the loss of this one bear species, although eminently emblematic, seems insignificant when compared to the number of species that are threatened at the top of tropical mountains.

On Mount Kinabalu in Borneo, Dr Thomas’ graduate student, I-Ching Chen, has been following the movement of Geometrid moths uphill as temperatures increase. Their natural ranges have shifted by 59m in 42 years.

These moths “don’t have options; they are being forced up, and at some point they will run out of land,” reflected Dr Thomas.

The British scientist said that it was really too early to start generalising about the characteristics of the species that had shifted their distribution to stay within their optimal temperature range.

“But we know that the species which have expanded the most and fastest are the species that are not particularly fussy about where they live,” he told BBC News.

:: Read original here ::

Swedish wolves threatened by under-reported poaching

Illegal poaching accounts for over half of all deaths of Swedish wolves, suggests a new study.

Basing their estimates on long-term wolf counts, the researchers reveal that two-thirds of poaching goes undetected.

The study suggests that without the past decade of persecution Swedish wolves would be four times more abundant than they are today.

The study’s findings are reported in Proceedings of the Royal Society B.

“Many have speculated that poaching levels are high for many threatened species of carnivores,” said Chris Carbone from the Zoological Society of London.

“This study presents an important step in trying to quantify this hidden threat,” he added.

The new study predicts the size of the wolf Swedish population each year based on counts from the previous year.

These counts are based on radio-tracked wolves and the more traditional ‘footprint count’, used in Sweden for over 10 years to estimate wolf numbers.

Counting canines

The researchers’ estimates took account of confirmed cases of wolf mortality – such as when a wolf is killed on the road, dies from disease or is found killed.

However, when the team, based at Grimso Wildlife Research Station in Sweden, compared the expected numbers produced by their models to the actual number of wolves in the wild, they found they were over estimating the size of the population.

Conservation biologist Guillaume Chapron, and one of the team, suspects that ‘cryptic poaching’, poaching that goes undetected, accounts for this difference.

The poaching we see is the “tip of the iceberg,” he said.

The researchers predict that without the last decade of poaching, wolves would have numbered around a thousand by 2009, four times the number reported that year.

Wolves are known to kill the dogs that many Swedes use to hunt moose, and despite up to four year prison sentence if caught poaching, a few people do not hestitate to take a shot at a wolf.

Founding fins

Poaching is not the only threat to the Swedish wolf.

These large carnivores went extinct in Sweden in the 1970s, and the population has since re-established itself after a handful of migratory Finnish wolves took over the empty territories.

Today, all 250 or so Swedish wolves have descended from these few founding individuals.

And so the population is highly inbred and suffers from skeletal abnormalities and problems reproducing.

Further reducing the number of wolves by poaching leaves this population very vulnerable to further inbreeding, explained Dr Chapron.

:: Read original here ::

Over-fished tuna in ‘hot water’, study finds

Two more species of tuna have been added to the Red List of Threatened Species.

They join the Southern bluefin tuna – listed as critically endangered.

The report, published in this week’s Science, is the first global assessment of this highly prized family of fish, which are at risk of being over-fished.

The World Conservation Union (IUCN) says there is a lack of resolve to protect against overexploitation driven by high prices.

Until this latest study, attempts to assess the health of scombrid and billfish populations, families of fish that include tuna and swordfish, have been carried out at a regional scale.

This study, which relies on the IUCN Red List criteria to judge the stocks’ health, took a more global approach.

Of the 61 species of fish assessed, seven were earmarked as either vulnerable, endangered or critically endangered. All suffer from over-fishing, habitat loss and pollution.

Along with the two species of tuna, two mackerel and two marlin joined the Red List.

The ‘sapphires of seafood’

Per kilo, bluefins are among the most expensive seafood in the world.

“All three bluefin tuna species are susceptible to collapse under continued excessive fishing pressure. The Southern bluefin has already essentially crashed, with little hope of recovery,” said one the the study’s authors Kent Carpenter the IUCN’s Marine Biodiversity manager.

Southern bluefin numbers have reached levels that are one twentieth of those recorded before industrial fishing began.

Atlantic bluefins have probably gone the same way, add the authors, while bigeye tuna is labelled “vulnerable”.

“Tunas are highly migratory fish, swimming across ocean basins and between the waters of various countries during their lifetimes. Conserving them requires regional and global co-operation,” commented Susan Lieberman, director of international policy with the Pew Environment Group in a statement.

What is more, tuna’s restricted spawning grounds make them exceptionally susceptible to collapse if over-fishing continues, reports the international team of scientists.

And tuna’s long lifespan means it would take their population several years to recover if fishing stopped altogether.

Pew’s Dr Lieberman adds: “The IUCN Red List assessment reinforces that it is time for governments to live up to their responsibilities.”

The report comes days before the tuna regional fisheries management organizations (RFMOs) assemble in in La Jolla, California for the Kobe III meeting.

Read original here

And a little extra comment on the story.

Can States Sue on Greenhouse Gas as a ‘Nuisance’? High Court Asks

As the U.S. Environmental Protection Agency (EPA) is busy girding itself for a fight over new greenhouse gas emissions rules, the U.S. Supreme Court heard arguments today in a case on whether lawsuits over climate ought to be permitted.

At stake is whether greenhouse gas pollution may be considered a “nuisance” under U.S. law. The case stems from two 2004 federal lawsuits brought by seven states and several land-trust groups alleging that emissions from five major power companies could cause harm by contributing to global warming. Rising sea levels, loss of water in the Great Lakes, and reduced hydropower were among the injuries alleged by the plaintiffs; the lawsuits have since been combined, and two states have dropped out since the original suit was filed. The district court subsequently said in its decision that the case brought up a “political” question that the other branches of government, not the judicial branch, should consider, but an appeals court reversed that ruling. When the power companies appealed, the Supreme Court took the case.

In other pollution cases, the Supreme Court has supported suits claiming that pollution caused harm as a “nuisance” under common law, most often interpreted to prohibit noise and light pollution. The 80 minutes of occasionally spirited argument at the high court this morning focused on the two main issues in the greenhouse gas litigation: For the case to go forward, the plaintiffs must prove that the case has legal standing (they must show that the court is the right venue for resolving this dispute), and that the common law definition of nuisance can support suits over greenhouse gases. On the issue of standing, the court could rule that Congress or EPA is a more appropriate body to deal with pollution control.

The Obama Administration opposed the suing states in this case largely on grounds that they lack standing, marking a rare instance in which the Administration finds itself at odds with environmentalists on a major legal issue. (Environmentalists urged the states to try this legal strategy.) U.S. attorney Neal Katyal told the justices that the complexity of the issue suggests that the executive branch, namely EPA, is a better venue for controlling such an expansive type of pollution rather than the courts. “In the 222 years that this court has been sitting, it has never heard a case with so many potential perpetrators and so many potential victims,” he said in his opening remarks. “There are billions of emitters of greenhouse gasses on the planet and billions of potential victims as well.”

The attorneys for the power companies and the Obama Administration argued that the greenhouse gases case is fundamentally different from previous nuisance cases in which pollutants have played a central role. A landmark ruling by the Supreme Court in 1907, for example, found that a judge could stop a Tennessee copper company from polluting the Georgia environment under the nuisance doctrine. Such cases, Katyal said, “are essentially: A pollutes a river or something and hurts B.” But in the case of global warming pollution, he said, “A here is the world and B is the world, and that is such a difference in scale and scope to pose enormously difficult questions” about whether such suits should go forward.

If this case is allowed to proceed, asked the justices, should subsequent cases be limited to big polluters like the five targeted in this suit? “Your briefs talk a lot about how these are the five largest [U.S.] emissions producers, but I saw nothing in your theory to limit it to those five,” Justice Elena Kagan asked New York state attorney Barbara Underwood, who spoke on behalf of the six states in the suit. “Is there something that you think limits it to large emissions producers rather than anybody in the world?”

The states have argued that the larger the greenhouse gas emitter the stronger the connection linking pollution and potential harm. “These defendants,” Underwood said, speaking of the five polluters, “produce 650 million tons a year or 10% of U.S. emissions, and individually they produce amounts ranging from 1 to 3.5% of U.S. emissions.”

Those who challenged the states also suggested that courts would be ill-equipped to make the complex judgments that big regulatory agencies staffed with scientists and other experts make on a routine basis. Judges lack “the resources, the expertise” to be a “kind of super-EPA,” said Justice Ruth Bader Ginsburg.

But Underwood said courts could make such judgments—which could include determining how “substantial” an emitter must be to be found culpable—by relying on standards set by the agencies. She pointed to a cutoff set by EPA that limits regulated greenhouse gas polluters to those that emit 100,000 tons or more per year. “According to EPA’s own technical data, there would be at most a few thousand potential defendants.”

Because Justice Sonia Sotomayor recused herself—she sat on the panel that reviewed the issue in the appeals court—only eight justices heard the arguments. A 4-4 tie would mean litigation against the polluters could go forward, because that would leave in effect the earlier decision by the appeals court. While the tone of the questioning was largely skeptical toward the idea that such suits ought to go forward, divining a final ruling from the rough-and-tumble of oral argument can be difficult, especially because justices often ask tough questions of those they’re inclined to agree with—just to test their counterarguments. Eyes were squarely focused today on Justice Anthony Kennedy, often the swing judge when decisions are split 5-4 in favor of conservative decisions. Kennedy raised a concern that federal law, and EPA’s efforts to use that law, would necessarily “preempt” the common law. The court’s three liberal members, Ginsburg, Stephen Breyer, and Kagan, seemed skeptical on this issue, too.

At least one knowledgeable observer said a 4-4 tie was unlikely. “In short, this particular lawsuit seemed doomed, with the court’s biggest task figuring out how to say so without shutting the courthouse door entirely to such claims,” said longtime Supreme Court reporter Lyle Denniston.

:: Read original here ::

An Early Warning Sign for Ecosystem Collapse?

The surprising strength and location of last month’s Fukushima earthquake highlighted how poorly seismologists can predict when the ground is about to shake cataclysmically. Unfortunately, ecologists can’t do much better at forecasting when an ecosystem is about to collapse or change dramatically. But now a team of ecologists has shown that it is possible to detect early distress signals in a lake that foretell a major disruption to its ecology. If researchers could identify similar signals in other ecosystems, they might one day predict, and perhaps even prevent, ecological meltdowns.

The collapse of the Atlantic cod fishery in the early 1990s saw the most abundant fish in the North Atlantic disappear due to overfishing. Such events are becoming increasingly common as humans overfish, overgraze, and alter the climate. Connections between predators and prey—often described as a food web—become destabilized. This leaves ecosystems vulnerable to dramatic changes, such as when a single species, like certain algae, grows out of control and forms toxic blooms, like the red tides common off the coast of Florida and Mexico. In theory, learning to detect the precursors of environmental distress could help raise the alarm before any damage is irreversible. But while that’s a nice idea on paper, no one has shown that it is possible in real ecosystems.

Now, in the first study of its kind, researchers have pinpointed early warning signs for the disruption of a food web in a lake. By gradually introducing a large fish species—the largemouth bass—into a Wisconsin lake dominated by smaller algae-eating fish, a team of ecologists pushed the aquatic ecosystem to a critical limit where the largemouth bass came to dominate the food web. The researchers had carefully monitored the lake throughout the whole experiment, using a buoy that measures chemical and physical vital signs of the lake every 5 minutes.

Combining these measurements with estimates of the populations of algae, zooplankton, and fish taken from regular net catches, the researchers report that they detected unusual oscillations in the amount of algae in the lake more than a year before the lake’s food web shifted. They say these oscillations are likely due to changes in the feeding behavior of the smaller fish that result from the presence of the introduced predators.

“All of a sudden the places that were once safe [for the zooplankton-eating fish] are dangerous,” says study co-author and ecologist Stephen Carpenter of the University of Wisconsin, Madison. Because the smaller fish shifted to shallow waters where bass threaten them less, he explains, the algae that inhabit the more open waters of the lake were free of their predators and their populations fluctuate more. Carpenter and his colleagues report online today inScience that these fluctuations were a warning that the lake’s food web is changing.

They believe that the fluctuations in species abundance may herald an overall transformation of the lake ecosystem. “The reason that there is so much excitement about these early warning signals is that they are universal,” says lake ecologist Marten Scheffer of Wageningen University in the Netherlands. Many systems have tipping points, he explains, even the climate system. He adds that isolating these signals from the ecosystem is not only useful for predicting environmental catastrophes, but they can also be used to determine which habitats are most likely to respond to conservation, and so allow ecologists to direct their efforts.

The major challenge now for ecologists is isolating the appropriate signals in other ecosystem. Ecologist Robert Holt of the University of Florida in Gainesville explains that Carpenter and his team have worked on this lake system for many years, and thus they understand it intimately. Ecologists don’t understand most other ecosystems nearly as well, he says, and so may find it harder to pin down the appropriate early warning signals in other systems.

:: Read original here ::