Alien rats take on prey’s role

Invasive rats are compensating for the loss of native pollinators in New Zealand, scientists report.

The rats, which are responsible for devastating the native pollinator populations, are attracted to the flowers for their nectar.

The results could mean that the decline of pollinating animals worldwide does not spell the end for all native plants.

The results are published in a Royal Society journal.

Almost 90% of the world’s flowering plants are pollinated by animals.

Insect pollination alone is estimated to be worth £141bn ($224bn) each year, and according to a report from the UN Food and Agriculture Organization (FAO) bees pollinate over two-thirds of the world’s crops.

So the decline of the world’s pollinating animals has unsurprisingly sparked concerns about lower yields and serious long-term food shortages among farmers and governments.

Conservationists also predict the loss of many animal-pollinated plants.

“New Zealand offers a really interesting and rare opportunity to look at what the consequences of species extinction [are] for… pollination,” explained conservation biologist David Wilcove from Princeton University, US.

“We have this situation where almost all of the native vertebrates in New Zealand – birds, bats and reptiles – have disappeared from the North island… largely due to predation by rats,” he added.

But a small patch of pristine New Zealand woodland still exists, affording the researchers the opportunity to investigate the impact of losing key pollinating species on endemic plant species.

Dr Wilcove, and his then Princeton colleague David Pattemore, set out to study three plants: the red-flowered Metrosideros and Knightia, and the purple-flowered Veronica.

What the duo didn’t expect to see was that on the mainland, where the plants were no longer visited by traditional pollinating species, rats, and a recently colonising bird, were doing the job instead.

And for two of the three plant species, the invasive species were doing a comparable job to the native pollinators.

“I was quite startled by it,” said Dr Wilcove.

He explains that, in general, this type of compensation is more likely to happen for flowering plants that are pollinated by many different animals.

But for plants that rely on a very specialist pollinator, the loss of its sole pollinating animal still spells doom for the species.

Most crops, Dr Wilcove suspects, are pollinated by multiple species, and so there might be room for one pollinator to be replaced by another.

So for crop species, these findings should be encouraging, he suggested.

“I think this result should at least force people to think more carefully about what possible beneficial role the non-native [species] are having… and perhaps develop a slightly different control strategy,” he added.

:: Read original here ::

Jawbones are ‘shaped by diet’, a study finds

Diet has shaped human jaw bones; a result that could help explain why many people suffer with overcrowded teeth.

The study has shown that jaws grew shorter and broader as humans took on a more pastoral lifestyle.

Before this, developing mandibles were probably strengthened to give hunter-gatherers greater bite force.

The results were published in the Proceedings of the National Academy of Sciences.

“This is a fascinating study which challenges the common perception that there has been little recent change in the morphology of humans,” said anthropologist Jay Stock from the University of Cambridge.

Many scientists have suggested that the range of skull shapes that exist within our species is the result of exposure to different climates, while others have argued that chance played more of a role in creating the diversity we see in people’s profiles.

The new data, collected from over 300 skulls, across 11 populations, shows that jaws shortened and widened as humans moved from hunting and gathering to a more sedentary way of life.

The link between jaw morphology and diet held true irrespective of where people came from in the world, explained anthropologist Noreen von Cramon-Taubadel from the University of Kent.

Concurrently crooked

It would be tempting to conclude that this is evidence for concurrent evolutionary change – where jaw bones have evolve to be shorter and broader multiple, independent times, she told BBC News.

But the sole author of the paper suggested that the changes in human skulls are more likely driven by the decreasing bite forces required to chew the processed foods eaten once humans switch to growing different types of cereals, milking and herding animals about 10,000 years ago.

“As you are growing up… the amount that you are chewing, and the pressure that your chewing muscles and bone [are] under, will affect the way that the lower jaw is growing,” explained Dr von Cramon-Taubadel.

She thinks that the shorter jaws of farmers meant that they have less space for their teeth relative to hunter-gatherers, whose jaws are longer.

Teeth-pulling tale

“I have had four of my pre-molars pulled and that is the only reason that my teeth fit in my mouth,” said Dr von Cramon-Taubadel.

Ever since that time, she has wondered why so many people suffer with teeth-crowding.

“I think that’s the reason why this result resonates with people,” she said.

Dr Stock added: “[The finding] is particularly important in that it demonstrates that variation that we find in the modern human skeletal system is not solely driven by population history and genetics.”

These results fit with previous evidence of both a reduction in tooth and body size as humans moved to a more pastoral way of life.

It also helps explain why studies of captive primates have shown that animals tend to have more problems with teeth misalignment than wild individuals.

Further evidence comes from experimental studies that show that hyraxes – rotund, short-tailed rabbit-like creatures – have smaller jaws when fed on soft food compared to those fed on their normal diet.

:: Read original here ::

CO2 climate impacts reassessed

Global temperatures could be less sensitive to changing atmospheric carbon dioxide (CO2) levels than previously thought, a study suggests.

The researchers said people should still expect to see “drastic changes” in climate worldwide, but that the risk was a little less imminent.

The results are published in Science.

The study is the latest to derive a value for “climate sensitivity” – the temperature rise for a doubling of CO2 concentrations – from palaeontology.

Previous studies have produced a mean value around 3C; but the new analysis concludes it is somewhat lower, around 2.3C.

More accurate analyses of climate sensitivity from historical data should allow for development of more accurate climate models that forecast the future.

The new analysis uses palaeoclimate data going back to the latter stages of the most recent Ice Age, 21,000 years ago, and a computer model.

Lead author Andreas Schmittner from Oregon State University, US, explained that by looking at how surface temperatures changed during a period when humans were having no impact on global temperatures, his team showed that it had not been as cold as previous estimates had suggested.

“This implies that the effect of CO2 on climate is less than previously thought,” he explained.

The researchers suggest that this finding can reduce uncertainty in future climate projections, though they do still give a range rather than a precise value.

The new models predict that given a doubling in CO2 levels from pre-industrial levels, the Earth’s surface temperatures will rise by 1.7C to 2.6C (3.1F to 4.7F), with a mean value of 2.3C.

That is a much tighter range than the one produced by the Intergovernmental Panel on Climate Change’s (IPCC) 2007 report, which suggested a rise of 2.0-4.5C, with a mean of about 3C.

More time?

The authors stress the results do not mean that the threat from human-induced climate change should be treated any less seriously, explained palaeoclimatologist Antoni Rosell-Mele from the Autonomous University of Barcelona, a member of the team that came up with the new estimates.

But it does mean that to induce large-scale warming of the planet, leading to widespread catastrophic consequences, we would have to increase CO2 more than we are going to do in the near future, he said.

“But we don’t want that to happen at any time, right?” he remarked

“At least, given that no one is doing very much around the planet [about] mitigating CO2 emissions, we have a bit more time.”

Whether these results mean that the global temperatures would be less responsive to falling CO2, if emissions do fall, is unclear.

“I don’t think we know that, to be honest,” remarked Dr Rosell-Mele.

Gabriele Hegerl from the University of Edinburgh is cautious about the result in her perspective piece published in the same issue of Science.

She says that this is just one particular climate model, and “future work with a range of models would serve to strengthen the result”.

Climatologist Andrey Ganopolski from the Potsdam Institute for Climate Impact Research, Germany, went further, saying he would not make such a strong conclusion based on this data.

“The results of this paper are the result of the analysis of [a] cold climate during the glacial maximum (the most recent ice age),” he told BBC News.

“There is evidence that the relationship between CO2 and surface temperatures is likely to be different [during] very cold periods than warmer [periods].”

Scientists, he said, would therefore prefer to analyse periods of the Earth’s history that are much warmer than now when making projections about future temperatures.

However, although good data exists for the last million years, temperatures during this time have been either similar to present, or colder.

“One should be very careful about using cold climates to [construct] the future,” he added.

:: Read original here ::

Jupiter moon Europa ‘has shallow lakes’

Scientists have found the best evidence yet for water just beneath the surface of Jupiter’s icy moon, Europa.

Analysis of the moon’s surface suggests plumes of warmer water well up beneath its icy shell, melting and fracturing the outer layers.

The results, published in the journal Nature, predict that small lakes exist only 3km below the crust.

Any liquid water could represent a potential habitat for life.

From models of magnetic forces, and images of its surface, scientists have long suspected that a giant ocean, roughly 160km (100 miles) deep, lies somewhere between 10-30km beneath the ice crust.

Many astrobiologists have dreamed of following in the footsteps of Arthur C Clarke’s fictional character David Bowman, who, in the novel Odyssey Two, discovers aquatic life-forms in the deep Europan sea.

But punching holes through the moon’s thick, icy outer layers has always seemed untenable.

The discovery of shallow liquid water by an American team makes a space mission to recover water from the moon much more plausible.

Shallow seas

The presence of shallow lakes also means that surface waters are probably vigorously mixing with deeper water.The icy eddies could transfer nutrients between the surface water and the ocean’s depths.

“That could make Europa and its ocean more habitable,” said lead author Britney Schmidt from the University of Texas at Austin, US, who analysed images collect by the Galileo spacecraft launched in 1989.

Glaciologists have been studying the surface of Europa for many years, trying to work out what formed its scarred, fractured surface.

By looking at Antarctica, where we see similar [features] – glaciers, ice shelves – we can infer something about the processes that are happening on Europa, said glaciologist Martin Siegert from the University of Edinburgh.

He explained that the new study tells us how upwelling of warmer water causes melting of surface ice, forming cracks.

“You get freezing [water] between the cracks… so you end up with the existing ice cemented in with new ice.”

“The underside then freezes again, which causes the uplifting; its pretty neat,” Dr Siegert told BBC News.

The US and Europe are working on missions to Europa, and Jupiter’s other moons, which they hope to launch either late this decade or early in the 2020s.

Europa

  • Europa was discovered – together with three other satellites of Jupiter – by Italian astronomer Galileo Galilei in January 1610.
  • The icy moon is 350 million miles from Earth, and is one of 64 Jovian satellites.
  • In the 1990s, Nasa’s Galileo probe sent pictures back of its surface.
  • Europa has a small metal core (light blue, centre), surrounded by a large layer of rock (orange).
  • The surface is thought to consist of an ocean of liquid water (blue) covered by a thick layer of ice (beige).

:: Read original here ::

Fukushima fallout fears over Japan farms

New research has found that radioactive material in parts of north-eastern Japan exceeds levels considered safe for farming.

The findings provide the first comprehensive estimates of contamination across Japan following the nuclear accident in 2011.

Food production is likely to be affected, the researchers say.

The results are reported in the Proceedings of National Academy of Sciences(PNAS) journal.

In the wake of the accident at Japan’s Fukushima nuclear power plant, radioactive isotopes were blown over Japan and its coastal waters.

Fears that agricultural land would be contaminated prompted research into whether Japanese vegetables and meat were safe to eat.

An early study suggested that harvests contained levels of radiation well under the safety limit for human consumption.

Contaminated crops

Now, an international team of researchers suggests this result deserves a second look.

To estimate contamination levels, Teppei Yasunari, from the Universities Space Research Association in the US state of Maryland, and his colleagues, took measurements of the radioactive element caesium-137 in soil and grass from all but one of Japan’s 47 regions and combined these results with simulations based on weather patterns following the meltdown.

Caesium-137 lingers in the environment for decades, and so is more of a concern than other radioactive elements released in the cloud of steam when the reactors’ cooling systems failed, leading to explosions.

The team found that the area of eastern Fukushima had levels of the radioactive element that exceeded official government limits for arable land.Under Japanese Food Sanitation Law, 5,000 becquerel per kg (Bq/kg) of caesium is considered the safe limit in soil (caesium-137 makes up about half of total radioactive caesium, and therefore its safe limit is 2,500 Bq/kg).

The researchers estimate that caesium-137 levels close to the nuclear plant were eight times the safety limit, while neighbouring regions were just under this cut off; the rest of Japan was well below (averaging about 25 Bq/kg) the safety limit.

Relatively low contamination levels in western Japan could be explained by mountain ranges sheltering those regions from the dispersal of radioactive material, the authors said.

Food production in the most contaminated regions, the researchers write, is likely to be “severely impaired”, and that Fukishima’s neighbouring regions, such as, Iwate, Miyagi, Yamagata, Niigata, Tochigi, Ibaraki, and Chiba are likely to also be affected.

“Some neighbouring prefectures… are partially close to the limit under our upper bound estimate and, therefore, local-scale exceedance is likely given the strong spatial variability of [caesium-137] deposition,” the researchers explained in PNAS.

They urge the Japanese government to carry out a more thorough assessment of radioactive contamination across Japan before considering future decontamination plans.

A second study, published in the same edition of PNAS, collected over a hundred soil samples from within 70km of the Fukishima plant, and found similarly high caesium-137 levels across the Fukishima prefecture, and its neighbouring regions.

Radioecologist Nick Beresford from Centre of Ecology and Hydrology in Lancaster explained that once in soil, caesium will become bound to mineral components, which limits its uptake into plants.

However, this process depends on the soil type. “Caesium stays mobile for longer in organic soils, hence why England and Wales still have some post-Chernobyl restrictions in upland areas,” he told BBC News.

Ploughing, and some fertilisers can help farmers reduce plants’ uptake of the dangerous elements, and binding agents can be added to animal feed to reduce their uptake from the gut, he added.

Local recordings

New figures on background radiation levels recorded 60km northwest of the Daiichi power plant have also been released this week by Japanese physicist Tsuneo Konayashi from Fukushima Medical University.

Dr Konayashi saw an initial spike reaching over nine times the usual levels hours after the explosions at the plant; five months later levels have dropped to one and a half times those expected.

He continues to monitor radiation levels and distribute his data around campus.

Becquerels and Sieverts

  • A becquerel (Bq), named after French physicist Henri Becquerel, is a measure of radioactivity
  • A quantity of radioactive material has an activity of 1Bq if one nucleus decays per second – and 1kBq if 1,000 nuclei decay per second
  • A sievert (Sv) is a measure of radiation absorbed by a person, named after Swedish medical physicist Rolf Sievert
  • A milli-sievert (mSv) is 1,000th of a Sievert

:: Read original here ::

Primates leapt to social living

Scientists may be a step closer to understanding the origins of human social behaviour.

An analysis of over 200 primate species by a University of Oxford team suggests that our ancestors gave up their solitary existence when they shifted from being nocturnal creatures to those that are active during the day.

It is likely communal living was adopted to protect against day time predators, the researchers say.

The results are published in Nature.

From work on social insects and birds, some biologists have suggested that social groups begin to form when young do not leave their natal ground, but instead hang around and help raise their siblings.

Now, the latest evidence from primates suggests that this might not have been the case for our ancestors.

Leaping to sociality

By looking at whether closely related species share similar social structures, the Oxford team of evolutionary biologists shows that a common history is important in shaping the way animals behave in a group.

The team pinpointed the shift from non-social to social living to about 52 million years ago; a switch that appears to have happened in one step, and coincided with a move into daylight.

It did coincide with a change in family dynamics or female bonding, which emerged much later at about 16 million years ago.

“If you are a small animal active at night then your best strategy to avoid predation is to be difficult to detect,” explained Oxford’s Suzanne Shultz, who led the study.

“Once you switch to being active during the day, that strategy isn’t very effective, so an alternative strategy to reduce the risk of being eaten is to live in social groups,” she told BBC News.

Dr Shultz thinks that the move to day-time living in ancient primates allowed animals to find food more quickly, communicate better, and travel faster through the forest.

The link between sociality and a switch to daytime living might have been missed until now, she suspects, because biologists interested in this question have tended to work with Old World monkeys, like baboons, which are characterised by female bonded groups, which are not characteristic of many primate species.

Flexibly social

Human societies likely descended from similar large, loosely aggregated creatures, Dr Shultz explained, but the key difference, she pointed out, is that our closest cousins’ societies do not vary within a species, while humans’ do.

“In human societies we have polygyny… we have monogamy, and in some places we have females leaving the group they were born in, and in others males leave,” she said.

Why this difference exist is still unclear.

:: Read original here ::

Ancient horses’ spotted history reflected in cave art

Scientists have found evidence that leopard-spotted horses roamed Europe 25,000 years ago alongside humans.

Until now, studies had only recovered the DNA of black and brown coloured coats from fossil specimens.

New genetic evidence suggests “dappled” horses depicted in European cave art were inspired by real life, and are less symbolic than previously thought.

The findings are published in the Proceedings of the National Academy of Sciences.

Horses, which were the most abundant large mammal roaming Eurasia 25,000 years ago, were a key component of early European diets.

So it is not surprising that the cave art of this time had a certain equestrian flair – horses make up 30% of the animals depicted in European cave paintings from this era.

Biologists, interested in the diversity of European animals before the last Ice Age, are interested in how accurately these early artistic impressions portrayed the colouring of the horses that lived alongside the ancient humans.

“It was critical to ensure that the horse depictions from the cave paintings were based on real-life experiences rather than products of the imagination,” explained lead author Arne Ludwig from The Leibniz Institute for Zoo and Wildlife Research in Berlin.

In previous work, Dr Ludwig, and his colleagues, recovered only the DNA of black and brown coat colours from the prehistoric horse bones.

But the dappled coats of the 25,000 year horses depicted at the Pech Merle cave complex in France convinced the team to take a second look.

Fur coats

By revisiting the fossil DNA of 31 horse specimens collected from across Europe, from Siberia to the Iberian Peninsula, the researchers found that six of the animals carried a mutation that causes modern horses to have white and black spots.

Of the remaining 25 specimens, 18 were brown coloured and six were black.

Dr Ludwig explained that all three of the horse colours – black, brown and spotted – depicted in the cave paintings have now been found to exist as real coat-colours in the ancient horse populations.

The researchers say that these three colours probably provided enough variation for humans to create the diversity of coat colours and patterns seen in modern horses.

The domestication of horses, which produced modern breeds, is thought to have begun about 4,600 years ago in the steppe between modern Ukraine and Kazakhstan.

:: Read original here ::

Pioneers boast high fertility, say scientists

Scientists have shown that women who were first to settle in a new land had more children and grandchildren than those who followed.

Researchers analysed the family trees of French settlers who colonised Canada in the 17th and 18th centuries.

Their results could help to explain why some rare genetics diseases are common in communities established by migrations.

The findings have been published in the journal Science.

The team of researchers from Canada and Europe relied on data collected by the parish councils of Charlevoix and Saguenay Lac Saint-Jean, a region 170km north of Quebec, Canada.

The towns not only boast dairy farms, charming villages and sandy beaches but some of the best ever-kept marriage records – comprising more than a million people.

By building a picture of marriages and how many children the pairings produced, the researchers showed that woman who arrived as part of the first wave of immigration had 15% more children than those who arrived a generation later.

The pioneering woman married younger and benefited from scooping up the best local resources, they added.

But the study also found that the pioneering women’s children also had more children.

Lead author Laurent Excoffier, from the University of Bern in Switzerland, explained that the children of women at the front of the wave inherited their mother’s higher rate of fertility.

Yet, the researchers added, there was no such correlation between generations that arrived 30 years later behind the first wave.

Dr Excoffier drew parallels with cane toads. Scientists have observed that the toads at the edge of their range have bigger front legs and stronger back legs; all the better to invade new areas.

And when toads at the frontiers breed, their offspring inherit these longer, stronger limbs.

Such an effect is not unexpected, but until now no one has seen this phenomenon in humans.

“This was a rare chance to study a relatively recent human migration,” said co-author Damian Labuda, a geneticist from the University of Montreal, Canada.

Population geneticist Montgomery Slatkin from University of California, who was not involved in the work, called the study one of the “most interesting, detailed studies” he had seen.

“I think what happened [here] could easily have happened in other populations,” he added.

The findings suggest that families at the front of the wave of migration contributed more to the contemporary gene pool than those that were slower to arrive, explained Dr Labuda.

This could help explain why some rare genetic diseases are more common than expected in the Charlevoix and Saguenay Lac Saint-Jean regions.

That is because any disease causing mutations carried by people by the frontiers would be pass onto their descendents, who make up a large proportion of subsequent generations in the population.

:: Read original here ::

Icelandic rocks could have steered Vikings

Vikings used rocks from Iceland to navigate the high seas, suggests a new study.

In Norse legends, sunstones are said to have guided seafarers to North America.

Now an international team of scientists report in the journal the Proceedings of the Royal Society Athat the Icelandic spars behave like mythical sunstones and polarise light.

By holding the stones aloft, voyaging Vikings could have used them to find the sun in the sky.

The Vikings were skilled navigators and travelled thousand of kilometres between Northern Europe and North America.

But without a magnetic compass, which was not invented until the 13th Century, they must have relied on other navigational aids.

Without the stars, which would have been out of sight during the constant daylight of the summer months, the sun would have been their best bet to set their course by.

But on cloudy or foggy days the seafarers would have been left with only the direction of the wind and swell to guide their way.

Through the fog

Norse legends tell of seafarers lifting stones to the sky to spy the direction of the sun when it was hidden by cloud cover.

Earlier this year, a study in the Philosophical Transactions of the Royal Society B reviewed the evidence that naturally forming crystals can selectively block light of one polarisation – how waves of light can be restricted to certain directions of oscillation.

The new result shows that Icelandic spars, which are formed from crystallised calcium carbonate, are good polarisers and could have been the raw material of the mythical sunstones.

The spars can be easily cleaved and crafted into a rhombus shape required for the polarising effect, and the discovery of one on the wreck of an Elizabethan ship that sunk in 1592 “looks very promising” the authors report.

:: Read original here ::