Google Search

For weather information from across the nation, please check out our home site National Weather Outlook. Thanks!

Chicago Current Weather Conditions

Chicago Weather Forecast

Chicago 7 Day Weather Forecast

Chicago Weather Radar

Friday, February 28, 2014

Terrestrial ecosystems at risk of major shifts as temperatures increase

Over 80% of the world's ice-free land is at risk of profound ecosystem transformation by 2100, a new study reveals. "Essentially, we would be leaving the world as we know it," says Sebastian Ostberg of the Potsdam Institute for Climate Impact Research, Germany. Ostberg and collaborators studied the critical impacts of climate change on landscapes and have now published their results in Earth System Dynamics, an open access journal of the European Geosciences Union (EGU).

The researchers state in the article that "nearly no area of the world is free" from the risk of climate change transforming landscapes substantially, unless mitigation limits warming to around 2 degrees Celsius above preindustrial levels.

Ecosystem changes could include boreal forests being transformed into temperate savannas, trees growing in the freezing Arctic tundra or even a dieback of some of the world's rainforests. Such profound transformations of land ecosystems have the potential to affect food and water security, and hence impact human well-being just like sea level rise and direct damage from extreme weather events.

The new Earth System Dynamics study indicates that up to 86% of the remaining natural land ecosystems worldwide could be at risk of major change in a business-as-usual scenario (see note). This assumes that the global mean temperature will be 4 to 5 degrees warmer at the end of this century than in pre-industrial times -- given many countries' reluctance to commit to binding emissions cuts, such warming is not out of the question by 2100.

"The research shows there is a large difference in the risk of major ecosystem change depending on whether humankind continues with business as usual or if we opt for effective climate change mitigation," Ostberg points out.

But even if the warming is limited to 2 degrees, some 20% of land ecosystems -- particularly those at high altitudes and high latitudes -- are at risk of moderate or major transformation, the team reveals.

The researchers studied over 150 climate scenarios, looking at ecosystem changes in nearly 20 different climate models for various degrees of global warming. "Our study is the most comprehensive and internally consistent analysis of the risk of major ecosystem change from climate change at the global scale," says Wolfgang Lucht, also an author of the study and co-chair of the research domain Earth System Analysis at the Potsdam Institute for Climate Impact Research.

Few previous studies have looked into the global impact of raising temperatures on ecosystems because of how complex and interlinked these systems are. "Comprehensive theories and computer models of such complex systems and their dynamics up to the global scale do not exist."

To get around this problem, the team measured simultaneous changes in the biogeochemistry of terrestrial vegetation and the relative abundance of different vegetation species. "Any significant change in the underlying biogeochemistry presents an ecological adaptation challenge, fundamentally destabilising our natural systems," explains Ostberg.

The researchers defined a parameter to measure how far apart a future ecosystem under climate change would be from the present state. The parameter encompasses changes in variables such as the vegetation structure (from trees to grass, for example), the carbon stored in the soils and vegetation, and freshwater availability. "Our indicator of ecosystem change is able to measure the combined effect of changes in many ecosystem processes, instead of looking only at a single process," says Ostberg.

He hopes the new results can help inform the ongoing negotiations on climate mitigation targets, "as well as planning adaptation to unavoidable change."

Note

Even though 86% of land ecosystems are at risk if global temperature increases by 5 degrees Celsius by 2100, it is unlikely all these areas will be affected. This would mean that the worst case scenario from each climate model comes true.


View the original article here

Thursday, February 27, 2014

Uncovering the tricks of nature's ice-seeding bacteria

Like the Marvel Comics superhero Iceman, some bacteria have harnessed frozen water as a weapon. Species such as Pseudomonas syringae have special proteins embedded in their outer membranes that help ice crystals form, and they use them to trigger frost formation at warmer than normal temperatures on plants, later invading through the damaged tissue. When the bacteria die, many of the proteins are wafted up into the atmosphere, where they can alter the weather by seeding clouds and precipitation.

Now scientists from Germany have observed for the first time the step-by-step, microscopic-level action of P. syringae's ice-nucleating proteins locking water molecules in place to form ice. The team will present their findings at the AVS 60th International Symposium and Exhibition, held Oct. 27 -- Nov. 1 in Long Beach, Calif.

"Ice nucleating proteins are the most effective ice nucleators known," said Tobias Weidner, leader of the surface protein group at the Max Planck Institute for Polymer Research. The proteins jump-start the process of ice crystal formation so well that dried ice-nucleating bacteria are often used as additives in snowmakers.

Although scientists discovered ice-nucleating proteins decades ago, little is known about how they actually work. Weidner and his team tackled the mystery with a powerful tool called spectroscopy that can decipher patterns in the interaction between light and matter to visualize the freezing process in layers of materials only a few molecules thick.

The researchers prepared a sample of fragments of P. syringae bacteria that they spread over water to form a surface film. As the temperature was lowered from room temperature to near freezing levels the scientists probed the interface between the bacterial proteins and the water with two laser beams. The beams combined within the sample and a single beam was emitted back, carrying with it information about how the protein and water molecules move and interact.

By analyzing the returning light beam's frequency components, Weidner and his colleagues found a surprisingly dramatic result: as the temperature approached zero degrees Celcius the water molecules at the ice-nucleating protein surface suddenly became more ordered and the molecular motions become sluggish. They also found that thermal energy was very efficiently removed from the surrounding water. The results indicate that ice nucleating proteins might have a specific mechanism for heat removal and ordering water that is activated at low temperatures, Weidner said.

"We were very surprised by these results," Weidner added. "When we first saw the dramatic increase of water order with lower temperatures we believed it was an artifact." The movements of the water molecules near the ice-nucleating protein was very different than the way water had interacted with the many other proteins, lipids, carbohydrates, and other biomolecules the team had studied.

Recent studies have shown that large numbers of bacterial ice-nucleating proteins become airborne over areas like the Amazon rainforest and can spread around the globe. The proteins are among the most effective promoters of ice particle formation in the atmosphere, and have the potential to significantly influence weather patterns. Learning how P. syringae triggers frost could help teach researchers how ice particle formation occurs in the upper atmosphere.

"Understanding at the microscopic level -- down to the interaction of specific protein sites with water molecules -- the mechanism of protein-induced atmospheric ice formation will help us understand biogenic impacts on atmospheric processes and the climate," Weidner said. For a more detailed picture of protein-water interactions it will also be important to combine their spectroscopic results with computer models, he said.

Cite This Page:

American Institute of Physics. "Uncovering the tricks of nature's ice-seeding bacteria." ScienceDaily. ScienceDaily, 23 October 2013. .American Institute of Physics. (2013, October 23). Uncovering the tricks of nature's ice-seeding bacteria. ScienceDaily. Retrieved February 1, 2014 from www.sciencedaily.com/releases/2013/10/131023141119.htmAmerican Institute of Physics. "Uncovering the tricks of nature's ice-seeding bacteria." ScienceDaily. www.sciencedaily.com/releases/2013/10/131023141119.htm (accessed February 1, 2014).

View the original article here

Wednesday, February 26, 2014

Near-future heat and precipitation extremes predicted

Long-term and average changes are in the focus of the discussion on climate change: globally, as the different scientific climate models all predict, it will be warmer on Earth at the end of the century. For decision-makers and people affected by climate change, however, information on the frequency and intensity of extreme events such as heat and cold extremes, heavy rainfall or dry spells are at least as important as indications of average values. Moreover, for them projections about the next ten, twenty, thirty or forty years are usually more relevant than the long-term view to the end of the century. The problem: for the short and medium term, the models yield extremely different results.

Does that mean that the models are not working? No, says Erich Fischer, a senior scientist at the Institute for Atmospheric and Climate Science at ETH Zurich, who has been investigating the causes of the major discrepancies in the short and medium-term projections. In a study just published in the journal "Nature Climate Change," he concludes that they are mostly caused by natural, chaotic and thus unpredictable fluctuations in the climate system. There is certainly potential for improving climate models, Fischer says. "However, even if we had a perfect model for the medium-term, there would still be uncertainties."

Butterfly effect simulated

The researchers obtained their results from a simulation of the well-known butterfly effect, which states that slightly different starting conditions can vastly influence a development in the longer term ("Does the flap of a butterfly's wings in Brazil set off a tornado in Texas?"): the scientists calculated the future climate twenty-one times using one of the leading climate models, deliberately changing the temperatures on Day 1 of the calculation ever so slightly for every point on Earth -- by a maximum of one hundred billionths of a degree Celsius.

This revealed that the differences in the maximum and minimum annual temperatures and the intensive precipitation between 2016 and 2035 were almost as great in the realisations of this one model as the known differences between the various models. From these results the researchers concluded that the majority of the differences are due to the starting conditions and thus chaos, not the uncertainties of the models.

What can be predicted and what can't

"Our study reveals that we have to live with uncertainties in local, medium-term projections," says Fischer. A Swiss farmer, for instance, cannot expect any accurate predictions on the changes in climate extremes on the Swiss Central Plateau in the next thirty to forty years, even if it is clear that the heat extremes and periods of heavy rainfall in the long-term trend will be more intense by the end of the century.

However, this does not mean to say that no scientific projections about the coming decades are possible. The ETH-Zurich scientists have found ways to make such projections -- by considering large regions or the entire world. This enabled them to demonstrate that the intensity of heat extremes and periods of heavy rainfall will not increase equally everywhere on Earth: while heat extremes will become significantly more intense on two thirds of the land surface within three decades, there will be no significant changes in a third of the area. And as far as heavy rainfall is concerned, it will increase by ten per cent in a quarter of the area and less than ten per cent in the remaining three quarters.

Risks predictable

The ETH-Zurich researchers make similar projections for large individual regions such as Europe, the USA, China or Australia. In all these regions, the climate models predict an increase in the intensity of heat waves in the next thirty years and heavy rainfall in the next fifty years. For institutions with a global focus, such as reinsurance companies or food multinationals, such predictions are extremely useful, even if it is unclear where exactly the extreme events will occur. "The different models agree that changes in extreme weather events will occur and how strong they will be, but not where they will be the strongest. This is largely determined by chaos," says Fischer. In physics, it is common for a single condition not to be predictable but probably the average. Fischer compares it with road traffic: if speed limits are increased, we can predict that there will more traffic accidents. Where exactly the next accident will take place, however, we cannot tell.

Story Source:

The above story is based on materials provided by ETH Zurich. Note: Materials may be edited for content and length.


View the original article here

Tuesday, February 25, 2014

Satellite trio to explore the Earth's magnetic field

In a dense fog, a Russian Rockot rocket on 22 November 2013 cleared the launchpad of the Baikonur Cosmodrome on schedule at 13:02:15 CET. In the tip of the rocket: three identical satellites to measure the Earth's magnetic field. A good hour and a half later, at 14:37:48 CET, the report of success: all three satellites separated seamlessly from the carrier rocket and the ground stations Kiruna (Sweden) and Longyearbyen /Svalbard (Norway) were able to establish radio contact with them. GFZ scientists and invited guests observed the start of the mission called SWARM of the European Space Agency in Darmstadt via remote transmission.

Professor Johanna Wanka, Federal Minister of Education and Research said on the occasion of the perfect start of the mission: "We are very pleased that this European mission has started so smoothly.The magnetic field of the Earth is our shield against cosmic particle radiation. But it is subject to natural fluctuations, from the Earth's interior or eruptions on the Sun. Improving the exploration of its function and recording space weather data more accurately allows us to draw conclusions for life on our planet."

Professor Reinhard Huettl, Chairman of the Board of the GFZ German Research Centre for Geosciences pointed out a Potsdam success story: "The three satellites are direct developments from the CHAMP mission of the GFZ, which was launched in 2000. CHAMP with his followers GRACE and SWARM proves to be the founding father of a whole generation of satellites and space-based measurement methods."

A trio for the magnetic field

SWARM is an ESA mission as part of its "Living Planet" program. "The satellite swarm -- hence the name -- is to measure the Earth's magnetic field from space with unprecedented precision for at least four years," elaborated Professor Huettl. For this, the three satellites fly in an optimized formation: two satellites (SWARM-A, SWARM-B) fly in an altitude of 450 kilometers with a distance of 150 kilometers alongside one another, the third (SWARM-C) ascends into a higher orbit at 530 km altitude. The reason for this complex formation flight lies in the magnetic field itself: it is generated by the flow of electrically conducting liquid iron in the outer core oft he Earth, 2900 kilometers beneath our feet. It is influenced by the conductivity and the dynamics of the overlying mantle (up to 40 kilometers below the Earth's surface). Finally, the magnetized rocks of the Earth's crust contribute to the Earth's magnetic field. In addition, the sun and currents in near-Earth space influence the Earth's magnetic field from the outside. In order to study these individual components, the total signal of the magnetic field measured by the satellite needs to be separated into its individual components. "From its distance of 150 kilometers, the lower flying SWARM pair can look at the magnetic field of the Earth's crust with a stereo view," explains Professor Hermann L?hr , one of the three Principle Investigators of the mission, member of the SWARM Mission Advisory Group and Head of the German SWARM Project Office at the GFZ. "We can therefore analyze this component with very high accuracy." The third, upper SWARM satellite can in turn precisely determine the force of the magnetic field as it decreases with increasing altitude. Also, over time this satellite flies in a progressively increasing angle to the path of the lower pair. The total measurement will give a picture of the earth's magnetic field with a precision never achieved before.

Almost as a side effect, the possibility arises to observe space weather more accurately. What is understood by this are flares of our sun, but also magnetic storms generated by distant stars that can interfere with or even paralyze our technical civilization. For example, a strong solar storm in 1989 caused a breakdown of the electricity supply in Canada.

About the satellites

The three SWARM satellites together cost about 220 million euros, each weighs 500 kg. Inside the carrier rocket, a four-meter long measuring arm is folded on the back of the five meter long satellite body. This boom is folded out several hours after the deployment of the satellite, once the on-board operating system has been initiated. The reason for this is that the surface of the satellite is equipped with solar cells for the power supply. The magnetic field generated by the current, however, would interfere with the measurement, therefore, the magnetic field measuring instruments are mounted on the measuring arm.

At the tip of the boom, the particularly sensitive apparatus for measuring the magnetic field strength is installed, the sensors for determining the direction of the magnetic field are in its center. In the same position, three star sensors allow the satellite to determine and corrected its location.

To begin with, the three satellites fly parallel on a north-south path at about 88? inclination. Swarm-C is then slowly deflected by 30? per year and thus continues to fly at an increasing angle to the orbit of Swarm-A and -B.

Cite This Page:

Helmholtz Centre Potsdam - GFZ German Research Centre for Geosciences. "Satellite trio to explore the Earth's magnetic field." ScienceDaily. ScienceDaily, 22 November 2013. .Helmholtz Centre Potsdam - GFZ German Research Centre for Geosciences. (2013, November 22). Satellite trio to explore the Earth's magnetic field. ScienceDaily. Retrieved February 1, 2014 from www.sciencedaily.com/releases/2013/11/131122103703.htmHelmholtz Centre Potsdam - GFZ German Research Centre for Geosciences. "Satellite trio to explore the Earth's magnetic field." ScienceDaily. www.sciencedaily.com/releases/2013/11/131122103703.htm (accessed February 1, 2014).

View the original article here

Monday, February 24, 2014

Picture of how our climate is affected by greenhouse gases is a 'cloudy' one

The warming effect of human-induced greenhouse gases is a given, but to what extent can we predict its future influence? That is an issue on which science is making progress, but the answers are still far from exact, say researchers from the Hebrew University of Jerusalem, the US and Australia who have studied the issue and whose work which has just appeared in the journal Science.

Indeed, one could say that the picture is a "cloudy" one, since the determination of the greenhouse gas effect involves multifaceted interactions with cloud cover.

To some extent, aerosols -- particles that float in the air caused by dust or pollution, including greenhouse gases -- counteract part of the harming effects of climate warming by increasing the amount of sunlight reflected from clouds back into space. However, the ways in which these aerosols affect climate through their interaction with clouds are complex and incompletely captured by climate models, say the researchers. As a result, the radiative forcing (that is, the disturbance to Earth's "energy budget" from the sun) caused by human activities is highly uncertain, making it difficult to predict the extent of global warming.

And while advances have led to a more detailed understanding of aerosol-cloud interactions and their effects on climate, further progress is hampered by limited observational capabilities and coarse climate models, says Prof. Daniel Rosenfeld of the Fredy and Nadine Herrmann Institute of Earth Sciences at the Hebrew University of Jerusalem, author of the article in Science. Rosenfeld wrote this article in cooperation with Dr. Steven Sherwood of the University of New South Wales, Sydney, Dr. Robert Wood of the University of Washington, Seattle, and Dr. Leo Donner of the US National Oceanic and Atmospheric Administration. .

Their recent studies have revealed a much more complicated picture of aerosol-cloud interactions than considered previously. Depending on the meteorological circumstances, aerosols can have dramatic effects of either increasing or decreasing the cloud sun-deflecting effect, the researchers say. Furthermore, little is known about the unperturbed aerosol level that existed in the preindustrial era. This reference level is very important for estimating the radiative forcing from aerosols.

Also needing further clarification is the response of the cloud cover and organization to the loss of water by rainfall. Understanding of the formation of ice and its interactions with liquid droplets is even more limited, mainly due to poor ability to measure the ice-nucleating activity of aerosols and the subsequent ice-forming processes in clouds.

Explicit computer simulations of these processes even at the scale of a whole cloud or multi-cloud system, let alone that of the planet, require hundreds of hours on the most powerful computers available. Therefore, a sufficiently accurate simulation of these processes at a global scale is still impractical.

Recently, however, researchers have been able to create groundbreaking simulations in which models were formulated presenting simplified schemes of cloud-aerosol interactions, This approach offers the potential for model runs that resolve clouds on a global scale for time scales up to several years, but climate simulations on a scale of a century are still not feasible. The model is also too coarse to resolve many of the fundamental aerosol-cloud processes at the scales on which they actually occur. Improved observational tests are essential for validating the results of simulations and ensuring that modeling developments are on the right track, say the researchers.

While it is unfortunate that further progress on understanding aerosol-cloud interactions and their effects on climate is limited by inadequate observational tools and models, achieving the required improvement in observations and simulations is within technological reach, the researchers emphasize, provided that the financial resources are invested. The level of effort, they say, should match the socioeconomic importance of what the results could provide: lower uncertainty in measuring human-made climate forcing and better understanding and predictions of future impacts of aerosols on our weather and climate.


View the original article here

Sunday, February 23, 2014

Southeast U.S. should prepare for wild weather from climate change, expert says

People who live in the southeastern United States should begin to prepare for more drastically changing weather conditions -- everything from heat waves to poorer air quality -- caused by climate change, according to a new book, edited by a University of Florida researcher.

The book, which UF's Keith Ingram helped write, is titled "Climate Change of the Southeast United States: Variability, Change, Impacts and Vulnerability." Ingram was the book's lead editor.

Principal authors and editors, including Ingram, unveiled the book Tuesday. Ingram is director of the Southeast Climate Consortium and an associate research scientist with UF's Institute of Food and Agricultural Sciences.

"The Southeast already experiences extreme weather events including floods, droughts, heat waves, cold outbreaks, winter storms, severe thunderstorms, tornadoes and tropical cyclones. In the future, these events are likely to become more frequent or more severe, causing damage to most of our region's agriculture, stressing our region's water resources and threatening human health," he said. "The sooner we make preparations, the better off we'll be."

As defined in the book, the Southeast includes Florida, Georgia, South Carolina, North Carolina, Virginia, Tennessee, Kentucky, Arkansas, Louisiana, Mississippi, Alabama, the Virgin Islands and Puerto Rico.

Specific findings include:

• Average annual temperatures are projected to increase through the 21st century, with the region's interior projected to warm by as much as 9 degrees Fahrenheit;

• Cold days will become less frequent and the freeze-free season will lengthen by up to a month;

• Temperatures exceeding 95 degrees are expected to increase across the Southeast, and heat waves are expected to become longer by between 97 percent and 234 percent through the end of the century;

• Sea levels will likely rise by an average of 3 feet by the end of this century. Of particular concern is that storm surges will compound impacts of rising sea levels, Ingram said. People will have to raise existing structures and build new structures on filled soil, he said. Many cities and counties will have to build or refit water and sewer plants so they can survive rising waters caused by floods, Ingram said. Many builders, residents and governments are already doing these things, he said.

• While the number of tropical storms is projected to decrease slightly, the number of category 3 to category 5 hurricanes is expected to increase;

• High temperature stresses in summer will become more frequent and damaging to agriculture, and will possibly drive dairy and livestock production farther north. Warm weather during winter months reduces yields of blueberry, peach and other crops that need cool temperatures for flower buds to break, he said.

• Air quality is projected to decline and pollen counts will go up, damaging human health.

Residents of the Southeast should begin to prepare for the likelihood of more frequent extreme weather events, Ingram said.

With 26 percent of the U.S. population living in the Southeast, the region produces 25 percent of the country's carbon dioxide emissions, which are partly responsible for the climate change problem, Ingram said.

"We are a significant contributor, but we can help with the solution," he said.

The Southeast Climate Consortium works with extension agents and farmers to bring them valuable research.

"We work on how to adapt to or mitigate climate change," Ingram said.

Some local governments have agreed to reduce carbon emissions, the authors said Tuesday.

Cite This Page:

University of Florida Institute of Food and Agricultural Sciences. "Southeast U.S. should prepare for wild weather from climate change, expert says." ScienceDaily. ScienceDaily, 14 November 2013. .University of Florida Institute of Food and Agricultural Sciences. (2013, November 14). Southeast U.S. should prepare for wild weather from climate change, expert says. ScienceDaily. Retrieved February 1, 2014 from www.sciencedaily.com/releases/2013/11/131114113623.htmUniversity of Florida Institute of Food and Agricultural Sciences. "Southeast U.S. should prepare for wild weather from climate change, expert says." ScienceDaily. www.sciencedaily.com/releases/2013/11/131114113623.htm (accessed February 1, 2014).

View the original article here

Saturday, February 22, 2014

Walden Pond trees leafing out far earlier than in Thoreau's time

Climate-change studies by Boston University biologists show leaf-out times of trees and shrubs at Walden Pond are an average of 18 days earlier than when Henry David Thoreau made his observations there in the 1850s. However, not all plants respond in the same way, the result of which is that native species eventually may be threatened and lose competitive advantage to more resilient invasive shrubs such as Japanese barberry, according to a study published in the new edition of New Phytologist.

"By comparing historical observations with current experiments, we see that climate change is creating a whole new risk for the native plants in Concord," said BU Prof. Richard Primack. "Weather in New England is unpredictable, and if plants leaf out early in warm years, they risk having their leaves damaged by a surprise frost. But if plants wait to leaf out until after all chance of frost is lost, they may lose their competitive advantage."

The study began when Caroline Polgar, a graduate student with Primack, examined Thoreau's unpublished observations of leaf-out times for common trees and shrubs in Concord in the 1850s, then repeated his observations over the past five springs.

"We started to wonder if all trees and shrubs in Concord are equally responsive to warming temperatures in the spring," Polgar said. What she found was surprising. "All species -- no exceptions -- are leafing out earlier now than they did in Thoreau's time," she said. "On average, woody plants in Concord leaf out 18 days earlier now."

In New England, plants have to be cautious about leafing out in the early spring. If they leaf out too early, their young leaves could suffer from subsequent late frost. Since leafing-out requirements are thought to be species-specific, the group designed a lab experiment to test the responsiveness of 50 tree and shrub species in Concord to warming temperatures in the late winter and early spring.

For the past two winters, the researchers traveled to Concord and collected leafless dormant twigs from each species, and placed them in cups of water in their lab. Over the following weeks, they observed how quickly each species was be able produce their leaves in these unseasonably warm lab conditions.

"We found compelling evidence that invasive shrubs, such as Japanese barberry, are ready to leaf out quickly once they are exposed to warm temperatures in the lab even in the middle of winter, whereas native shrubs, like highbush bluberry, and native trees, like red maple, need to go through a longer winter chilling period before they can leaf out -- and even then their response is slow," says Amanda Gallinat, a second-year graduate student and third author of the paper.

The strength of this study, Gallinat said, is the pairing of observations and experiments.

"Our current observations show that plants in Concord today are leafing out earlier than in Thoreau's time in response to warm temperatures," she said. "However, the experiments show that as spring weather continues to warm, it will be the invasive shrubs that will be best able to take advantage of the changing conditions."

The spring growing season is of increasing interest to biologists studying the effects of a warming climate, and in coming decades non-native invasive shrubs are positioned to win the gamble on warming temperature, Primack said. The BU group is adding these findings to a growing list of advancing spring phenomena in Concord and elsewhere in Massachusetts, including flowering dates, butterfly flight times, and migratory bird arrivals. Founded in 1839, Boston University is an internationally recognized institution of higher education and research. With more than 33,000 students, it is the fourth-largest independent university in the United States. BU consists of 16 schools and colleges, along with a number of multi-disciplinary centers and institutes integral to the University's research and teaching mission. In 2012, BU joined the Association of American Universities (AAU), a consortium of 62 leading research universities in the United States and Canada.


View the original article here

Friday, February 21, 2014

New phone alerts for extreme weather may prevent casualties in India

When Cyclone Phailin hit India in late 2013 it became the largest storm to batter the subcontinent in over a decade. The storm, officially classified as a Category 5 tropical cyclone, affected more than 12 million people in India and neighboring countries, and required mass evacuations.

These evacuations revealed an urgent need for an effective alert system which could forewarn the majority of the population. A new paper published in Atmospheric Science Letters details how computer science undergraduates have created image based mobile phone alerts, connected to the Weather Research and Forecasting system.

India has a mobile phone subscriber base exceeding 929 million people and this is expected to touch 1.15 billion by the end of 2014. An alert system developed for mobiles could reach an estimated 97% of the population..

The paper details how during the 2013 storm the computer scientists were able to track its genesis, progression and landfall. By converting this information into images suitable for phones, they created a forecasting and warning system accessible to ordinary citizens.

"Cyclone alerts can save lives and property, but must be easily accessible," said Dr. Sat Ghosh. "The global perception of India's emerging IT prowess is lopsided. It is thought of as merely a manufacturing hub; however, our article puts the country's numerical literacy to practical use. The easy-to-use Weather Research and Forecasting model remains confined to an elite group of users, such as atmospheric scientists and weather forecasters. Our research explores how the WRF forecast can be interfaced with mobile telephony which has a deep penetration even in rural pockets of India."

Story Source:

The above story is based on materials provided by Wiley. Note: Materials may be edited for content and length.


View the original article here

Thursday, February 20, 2014

Scientists eye longer-term forecasts of U.S. heat waves

Scientists have fingerprinted a distinctive atmospheric wave pattern high above the Northern Hemisphere that can foreshadow the emergence of summertime heat waves in the United States more than two weeks in advance.

The new research, led by scientists at the National Center for Atmospheric Research (NCAR), could potentially enable forecasts of the likelihood of U.S. heat waves 15-20 days out, giving society more time to prepare for these often-deadly events.

The research team discerned the pattern by analyzing a 12,000-year simulation of the atmosphere over the Northern Hemisphere. During those times when a distinctive "wavenumber-5" pattern emerged, a major summertime heat wave became more likely to subsequently build over the United States.

"It may be useful to monitor the atmosphere, looking for this pattern, if we find that it precedes heat waves in a predictable way," says NCAR scientist Haiyan Teng, the lead author. "This gives us a potential source to predict heat waves beyond the typical range of weather forecasts."

The wavenumber-5 pattern refers to a sequence of alternating high- and low-pressure systems (five of each) that form a ring circling the northern midlatitudes, several miles above the surface. This pattern can lend itself to slow-moving weather features, raising the odds for stagnant conditions often associated with prolonged heat spells.

The study is being published next week in Nature Geoscience. It was funded by the U.S. Department of Energy, NASA, and the National Science Foundation (NSF), which is NCAR's sponsor. NASA scientists helped guide the project and are involved in broader research in this area.

Predicting a lethal event

Heat waves are among the most deadly weather phenomena on Earth. A 2006 heat wave across much of the United States and Canada was blamed for more than 600 deaths in California alone, and a prolonged heat wave in Europe in 2003 may have killed more than 50,000 people.

To see if heat waves can be triggered by certain large-scale atmospheric circulation patterns, the scientists looked at data from relatively modern records dating back to 1948. They focused on summertime events in the United States in which daily temperatures reached the top 2.5 percent of weather readings for that date across roughly 10 percent or more of the contiguous United States. However, since such extremes are rare by definition, the researchers could identify only 17 events that met such criteria -- not enough to tease out a reliable signal amid the noise of other atmospheric behavior.

The group then turned to an idealized simulation of the atmosphere spanning 12,000 years. The simulation had been created a couple of years before with a version of the NCAR-based Community Earth System Model, which is funded by NSF and the Department of Energy.

By analyzing more than 5,900 U.S. heat waves simulated in the computer model, they determined that the heat waves tended to be preceded by a wavenumber-5 pattern. This pattern is not caused by particular oceanic conditions or heating of Earth's surface, but instead arises from naturally varying conditions of the atmosphere. It was associated with an atmospheric phenomenon known as a Rossby wave train that encircles the Northern Hemisphere along the jet stream.

During the 20 days leading up to a heat wave in the model results, the five ridges and five troughs that make up a wavenumber-5 pattern tended to propagate very slowly westward around the globe, moving against the flow of the jet stream itself. Eventually, a high-pressure ridge moved from the North Atlantic into the United States, shutting down rainfall and setting the stage for a heat wave to emerge.

When wavenumber-5 patterns in the model were more amplified, U.S. heat waves became more likely to form 15 days later. In some cases, the probability of a heat wave was more than quadruple what would be expected by chance.

In follow-up work, the research team turned again to actual U.S. heat waves since 1948. They recognized that some historical heat wave events are indeed characterized by a large-scale circulation pattern that indicated a wavenumber-5 event.

Extending forecasts beyond 10 days

The research finding suggests that scientists are making progress on a key meteorological goal: forecasting the likelihood of extreme events more than 10 days in advance. At present, there is very limited skill in such long-term forecasts.

Previous research on extending weather forecasts has focused on conditions in the tropics. For example, scientists have found that El Ni?o and La Ni?a, the periodic warming and cooling of surface waters in the central and eastern tropical Pacific Ocean, are correlated with a higher probability of wet or dry conditions in different regions around the globe. In contrast, the wavenumber-5 pattern does not rely on conditions in the tropics. However, the study does not exclude the possibility that tropical rainfall could act to stimulate or strengthen the pattern.

Now that the new study has connected a planetary wave pattern to a particular type of extreme weather event, Teng and her colleagues will continue searching for other circulation patterns that may presage extreme weather events.

"There may be sources of predictability that we are not yet aware of," she says. "This brings us hope that the likelihood of extreme weather events that are damaging to society can be predicted further in advance."

The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this release are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.


View the original article here

Wednesday, February 19, 2014

With few hard frosts, tropical mangroves push north

Cold-sensitive mangrove forests have expanded dramatically along Florida's Atlantic Coast as the frequency of killing frosts has declined, according to a new study based on 28 years of satellite data from the University of Maryland and the Smithsonian Environmental Research Center in Edgewater, Md.

Between 1984 and 2011, the Florida Atlantic coast from the Miami area northward gained more than 3,000 acres (1,240 hectares) of mangroves. All the increase occurred north of Palm Beach County. Between Cape Canaveral National Seashore and Saint Augustine, mangroves doubled in area. Meanwhile between the study's first five years and its last five years, nearby Daytona Beach recorded 1.4 fewer days per year when temperatures fell below 28.4 degrees Fahrenheit (-4 degrees Celsius). The number of killing frosts in southern Florida was unchanged.

The mangroves' march up the coast as far north as St. Augustine, Fla., is a striking example of one way climate change's impacts show up in nature. Rising temperatures lead to new patterns of extreme weather, which in turn cause major changes in plant communities, say the study's authors.

Unlike many studies which focus on changes in average temperatures, this study, published online Dec. 30 in the peer-reviewed journal Proceedings of the National Academy of Sciences, shows that changes in the frequency of rare, severe events can determine whether landscapes hold their ground or are transformed by climate change.

The mangrove forests are edging out salt marshes, said University of Maryland Entomology Professor Daniel S. Gruner, a study co-author. "This is what we would expect to see happening with climate change, one ecosystem replacing another," said Gruner, who co-leads an interdisciplinary research project on mangrove ecosystems, along with Ilka C. Feller of the Smithsonian. "But at this point we don't have enough information to predict what the long term consequences will be."

One valuable ecosystem replaces another -- at what cost?

"Some people may say this is a good thing, because of the tremendous threats that mangroves face," said the study's lead author, Kyle Cavanaugh, a Smithsonian postdoctoral research fellow. "But this is not taking place in a vacuum. The mangroves are replacing salt marshes, which have important ecosystem functions and food webs of their own."

Mangrove forests grow in calm, shallow coastal waters throughout the tropics. Salt marshes fill that niche in temperate zones. Both provide crucial habitat for wildlife, including endangered species and commercially valuable fish and shellfish. Some animals use both types of habitat. Others, like marsh-nesting seaside sparrows or the honey bees that produce mangrove honey, rely on one or the other.

Both provide valuable ecosystem services, buffering floods, storing atmospheric carbon and building soils. Both are in decline nationally and globally. Mangrove forests are cut down for charcoal production, aquaculture and urbanization or lose habitat to drainage projects. Salt marshes are threatened by drainage, polluted runoff and rising sea levels.

Florida naturalists noticed that mangroves now grow in places that once were too chilly for the tropical trees. "We knew this was happening, but no one knew if it was a local or a regional phenomenon," Cavanaugh said.

Study used satellite photos, the "gold standard" in climate change

Cavanaugh, an expert in remote sensing, turned to photographs of Florida's Atlantic coast taken by NASA's Landsat 5, which launched in 1984 and tracked changes in Earth's land cover until 2011. "It very quickly became a gold standard to examine the effects of climate change, because it lets you look back in time," Cavanaugh said.

The satellite images revealed the mangroves' expansion into terrain formerly inhabited by salt marsh plants. While the study only looked at the Atlantic Coast, the same trend is taking place on Florida's Gulf Coast, Cavanaugh and Gruner said.

Mean winter temperatures have risen at seven of eight coastal weather stations in the study area. But if overall warming benefited mangroves, the mangrove cover should have increased all over Florida, not only in the north. Average winter temperature, rainfall, and urban or agricultural land use did not explain the mangroves' expansion. Only fewer freezing days at the northern end of their range matched the trend.

The researchers are studying effects on coastal insects and birds; whether the change will affect coastal ecosystems' ability to store carbon; and whether juvenile fish and commercially valuable shellfish will remain abundant in the changing plant communities.

Cavanaugh is looking at Landsat 5 imagery for Mexico, Peru, Brazil, Australia and New Zealand to see if mangroves are expanding elsewhere as they are in Florida.


View the original article here

Tuesday, February 18, 2014

Solar activity not a key cause of climate change, study shows

Climate change has not been strongly influenced by variations in heat from the sun, a new scientific study shows.

The findings overturn a widely held scientific view that lengthy periods of warm and cold weather in the past might have been caused by periodic fluctuations in solar activity.

Research examining the causes of climate change in the northern hemisphere over the past 1000 years has shown that until the year 1800, the key driver of periodic changes in climate was volcanic eruptions. These tend to prevent sunlight reaching Earth, causing cool, drier weather. Since 1900, greenhouse gases have been the primary cause of climate change.

The findings show that periods of low sun activity should not be expected to have a large impact on temperatures on Earth, and are expected to improve scientists' understanding and help climate forecasting.

Scientists at the University of Edinburgh carried out the study using records of past temperatures constructed with data from tree rings and other historical sources. They compared this data record with computer-based models of past climate, featuring both significant and minor changes in the sun.

They found that their model of weak changes in the sun gave the best correlation with temperature records, indicating that solar activity has had a minimal impact on temperature in the past millennium.

The study, published in Nature GeoScience, was supported by the Natural Environment Research Council.

Dr Andrew Schurer, of the University of Edinburgh's School of GeoSciences, said: "Until now, the influence of the sun on past climate has been poorly understood. We hope that our new discoveries will help improve our understanding of how temperatures have changed over the past few centuries, and improve predictions for how they might develop in future. Links between the sun and anomalously cold winters in the UK are still being explored."


View the original article here

Monday, February 17, 2014

Preparing for hell and high water: Researchers advocate for climate adaptation science

Changes are already happening to Earth's climate due to the burning of fossil fuels, deforestation and large-scale agriculture. As changes get more pronounced, people everywhere will have to adjust. In this week's issue of the journal Science, an international group of researchers urge the development of science needed to manage climate risks and capitalize on unexpected opportunities.

"Adapting to an evolving climate is going to be required in every sector of society, in every region of the globe. We need to get going, to provide integrated science if we are going to meet the challenge," said senior scientist Richard Moss of the Department of Energy's Pacific Northwest National Laboratory. "In this article, we describe the foundations for this research and suggest measures to establish it."

Climate preparedness research needs to integrate social and climate science, engineering, and other disciplines. It prepares for impacts by determining who and what are most vulnerable to changes and considering ways to adapt.

"Science for adaptation starts with understanding decision-making processes and information needs, determining where the vulnerabilities are, and then moves to climate modeling. A final step tracks whether adaptation is effective," said Moss, who is based at the Joint Global Change Research Institute, a collaboration between PNNL in Richland, Wash. and the University of Maryland.

The article grew out of a workshop held in August 2012 at the Aspen Global Change Institute in Aspen, Colo., on how to improve support for decision-making in the face of a changing climate. The authors arrived at this approach to guide preparedness research based on the need to reduce the risks that climate change presents.

"The need to adapt and adjust is going to be global," said Moss. "We need a flexible, integrated approach that merges theoretical and problem-oriented sciences around four general challenges."

The four challenges are:

Understanding what information is needed to make decisions about adapting to climate changeIdentifying vulnerabilities in society, the economy and the environmentImproving forecasts and climate models in ways that can address specific problemsProviding technology, management, and policy options for adapting

As an example of how practical and basic research can work together, Moss described work in the U.S. involving water utilities, university scientists, and private firms to pilot use of climate models and water utility modeling to design resilient water systems.

"This research is motivated by a practical challenge, ensuring reliable water supplies. Among the scientific advances that will be required is better integration of weather and climate models to improve decadal climate information to help people plan," Moss said.

Bringing together diverse disciplines at the Aspen workshop allowed the international team to explore all facets of adaptation, including less examined ones such as how scientific information is (and isn't) used in making decisions.

"Traditionally we think that what society needs is better predictions. But at this workshop, all of us -- climate and social scientists alike -- recognized the need to consider how decisions get implemented and that climate is only one of many factors that will determine how people will adapt," he said.

The focus on problem-solving could open up new sources of funding as well, sources such as non-governmental organizations, industry -- any group with specific problems that adaptation science could solve.

"We will make a virtue of necessity," said Moss.


View the original article here

Sunday, February 16, 2014

The lingering clouds: Why pollution results in larger storm clouds, colder days, warmer nights

A new study reveals how pollution causes thunderstorms to leave behind larger, deeper, longer lasting clouds. Appearing in the Proceedings of the National Academy of Sciences November 26, the results solve a long-standing debate and reveal how pollution plays into climate warming. The work can also provide a gauge for the accuracy of weather and climate models.

Researchers had thought that pollution causes larger and longer-lasting storm clouds by making thunderheads draftier through a process known as convection. But atmospheric scientist Jiwen Fan and her colleagues show that pollution instead makes clouds linger by decreasing the size and increasing the lifespan of cloud and ice particles. The difference affects how scientists represent clouds in climate models.

"This study reconciles what we see in real life to what computer models show us," said Fan of the Department of Energy's Pacific Northwest National Laboratory. "Observations consistently show taller and bigger anvil-shaped clouds in storm systems with pollution, but the models don't always show stronger convection. Now we know why."

Also, pollution can decrease the daily temperature range via such clouds: High clouds left after a thunderstorm spread out across the sky and look like anvils. These clouds cool the Earth during the day with their shadows but trap heat like a blanket at night. Pollution can cause clouds from late afternoon thunderstorms to last long into the night rather than dissipate, causing warmer nights.

Secret Life of Clouds

Models that predict weather and climate don't reconstruct the lives of clouds well, especially storm clouds. Usually these models replace storm clouds with simple equations that fail to capture the whole picture.

Because of the poor reconstructions, researchers have been faced with a dilemma: Pollution causes the anvil-shaped clouds to linger longer than they would in clean skies -- but why?

Possible reasons revolve around tiny natural and human-made particles called aerosols that serve as seeds for cloud droplets to form around. A polluted sky has many more aerosols than a clean sky -- think haze and smog -- and that means less water for each seed. Pollution makes more cloud droplets, but each droplet is smaller.

More and smaller droplets change things for the clouds. Researchers have long thought that smaller droplets start a chain reaction that leads to bigger, longer-lasting clouds: Instead of raining down, the lighter droplets carry their water higher, where they freeze. The freezing squeezes out the heat the droplets carry with them and causes the thunder cloud to become draftier. The stronger convection lifts more water droplets, building up the cloud.

But researchers don't always see stronger convection every time they see larger and longer-lasting clouds in polluted environments, indicating a piece of the puzzle was missing.

To solve this dilemma, Fan and colleagues decided to compare real-life summer storm clouds to a computer model that zooms deep into simulated clouds. The model included physical properties of the cloud particles as well as the ability to see convection, if it gets stronger or weaker. Most models run in days or weeks, but the simulations in this study took up to six months.

"Modeling the details of cloud microphysical properties is very computationally intensive, so models don't usually include them," said Fan.

Convection Vexation

The researchers started with cloud data from three locations that differ in how polluted, humid and windy they typically are: the tropics in the western Pacific, southeastern China and the Great Plains in Oklahoma. The data had been collected through DOE's ARM Climate Research Facility.

With support from DOE's Regional and Global Climate Model program, the research ran simulations on PNNL's hometown supercomputer Olympus. Their simulations of a month of storms ended up looking very similar to the actual observed clouds, validating that the models re-created the storm clouds well.

The team found that in all cases, pollution increased the size, thickness and duration of the anvil-shaped clouds. However, only two locations -- the tropics and China -- showed stronger convection. The opposite happened in Oklahoma -- pollution made for weaker convection.

This inconsistency suggested that stronger convection isn't the reason. Taking a closer look at the properties of water droplets and ice crystals within clouds, the team found that pollution resulted in smaller droplets and ice crystals, regardless of location.

In addition, the team found that in clean skies, the heavier ice particles fall faster out of the anvil-shaped clouds, causing the clouds to dissipate. However, the ice crystals in polluted skies were smaller and too light to fall out of the clouds, leading to the larger, longer-lasting clouds.

Lastly, the team estimated how much warming or cooling the storm clouds contributed. Overall, the polluted clouds cooled the day and warmed the night, decreasing the daily temperature range.

Most models don't simulate convection well, take into account the microphysical processes of storm clouds, nor address how pollution interacts with those processes. Accounting for pollution effects on storm clouds in this way could affect the ultimate amount of warming predicted for the Earth in the next few decades. Accurately representing clouds in climate models is key to improving the accuracy of predicted changes to the climate.

Journal Reference:

J. Fan, L. R. Leung, D. Rosenfeld, Q. Chen, Z. Li, J. Zhang, H. Yan. Microphysical effects determine macrophysical response for aerosol impacts on deep convective clouds. Proceedings of the National Academy of Sciences, 2013; DOI: 10.1073/pnas.1316830110

View the original article here

Saturday, February 15, 2014

Primary GOES-R instrument ready to be installed onto spacecraft

A key instrument that will fly on the Geostationary Operational Environmental Satellite -- R (GOES-R) spacecraft, NOAA's next-generation of geostationary satellites, is cleared for installation on the spacecraft.

The Advanced Baseline Imager, or ABI, is GOES-R's primary instrument for scanning Earth's weather, oceans, and environment and is a significant improvement over instruments on NOAA's current geostationary satellites. The ABI will offer faster imaging with much higher detail. It will also introduce new forecast products for severe weather, volcanic ash advisories, fire and smoke monitoring and other hazards.

"The United States is home to some of the most severe weather in the world including tornadoes, hurricanes, snowstorms, floods, and wildfires," said Mary Kicza, assistant administrator for NOAA's Satellite and Information Service. "The ABI offers breakthrough technology that will help NOAA develop faster and more accurate forecasts that will save lives and protect communities."

The first satellite in the GOES-R Series is currently scheduled for launch in early 2016. GOES-R's instruments will also feature improved lightning detection and solar weather monitoring tools, and will provide near real time data to forecasters during severe weather events.

The ABI has two scan modes. It will have the ability to continuously take an image of the entire planet, or a full disk image, every five minutes compared to every 30 minutes with the current GOES imager. It also has an alternative, or flex mode, which will concurrently take a full disk image every 15 minutes, an image of the continental U.S. every five minutes, and smaller, more detailed images of areas where storm activity is present, as often as every 30 seconds. This kind of flexibility and increased frequency of images is a boon for forecasters.

"Completing ABI is a major milestone for the program, the culmination of nine years of work to develop an instrument with extraordinary capabilities for weather observation," said Pam Sullivan, project manager of the GOES-R Flight Project at NASA Goddard. "With its increased resolution and faster scan times, ABI is comparable to a hi-definition upgrade for our geostationary weather satellites."

In early 2014 the ABI will be shipped from its developer, ITT Exelis, in Ft. Wayne, Ind. to the spacecraft developer, Lockheed Martin Space Systems Co. in Littleton, Colo., to be installed onto the first GOES-R spacecraft. Lockheed is building the spacecraft for the GOES-R series.

The remaining GOES-R instruments to be delivered are:

Geostationary Lightning Mapper, which will provide continuous surveillance for the first time of total lightning activity from geostationary orbit over the western hemisphere;Space Environment In-Situ Suite, which consists of sensors that will monitor radiation hazards that can affect satellites, radio communications and navigation systems;Solar Ultraviolet Imager, a high-powered telescope that observes the sun, monitoring for solar flares and other solar activity that could impact Earth by disrupting power utilities communication and navigation systems and causing damage to orbiting satellites and the International Space Station; andMagnetometer, which will provide measurements of the magnetic field surrounding Earth that protects the planet from charged particles released from the sun. These particles can be dangerous to spacecraft and human spaceflight. The geomagnetic field measurements will provide alerts and warnings to satellite operators and power utilities.

A sixth instrument, the Extreme X-Ray Irradiance Sensor (EXIS), was completed in May 2013 and was the first of GOES-R's instruments to be ready for integration. EXIS will provide important early warnings of impending solar storms and give scientists a more accurate measure of the power of solar energy radiating toward earth, which can severely disrupt telecommunications, air travel and the performance of power grids.

NOAA manages the GOES-R Series program through an integrated NOAA-NASA office, staffed with personnel from both agencies and located at NASA's Goddard Space Flight Center in Greenbelt, Md.

For more information about GOES-R and the current GOES satellite fleet, visit:


View the original article here

Friday, February 14, 2014

Motion of the ocean: Predicting the big swells

New research will help you every morning with the surf report. Research led by the Vice-Chancellor will allow oceanographers and meteorologists to better predict the rate at which ocean swells decay, or deteriorate, as they travel across the globe.

"Ocean cargo shipping, offshore oil and gas production, and even recreational activities such as surfing, are all dependent on wave action," says Professor Young.

"It is therefore critical that we are able to predict swell."

It is estimated that 75 per cent of waves across the world are not actually generated by local winds. Instead, they are driven by distant storms which propagate as swell.

"Imagine you drop a rock in a pond. Waves radiate out from the rock. You don't need anything to push the waves. Once generated, they propagate by themselves.

"So, for most of the Indian, Pacific and South Atlantic oceans, it is actually the weather in the Southern Ocean thousands of kilometres away that dominates the wave conditions," explains Professor Young.

"The Southern Ocean is dominated by big low pressure systems that move across it year round. These systems generate waves that then grow and can travel tens of thousands of kilometres from where they were actually formed, to crash on a beach in Australia."

Professor Young, who is affiliated with the Research School of Earth Sciences, used orbiting satellites to track swell generated in the Southern Ocean and measure the rate of decay as it travelled north towards Australia.

The results showed that the decay of the swell depends on how steep the wave actually is.

"Steep waves decay very quickly. However, typical swell is not very steep and can travel across oceanic basins with only a relatively small loss of energy."

Over 200 individual cases were tracked, making this study the first to provide such comprehensive data of this decay.

"What we were able to do is track the swell from the satellite as it moved from the south to the north, some 1400 kilometres. We only chose cases where there was no wind so that we could be confident that all we were measuring was the swell decay.

"We can take these results and put them into a mathematical formula that can be put straight into computer models used by national weather bureaus.

"This will increase our ability to better predict wave action. As 70 per cent of the world's oceans are dominated by swell, it's extremely important to be able to predict them accurately."

Journal Reference:

I. R. Young, A. V. Babanin, S. Zieger. The Decay Rate of Ocean Swell Observed by Altimeter. Journal of Physical Oceanography, 2013; 43 (11): 2322 DOI: 10.1175/JPO-D-13-083.1

View the original article here

Thursday, February 13, 2014

System developed for assessing how effective species are at pollinating crops

From tomatoes to pumpkins, most fruit and vegetable crops rely on pollination by bees and other insect species -- and the future of many of those species is uncertain. Now researchers from North Carolina State University are proposing a set of guidelines for assessing the performance of pollinator species in order to determine which species are most important and should be prioritized for protection.

"Widespread concerns over the fate of honey bees and other pollinators have led to increased efforts to understand which species are the most effective pollinators, since this has huge ramifications for the agriculture industry," says Dr. Hannah Burrack, an associate professor of entomology at NC State and co-author of a paper on the new guidelines and related research. "However, various research efforts have taken a wide variety of approaches, making it difficult to compare results in a meaningful way.

"We've developed a set of metrics that we think offers a comprehensive overview of pollination efficiency, which would allow researchers to compare data from different crops and regions."

The new comprehensive approach looks at four specific metrics. First is single-visit efficiency, which measures the number of seeds produced when one bee visits one flower. Second is abundance, which measures the number of each type of bee observed in a study area. Third is inclement weather behavior, which tracks how active a bee species is during cool, cloudy and/or windy weather. Fourth is visitation rate, or the number of flowers that a bee visits while foraging, and the amount of time it spends at each flower.

"The perfect bee would produce a lot of seeds and visit a lot of flowers, even in poor weather -- and there would be a lot of them," Burrack says. "But as far as we know, the perfect bee doesn't exist."

The researchers conducted a pilot study using their comprehensive approach to assess the pollination performance of various bee species on economically important highbush blueberry crops in North Carolina. They found that small native bees had extremely high single-visit efficiency rates and were active during inclement weather. However, small native bees did not have high abundance nor appear to have high visitation rates.

"This highlights the importance of incorporating multiple metrics," says Dr. David Tarpy, an associate professor of entomology at NC State and co-author of the paper. "Because researchers looking only at visitation rates or abundance may think the small native species are unimportant, when they actually appear to be important pollinators for blueberry growers."

The paper, "Multiple Criteria for Evaluating Pollinator Performance in Highbush Blueberry (Ericales: Ericaceae) Agroecosystems," was published online Nov. 25 in the journal Environmental Entomology.


View the original article here

Wednesday, February 12, 2014

Scientists nearing forecasts of long-lived wildfires

Scientists have developed a new computer modeling technique that offers the promise, for the first time, of producing continually updated daylong predictions of wildfire growth throughout the lifetime of long-lived blazes.

The technique, devised by scientists at the National Center for Atmospheric Research (NCAR) and the University of Maryland, combines cutting-edge simulations portraying the interaction of weather and fire behavior with newly available satellite observations of active wildfires. Updated with new observations every 12 hours, the computer model predicts critical details such as the extent of the blaze and changes in its behavior.

The breakthrough is described in a study appearing today in an online issue of Geophysical Research Letters, after first being posted online last month.

"With this technique, we believe it's possible to continually issue good forecasts throughout a fire's lifetime, even if it burns for weeks or months," said NCAR scientist Janice Coen, the lead author and model developer. "This model, which combines interactive weather prediction and wildfire behavior, could greatly improve forecasting -- particularly for large, intense wildfire events where the current prediction tools are weakest."

Firefighters currently use tools that can estimate the speed of the leading edge a fire but are too simple to capture crucial effects caused by the interaction of fire and weather.

The researchers successfully tested the new technique by using it retrospectively on the 2012 Little Bear Fire in New Mexico, which burned for almost three weeks and destroyed more buildings than any other wildfire in the state's history.

The research was funded by NASA, the Federal Emergency Management Agency, and the National Science Foundation, which is NCAR's sponsor.

Sharpening the picture

In order to generate an accurate forecast of a wildfire, scientists need a computer model that can both incorporate current data about the fire and simulate what it will do in the near future.

Over the last decade, Coen has developed a tool, known as the Coupled Atmosphere-Wildland Fire Environment (CAWFE) computer model, that connects how weather drives fires and, in turn, how fires create their own weather. Using CAWFE, she successfully simulated the details of how large fires grew.

But without the most updated data about a fire's current state, CAWFE could not reliably produce a longer-term prediction of an ongoing fire. This is because the accuracy of all fine-scale weather simulations declines significantly after a day or two, thus affecting the simulation of the blaze. An accurate forecast would also have to include updates on the effects of firefighting and of such processes as spotting, in which embers from a fire are lofted in the fire plume and dropped ahead of a fire, igniting new flames.

Until now, the kind of real-time data that would be needed to regularly update the model has not been avaliable. Satellite instruments offered only coarse observations of fires, providing images in which each pixel represented an area a little more than a half mile across (1 kilometer by 1 kilometer). These images might show several places burning, but they could not distinguish the boundaries between burning and non-burning areas, except for the largest wildfires.

To solve the problem, Coen's co-author, Wilfrid Schroeder of the University of Maryland, has produced higher-resolution fire detection data from a new satellite instrument, the Visible Infrared Imaging Radiometer Suite (VIIRS), which is jointly operated by NASA and the National Oceanic and Atmospheric Administration (NOAA). Launched in 2011, this new tool provides coverage of the entire globe at intervals of 12 hours or less, with pixels about 1,200 feet across (375 meters). The higher resolution enabled the two researchers to outline the active fire perimeter in much greater detail.

Coen and Schroeder then fed the VIIRS fire observations into the CAWFE model. By restarting the model every 12 hours with the latest observations of the fire extent -- a process known as cycling -- they could accurately predict the course of the Little Bear fire in 12- to 24-hour increments during five days of the historic blaze. By continuing this way, it would be possible to simulate the entire lifetime of even a very long-lived fire, from ignition to extinction.

"The transformative event has been the arrival of this new satellite data," said Schroeder, a professor of geographical sciences who is also a visiting scientist with NOAA. "The enhanced capability of the VIIRS data favors detection of newly ignited fires before they erupt into major conflagrations. The satellite data has tremendous potential to supplement fire management and decision support systems, sharpening the local, regional, and continental monitoring of wildfires."

Keeping firefighters safe

The researchers said that forecasts using the new technique could be particularly useful in anticipating sudden blowups and shifts in the direction of the flames, such as what happened when 19 firefighters perished in Arizona last summer.

In addition, they could enable decision makers to look at several newly ignited fires and determine which pose the greatest threat.

"Lives and homes are at stake, depending on some of these decisions, and the interaction of fuels, terrain, and changing weather is so complicated that even seasoned managers can't always anticipate rapidly changing conditions," Coen said. "Many people have resigned themselves to believing that wildfires are unpredictable. We're showing that's not true."


View the original article here

Tuesday, February 11, 2014

MS study correlates negative effect of warmer weather on cognitive status

Kessler Foundation scientists correlated functional magnetic resonance imaging (fMRI) findings with the negative impact of outdoor temperature on cognitive functioning in multiple sclerosis (MS).

This study, "Warmer outdoor temperature is associated with task-related increased BOLD activation in patients with multiple sclerosis," released by Brain Imaging & Behavior corroborates the group's previous study that established that people with MS performed worse on processing speed and memory tasks during warmer outdoor temperatures versus during cooler outdoor temperatures. "Increased MS disease activity during warmer months is a recent discovery. Now, this work is the first report of brain activation associated with outdoor temperature in MS. This finding is novel and important for persons with MS who are shown to have worse cognition during warmer weather," said Victoria M. Leavitt, Ph.D., research scientist at Kessler Foundation and principal investigator for the study, funded by a grant from the National MS Society.

Kessler Foundation researchers previously demonstrated that patients with multiple sclerosis (MS) demonstrate worse cognition on warmer days. (Leavitt VM, Sumowski JF, Chiaravalloti N, DeLuca J. Warmer outdoor temperature is associated with worse cognitive status in multiple sclerosis. Neurology. 2012 Mar 27;78(13):964-8). The purpose of the current study was to identify the neurophysiological basis for worse cognition. "Here, we examined the neurophysiology underlying this temperature-cognition relationship, said Dr. Leavitt. "The association between task-related BOLD fMRI activation and outdoor temperature was investigated in 28 MS patients who demonstrated worse cognitive function on warmer days. In MS patients, warmer outdoor temperature was associated with greater BOLD activation during performance of a simple sustained attention task. The brain areas that showed greater activation on warmer days were regions typically activated by MS patients during task performance: the frontal, dorsolateral, prefrontal and parietal cortex. The relationship between outdoor temperature and cerebral activation was absent in healthy controls. Increased brain activation required by MS patients on warmer days to perform a simple task may signify neural inefficiency."

According to Dr. Sumowski, "The significant effect of warmer weather on cognition should be considered when designing and conducting clinical trials. This information might assist clinicians in choosing clinical treatment, and help researchers develop effective strategies for coping with the negative effects of weather-related effects on cognition that impact independence, education, employment and activities of daily living."

Story Source:

The above story is based on materials provided by Kessler Foundation. Note: Materials may be edited for content and length.


View the original article here

Monday, February 10, 2014

Rainfall to blame for decline in Arctic peregrines

Rain, crucial to sustaining life on Earth, is proving deadly for young peregrine falcons in Canada's Arctic.

A University of Alberta study recently published in Oecologia shows that an increase in the frequency of heavy rain brought on by warmer summer temperatures is posing a threat not seenin this species since before pesticides such as DDT were banned from use in Canada in 1970.

The study is among the first to directly link rainfall to survival of wild birds in Canada.

A nest-box experiment at the heart of the study, co-written by U of A researcher Alastair Franke and Alexandre Anctil of the Universit? du Qu?bec, has provided "unequivocal evidence" that gradual changes in Arctic temperature and precipitation are responsible for a long-term decline in reproduction for the peregrine, a top predator in the Arctic.

The change in rainfall patterns in recent years has had a big influence on the overall decline in reproductive success over the last three decades, Franke said.

Paired with historical weather data and measures of breeding success dating back to 1980, the researchers also conducted a nest-box experiment from 2008 to 2010 in a dense population of peregrines breeding near Rankin Inlet in Nunavut on the shores of the Hudson Bay. Falcon nests were monitored using motion-sensitive cameras, and images confirmed that more than one-third of the chick deaths recorded were indeed caused by rain, whether they were raised in nest boxes or on natural ledges.

"The nestlings died from hypothermia and in some cases from drowning in their flooded nests. Without constant parental care, they are most vulnerable to cold and wet conditions in the first three weeks of life."

Over the past 30 years, scientists have been surprised to discover an ongoing decline, even when pesticide residues were known to be too low to cause reproductive failure.

"We knew DDT was no longer an issue and based on field observations, we wondered whether changes in climate were responsible for high mortality in recent years," Franke said.

Besides deaths attributed to rainfall, the study also revealed additional fallout for chicks: starvation.

"We were surprised to find that a considerable number of nestlings raised in nest boxes later died of starvation despite having been spared from the direct effects of rain."

Believing that storms may also be the culprit in reducing the abundance of prey for peregrines, Franke has launched a food supplementation study to explore the possible link.

Grim as the study's findings are, "they have improved our understanding of the direct effects of long-term changes in weather patterns and have identified the potential importance of indirect effects," Franke said.

The work also shows that wildlife can be sensitive to many different environmental pressures and that ongoing vigilance and monitoring is critical, he noted.

The study was funded by ArcticNet, the U of A's Canadian Circumpolar Institute, the Nunavut Wildlife Management Board and Department of Environment, the Natural Sciences and Engineering Research Council of Canada, the Fonds de recherche Nature et technologies Qu?bec and a W. Garfield Weston Award.


View the original article here

Sunday, February 9, 2014

Unique model simulates electron environment in space at 36000 km above the Earth

A spacecraft at near-Earth orbit is continuously bombarded by charged particles. Finnish Meteorological Institute has developed a unique model that simulates electron environment in the near-Earth space.

Finnish Meteorological Institute's new model specifies the electron environment at any orbit where important satellites are moving. FMI' s new IMPTAM (Inner Magnetosphere Particle Transport and Acceleration Model) model is a unique tool and the only one in Europe. "Specifying the electron flux at any satellite orbit, we will be able to provide satellite operators the critical information for surface charging of satellite materials," says the main developer of the IMPTAM model, FMI's researcher Dr. Natalia Ganushkina.

At present, there are about 1000 operational satellites at different orbits in the near-Earth space and all of them pass through the regions where the radiation environment can vary significantly with location. All the variability is imposed by the activity of the Sun.

Electrons with these energies constitute one of the most important parts of the radiation environment in the near-Earth space. First, they are responsible for discharges on the surface of the outer spacecraft layers that can cause significant damage and spacecraft anomalies. Second, they are accelerated to much higher energies of megaelectronvolts and populate the Earth's radiation belts which are from the radiation hazard viewpoint the two most critical regions around the Earth.

Main drivers of transport and acceleration of electrons with energies of 50 to 150 kiloelectronvolts (keV) to geostationary orbit (36000 km above the Earth) in space. Electrons come to geostationary orbit from the areas in space located at about 10 Earth Radii (1 Earth Radius is equal to 6400 km) from the Earth in the direction away from the Sun. They move towards the Earth not in empty space but in magnetic and electric fields whose presence was established from observations. These magnetic and electric fields guide electrons, and if they change, electrons move differently. Since we do not have satellites in every point in space to tell us what are the magnitudes of the fields, we need to use models for these fields.

A very good agreement is achieved between the observed electrons in form of electron fluxes measured by several satellites at geostationary orbit and our model electron fluxes at the same locations as satellites. The main factor was found to be the inclusion of small-scale electric fields related to reconfiguration of electric and magnetic fields during substorms, phenomena which occurs in our space very often, almost every day and last 1-3 hours. Substorms are responsible for spectacular aurora displays which we can see at high latitudes. The good agreement indicates that our model contains the necessary physical processes correctly, and can be used for other orbits, not only for geostationary.

Journal Reference:

N. Y. Ganushkina, O. A. Amariutei, Y. Y. Shprits, M. W. Liemohn. Transport of the plasma sheet electrons to the geostationary distances. Journal of Geophysical Research: Space Physics, 2013; 118 (1): 82 DOI: 10.1029/2012JA017923

View the original article here

Saturday, February 8, 2014

Researchers target sea level rise to save years of archaeological evidence

Prehistoric shell mounds found on some of Florida's most pristine beaches are at risk of washing away as the sea level rises, wiping away thousands of years of archaeological evidence.

"The largest risk for these ancient treasure troves of information is sea level rise," said Shawn Smith, a senior research associate with the Center for Ocean-Atmospheric Prediction Studies at Florida State University.

But a joint project between Smith and the National Park Service is drawing attention to the problem to hopefully minimize the impact on the state's cultural sites.

Smith and Margo Schwadron, an archaeologist with the National Park Service, have embarked on a project to examine past and future changes in climate and how we can adapt to those changes to save areas of shoreline and thus preserve cultural and archeological evidence.

"We're kind of the pioneers in looking at the cultural focus of this issue," Smith said, noting that most weather and ocean experts are concerned about city infrastructure for coastal areas.

To complete the project, the National Park Service awarded Smith a $30,000 grant. With that money, Smith and former Florida State University undergraduate Marcus Johnson spent hours compiling modern, colonial and paleo weather data.

The focus of their initial research is the Canaveral National Seashore and Everglades National Park, which both have prehistoric shell mounds, about 50 feet to 70 feet high. Researchers believe these shell mounds served as foundations for structures and settlements and later served as navigational landmarks during European exploration of the region.

Modern temperature and storm system information was easily available to researchers. But, to go hundreds and then thousands of years back took a slightly different approach.

Log books from old Spanish forts as well as ships that crossed the Atlantic had to be examined to find the missing information.

The result was a comprehensive data set for the region, so detailed that modern era weather conditions are now available by the hour.

Smith and Schwadron are trying to secure more funding to continue their work, but for now, they are making their data set available to the general public and other researchers in hopes of raising awareness about the unexpected effects of sea level rise.

The National Park Service has also published a brochure on climate change and the impact that sea level rise could have on the shell mounds found at Cape Canaveral.


View the original article here

Friday, February 7, 2014

North Atlantic atmospheric oscillation affects quality of cava

The quality of cava depends on technical factors such as fermentation, aging and bottling processes, which usually remain stable for years. Researchers from Malaga University (Spain) have discovered that oscillations in the North Atlantic -that affects European climate- also have an effect on the attributes of this sparkling wine. The years in which there is presence of the Azores anticyclone, there is a drop in the quality of cava.

The researchers Raimundo Real and Jos? Carlos B?ez, from the University of Malaga, have analysed the possible effects of the North Atlantic oscillation, known in scientific literature as NAO, on the quality of Spanish cava in a study published in the International Journal of Biometeorology.

The NAO is a microclimate index that reflects the atmospheric pressure difference between the Azores and Iceland, so the presence of an anticyclone in the Azores is positive and it is negative if there are areas of low pressure in that same area. This pressure difference that oscillates over time, has a direct effect on the weather conditions in the Iberian Peninsula.

"We discovered there was a connection between the NAO and the quality of cava between 1970 and 2008. The existence of positive NAO values during the months of March to August, when the grape is developing and maturing, reduced the capacity of obtaining top quality cava," Raimundo Real told SINC.

The North Atlantic oscillation plays a major role in weather fluctuations in the hemisphere. The phenomenon affects the climate in Europe and the Iberian Peninsula. It is related to temperature and rain variations in cava producing regions, which affects the physiological processes during the grape's period of maturity.

"The likelihood of obtaining a top quality cava is higher when the average value of the NAO is negative. This makes the average temperature in the cava region drop and the quality improves," the expert explained.

Inter-annual variations in the quality of cava are determined according to the different aromas and the amount of sugar in the grape. These qualities of the plant in turn, in one area of production, depend on weather conditions, such as cloud cover, temperature and rainfall to which the plant is subjected, particularly during the grape period (March to September).

Predicting the years of top-quality cava

The climate in the Atlantic Ocean, the Mediterranean basin and the surrounding continents shows considerable weather variability.

"During half of the years we analysed, the NAO values are intermediate and do not clearly affect the quality of the cava, but in the other half, the values are more extreme and lead to clearly favorable or unfavorable conditions for obtaining top-quality," says Real.

The information for 2012 pointed towards an 80% likelihood of obtaining a top-quality cava, while this likelihood is around 45% for 2013, always according to the model obtained. The model correctly predicted the 80% for the clearly favorable years for obtaining top-quality cava and the 70% likelihood of the clearly unfavorable years.

The NAO value between March and August can be calculated in the actual wine harvest time, while the quality of the cava can only be valued two years later. "This is important for being able to predict years of top-quality cava production, as well as for exploring the possible effects and variations of climate change on the quality" he concluded.


View the original article here

Thursday, February 6, 2014

Precise remote sounding for better climate models

The water budget of the troposphere, the bottom layer of Earth's atmosphere, determines the weather and plays a central role in climate change. The isotope composition of water vapor, i.e. the ratio of light and heavy water molecules, provides insight into underlying mechanisms. Climate researchers of Karlsruhe Institute of Technology (KIT) gather the data required by in-situ measurements as well as by using remote sounding instruments, e.g. on board of satellites. In a recent campaign, they combined both methods and proved the precision of remote sounding measurements for the first time.

"Water evaporation and condensation processes as well as the strong greenhouse effect of water vapor and clouds decisively influence the energy balance of the atmosphere and the entire planet," says Matthias Schneider from the KIT Institute of Meteorology and Climate Research (IMK). "The complex water cycle has to be known better to understand our climate and to reliably estimate its development." Global analysis requires reliable remote sounding measurements. The data measured will then help improve the reliability of climate models.

During a campaign above the sea near the Canary Island of Tenerife, the scientists conducted six measurement flights to determine the isotope composition of water vapor up to 7 km height. For this purpose, they used the ISOWAT diode laser spectrometer specially developed by IMK for use on aircraft. This spectrometer ensures a high accuracy and temporal resolution under dry as well as under humid conditions. In parallel, measurements were carried out at 2370 and 3550 m height on Tenerife in cooperation with the Spanish weather service (AEMET). Two commercial in-situ instruments and an infrared instrument of the worldwide measurement network NDACC (Network for the Detection of Atmospheric Composition Change) were applied for this purpose. In addition, the data of the infrared instrument IASI (Infrared Atmospheric Sounding Interferometer) operated on board of the METOP European weather satellite were used. The flights of the research aircraft were coordinated with the ground-based and satellite measurements.

"For this campaign, the IMK measurement methods for ground- and satellite-based remote sounding were combined with IMK's aircraft-based in-situ measurement methods," Matthias Schneider says. "We found good agreement between the datasets. This means that the precision of the remote sounding instruments, that is the quality of the data supplied by them, was confirmed." For the first time, the researchers have proved that both the worldwide measurement network NDACC with its ground stations and modern weather satellites provide reliable global data for the isotope composition of tropospheric water vapor.

Better understanding of the mechanisms associated with the atmospheric water budget based on isotope composition is the objective of the MUSICA project coordinated by Schneider at IMK. MUSICA stands for "MUlti-platform remote Sensing of Isotopologues for investigating the Cycle of Atmospheric water." MUSICA is funded with EUR 1.5 million by the European Research Council (ERC). The project has a duration of five years.


View the original article here

Wednesday, February 5, 2014

Pacific Ocean temperature influences tornado activity in US

Meteorologists often use information about warm and cold fronts to determine whether a tornado will occur in a particular area. Now, a University of Missouri researcher has found that the temperature of the Pacific Ocean could help scientists predict the type and location of tornado activity in the U.S.

Laurel McCoy, an atmospheric science graduate student at the MU School of Natural Resources, and Tony Lupo, professor and chair of atmospheric science in the College of Agriculture, Food and Natural Resources, surveyed 56,457 tornado-like events from 1950 to 2011. They found that when surface sea temperatures were warmer than average, the U.S. experienced 20.3 percent more tornados that were rated EF-2 to EF-5 on the Enhanced Fuijta (EF) scale. (The EF scale rates the strength of tornados based on the damage they cause. The scale has six category rankings from zero to five.)

McCoy and Lupo found that the tornados that occurred when surface sea temperatures were above average were usually located to the west and north of tornado alley, an area in the Midwestern part of the U.S. that experiences more tornados than any other area. McCoy also found that when sea surface temperatures were cooler, more tornadoes tracked from southern states, like Alabama, into Tennessee, Illinois and Indiana.

"Differences in sea temperatures influence the route of the jet stream as it passes over the Pacific and, eventually, to the United States," McCoy said. "Tornado-producing storms usually are triggered by, and will follow, the jet stream. This helps explain why we found a rise in the number of tornados and a change in their location when sea temperatures fluctuated."

In the study, McCoy and Lupo examined the relationship between tornadoes and a climate phenomenon called the Pacific Decadal Oscillation (PDO). PDO phases, which were discovered in the mid-1990s, are long-term temperature trends that can last up to 30 years. According to NASA scientists, the current PDO phase has just entered into a "cool" state.

"PDO cool phases are characterized by a cool wedge of lower than normal sea-surface ocean temperatures in the eastern Pacific and a warm horseshoe pattern of higher than normal sea-surface temperatures extending into the north, west and southern Pacific," McCoy said. "In the warm phase, which lasted from 1977 to 1999, the west Pacific Ocean became cool and the wedge in the east was warm."

In 2011, more than 550 deaths occurred as a result of tornadoes, resulting in more than $28 billion in property damage, according to the U.S. National Oceanic and Atmospheric Administration. McCoy says that with her findings, officials may be able to save lives in the future.

"Now that we know the effects of PDO cool and warm phases, weather forecasters have another tool to predict dangerous storms and inform the public of impending weather conditions," McCoy said.

The research will be presented at the National Weather Association Conference this fall.

Cite This Page:

University of Missouri-Columbia. "Pacific Ocean temperature influences tornado activity in US." ScienceDaily. ScienceDaily, 17 October 2013. .University of Missouri-Columbia. (2013, October 17). Pacific Ocean temperature influences tornado activity in US. ScienceDaily. Retrieved February 1, 2014 from www.sciencedaily.com/releases/2013/10/131017174043.htmUniversity of Missouri-Columbia. "Pacific Ocean temperature influences tornado activity in US." ScienceDaily. www.sciencedaily.com/releases/2013/10/131017174043.htm (accessed February 1, 2014).

View the original article here

Tuesday, February 4, 2014

Use of media can save lives in bad storms

The number and intensity of storms and other extreme weather events are on the increase all over the world. The latest study by the Medical University of Vienna in cooperation with the US Centers for Disease Control and Prevention (CDC) uses the example of one of the largest American series of tornados of all times to show that the risk of injury can be reduced significantly with the use of certain media.

Several dozen tornados struck in April 2011 across Southeast USA and made for an image of devastation. Thomas Niederkrotenthaler from the Centre for Public Health of the Medical University of Vienna used this third-largest series of tornados in the history of the USA as an opportunity to conduct a study, which just appeared in the latest edition of the international top journal PLOS ONE.

Television and social media offer particularly good protection

Together with his research team, Niederkrotenthaler investigated the behavioral factors which reduce or increase the risk of injury. The researchers particularly concentrated on the media use by those affected, which had never been scientifically investigated in this context so far. The results of the study show that people who used media intensively for education during the series of tornados, had a significantly less risk of injury. Television and Internet were mainly protective and warnings via social media such as Twitter and Facebook particularly in this case.

"The media carried out excellent work. It accurately predicted the streets and the locations through which the tornados would pass, and continuously provided information about changes in the predictions. The corresponding media users could thus effectively protect themselves from the consequences of the storms," says Niederkrotenthaler. "The great protective effect of media has its cause in an important characteristic feature of tornados because unlike hurricanes, its exact course can only be predicted shortly before its arrival. The target forecast lead time of the US National Weather Service is just 15 minutes."

Adapting the US prevention guidelines on the basis of the Medical University of Vienna/CDC study

The media is however also important for another reason: Approximately 20 percent of the injuries are caused only after a tornado, mainly during the cleaning-up operations. Toppling trees and accidents with chain saws are especially dangerous and rather frequent. This was an outcome that led to an adaptation of the American prevention guidelines. Niederkrotenthaler also says: "The tornado prevention guidelines were adapted as an outcome of our study. The media now informs the citizens that they need to be particularly careful after tornados as well."

The internationally composed research team identified a visit to shelters and cellar rooms as another important protective factor. Niederkrotenthaler said, "As a whole, factors of primary prevention mainly save lives in such cases. In Alabama alone there were 212 deaths due to the tornado outbreak; however, most of the victims did not make it to a hospital, which emphasizes the relevance of primary prevention." Tornado sirens also correspondingly made a significant contribution to protecting the civil population. They did sound quite frequently because of false alarms, but those affected have surprisingly not become hardened because of that -- on the contrary: "People, who had already heard the sirens before when a tornado actually struck, protected themselves better than others even during the series of tornados which we investigated," says Niederkrotenthaler.

Journal Reference:

Thomas Niederkrotenthaler, Erin M. Parker, Fernando Ovalle, Rebecca E. Noe, Jeneita Bell, Likang Xu, Melissa A. Morrison, Caitlin E. Mertzlufft, David E. Sugerman. Injuries and Post-Traumatic Stress following Historic Tornados: Alabama, April 2011. PLoS ONE, 2013; 8 (12): e83038 DOI: 10.1371/journal.pone.0083038

View the original article here