Tech

Human-derived mercury shown to pollute the world's deepest ocean trenches

image: The submersible "Deep Sea Warrior", used by Ruoyu Sun's team

Image: 
Ruoyu Sun and IDSSE-CAS

Scientists have found that man-made mercury pollution has reached the bottom of the deepest part of the ocean - the Marianas Trench. This has significant implications for how mercury affects the marine environment, and how it may be concentrated in the food chain. The findings, which come from two independent research groups, are presented at the Goldschmidt geochemistry conference.

Mercury is toxic to humans and other animals, and has been implicated in environmental disasters in the past, most famously at Minamata in Japan in the 1950's where it led to birth defects and severe neurological symptoms. It tends to be concentrated in marine organisms, where small amounts are ingested by some species which are in turn eaten by larger species, meaning that harmful levels of mercury can be concentrated in animals that sit higher up in natural food webs through the process of bioaccumulation. As an example, this leads to mercury concentrations in swordfish being x40 that of salmon. Mercury is generally poisonous at high levels and can be especially dangerous to the developing foetus.

Now two groups of scientists are independently reporting that both manmade and natural methylmercury, a toxic form of mercury easily accumulated by animals, has been found in fish and crustaceans in the 11,000m deep Marianas Trench in the Pacific Ocean. This work is being reported at the Goldschmidt Geochemistry conference.

"This is a surprise" said researcher Ruoyu Sun. "Previous research had concluded that methlymercury was mostly produced in the top few hundred metres of the ocean. This would have limited mercury bioaccumulation by ensuring that fish which forage deeper than this would have had limited opportunity to ingest the methylmercury. With this work, we now believe that isn't true".

Dr Ruoyu Sun, leading a group of researchers from Tianjin University, China, said "During 2016-2017, we deployed sophisticated deep-sea lander vehicles on the seafloor of Mariana and Yap trenches, amongst the most remote and inaccessible locations on Earth, and captured the endemic fauna at 7000-11000 m and collected sediments at 5500-9200 m. We are able to present unequivocal mercury isotope evidence that the mercury in the trench fauna originates exclusively from methylmercury from the upper ocean. We can tell this because of the distinctive isotopic fingerprint which stamps it as coming from the upper ocean."

Independently, a group led by Dr Joel Blum (University of Michigan) sampled fish and crustaceans from 2 of the deepest Pacific Trenches, the Kermadec trench near New Zealand (which drops to 10000m) and the Marianas Trench off the Philippines. They use mercury isotopic signatures at both locations to show that mercury found in trench species is largely derived from the atmosphere and enters the ocean in rainfall. Joel Blum said:

"We know that this mercury is deposited from the atmosphere to the surface ocean and is then transported to the deep ocean in the sinking carcasses of fish and marine mammals as well as in small particles. We identified this by measuring the mercury isotopic composition, which showed that the ocean floor mercury matched that from fish found at around 400-600m depth in the Central Pacific*, Some of this mercury is naturally-produced, but it is likely that much of it comes from human activity.

This work shows that human-released mercury has reached and entered foodwebs in even the most remote marine ecosystems on earth. This better understanding of the origin of mercury in the deepest reaches of the ocean will aid in modelling the fate of mercury in the atmosphere and oceans".

Ruoyu Sun commented "Our findings reveal very little methylmercury is produced in the deep oceans, and imply that anthropogenic mercury release at the Earth's surface is much more widespread across deep oceans than was previously thought".

Commenting, Professor Ken Rubin of the Dept. of Earth Sciences, University of Hawaii said:

"We know that mercury is introduced into the environment from a variety of natural sources such as volcanic eruptions and forest fires. However, human activities, such as coal and petroleum burning, mining, and manufacturing, are mainly responsible for mercury deposition to marine environments. We are now learning from these two studies that the effects of this deposition have spread throughout the ocean into the deep sea and the animals that live there, which is yet another indicator of the profound impact of modern human activities on the planet."

This is an independent comment; Professor Rubin was not involved in this work.

Credit: 
Goldschmidt Conference

Climate change and the rise of the Roman Empire and the fall of the Ptolemies

The assassination of Julius Caesar on the Ides of March in 44 B.C.E. triggered a 17-year power struggle that ultimately ended the Roman Republic leading to the rise of the Roman Empire. To the south, Egypt, which Cleopatra was attempting to restore as a major power in the Eastern Mediterranean, was shook by Nile flood failures, famine, and disease. These events are among the best known and important political transitions in the history of western civilization. A new study reveals the role climate change played in these ancient events.

An international team of researchers, including Yale's Joe Manning, used historical accounts and climate proxy records -- natural preservers of an environment's history (such as ice cores) -- to uncover evidence that the eruption of Alaska's Okmok volcano in 43 B.C.E. caused global climatic changes that sparked the period's political and social unrest and ultimately changed the course of ancient history. The research was published June 22 in the journal Proceedings of the National Academy of Sciences.

The interdisciplinary team analyzed volcanic fallout records in six Arctic ice cores, and found that the largest volcanic eruption in the northern Hemisphere of the past 2,500 years occurred in early 43 B.C.E. The researchers found that the geochemistry of tephra -- rock fragments and particles ejected by a volcanic eruption -- originated from the Okmok volcano in Alaska. Climate proxy records show that 43 and 42 B.C.E. were among the coldest years of the recent millennia in the Northern Hemisphere at the start of one of the coldest decades. Further research suggested that this high-latitude eruption led to pronounced changes in hydroclimate, including colder seasonal temperatures in specific Mediterranean regions during the two-year period following the eruption.

The team synchronized these scientific findings with written and archaeological sources from the period, which described unusual climate, crop failures, famine, disease, and unrest in the Mediterranean immediately following the eruption -- suggesting, Manning said, that the otherwise sophisticated and powerful ancient states were significantly vulnerable to these climatic shocks from a volcanic eruption located on the opposite side of the earth.

The decade of the 40's BCE was a period of food insecurity and famine in Egypt during the reign of Cleopatra, both of which took place during a time when the Nile River failed to flood. While there is some rain in the region, there is not enough to sustain agriculture, and Egyptians relied heavily on annual Nile River flood to water their crops, said Manning, the William K. and Marilyn Milton Simpson Professor of Classics and a scholar of ancient Egyptian history. "We know that the Nile River did not flood in 43 B.C.E.and 42 B.C.E. -- and now we know why. This volcanic eruption greatly affected the Nile watershed."

One of the texts that corroborated these findings is dated about 39 B.C.E. -- year 13 of Cleopatra's reign -- but refers to large-scale famine and social distress of the previous decade. The inscription describes a local governor who saves the population from widespread famine by finding food when there hadn't been a Nile River flood for several years. He is recognized as a savior by priesthoods, said Manning. "This inscription does not describe collapse or resilience," he said. "It is a more complicated story of trying to survive and to figure out how to distribute grain during a very chaotic time."

Today, Okmok Island, located in the mid-Aleutian Islands, has a population of about 40 people and 7,500 head of cattle. Manning finds irony in the fact that one of the most significant places in world history is in an extremely remote part of the world: "This large volcanic eruption that happened in the winter of 43 B.C.E. had cascading impacts on the climate system and on human societies in the Mediterranean during a vulnerable period of time."

Yet, he added, "Neither Roman scientists nor ancient priests had any notion of Okmok Island."

The new research "allows us to rethink ancient history, especially with regard to environment and climate, and to create a vision of a dynamic, three-dimensional society," Manning said.

In addition to Manning, the team involved in the paper included researchers from the Desert Research Institute, University of Cambridge, University of Bern, Queen's University Belfast, University of Oxford, Trinity College-Dublin, University of Alaska-Fairbanks, University of Göttingen, and University of Copenhagen.

Manning and Francis Ludlow's (Trinity College-Dublin) work on the PNAS paper was funded by an NSF grant for the Yale-led project "Volcanism, Hydrology and Social Conflict: Lessons from Hellenistic and Roman-Era Egypt and Mesopotamia."

Credit: 
Yale University

Microbubbles controlled by acoustical tweezers for highly localized drug release

image: Local release, assisted by an acoustic trap, of nanoparticles transported by microbubbles

Image: 
Diego Baresch, Institut de mécanique et d'ingénierie de Bordeaux (CNRS/Université de Bordeaux/Arts et Métiers Paristech/Bordeaux INP)

Microbubbles are used every day as contrast agents in medical sonography, and are the subject of intense research for the delivery of therapeutic agents. There are a number of options available to manipulate these microbubbles, including the use of light and sound, although the potential of the latter remains largely unexplored. In their research[1] published on 22 June 2020 in PNAS, CNRS researcher Diego Baresch and Valeria Garbin[2], a researcher at the Delft University of Technology (The Netherlands), show that it is entirely possible to manipulate microbubbles through the use of "acoustical tweezers," a tool developed in 2016 that uses an acoustical beam to trap an object without contact. In using these acoustical tweezers through layers of bio-mimicking and elastic materials, they successfully surpassed the limitations of optical tweezers,[3] which cannot propagate through opaque media (such as in vivo tissue). As a result, the scientists have opened the way for a broader application of acoustical tweezers in biology and biomedicine, for instance for the highly-localized, reproducible, and controlled delivery of medicine, or for in vitro tissue engineering using stem cells.

Credit: 
CNRS

Artificial night sky poses serious threat to coastal species

image: The sand hopper (Talitrus saltator) is a common feature of Europe's coasts

Image: 
John Spicer, University of Plymouth

The artificial lighting which lines the world's coastlines could be having a significant impact on species that rely on the moon and stars to find food, new research suggests.

Creatures such as the sand hopper (Talitrus saltator) orientate their nightly migrations based on the moon's position and brightness of the natural night sky.

However, a study by Bangor University and the University of Plymouth shows the presence of artificial light originating from cities several kilometres away (also known as artificial skyglow) disrupts the lunar compass they use when covering long distances.

In some cases, this can lead to them travelling towards the sea and away from food, while in others it reduces the chance of them venturing out on forays for food at all.

Writing in Current Biology, researchers say this could pose a distinct threat not just to the health of sand hopper populations but also the wider ecosystem, since they play an important role in breaking down and recycling algae washed up on strandlines.

The study was conducted as part of the Artificial Light Impacts on Coastal Ecosystems (ALICE) project, funded by the Natural Environment Research Council.

Dr Thomas Davies, Lecturer in Marine Conservation at the University of Plymouth (UK), is the paper's senior author and principal investigator on the ALICE project. He said:

"Skyglow is the most geographically widespread form of light pollution. Surveys have shown it can currently be detected above 23% of the world's coasts nightly, and with coastal human populations set to at least double by 2060 its effects are only going to increase. Our results show it is already having demonstrable impacts on biological processes that are guided by celestial light cues.

"Through the ALICE project, we are finding increasing evidence that light pollution from coastal cities can influence marine species inhabiting nearby beaches, rocky shores and even the seafloor. These results highlight how pervasive city lighting could be in shaping the ecology of coastlines kilometres distant from their nearest urban centres. They also highlight the potential for artificial skyglow to impact other species that undergo migrations using the moon as a compass.

"While our understanding of the impacts of street lights on nature has improved dramatically, artificial skyglow has been largely overlooked. More work is urgently needed to fully understand the extent to which it is shaping the natural environment."

Stuart Jenkins, Professor of Marine Ecology at Bangor University and one of the study's co-authors, added:

"It is easy to forget the critical influence of the moon in guiding many organisms' movements. However, we are increasingly realising that by disrupting patterns of night time lighting, we are potentially reducing the ability of animals to navigate. This new research on the shores of North Wales shows clearly that very low levels of artificial light can have far-reaching effects on coastal marine species."

The sand hopper is a common feature of Europe's coasts and spends daytimes buried in the sand at depths of 10-30cm, emerging at night to feed on decaying seaweed and other detritus.

For this study, researchers monitored the sand hopper population on Cable Bay beach in North Wales (UK), a naturally dark location, over 19 nights between June and September 2019.

They observed the behaviour of almost 1,000 individuals under a range of moon phases and weather conditions, before introducing artificial light that replicated the intensity and colour of skyglow from towns and cities around the UK coastline.

Credit: 
University of Plymouth

Synthetic materials mimic living creatures

video: 'Hybrid bonding polymer' object crawls on a surface driven by alternating periods of light exposure and darkness.

Image: 
Northwestern University

EVANSTON, Ill. -- Northwestern University researchers have developed a family of soft materials that imitates living creatures.

When hit with light, the film-thin materials come alive -- bending, rotating and even crawling on surfaces.

Called "robotic soft matter by the Northwestern team," the materials move without complex hardware, hydraulics or electricity. The researchers believe the lifelike materials could carry out many tasks, with potential applications in energy, environmental remediation and advanced medicine.

"We live in an era in which increasingly smarter devices are constantly being developed to help us manage our everyday lives," said Northwestern's Samuel I. Stupp, who led the experimental studies. "The next frontier is in the development of new science that will bring inert materials to life for our benefit -- by designing them to acquire capabilities of living creatures."

The research will be published on June 22 in the journal Nature Materials.

Stupp is the Board of Trustees Professor of Materials Science and Engineering, Chemistry, Medicine and Biomedical Engineering at Northwestern and director of the Simpson Querrey Institute He has appointments in the McCormick School of Engineering, Weinberg College of Arts and Sciences and Feinberg School of Medicine. George Schatz, the Charles E. and Emma H. Morrison Professor of Chemistry in Weinberg, led computer simulations of the materials' lifelike behaviors. Postdoctoral fellow Chuang Li and graduate student Aysenur Iscen, from the Stupp and Schatz laboratories, respectively, are co-first authors of the paper.

Although the moving material seems miraculous, sophisticated science is at play. Its structure comprises nanoscale peptide assemblies that drain water molecules out of the material. An expert in materials chemistry, Stupp linked the peptide arrays to polymer networks designed to be chemically responsive to blue light.

When light hits the material, the network chemically shifts from hydrophilic (attracts water) to hydrophobic (resists water). As the material expels the water through its peptide "pipes," it contracts -- and comes to life. When the light is turned off, water re-enters the material, which expands as it reverts to a hydrophilic structure.

This is reminiscent of the reversible contraction of muscles, which inspired Stupp and his team to design the new materials.

"From biological systems, we learned that the magic of muscles is based on the connection between assemblies of small proteins and giant protein polymers that expand and contract," Stupp said. "Muscles do this using a chemical fuel rather than light to generate mechanical energy."

For Northwestern's bio-inspired material, localized light can trigger directional motion. In other words, bending can occur in different directions, depending on where the light is located. And changing the direction of the light also can force the object to turn as it crawls on a surface.

Stupp and his team believe there are endless possible applications for this new family of materials. With the ability to be designed in different shapes, the materials could play a role in a variety of tasks, ranging from environmental clean-up to brain surgery.

"These materials could augment the function of soft robots needed to pick up fragile objects and then release them in a precise location," he said. "In medicine, for example, soft materials with 'living' characteristics could bend or change shape to retrieve blood clots in the brain after a stroke. They also could swim to clean water supplies and sea water or even undertake healing tasks to repair defects in batteries, membranes and chemical reactors."

Credit: 
Northwestern University

Diagnosing brain tumors with a blood test

image: Princess Margaret Senior Scientist Dr. Daniel De Carvalho and Krembil Brain Institute Medical Director Dr. Gelareh Zadeh collaborated to combine advanced technology with machine learning to develop a highly sensitive and accurate blood test to detect and classify brain cancers.

Image: 
UHN

TORONTO - A simple but highly sensitive blood test has been found to accurately diagnose and classify different types of brain tumours, resulting in more accurate diagnosis, less invasive methods and better treatment planning for patients, in the future.

The finding, published in Nature Medicine on June 22, 2020, describes a non-invasive and easy way to classify brain tumours. The study is also being presented virtually today at the prestigious Opening Plenary Session of the American Association for Cancer Research Annual Meeting 2020: Turning Science into Lifesaving Care.

A major challenge in treating brain cancers is the accurate diagnosis of different types of brain cancers, and tumours ranging from low grade - which can look almost normal under a microscope - to aggressive tumours. Cancer grades are used to determine prognosis, and assist in treatment planning.

Current methods to diagnose and establish the subtype of brain cancer based on molecular information rely upon invasive surgical techniques to obtain tissue samples, which is a high-risk procedure and anxiety-provoking for patients.

The ability to diagnose and classify the type of brain tumour without the need for a tissue sample is revolutionary and practice changing. In some cases, surgery may not even be necessary.

"If we had a better and more reliable way to diagnose and subtype tumours, we could transform patient care," says Dr. Gelareh Zadeh, Medical Director of the Krembil Brain Institute, Head of Surgical Oncology at the Princess Margaret Cancer Centre, Senior Scientist at the Princess Margaret Cancer Research Institute, Professor of Surgery, University of Toronto, and a co-senior author in the study.

"It would have a tremendous impact on how we treat these cancers, and in how we plan our treatments."

Dr. Zadeh worked with Senior Scientist Dr. Daniel De Carvalho at Princess Margaret Cancer Centre, who is a world leader in the field of cancer epigenetics applied to early detection, classification and novel therapeutic interventions.

Dr. De Carvalho's lab specializes in a type of epigenetic modification called DNA methylation, which plays an important role in the regulation of gene expression (turning genes on or off) in cells. In cancer cells, DNA methylation patterns are disrupted, leading to unregulated cancer growth.

Dr. De Carvalho has previously developed a DNA methylation-based liquid biopsy approach to profile hundreds of thousands of these epigenetic alterations in DNA molecules circulating in the blood. These fragments are called circulating tumour DNA or ctDNA. Combining this new technology with machine learning, his team was able to develop a highly sensitive and accurate test to detect and classify multiple solid tumours.

Working together, Drs. Zadeh and De Carvalho decided to use this same approach in the challenging application of intracranial brain tumour classification. The clinicians and scientists tracked the cancer origin and type by comparing patient tumour samples of brain cancer pathology, with the analysis of cell-free DNA circulating in the blood plasma from 221 patients.

Using this approach, they were able to match the circulating plasma ctDNA to the tumour DNA, confirming their ability to identify brain tumour DNA circulating in the blood of these patients. Then, using a machine learning approach, they developed a computer program to classify the brain tumour type based solely on the circulating tumour DNA.

Prior to this, it was not thought possible to detect any brain cancers with a blood test because of the impermeable blood-brain barrier, says Dr. Zadeh. This barrier exists between the brain's blood vessels and its tissue, protecting the brain from any toxins in the blood.

"But because this test is so sensitive in picking up even small amounts of highly specific tumour-derived signals in the blood, we now have a new, non-invasive way of detecting and discriminating between common brain tumours - something which was long thought impossible. This really is a tour de force," explains Dr. Zadeh.

Dr. Daniel De Carvalho, a Canada Research Chair in Cancer Epigenetics and Associate Professor at University of Toronto, adds that the field of identifying tumour-specific alterations in ctDNA with new, more sensitive tests in various body fluids - such as blood and urine - is now at a turning point because advanced technologies can detect and analyze even the smallest traces of cancer-specific molecular signatures from the vast quantities of circulating non-tumour DNA fragments.

"The possibility to map epigenetic modifications genome-wide, combined with powerful computational approaches, has brought us to this tipping point," says Dr. De Carvalho.

"Molecular characterization of tumours by profiling epigenetic alterations in addition to genetic mutations gives us a more comprehensive understanding of the altered features of a tumour, and opens the possibilities for more specific, sensitive, and tumour agnostic tests."

In an accompanying paper, also published in Nature Medicine on June 22, 2020, Dr. Carvalho and his collaborators from the Dana-Farber Cancer Institute at Harvard University show the same blood test can accurately identify kidney cancer from circulating cell-free DNA obtained either from plasma or from urine.

Credit: 
University Health Network

Mysterious climate change

image: Drilling camp in the Horseshoe Valley with flat drill in the foreground.

Image: 
© Chris Turney

New research findings underline the crucial role that sea ice throughout the Southern Ocean played for atmospheric CO2 in times of rapid climate change in the past. An international team of scientists with the participation of the University of Bonn has shown that the seasonal growth and destruction of sea ice in a warming world increases the biological productivity of the seas around Antarctica by extracting carbon from the atmosphere and storing it in the deep ocean. This process helps to explain a long-standing question about an apparent 1,900-year pause in CO2 growth during a period known as the Antarctic cold reversal. The research results have now been published in "Nature Geoscience".

Surrounding the remote continent of Antarctica, the Southern Ocean is one of the most important yet poorly understood components of the global carbon cycle. Having captured half of all human-related carbon that has entered the ocean to date, the Southern Ocean is crucial to regulating human-induced CO2. Therefore, understanding the processes that determine its effectiveness as a carbon sink through time are essential to reducing uncertainty in climate projections.

After the Last Ice Age, around 18,000 years ago, the world transitioned naturally into the warm interglacial world we live in today. During this period, CO2 rose rapidly from around 190 ppm to 280 ppm over around 7,000 years. This rise was not steady, and was interrupted by rapid rises and intermittent plateaus, reflecting different processes within the global carbon cycle.

Antarctic Cold Reversal

One period stands out: a 1,900-year plateau of near-constant CO2 levels at 240 ppm starting some 14,600 years ago called the Antarctic Cold Reversal. The cause of this plateau remains unknown, but understanding the processes may be critical for improving projections surrounding climate-carbon feedbacks.

"We found that in sediment cores located in the sea-ice zone of the Southern Ocean biological productivity increased during this critical period, whereas it decreased farther north, outside of the sea-ice zone", says Michael Weber, co-author of the study from the Institute for Geosciences at the University of Bonn. "It was now important to find out how climate records on the Antarctic continent depict this critical time period."

To resolve this question researchers from Keele University, U.K., and the University of New South Wales (UNSW) in Sydney, Australia, travelled to the Patriot Hills Blue Ice Area to obtain new records of marine biomarkers captured in ice cores. Chris Fogwill, lead author of the study from Keele University, says "the cause of this long plateau in global atmospheric CO2 levels may be fundamental to understanding the potential of the Southern Ocean to moderate atmospheric CO2. Whilst recent reductions in emissions due to the Covid-19 pandemic have shown that we can reduce CO2, we need to understand the ways in which CO2 levels have been stabilised by natural processes, as they may be key to the responsible development of geoengineering approaches and remain fundamental to achieving our commitment to the Paris Agreement".

Horizontal ice core analysis

Blue ice areas are created by fierce, high-density katabatic winds that erode the top layer of snow effectively and expose the ice below. As a result, ice flows up to the surface, providing access to ancient ice below. While most Antarctic researchers drill down into the ice to extract samples with a conventional ice core, this team used a different method: horizontal ice core analysis. Chris Turney (UNSW, Sydney) says "Instead of drilling kilometres into the ice, we can simply walk across a blue ice area to travel back through time. This provides the opportunity to sample large volumes of ice necessary for studying new organic biomarkers and DNA that were blown from the Southern Ocean onto Antarctica and preserved in the blue ice."

The results demonstrated a marked increase in the number and diversity of marine organisms across the 1,900 year period of the CO2 plateau, an observation never seen before. The team also conducted climate modelling revealing that this period coincided with the greatest seasonal changes in sea ice extent from summer to winter. Together with the marine cores, these findings provide the first evidence of increased biological productivity record and suggest that processes in the Antarctic Zone of Southern Ocean may have caused the CO2 plateau.

The team will use this work to underpin the development of climate models that seek to improve our understanding of future climate change. The inclusion of sea ice processes that control climate-carbon feedbacks in a new generation of models will be crucial for reducing uncertainties surrounding climate projections and help society adapt to future warming.

Credit: 
University of Bonn

Positive YouTube videos of wolves linked to greater tolerance

A new study from North Carolina State University suggests that people have more tolerance for wolves after seeing positive videos about them, which could make YouTube an important wolf conservation tool.

"One of the cool things about these results is that positive messaging was effective for changing people's views. People had more positive attitudes, greater willingness to accept wolves, and were more likely to take action to help their conservation - no matter their political identity or their age - after watching positive videos," said Nils Peterson, senior author of the study and a professor in NC State's Department of Forestry and Environmental Resources.

"A lot of wildlife species we care about only need tolerance to persist in a landscape," Peterson added. "They're not domestic animals that need a lot of help from us. They just need us not to kill them or destroy their habitat."

In the study, researchers evaluated how a group of 273 people rated their tolerance for wolves before and after watching either a playlist of five different negative videos, a playlist of five different positive videos, or a neutral video.

To measure their tolerance, researchers asked questions in three categories: they asked participants about their overall attitudes toward wolves, such as whether they thought wolves were "good" or "bad;" their level of acceptance of wolves in their state and near populated areas; and their intended behaviors, or whether they would be likely to act for or against wolves or their conservation.

Survey participants had positive attitudes, acceptance and behavior intentions about wolves prior to receiving any treatment, but researchers saw that positive videos could still increase attitudes, acceptance and participants' willingness to act. They also saw those changes regardless of whether the viewer identified as conservative or liberal.

"Everybody is on social media these days, including state wildlife agencies, federal agencies, nonprofits, and everybody is putting content out there," said the study's lead author Will Casola, a Ph.D. student at NC State. "This study shows that this material actually has the potential to influence people, and they're not just putting time and resources into something that goes in one ear and out the other."

However, people who identified as liberal were more likely than conservatives to show positive changes in favor of wolves in measures of attitudes, acceptance and intended behaviors regardless of the videos they watched.

"We didn't see anything that would suggest people reacted differently to each video treatment depending on their political affiliation," Casola said. "Instead, we saw that no matter which videos they watched, liberals were more likely to exhibit positive changes."

The largest changes in tolerance were linked to older age. People above the age of 40, regardless of political background, were more likely to have larger changes in their attitudes for or against wolves.

While negative videos also led to decreased tolerance for wolves, this change was less dramatic.

"There's a lot of literature out there that shows that positively framed messages are more powerful than negatively framed messages, and these findings reinforce that," Casola said.

Researchers saw improvements in respondents' willingness to act for wolf conservation overall, but except for signing petitions to support wolf re-introduction, respondents showed reluctance to take other specific actions to aid wolf conservation.

"People in general said they weren't likely to participate in many of these behaviors, but they were also less likely to participate in behaviors that were directly opposed to wolf recovery and conservation," Casola said.

Researchers focused on wolves since they can be controversial. While researchers said wolves are essential for maintaining a diversity of species in a landscape and improving the health of populations they prey on, they can also compete with people for space and resources, and can pose a risk for livestock.

Researchers said one unanswered question in their work is about how effective the videos were at reaching people who may not already agree with the underlying message.

"People are already asking the question: How do we get media to cross ideological bubbles that people have created?" Peterson said.

Credit: 
North Carolina State University

Satellites have drastically changed how we forecast hurricanes

image: This video looks at advances in hurricane forecasting, with a focus on the contributions from weather satellites.

Image: 
NASA

The powerful hurricane that struck Galveston, Texas on September 8, 1900, killing an estimated 8,000 people and destroying more than 3,600 buildings, took the coastal city by surprise.

This video looks at advances in hurricane forecasting in the 120 years since, with a focus on the contributions from weather satellites. This satellite technology has allowed us to track hurricanes - their location, movement and intensity.

"One of the dramatic impacts is that satellite data keeps an eye on the target," especially over unpopulated areas such as oceans, said JPSS Director Greg Mandt. "We're sort of like your eyes in the sky to make sure that Mother Nature can never surprise you."

A fleet of Earth-observing satellites, including those from the Joint Polar Satellite System (JPSS) and Geostationary Operational Environmental Satellite series (GOES-R), provides remarkable advances in hurricane forecasting. The JPSS polar-orbiting satellites measure the state of the atmosphere by taking precise measurements of sea surface temperatures and atmospheric temperature and moisture, which are critical to securing storm forecasts several days in advance.

Improved sensors also give us a better understanding of the core of hurricanes and allow forecasters to predict where they're going to hit, without over-warning, Mandt said. "Then you can narrow and shrink that cone of uncertainty and give a better prediction."

The GOES satellites orbit at the same rate as the Earth spins, which allows them to stare at hurricanes as they evolve. That, combined with advances to the sensors, gives us a view of hurricanes in motion.

"We take a full-disc picture of the entire hemisphere in five minutes," said GOES-R Series System Program Director Pam Sullivan. "But we can also look at a smaller area and scan it every 30 seconds. You get to see the hurricane eye wall forming. You can see it actually forming in real time. The Earth looks alive. It looks like a living thing."

Credit: 
NASA/Goddard Space Flight Center

Recovery from airline delays works best with future disruptions in mind

image: Industrial and enterprise systems engineering professor Lavanya Marla and her colleagues developed a new approach to airline-disruption recovery that could save the industry millions of dollars.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Instead of responding to each flight delay as if it were an isolated event, airlines should consider the likelihood of potential disruptions ahead, researchers report in the journal Transportation Science. They developed a new approach that allows airlines to respond to flight delays and cancellations while also incorporating information about likely disruptions later the same day.

Their model suggests this approach could reduce airline recovery costs by 1%-2%, potentially resulting in millions of dollars of savings a year, the researchers say.

Flight disruptions waste precious resources and cost airlines tens of billions of dollars a year, said study lead Lavanya Marla, a professor of industrial and enterprise systems engineering at the University of Illinois at Urbana-Champaign. Because most airports in the U.S. schedule flights to leave or depart every two minutes, delays at one or two major airports can propagate quickly through the system. While some timing buffers are built into the network to allow for minor delays, larger disruptions - for example, those stemming from a powerful weather system in one region of the country - tend to magnify problems across the network of airports as the day progresses.

Understanding these probabilities can help airlines respond to disruptions in a more realistic manner, Marla said.

"We are trying to introduce the idea that we should be reactive and proactive at the same time," she said.

For example, an airplane that is behind schedule could use more fuel to fly faster to make its destination on time, Marla said.

"But if I know that there is a high likelihood that the flight will experience a delay at the other end, I may decide not to waste a lot of money trying to speed it up," she said. "That way, I don't incur those unnecessary costs."

Airlines can respond to disruptions in a number of ways. They can hold flights so that delayed passengers and crew members can make their connections. They can cancel flights to minimize disruptions elsewhere in the system. They can swap aircraft. They can switch the crew pairings for particular flights. They also can reroute aircraft or change their speed, flight pattern or elevation.

Some options are more disruptive or expensive than others, Marla said. With her colleagues, Alexandre Jacquillat, of the Massachusetts Institute of Technology, and U. of I. civil and environmental engineering graduate student Jane Lee, Marla developed the Stochastic Reactive and Proactive Disruption Management model, which uses estimates of potential future disruptions to choose the least costly options available. It often deliberately introduces flight-departure holds, which are less costly than speeding up the aircraft, canceling flights or swapping aircraft.

"We are going to trade a lot of these very costly measures for a number of strategically placed low-impact approaches," Marla said. "That may result in more delayed flights, but that's because I'm holding these flights deliberately so that my network connectivity is preserved."

The model is designed to minimize an airline's recovery costs, Marla said.

"A solution that's good for the airline might not be good for individual passengers," she said. "But reducing delays on the whole is good for passengers."

Future studies should incorporate data that also prioritizes the needs of airline crews and passengers, she said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Are protected areas effective at maintaining large carnivore populations?

image: Brown bear in eastern Finland. Interestingly, the effects of protected areas on bear densities varied depending on the methodology used.

Image: 
Daniel Burgas-Riera

A recent study, led by the University of Helsinki, used a novel combination of statistical methods and an exceptional data set collected by hunters to assess the role of protected areas for carnivore conservation in Finland.

Overall, protected areas do not harbour higher densities of large carnivore species than unprotected lands. These areas even had declining wolverine densities within their limits while populations outside remained overall stable over a 30-year study period. The study was published in the journal Nature Communications.

The international group of authors, led by Dr Julien Terraube from the Faculty of Biological and Environmental Sciences at the University of Helsinki, proposes that the results do not indicate that protected areas are unimportant for carnivore conservation, as they may support seasonal habitats and prey for these highly mobile species. However, the outcomes highlight complex socio-ecological pressures on carnivore populations that vary in both time and space and affect the conservation outcomes of protected areas. For example, the largest Finnish protected areas are located in Lapland, and due to their sizes these areas are most suitable for large carnivores. However, the areas seem unable to maintain stable wolverine populations, which may be linked to increased conflicts with herders in the reindeer husbandry area.

"Wolverines are only found in three Nordic countries within the European Union, and therefore Finland plays an important role for the conservation of this species", explains Dr Terraube. He adds: "The negative trend of wolverine populations inside northern protected areas is alarming and highlights that further research is needed to understand the dynamics of wolverine populations in Lapland, how this species is affected by illegal killing and what protected areas could do to improve this situation".

On a brighter note, the researchers also found lynx densities to be higher within protected areas located in eastern Finland than those located in the western part of the country. The ecological factors that may influence this, such as prey abundance or connectivity to healthy Russian populations, remain unexplored.

The po­ten­tial of cit­izen science for as­sess­ing the im­pact of pro­tec­ted areas

The results show that counterfactual approaches applied to long-term and large-scale data are powerful analytical tools for evaluating the effectiveness of protected areas in maintaining wildlife populations. A counterfactual approach means comparing protected and unprotected sites that have similar environmental characteristics or human-caused threats. The method has been increasingly used to assess the effectiveness of protected areas in halting deforestation. This allows researchers to isolate the effect of protection on land cover from other confounding factors such as elevation. Until now, these types of approaches focused on matching analyses have been restricted to studies investigating the effects of protected areas on land-use changes. Finding wildlife time series with enough temporal and spatial coverage to conduct such robust effectiveness assessments is often difficult.

Dr. Terraube explains: "We were able to use data collected through the Finnish Wildlife Scheme to conduct this study. Hunters throughout the country have collected this data set since 1989, offering a fantastic opportunity to apply matching analyses to wildlife data for the first time and to assess large-scale and long-term patterns of protected area effectiveness. We chose to focus on large carnivores, as this species group is particularly prone to rising conflicts with local communities. Carnivore-human conflicts have increased in Finland following the recent recovery of most carnivore species. This has resulted in increasingly negative attitudes towards certain species, such as the wolf, and to increased levels of illegal killing".

Main­stream­ing im­pact eval­u­ation: to­wards bet­ter man­age­ment of pro­tec­ted areas

The study highlights the need to design robust methodological tools to strengthen our understanding of conservation outcomes and opens new avenues for improving protected area impact assessments. This is of the utmost importance, as the international community is currently turning to the post-2020 targets drafted by the UN Convention on Biological Diversity aiming to upgrade protected areas in an attempt to halt global biodiversity loss.

"We argue that this study shows that, despite methodological challenges, robust assessments of protected area effectiveness for the conservation of wide-ranging species, such as large carnivores, are possible and greatly needed as a basis for further research. It also highlights the extraordinary value of long-term wildlife monitoring activities conducted by citizens across an entire country", concludes Dr Terraube.

Credit: 
University of Helsinki

Undergraduate student discovers 18 new species of aquatic beetle in South America

image: University of Kansas undergraduate Rachel Smith collects aquatic beetles along a river margin in Suriname.

Image: 
Andrew Smith

LAWRENCE -- It would be striking for a seasoned entomologist with decades of fieldwork to discover such a large number of species unknown to science. But for University of Kansas student Rachel Smith, an undergraduate majoring in ecology & evolutionary biology, the find is extraordinary: Smith recently published a description of 18 new species of aquatic water beetle from the genus Chasmogenus in the peer-reviewed journal ZooKeys.

"The average size of these beetles, I would say, is about the size of a capital 'O' in a 12-point font," said Smith of the collection of new species. "They spend a lot of their life in forest streams and pools. They're aquatic, so they're all great swimmers -- and they can fly."

The research involved Smith traveling to Suriname to perform fieldwork as well as passing countless hours in the lab of Andrew Short, associate professor of ecology & evolutionary biology and associate curator with KU's Biodiversity Institute, who co-wrote the new paper.

Smith said many of the aquatic beetle species are virtually indistinguishable simply by looking at them, even under a microscope.

"Something unique and fascinating about this genus, particularly the ones I worked on, is that many look almost exactly the same," she said. "Even to my trained eye, it's hard to tell them apart just based on external morphology. Their uniqueness is in there but kind of hidden in this very uniform external morphology."

To identify the new species, Smith compared DNA evidence from the aquatic beetles with a few external morphological differences that could be observed. But this was not enough: Much of Smith's work also hinged on dissecting these tiny specimens collected in northeastern South America to spot key differences in their internal anatomy.

"Because it's difficult to tell them apart from external morphology, you kind of have to go inside," she said. "I ended up doing over 100 dissections of these beetles to extract the male genitalia and look at it under a microscope. That really was the true way to tell them apart. Ultimately, it came down to male genitalia and genetic divergence that I used to delimit many of these species."

The aquatic beetles described in the new paper were collected over multiple trips to Venezuela, Suriname and Guyana. Smith herself participated in one expedition to Suriname to collect specimens.

"In Suriname, almost every day involved a boat ride down a river or kayaking to a location," she said. "And there would have been either a short or a long hike. One day it was up an entire mountain, another day it was just a short little hike down a river trail. Well, not necessarily a trail because there aren't trails in the rainforest. We'd find an area that had some small, slow-moving or stagnant pools. The best ones are usually still and have dead leaves and mud and detritus -- that's where a lot of these beetles are found. You definitely have to get dirty to do this work, but it's very satisfying."

Indeed, one of the beetles Smith and her fellow researchers discovered in the Suriname rainforest ended up being unknown to science.

"I was part of a group that collected one of the beetles that was named in this paper," she said. "So, I was involved in the entire process of naming a species -- going to the rainforest, collecting it, bring it back to the lab, naming it and describing it. It was so nice to be a part of the whole process of discovering a new species."

Smith's co-author and faculty mentor Short said her paper reflects two years of work and is a remarkable accomplishment for any scientist, much less an undergraduate student.

"While new species for me are common, this is quite a lot for one paper and a huge amount for a student to describe," he said. "Rachel has done a great job. An undergraduate describing 18 species is extraordinary -- it's rare even for experienced scientists. I've described a lot of new species but never as many as 18 at once. This work highlights just how little we know about how many species there are in South America."

Smith said after graduation from KU in December, her aim is to develop a career in fieldwork and research, to uncover hidden biodiversity in hopes that it can empower conservation efforts in threatened areas.

"I've always had my sights set on a larger picture, and conservation really is my ultimate goal," she said. "You have to start from the bottom up, with taxonomy. You can't really know the efficacy of any kind of conservation effort without knowing what you're protecting or any idea of how many species are there. As I described in this paper, over half of these species are microendemic, meaning that they only occur in one specific locality. So, it begs the question -- is there something unique in that area that these beetles are specializing on, and what kind of kind of niches or roles do they play in that ecosystem? Hopefully it leads to a larger conversation about taking action to get certain areas protected."

Smith said destruction of such habitats could lead to an incalculable loss of biodiversity, but taxonomists could inform debates that pit species conservation against economic gains that come from extraction of natural resources.

"There's deforestation and logging and a lot of gold mining in this particular area where I was at in Suriname," she said. "But I think the take-home message from this paper really is that biodiversity is found in even in the smallest puddles in South America."

Credit: 
University of Kansas

Urine test reveals quality of your diet -- and whether it's the best fit for your body

Scientists have completed large-scale tests on a new type of five-minute urine test that measures the health of a person's diet, and produces an individual's unique urine 'fingerprint'.

Scientists at Imperial College London in collaboration with colleagues at Northwestern University, University of Illinois, and Murdoch University, analysed levels of 46 different so-called metabolites in the urine of 1,848 people in the U.S.

Metabolites are considered to be an objective indicator of diet quality - and are produced as different foods are digested by the body, say the research team, who published their findings in the journal Nature Food.

The work was funded by the U.S. National Institutes of Health and Health Data Research UK.

Dr Joram Posma, author of the research from Imperial's Department of Metabolism, Digestion and Reproduction said: "Diet is a key contributor to human health and disease, though it is notoriously difficult to measure accurately because it relies on an individual's ability to recall what and how much they ate. For instance, asking people to track their diets through apps or diaries can often lead to inaccurate reports about what they really eat. This research reveals this technology can help provide in-depth information on the quality of a person's diet, and whether it is the right type of diet for their individual biological make-up."

The findings revealed an association between 46 metabolites in urine, and types of foods or nutrients in the diet. For instance, certain metabolites correlated with alcohol intake, while others were linked to intake of citrus fruit, fructose (fruit sugar), glucose and vitamin C. The team also found metabolites in urine associated with dietary intake of red meats, other meats such as chicken, and nutrients such as calcium. Certain metabolites were also linked with health conditions - for instance compounds found in urine such as formate and sodium (an indicator of salt intake) are linked with obesity and high blood pressure.

Professor Paul Elliott, study co-author and Chair in Epidemiology and Public Health Medicine at Imperial said: "Through careful measurement of people's diets and collection of their urine excreted over two 24-hour periods we were able to establish links between dietary inputs and urinary output of metabolites that may help improve understanding of how our diets affect health. Healthful diets have a different pattern of metabolites in the urine than those associated with worse health outcomes."

In a second study also published in Nature Food by the same Imperial team, in collaboration with Newcastle University, Aberystwyth University, and Murdoch University and funded by the National Institute for Health Research, the Medical Research Council and Health Data Research UK, the team used this technology to develop a five-minute test to reveal that the mix of metabolites in urine varies from person to person.

The team says the technology, which produces an individual's urine 'fingerprint', could enable people to receive healthy eating advice tailored to their individual biological make-up. This is known as "precision nutrition", and could provide health professionals with more specific information on the quality of a person's diet.

Dr Isabel Garcia-Perez, author of the research also from Imperial's Department of Metabolism, Digestion and Reproduction explained: "Our technology can provide crucial insights into how foods are processed by individuals in different ways - and can help health professionals such as dieticians provide dietary advice tailored to individual patients."

Dr Garcia-Perez added that the team now plan to use the diet analysis technology on people at risk of cardiovascular disease.

The researchers say this urine 'fingerprint' can be used to develop an individual's personal score - called the Dietary Metabotype Score, or DMS.

In their experiments, the team asked 19 people to follow four different diets - ranging from very healthy (following 100 per cent of World Health Organisation recommendations for a balanced diet), to unhealthy (following 25 per cent WHO diet recommendations).

The team found that people who strictly followed the same diet had varied DMS scores.

The team's work also revealed that the higher a person's DMS score, the healthier their diet. A higher DMS score was also found to be associated with lower blood sugar, and a higher amount of energy excreted from the body in urine.

The team found the difference between high energy urine (i.e. high DMS score) and low energy urine (low DMS score) was equivalent to someone with a high DMS score losing an extra 4 calories a day, or 1,500 calories a year. The team calculate this could translate to a difference of 215g of body fat per year.

The next step is to investigate how a person's urine metabolite fingerprint may link to a person's risk of conditions such as obesity, diabetes and high blood pressure. Professor Gary Frost, co-author of the research and Chair in Nutrition and Dietetics at Imperial said: "These findings bring a new and more in-depth understanding to how our bodies process and use food at the molecular level. The research brings into question whether we should re-write food tables to incorporate these new metabolites that have biological effects in the body."

Professor John Mathers, co-author of research and Director of the Human Nutrition Research Centre at Newcastle University said: "We show here how different people metabolise the same foods in highly individual ways. This has implications for understanding the development of nutrition-related diseases and for more personalised dietary advice to improve public health."

Credit: 
Imperial College London

Product recommendation systems can help with search of antiviral drugs

image: Active compounds search in broad chemical space.

Image: 
Skoltech

Scientists from Skoltech and the Chumakov Federal Scientific Center for Research and Development of Immune-and-Biological Products of RAS checked the ability of artificial intelligence that suggest products to buy, recommend new antiviral compounds. The researchers found that advanced algorithms can effectively suggest both music, movies to buy, and compounds with antiviral activity.

Every internet user knows contextual advertising that suggests products to buy along with already purchased ones. Online retail use recommender systems that analyze user's preferences and purchase history to suggest a new product, a movie, or music. Can these algorithms "recommend" a new antiviral drug or "recommend" a well-known approved drug for a new disease?

A multidisciplinary team from the Skoltech Center for Computational and Data-Intensive Science and Engineering (CDISE) (Ekaterina Sosnina, Sergey Sosnin, Ivan Nazarov, and Maxim Fedorov) and the Chumakov Federal Scientific Center for Research and Development of Immune-and-Biological Products of RAS (Anastasia Nikitina and Dmitry Osolodkin) has checked this idea. The researchers carried out computational experiments and compared the performance of different recommender algorithms for the selection of small molecules active toward viruses. They showed that recommender systems could effectively pinpoint antiviral compounds and find promising drug candidates based on latent relationships in chemical and biological data. The key to success was Big Data: the team used the extensive ViralCHEMBL database containing antiviral activity data of about 250,000 molecules against 158 viral species.

"The success of this project is based on both significant progress in the mathematical algorithms and deep expertise in the subject area, such as medicinal chemistry, biology, and machine learning. We have launched this project long before the coronavirus outbreak and hope that our findings will help researchers to find new molecules with anti-SARS-CoV-2 activity" says Ekaterina Sosnina, a Ph.D. student at Skoltech and the first author of the paper.

The scientists believe that their study will help chemists to find new antiviral drug candidates and provide a way for the repurposing of the existing drugs to combat SARS-CoV-2 and other potential viral outbreaks.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Pioneering research reveals certain human genes relate to gut bacteria

image: Image of a bacterial culture taken from fecal sample.

Image: 
Chloe Russell, as featured in her book Up Your A-Z An Encyclopedia On Gut Bacteria

The role genetics and gut bacteria play in human health has long been a fruitful source of scientific enquiry, but new research marks a significant step forward in unraveling this complex relationship. Its findings could transform our understanding and treatment of all manner of common diseases, including obesity, irritable bowel syndrome, and Alzheimer's disease.

The international study, led by the University of Bristol and published today in Nature Microbiology, found specific changes in DNA - the chains of molecules comprising our genetic make-up - affected both the existence and amount of particular bacteria in the gut.

Lead author Dr David Hughes, Senior Research Associate in Applied Genetic Epidemiology, said: "Our findings represent a significant breakthrough in understanding how genetic variation affects gut bacteria. Moreover, it marks major progress in our ability to know whether changes in our gut bacteria actually cause, or are a consequence of, human disease."

The human body comprises various unique ecosystems, each of which is populated by a vast and diverse array of microorganisms. They include millions of bacteria in the gut, known as the microbiome, that help digest food and produce molecules essential for life, which we cannot produce ourselves. This has prompted researchers to question if gut bacteria may also directly influence human health and disease.

Previous research has identified numerous genetic changes apparently related to bacterial composition in the gut, but only one such association has been observed consistently. This example involves a well-known single mutation that changes whether someone can digest the sugar (lactose) in fresh milk. The same genetic variation also predicts the prevalence of bacteria, Bifidobacterium, that uses or digests lactose as an energy source.

This study, the biggest of its kind, identified 13 DNA changes related to changes in the presence or quantity of gut bacteria. Researchers at Bristol worked with Katholieke Universiteit Leuven and Christian-Albrecht University of Kiel to analyse data from 3,890 individuals from three different population studies: one in Belgium (the Flemish Gut Flora Project) and two in Germany (Food Chain Plus and PopGen). In each individual, the researchers measured millions of known DNA changes and, by sampling their feces, also registered the presence and abundance of hundreds of gut bacteria.

Dr Hughes said: "It was exciting to identify new and robust signals across the three study populations, which makes the correlation of genetic variation and gut bacteria much more striking and compelling. Now comes the great challenge of confirming our observations with other studies and dissecting how exactly these DNA changes might impact bacterial composition."

Such investigations could hold the key to unlocking the intricate biological mechanisms behind some of the biggest health challenges of our time.

Study co-author Dr Kaitlin Wade, Lecturer in Epidemiology at the University of Bristol, said: "A strength here is that these findings provide a groundwork for causal analyses to determine, for instance, whether the presence of specific bacteria increases the risk of a disease or is a manifestation of it."

"The implications for our understanding of human health and our approach to medicine are far-reaching and potentially game changing."

Credit: 
University of Bristol