Brain

Discovered: A new property of light

Researchers have discovered that light can possess a new property, self-torque. This discovery could open up exciting possibilities in light-related applications, researchers explain in a related video, including as relates to the improvement of smart phones and hard drives. The utility of light is tightly connected to our ability to control light. In addition to many well-known properties like intensity and wavelength, light can be twisted, possessing what's known as angular momentum, something researchers have known for several decades. Beams carrying highly structured angular momentum, or orbital angular momentum (OAM), are known as vortex beams. Their intensity, which has a donut-like shape, has applications in optical communications, microscopy, quantum optics and microparticle manipulation. Recently, the way novel properties of structured light beams can be achieved by exploiting angular momentum has garnered renewed interest. Here, speculating that beams that carry OAM could operate in a time-dependent matter, researchers led by Laura Rego, discovered that light can possess a new property, self-torque. Light beams with self-torque possess an angular momentum that changes continuously in time. These beams can be naturally generated through the process of high-harmonic generation. They look like a croissant, containing over an octave of orbital angular momentum values along the light pulse. In several experiments, the authors studied the unique properties of beams with self-torque. "This is the first time that anyone has predicted or even observed this new property of light," said Rego in a related video. "For example, we think we can modulate the orbital angular momentum of light in the same way frequency is modulated in communications."

Credit: 
American Association for the Advancement of Science (AAAS)

A Trojan horse? Immune cells ferry deadly fungus from mouse lung into the blood

MADISON - A report today (June 27) in PLOS Pathogens shows how inhaled fungal spores exit the lung and trigger a fatal infection in mice.

The study solves a mystery of mycology: Why are spores of a certain fungal strain deadly while the yeast form of that same fungus is harmless?

Study leader Christina Hull, professor of biomolecular chemistry and medical microbiology and immunology at the University of Wisconsin-Madison, focuses on Cryptococcus, the most deadly inhaled fungus. The short answer, she says, is that lung macrophages abandon their posts as bodyguards and begin smuggling spores into the bloodstream.

Normally, macrophages chew up pathogens and spit out their inert fragments. But in Cryptococcus infections, the immune cells serve as Trojan horses, concealing a deadly cargo.

The study revealed that macrophages in mouse lungs were packed with live Cryptococcus spores. "These immune cells are acting as a vehicle to invade the rest of the body," says first author Naomi Walsh, "with the spores hidden inside, protected from other types of immune attack."
Cryptococcus mainly affects people with immune impairments due to cancer treatment, for example, or AIDS. The fungus causes severe brain infections, resulting in several hundred thousand deaths a year worldwide.

This particular fungus can take the form of spores, which are produced during sexual reproduction. Spores are durable particles that can survive for years before sprouting into yeast to complete the life cycle.

Fungal spores are also ubiquitous in the environment, so humans and other mammals commonly inhale them, highlighting the need for protection by macrophages and other immune cells in the lungs.

Fungi have great genetic diversity. In some strains, both yeast and fungal spores cause disease. In others, both are harmless. But why, Hull wondered, would an inhaled yeast of a particular strain cause no disease when the same strain was deadly as an inhaled spore?

In the PLOS Pathogens report, Hull's group probed this inconsistency.
The new study grows from more than a decade's worth of research by Hull's lab, searching for cracks in the armor of Cryptococcus. The launchpad was the 2009 invention by former graduate student Michael Botts of a practical method for concentrating large numbers of spores using a centrifuge. That work enabled mouse studies that needed significant quantities of pure spores.

In the mice, the spores produced by "non-virulent" yeast strains caused 100 percent mortality, with infections of the brain, kidney, lung, liver, and spleen, Walsh says. "Mice exposed to the same dose of yeast had fungus in the lung only ... and they looked healthy the whole time."

The Trojan horse did more than smuggle spores from the lung: the macrophages provided shelter for the spores while they germinated into yeast, showing how the spores started a fatal infection.

This failure of immune defenses matters, Hull says. Fungal infections are hard to treat, because fungi are much closer to mammals on the evolutionary tree than bacteria. Therefore, compounds that harm fungi tend to harm humans.

Rule breaking is the rule in the new results, Hull says. "First, we found that spores caused fatal disease, even though they were genetically identical to yeast that could not cause disease at all. That's unusual. Then we discovered that the disease only occurred if spores were engulfed by macrophages that would normally kill them. And finally, once inside the 'protection' of the macrophages, the spores germinated, hitched a ride out of the lung, and made the mice sick. Immune cells that were supposed to protect the mouse from the fungal pathogen actually helped the fungus cause disease."

The Trojan horse strategy has previously been seen in anthrax, a bacterial disease, where immune cells move spores in a similar fashion, but this is the first time it has been identified for an inhaled fungal pathogen.

By shedding light on this conversion of a host's defense into a fungal offense, Hull's findings could lead to ways to shield people who are susceptible to the fungus due to immune deficiency. "By understanding how spores move from the lungs to other tissues," Walsh says, "we can develop new strategies for preventing spore-mediated fungal diseases and learn how to treat patients more effectively."

Another crack in the armor may be the germination process that spores use for conversion to yeast, Walsh adds. "Humans do not have a process of germination, and so that process becomes an attractive target for prophylactic drugs that should not be toxic to mammals."

Already, Hull reports, automated screens for existing drugs or other compounds that can deter spore germination are producing promising results.

Credit: 
University of Wisconsin-Madison

Rich defects boosting the oxygen evolution reaction

image: Illustration of the solvothermal reaction of pristine CoFe LDHs by using ethylene glycol.

Image: 
©Science China Press

The oxygen evolution reaction (OER) with sluggish reaction kinetics and large over-potential is the severe reaction in water splitting that is promising for energy storage and conversion. However, it is still the bottleneck reaction of the water-splitting system because of the slow kinetics and large over-potential during the anodic polarization process. Therefore, it is crucial to develop highly efficient OER catalysts which can lower the over-potential effectively and accelerate reaction kinetics.

At present, CoFe double metal oxides or hydroxides have been proved by many studies to be efficient catalysts for catalyzing OER. However, the performance of the corresponding bulk catalysts is still unsatisfactory in practical applications. Based on this, it is of considerable significance to achieve the simultaneous improvement of the apparent activity and intrinsic activity of CoFe-based catalysts through material nanostructure engineering and electronic structure regulation.

Recently, Professor Shuangyin Wang's group from Hunan University, based on the strategy of defect engineering, used a mild reducing agent-ethylene glycol as a solvent in the solvothermal reduction of bulk CoFe LDHs to achieve defects construction. Such treatment realized the formation of anion and cation defects (O, Co, and Fe), and the bulk CoFe LDHs were in situ exfoliated, and a three-dimensional hierarchical structure was formed due to the intercalation effect of large size ethylene glycol during the solvothermal process.

After further morphology and electronic structure characterization, the authors found that the defect-rich structure significantly increased the intrinsic activity of the material, and the resulted 3-dimensional hierarchical structure promoted the mass transfer in the catalytic process, ultimately achieving effective OER performance.

Besides, compared with the conventional the two-dimensional material exfoliation or defect construction methods, this method breaks through the bottleneck of scale-up exfoliation of two-dimensional material, and the exfoliation of two-dimensional catalyst and in-situ development of three-dimensional structure are realized by a simple one-step solvothermal method. It provides a new direction for the large-scale preparation and application of OER catalysts.

Credit: 
Science China Press

Lightning bolt underwater

image: Katharina Grosse, Achim von Keudell and Julian Held (from the left) in the laboratory.

Image: 
RUB, Kramer

Electrochemical cells help recycle CO2. However, the catalytic surfaces get worn down in the process. Researchers at the Collaborative Research Centre 1316 "Transient atmospheric plasmas: from plasmas to liquids to solids" at Ruhr-Universität Bochum (RUB) are exploring how they might be regenerated at the push of a button using extreme plasmas in water. In a first, they deployed optical spectroscopy and modelling to analyse such underwater plasmas in detail, which exist only for a few nanoseconds, and to theoretically describe the conditions during plasma ignition. They published their report in the journal Plasma Sources Science and Technology on 4 June 2019.

Plasmas are ionised gases: they are formed when a gas is energised that then contains free electrons. In nature, plasmas occur inside stars or take the shape of polar lights on Earth. In engineering, plasmas are utilised for example to generate light in fluorescent lamps, or to manufacture new materials in the field of microelectronics. "Typically, plasmas are generated in the gas phase, for example in the air or in noble gases," explains Katharina Grosse from the Institute for Experimental Physics II at RUB.

Ruptures in the water

In the current study, the researchers have generated plasmas directly in a liquid. To this end, they applied a high voltage to a submerged hairline electrode for the range of several billionth seconds. Following plasma ignition, there is a high negative pressure difference at the tip of the electrode, which results in ruptures forming in the liquid. Plasma then spreads across those ruptures. "Plasma can be compared with a lightning bolt - only in this case it happens underwater," says Katharina Grosse.

Hotter than the sun

Using fast optical spectroscopy in combination with a fluid dynamics model, the research team identified the variations of power, pressure, and temperature in these plasmas. "In the process, we observed that the consumption inside these plasmas briefly amounts to up to 100 kilowatt. This corresponds with the connected load of several single-family homes," points out Professor Achim von Keudell from the Institute for Experimental Physics II. In addition, pressures exceeding several thousand bars are generated - corresponding with or even exceeding the pressure at the deepest part of the Pacific Ocean. Finally, there are short bursts of temperatures of several thousand degrees, which roughly equal and even surpass the surface temperature of the sun.

Water is broken down into its components

Such extreme conditions last only for a very short time. "Studies to date had primarily focused on underwater plasmas in the microsecond range," explains Katharina Grosse. "In that space of time, water molecules have the chance to compensate for the pressure of the plasma." The extreme plasmas that have been the subject of the current study feature much faster processes. The water can't compensate for the pressure and the molecules are broken down into their components. "The oxygen that is thus released plays a vital role for catalytic surfaces in electrochemical cells," explains Katharina Grosse. "By re-oxidating such surfaces, it helps them regenerate and take up their full catalytic activity again. Moreover, reagents dissolved in water can also be activated, thus facilitating catalysis processes."

Credit: 
Ruhr-University Bochum

Researchers discover more than 50 lakes beneath the Greenland Ice Sheet

image: This is surface meltwater in Greenland.

Image: 
Winnie Chu, Stanford University

Researchers have discovered 56 previously uncharted subglacial lakes beneath the Greenland Ice Sheet bringing the total known number of lakes to 60.

Although these lakes are typically smaller than similar lakes in Antarctica, their discovery demonstrates that lakes beneath the Greenland Ice Sheet are much more common than previously thought.

The Greenland Ice Sheet covers an area approximately seven times the size of the UK, is in places more than three kilometres thick and currently plays an important role in rising global sea levels.

Subglacial lakes are bodies of water that form beneath ice masses. Meltwater is derived from the pressure of the thick overlying ice, heat generated by the flow of the ice, geothermal heat retained in the Earth, or water on the surface of the ice that drains to the bed. This water can become trapped in depressions or due to variations in ice thickness.

Knowledge of these new lakes helps form a much fuller picture of where water occurs and how it drains under the ice sheet, which influences how the ice sheet will likely respond dynamically to rising temperatures.

Published in Nature Communications this week, their paper, "Distribution and dynamics of Greenland subglacial lakes", provides the first ice-sheet wide inventory of subglacial lakes beneath the Greenland Ice Sheet.

By analysing more than 500,000 km of airborne radio echo sounding data, which provide images of the bed of the Greenland Ice Sheet, researchers from the Universities of Lancaster, Sheffield and Stanford identified 54 subglacial lakes, as well as a further two using ice-surface elevation changes.

Lead author Jade Bowling of the Lancaster Environment Centre, Lancaster University, said:

"Researchers have a good understanding of Antarctic subglacial lakes, which can fill and drain and cause overlying ice to flow quicker. However, until now little was known about subglacial lake distribution and behaviour beneath the Greenland Ice Sheet.

"This study has for the first time allowed us to start to build up a picture of where lakes form under the Greenland Ice Sheet. This is important for determining their influence on the wider subglacial hydrological system and ice-flow dynamics, and improving our understanding of the ice sheet's basal thermal state."

The newly discovered lakes range from 0.2-5.9 km in length and the majority were found beneath relatively slow moving ice away from the largely frozen bed of the ice sheet interior and seemed to be relatively stable.

However, in the future as the climate warms, surface meltwater will form lakes and streams at higher elevations on the ice sheet surface, and the drainage of this water to the bed could cause these subglacial lakes to drain and therefore become active. Closer to the margin where water already regularly gets to the bed, the researchers saw some evidence for lake activity, with two new subglacial lakes observed to drain and then refill.

Dr Stephen J. Livingstone, Senior Lecturer in Physical Geography, University of Sheffield, said:

"The lakes we have identified tend to cluster in eastern Greenland where the bed is rough and can therefore readily trap and store meltwater and in northern Greenland, where we suggest the lakes indicate a patchwork of frozen and thawed bed conditions.

"These lakes could provide important targets for direct exploration to look for evidence of extreme life and to sample the sediments deposited in the lake that preserve a record of environmental change."

Credit: 
Lancaster University

Corals can survive in acidified ocean conditions, but have lower density skeletons

image: Researchers transplanted coral fragments to sites with low-pH conditions similar those expected with future ocean acidification, then monitored their survival and growth.

Image: 
Donald Potts

Coral reefs face many challenges to their survival, including the global acidification of seawater as a result of rising carbon dioxide levels in the atmosphere. A new study led by scientists at UC Santa Cruz shows that at least three Caribbean coral species can survive and grow under conditions of ocean acidification more severe than those expected to occur during this century, although the density of their skeletons was lower than normal.

The study took advantage of the unusual seawater chemistry found naturally at sites along the Caribbean coastline of Mexico's Yucatan Peninsula, where water discharging from submarine springs has lower pH than the surrounding seawater, with reduced availability of the carbonate ions corals need to build their calcium carbonate skeletons.

In a two-year field experiment, the international team of researchers transplanted genetically identical fragments of three species of corals to a site affected by the springs and to a nearby control site not influenced by the springs, and then monitored the survival, growth rates, and other physiological traits of the transplants. They reported their findings in a paper published June 26 in Proceedings of the Royal Society B.

"The good news is the corals can survive and deposit calcium carbonate, but the density of their skeletons is reduced, which means the framework of the reef would be less robust and might be more susceptible to storm damage and bioerosion," said Adina Paytan, a research professor at UCSC's Institute of Marine Sciences and corresponding author of the paper.

Of the three species tested, the one that performed best in the low-pH conditions was Siderastrea siderea, commonly known as massive starlet coral, a slow-growing species that forms large dome-shaped structures. Another slow-growing dome-shaped species, Porites astreoides (mustard hill coral), did almost as well, although its survival rate was 20 percent lower. Both of these species outperformed the fast-growing branching coral Porites porites (finger coral).

Coauthor Donald Potts, professor of ecology and evolutionary biology at UC Santa Cruz, said the transplanted species are all widespread throughout the Caribbean. "The slow-growing, dome-shaped corals tend to be more tolerant of extreme conditions, and they are important in building up the permanent structure of the reef," he said. "We found that they have the potential for persistence in acidified conditions."

Corals will have to cope with more than ocean acidification, however. The increasing carbon dioxide level in the atmosphere is also driving climate change, resulting in warmer ocean temperatures and rising sea levels. Unusually warm temperatures can disrupt the symbiosis between coral polyps and the algae that live in them, leading to coral bleaching. And rapidly rising sea levels could leave slow-growing corals at depths where they would die from insufficient sunlight.

Nevertheless, Potts noted that several species of Caribbean corals have long fossil records showing that they have persisted through major changes in Earth's history. "These are species with a history of survival and tolerance," he said.

He added that both S. siderea and P. astreoides had higher chlorophyll concentrations at the low-pH site, indicating that their algal symbionts were responding positively and potentially increasing the energy resources available to the corals for resisting stress.

Both of the slow-growing species that did well under acidified conditions have internal fertilization and brood their larvae, so that their offspring have the potential to settle immediately in the same area, Potts said. "This means there is potential for local genetic adaptation over successive generations to changing environmental conditions," he said.

The authors also noted that the differences among coral species in survival and calcification under acidified conditions could be useful information for reef restoration efforts and perhaps even for efforts to genetically modify corals to give them greater stress tolerance.

Paytan said she remains "cautiously optimistic," despite the many threats facing coral reefs worldwide.

"These corals are more robust than we thought," she said. "They have the potential to persist with ocean acidification, but it costs them energy to cope with it, so we have to do all we can to reduce other stressors, such as nutrient pollution and sedimentation."

Paytan and Potts said the collaboration with Mexican researchers was essential to the success of the project, enabling frequent monitoring of the transplanted corals throughout the two-year experiment.

Credit: 
University of California - Santa Cruz

New GSA bulletin study of the 2014 Oso landslide

image: The March 22, 2014 SR530 landslide near Oso, Washington, caused 43 fatalities, destroyed a neighborhood, blocked a state highway, and temporarily dammed the North Fork Stillaguamish River. This photo was taken the day after the catastrophic slide, before the river cut through the landslide deposit. Here, several geomorphological components of the landslide are visible, with a hummock field in the foreground transitioning upslope to larger slices of deposit separated by multiple scarps, which then transition to a fallen-tree covered, back-rotated block downdropped from the headscarp in the far field. Nearly the entire landslide deposit exhibits indications of extension. Collins and Reid attribute extensional hummock formation to widespread basal liquefaction of underlying alluvial sediments in the river valley. Photo by Stephen Slaughter (Washington Geological Survey, Washington Department of Natural Resources).

Image: 
Stephen Slaughter (Washington Geological Survey, Washington Department of Natural Resources)

Boulder, Colo., USA: As a compelling example of a large-mobility landslide, the 22 March 2014 landslide near Oso, Washington, USA, was particularly devastating, traveling across a 1-km-plus-wide river valley, killing 43 people, destroying dozens of homes, and temporarily closing a well-traveled highway.

To resolve causes for the landslide's behavior and mobility, Brian Collins and Mark Reid of the U.S. Geological Survey conducted detailed post-event field investigations and material testing of soils involved in the failure.

How far a landslide moves from the site where it began can, of course, vastly amplify the consequences of slope failure. Some landslides stop moving close to where they began, and others are very mobile and can travel long distances, affecting not only what is located at the base of the slope, but also farther away.

Collins and Reid mapped the geology and structure of the Oso landslide deposit by making multiple visits to the site over the course of three years. Some of the data they collected were highly ephemeral, being obscured by erosion and vegetation within one year of the landslide and highlighting the need to record many observations within a few months of the disaster.

Using "boots-on-the-ground" geologic mapping techniques, combined with high-resolution orthoimagery and airborne LiDAR data, they reconstructed the likely sequence of events that led to the landslide's large mobility. Their mapping and analyses show that the approximately nine-million-cubic-meter landslide underwent rapid extension or stretching in a closely timed sequence of events that led to the landslide overrunning the, at-the-time, saturated flood plain forming the valley floor.

The large and rapid failure of the landslide caused the flood plain, composed of alluvial sands and gravels, to liquefy through a process of pore-pressure generation and consequent liquefaction. Liquefaction greatly reduced the strength along the base of the landslide and enabled it to travel over 1 km across the valley flats.

Collins and Reid found extensive evidence of high soil-water pore pressure during their field work by identifying and mapping of hundreds of "sand boils" -- typically decimeter-sized cones of sand that indicated locations where liquefied alluvium tried to escape from a weakened base beneath the landslide. In their new GSA Bulletin article, Collins and Reid present their mapping and interpreted landslide sequence, as well as analyses that show how the basal liquefaction mechanism likely occurred at the site of the Oso landslide. They hypothesize that this mechanism might enhance mobility of other landslides in similar settings.

Credit: 
Geological Society of America

Frontline heroes hailed in the war against devil cancers

Residents of Tasmania's D'Entrecasteaux Channel Peninsula, Kingborough and Huon Valley communities are being hailed as the frontline heroes in the war against two deadly transmissible cancers affecting Tasmanian devils - Devil Facial Tumour Disease (DFTD) and Devil Facial Tumour 2 (DFT2).

New research from the University of Tasmania, published in the leading journal Evolutionary Applications, suggests that DFT2, which was discovered only five years ago, is currently confined to the Channel region.

Since 2014, 40 DFT2 tumours - as well as 51 DFTD cases - have been confirmed on the Channel peninsula and its surroundings.

Dr Rodrigo Hamede from the University of Tasmanian's School of Natural Sciences said the cooperation and commitment of the community had been vital in identifying the geographic footprint of DFT2.

"Dozens of residents have opened their doors to our students and allowed us to conduct surveys on their properties," Dr Hamede, who led the study, said.

"Likewise, many residents that have seen devil roadkills have called the Save the Tasmanian Devil Program and the University of Tasmania so that we can collect valuable information and samples.

"Without their support, we wouldn't have been able to undertake this study."

The new study, conducted in collaboration with institutions in Australia, USA, France and the UK, has now assessed the distribution, epidemiology and evolutionary interactions of both transmissible cancers.

DFT2 originated from a male devil, unlike DFTD, which originated from a female, and that might explain why DFT2 is affecting mostly males.

The current hypothesis suggests that females might be able to recognise Y-chromosome antigens and be less susceptible to DFT2 infection, but these patterns can change very quickly.

Both cancers are now competing for the same resource, and that resource is the devil, Dr Hamede said.

"This may explain why DFT2 tumours are found not just on the head as in DFTD, but also on the body, with non-facial tumours much more common in DFT2," he said.

An international collaboration involving staff and postgraduate students from the University of Tasmania's School of Natural Sciences and university researchers at Griffith University in Queensland, Washington State and Idaho in the United States has recently suggested that devils are adapting to the DFTD epidemic by becoming more tolerant or resistant to infection.

"This is good news for the devil and may explain why people are reporting seeing more devils around in the landscape than they have in the last 20 years," said Associate Professor Menna Jones, a study co-author.

"Whether the resistance that devils are evolving to DFTD gives them immediate protection against this new cancer DFT2 is an important question to answer."

Dr Hamede said DFT2 could represent a 'game-changer' for the evolutionary dynamics and adaptations observed over the last 25 years with DFTD.

"There is so much more we need to learn now. We need to double our efforts in monitoring DFT2 and evaluating its interactions with DFTD," he said.

"While DFT2 seems to be currently confined to the Channel peninsula, it is very plausible it would spread north and affect populations outside the peninsula in the coming years."

Transmissible cancers are known only to occur in domestic dogs, marine bivalves and Tasmanian devils.

So the fact that devils have been struck twice is no coincidence, according to Dr Hamede.

"Devils might be particularly prone to transmissible cancers, and somehow this could be good news for devils," he said.

"If devils have had two transmissible cancers in just 25 years, then they may have been affected by these tumours several times during their evolutionary history.

"And if they are still here, fighting and learning how to live with cancer, then they may also know how to overcome these transmissible tumours."

That devils are still here in the landscape after 25 years of DFTD is no surprise, says Associate Professor Jones.

Credit: 
University of Tasmania

New study on gene editing in wildlife finds people are wary

image: Patrice Kohl led the team that looked at people's perceptions about using CRISPR technology to potentially managed wildlife conservation.

Image: 
Karen Norum, UCF Office of Research

The applications of CRISPR based genetic engineering tools range from changing colors in butterfly wings to developing gene therapies that might one day cure or prevent human diseases.

Some scientists are also setting their sights on new uses - saving endangered species and possibly eliminating invasive ones to manage wildlife populations for conservation.

However, a University of Central Florida researcher and her colleagues have found that people living in the U.S. are wary of using this technology to achieve wildlife conservation goals.

And this insight is important, as new technology and procedures launched without public input can face backlash later.

The findings were published this month in the journal Conservation Biology and made available ahead of publication in March 2019.

The research represents the first large-scale, systematic survey of U.S. public opinion toward using gene editing for conservation efforts.

"I think scientists learned a lot from what has occurred with genetic modification of crops," says Patrice Kohl, an assistant professor in UCF's Nicholson School of Communication and Media and lead author of the study.

"In the 1990s, biotechnology companies rolled out genetically modified crops without any public input, and there was fierce public pushback in response to that. So, I think there is a lot of interest among scientists in avoiding that happening again."

CRISPR, or clustered regularly interspaced short palindromic repeats, is a fast and inexpensive method of genetic engineering that can be used almost like "scissors" to edit DNA without introducing foreign genetic material.

It has also enabled scientists to develop gene drives, a technique that would allow a genetic edit to spread rapidly throughout a wildlife population.

Compared with other emerging fields of research, such as nanotechnology and using gene editing in humans, there has been little public opinion research on gene editing in wildlife, Kohl says.

And understanding public opinion on this emerging technology is not only important for scientists seeking public buy-in so their efforts aren't wasted or their technologies put on a shelf, but it also can inform government officials when deciding how to regulate them.

"It's everybody's planet, and there are huge implications for using this technology," Kohl says. "I think scientists are interested in making sure their technologies or practices are rolled out in ways that are socially acceptable."

Proposals to use gene editing as a wildlife conservation tool include applications that could help endangered species as well as applications that could reduce or eliminate invasive ones.

For example, conservation scientists have proposed using CRISPR to improve disease immunity in populations of the endangered black-footed ferret.

There has also been interest in employing CRISPR-based tools to reduce or eliminate small invasive mammals on islands, such as the Galapagos or New Zealand, where they have devastated native bird species.

"But what if you introduce a gene-edited rat to reduce their populations on an island and then that rat escapes the island and you drive that rat species extinct?" Kohl says. "That has consequences for everyone across the entire planet."

Although the public opinion study didn't specify any particular species that could be saved or eliminated, its findings do offer an overview of people's attitudes toward the risk and benefit of using gene editing for wildlife conservation in general, as well as the factors that could affect those attitudes.

The study analyzed data from a nationally representative survey of 1,600 U.S. adults from December 2016 to January 2017.

It found that, overall, respondents significantly perceived the risks of gene editing wildlife as outweighing the benefits.

More than 80 percent of survey respondents thought it would be at least somewhat risky to nature and humans to use gene editing as a tool to manage wildlife, while 55 to 63 percent thought that it would be at least somewhat beneficial for nature and humans.

However, among individuals who strongly believe in the authority of scientific knowledge, gene editing in wildlife was perceived as more beneficial and less risky.

Respondents also viewed using gene editing tools to help species survive as more morally acceptable than using them to reduce or eliminate species.

And even though respondents were skeptical of the technology, Kohl says that doesn't necessarily mean they wouldn't support it.

"Just because you think something is risky doesn't necessarily mean you don't think something should be done," she says. "A lot of medical treatments are risky but sometimes you have to do something that's a little risky to take care of a problem. With cancer, for example, chemotherapy is risky, but that doesn't mean people think it should necessarily be off the table."

Credit: 
University of Central Florida

Use of evidence-based therapies for youth psychiatric treatment is slow to catch on

PHILADELPHIA--We all hope--and probably expect--that clinicians use only mental health therapies that are scientifically proven to improve symptoms. A new study from Penn Medicine and Philadelphia's Department of Behavioral Health and Intellectual disAbility Services (DBHIDS) shows that, unfortunately, evidence-based therapies to treat youth with mental health problems are slow to catch on. Specifically, researchers found that over a five-year period in Philadelphia, use of evidence-based therapies--practices backed by scientific data showing that symptoms improve in response to treatment, such as cognitive behavioral therapy (CBT)--increased only modestly, despite the city and researchers' substantial efforts to showcase the value of these approaches and to provide training to community clinicians. The results were published this month in Implementation Science.

This finding is of critical importance because clinicians who use evidence-based practices (EBPs) as part of their routine care obtain much better outcomes for children with depression, anxiety, trauma, and disruptive behavior disorders compared with clinicians who do not.

"Evidenced-based therapies are effective for treating a wide range of psychiatric conditions, but there is still a gap in widespread use," said the study's lead author Rinad S. Beidas, PhD, an associate professor of Psychiatry and Medical Ethics and Health Policy in the Perelman School of Medicine at the University of Pennsylvania, and founding director of the Penn Implementation Science Center at the Leonard Davis Institute of Health Economics (PISCE@LDI). "While findings showed a modest increase in use, the data point to a clear need for finding better ways to support clinicians and organizations in using EBP therapies. This research-to-practice gap is a historically intractable problem, which exists not only in behavioral health but all across health care specialties."

Researchers identified two factors driving the observed increases of EBP implementation in publicly funded clinics that could inform future strategies to increase EBP use. First, the more city-sponsored EBP trainings clinicians attended, the more likely they were to apply evidence-based techniques in their practices. Second, use of EBP was more likely among clinicians who worked in a practice with a "proficient culture," meaning the organization expects clinicians to place the well-being of their clients first, to be competent, and have up-to-date knowledge.

Over the last decade, cities from Philadelphia to Los Angeles have placed an increased emphasis on implementing EBP into care, from building EBPs into contracts to initiating new policies that support their use in an effort to help improve outcomes for vulnerable youth. In 2007, Philadelphia's DBHIDS began large-scale efforts to increase EBP use. The department created the Evidence-based Practice and Innovation Center (EPIC) in 2013, a city-wide entity intended to provide a centralized infrastructure to support EBP administration. Despite a national focus on EBP use, very few EBP implementation efforts around the country have been systematically and rigorously evaluated, which ultimately limits the ability to understand the effects of said efforts.

The researchers surveyed clinicians from 20 different publicly funded Philadelphia clinics that treat youth at three different points from 2013 to 2017. Sixty percent of the 340 clinicians contacted completed the survey. All of the clinics had the opportunity to receive system-level support provided by EPIC, but only half of the clinicians participated in city-funded, EBP training initiatives. On average, use of CBT techniques increased by six percent from the first data collection to the last, compared to no change in psychodynamic techniques, a frequently used type of "talk therapy" that has less evidence of effectiveness in children. The researchers also found that each EBP training initiative predicted a three percent increase in CBT use, but no change in use of psychodynamic techniques. In organizations described as having a more "proficient" culture at the beginning of the survey, clinicians exhibited an eight percent increase in CBT use, compared with a two percent decrease in organizations with less proficient cultures.

"Philadelphia is a leader in making EBP available to its most vulnerable citizens with mental health and substance abuse problems. This study represents an opportunity to learn from an exemplar system encouraging EBP implementation," Beidas said. "To build upon Philadelphia's and other cities' deep commitment to increasing this implementation, we need further studies to test and evaluate strategies that increase use of EBP to guide our understanding of the best ways to use and how to implement them."

Credit: 
University of Pennsylvania School of Medicine

Methylmercury precipitates heart failure by increasing Drp1-mediated mitochondrial fission

image: A small amount of MeHg induces cardiac vulnerability to hemodynamic load in mice. Exposure of mice to low-dose MeHg apparently has no impact on behavior and cardiac functions, but exacerbates pressure overload-induced heart failure through
abnormal mitochondrial fission.

Image: 
National Institute for Physiological Sciences

Okazaki, Japan - Although the widespread environmental contaminant methylmercury is largely associated with neurotoxic effects, it is also associated with increased risk for cardiovascular disease. Nishimura et al. found that mice exposed to a dose of methylmercury that was too low to cause neurotoxicity were more vulnerable to heart failure in response to pressure overload. Methylmercury removed a polysulfide group from Drp1, thereby removing an inhibitory brake on this protein, which resulted in increased Drp1-mediated mitochondrial fission. Treating mice or human cardiomyocytes with a polysulfide group-releasing compound reversed fragility to mechanical overload induced by methylmercury. These results provide a molecular mechanism for the cardiotoxic effects of methylmercury and a possible strategy to avert these effects.

Chronic exposure to methylmercury (MeHg), an environmental electrophilic pollutant, reportedly increases the risk of human cardiac events. We report that exposure to a low, non-neurotoxic dose of MeHg precipitated heart failure induced by pressure overload in mice. Exposure to MeHg at 10 ppm did not induce weight loss typical of higher doses but caused mitochondrial hyperfission in myocardium through the activation of Drp1 by its guanine nucleotide exchange factor filamin-A. Treatment of neonatal rat cardiomyocytes (NRCMs) with cilnidipine, an inhibitor of the interaction between Drp1 and filamin-A, suppressed mitochondrial hyperfission caused by low-dose MeHg exposure. Modification of cysteine residues in proteins with polysulfides is important for redox signaling and mitochondrial homeostasis in mammalian cells. We found that MeHg targeted rat Drp1 at Cys624, a redox-sensitive residue whose SH side chain forms a bulky and nucleophilic polysulfide (Cys624-S(n)H). MeHg exposure induced the depolysulfidation of Cys624-S(n)H in Drp1, which led to filamin-dependent activation of Drp1 and mitochondrial hyperfission. Treatment with NaHS, which acts as a donor for reactive polysulfides, reversed MeHg-evoked Drp1 depolysulfidation and vulnerability to mechanical load in rodent and human cardiomyocytes and mouse hearts. These results suggest that depolysulfidation of Drp1 at Cys624-S(n)H by low-dose MeHg increases cardiac fragility to mechanical load through filamin-dependent mitochondrial hyperfission.

This work newly suggests the molecular mechanism how low-dose methylmercury makes hearts more fragile.

Credit: 
National Institutes of Natural Sciences

Zero-calorie sweeteners on trial again

As a sugar substitute, zero-calorie sweeteners may reduce tooth decay and blood sugar spikes. Seven are approved worldwide and safe for humans - but does this mean they're healthy?

For the first time, scientists exposed pregnant and lactating mice to sucralose and acesulfame-K - a common combination in soda, sports supplements and other sweetened products - and found their pups developed harmful metabolic and gut bacteria changes.

Published in Frontiers in Microbiology, the study reinforces an emerging consensus: artificial sweeteners may be safe when used in moderation by adults, but they are not a "magic bullet" alternative to sugar.

The problem with sweeteners

"Non-nutritive sweeteners are generally believed to be safe when used in moderation," says Dr. John Hanover, a glycobiologist and senior author of the study at the U.S. National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), part of the National Institutes of Health. Hanover collaborated with Dr. Stephanie Olivier-Van Stichelen, formerly at NIDDK and now Assistant Professor at Medical College of Wisconsin, and Dr. Kristina Rother, Chief of the Section on Pediatric Diabetes and Metabolism at NIDDK.

"However, sweetness itself seems to some extent mimic the effects of sugar - triggering insulin secretion, inflammation and changes to the gut microbiome - which promote fat storage and type 2 diabetes," Hanover adds.

Since sweeteners are known to be passed on in small amounts via the placenta and breast milk, the researchers asked whether similar metabolic and microbiome changes occur in offspring following maternal sweetener intake. They fed mouse moms one of three sweetener solutions throughout pregnancy and lactation, and analyzed the effects on their pre-weaned pups. The solutions contained a mixture of sucralose and acesulfame-K at the 'acceptable daily intake' (ADI), double the ADI, or a control (water). The ADI is the maximum consumption deemed safe in humans based on toxicology studies.

"Sweeteners are often used in combination, partly because a blend can reduce the unpleasant bitter taste that some consumers experience," explains Rother. Olivier Van-Stichelen adds, "Combining sweeteners might also amplify the metabolic and microbiome effects - so we used the typical pairing of sucralose and ace-K to maximize the applicability of our results."

Maternal sweetener consumption affects pre-weaned offspring

Analysis of blood, feces and urine from a total of 226 pups confirmed that both sweeteners are transmitted prenatally - and as predicted, affect the metabolism and microbiome of the offspring.

While the pups' exposure was low, the researchers found significant metabolic changes in both the ADI and 2xADI groups versus the control group. Specifically, these changes indicated impaired liver functioning in clearing toxins from the blood, and a dramatic shift in bacterial metabolites in the gut. In both sweetener groups, for example, the researchers observed the loss of a major beneficial species of gut bacteria, Akkermansia muciniphila. Similar microbiome alterations in humans have been linked to type 2 diabetes and obesity.

Should pregnant or lactating mothers avoid sweeteners?

"Our results showed dose-dependent effects of sweetener exposure," the researchers report.

Of note, the degree of metabolic change was far greater in the 2xADI than the ADI group. What's more, further changes in sweetener-exposed pups - including lower weight and fasting blood glucose - only became prevalent in the 2xADI group. However, the microbiome changes were drastic even at the acceptable daily intake level.

Current recommendations for artificial sweetener use during pregnancy state that they may be used in moderation - except for saccharin, which should be avoided entirely. However, artificial sweeteners are now found in more products than ever - including mouthwash, toothpaste, and medicines, as well as food and drink - and since labels do not specify the amounts of added sweeteners, it is impossible to accurately track our intake.

"The results of the study highlight yet another potential health impact of zero-calorie sweeteners," says Olivier-Van Stichelen. "This is ongoing research that will be continued both in my recently started lab at the Medical College of Wisconsin as well as in Drs. Hanover's and Rother's labs at the National Institutes of Health".

Dr. Hanover concludes: "The perinatal period is a critical developmental stage for the microbiome and emerging detoxification systems in the rodent and human neonate alike, and our study defines potentially adverse consequences of early exposure to sweeteners. Therefore, based on our findings, zero-calorie sweeteners warrant further investigation in humans in this critical developmental window."

Credit: 
Frontiers

Cholesterol medication could invite diabetes, study suggests

COLUMBUS, Ohio - A study of thousands of patients' health records found that those who were prescribed cholesterol-lowering statins had at least double the risk of developing type 2 diabetes.

The detailed analysis of health records and other data from patients in a private insurance plan in the Midwest provides a real-world picture of how efforts to reduce heart disease may be contributing to another major medical concern, said Victoria Zigmont, who led the study as a graduate student in public health at The Ohio State University.

Statins are a class of drugs that can lower cholesterol and blood pressure, reducing the risk of heart attack and stroke. More than a quarter of middle-aged adults use a cholesterol-lowering drug, according to recent federal estimates.

Researchers found that statin users had more than double the risk of a diabetes diagnosis compared to those who didn't take the drugs. Those who took the cholesterol-lowering drugs for more than two years had more than three times the risk of diabetes.

"The fact that increased duration of statin use was associated with an increased risk of diabetes - something we call a dose-dependent relationship - makes us think that this is likely a causal relationship," Zigmont said.

"That said, statins are very effective in preventing heart attacks and strokes. I would never recommend that people stop taking the statin they've been prescribed based on this study, but it should open up further discussions about diabetes prevention and patient and provider awareness of the issue."

Researchers also found that statin users were 6.5 percent more likely to have a troublingly high HbA1c value - a routine blood test for diabetes that estimates average blood sugar over several months.

The study, published in the journal Diabetes Metabolism Research and Reviews, included 4,683 men and women who did not have diabetes, were candidates for statins based on heart disease risk and had not yet taken the drugs at the start of the study. About 16 percent of the group - 755 patients - were eventually prescribed statins during the study period, which ran from 2011 until 2014. Participants' average age was 46.

Randall Harris, a study co-author and professor of medicine and public health at Ohio State, said that the results suggest that individuals taking statins should be followed closely to detect changes in glucose metabolism and should receive special guidance on diet and exercise for prevention.

Although statins have clear benefits in appropriate patients, scientists and clinicians should further explore the impact of statins on human metabolism, in particular the interaction between lipid and carbohydrate metabolism, said co-author Steven Clinton, a professor of medicine and member of Ohio State's Comprehensive Cancer Center.

"In addition, researchers conducting large prospective cohort studies should be considering how statins impact human health overall. They should consider both risks and benefits, not just the disease that is being treated by the specific drug," Clinton said.

The study was done retrospectively, meaning that the researchers looked back at existing records from a group of patients to determine if there were any possible connections between statin prescriptions and diabetes. Previous research has suggested a connection, but this study design allowed for a glimpse at what is happening naturally in the clinical setting, rather than what happens in a prospective trial that randomly assigns some people to statins and some people to placebo, said Zigmont, who is now an assistant professor at Southern Connecticut State University.

The study was enriched by the availability of a variety of details on the study population, including data from biometric screenings and a health survey that asked about education, health behaviors and ethnicity, Zigmont said. She also had access to medical claims data and pharmacy claims data.

Zigmont was careful to take a wide variety of confounding factors into account in an effort to better determine if the statins were likely to have caused the diabetes, she said. Those included gender, age, ethnicity, education level, cholesterol and triglyceride readings, body mass index, waist circumference and the number of visits to the doctor.

Programs that help patients improve their fitness and diets could be considered and discussed when doctors are prescribing statins, so that patients can be proactive about diabetes prevention, she said.

It would also be helpful for future research to better determine which statins and which doses might lead to the greatest risk, Zigmont said. Her study didn't allow for an analysis based on different types of statins.

Limitations of the research include the fact that the majority of statin users were white, and that the research team had no way of knowing how closely patients adhered to their doctors' prescriptions. There also was no way of determining who was at elevated risk of diabetes at the study's onset, Zigmont said.

Credit: 
Ohio State University

European pregnancy rates from IVF and ICSI 'appear to have reached a peak'

Vienna 25 June 2019: The latest annual data collected by ESHRE from European national registries (for 2016) show another rise in the cumulative use of IVF in the treatment of infertility, although success rates after IVF or ICSI appear to have reached a peak, with pregnancy rates per started treatment calculated at 27.1% after IVF and 24.3% after ICSI. The figures, although indicative of a slight decline in pregnancy rate, continue a recent trend of conventional IVF cycles performing better than ICSI.

The one treatment in which cycle numbers and success rates continue to rise is in frozen embryo cycles in which embryos cryopreserved in storage are thawed for transfer in a later cycle. Pregnancy rate per started treatment in 2016 was 30.5%, an increase over 2015 of 1.3%. Around one half of the total European cycles analysed by ESHRE were transfers from frozen embryos, also an increase over 2015. Dr Christian de Geyter, chair of ESHRE's European IVF Monitoring Consortium, said the number of frozen embryo transfers was likely to increase as more and more clinics adopt a single embryo transfer policy (and thus store more embryos) or take up 'freeze-all' strategies to avoid embryo transfer in an initial stimulated cycle.

In Europe, Spain remains the most active country in assisted reproduction, with a record a record 140,909 treatment cycles performed. Spain continues to set the pace ahead of Russia (121,235 cycles), France (104,733) and Germany (969,226). The cycles monitored by ESHRE include treatments with IVF, ICSI, intrauterine insemination and egg donation.(2)

The report covers a total of more than 800,000 treatment cycles performed in 2016, with 165,000 babies born - and represents the largest and most accurate snapshot of assisted reproduction in Europe.(2) Dr de Geyter will present the results today in Vienna at the 35th Annual Meeting of ESHRE.

He estimates that around 84% of all European assisted reproduction fertility treatments are now included in the ESHRE monitoring programme - although this year (for 2016) without ther inclusion of any data so far from the UK, which usually performs around 60,000 treatments a year.

Among other findings:

Clinics in Europe continue to favour ICSI over IVF by around two-to-one (359,858 ICSI, 128,626 IVF cycles), a pattern now evident throughout the world. ICSI was developed in the early 1990s as a specific treatment for male infertility (low sperm counts, poor sperm quality) but is now clearly used for fertilisation in non-male cases.

Pregnancy rates are higher with five-day old embryos (blastocysts) than with three-day, another widely favoured strategy.

Pregnancy rates from egg donation continue to rise (now at about 50%), making egg donation the most successful treatment available.

The rate of twin pregnancy continues to decline in Europe, in 2016 to around 15%. Similarly, the rate of single embryo transfers continues to rise - from 11% in 1997 to above 40% in 2016.

'Success rates have stabilised,' said Dr de Geyter, 'although outcome in egg donation and with use of frozen embryos is still moving upwards. The biggest upwards movement, however, is from treatments with frozen eggs, which have been revolutionised by the widespread introduction of vitrification.' Twenty-two countries reported activity in egg donation treatment with frozen banked eggs (11,196 cycles), achieving a pregnancy rate of 43.7% and delivery rate of 29.9%.

Dr de Geyter also noted that the availability of assisted reproduction treatment remains very patchy in Europe, with Denmark and Belgium each offering more than 2500 treatment cycles per million population, while others (such as Austria and Italy) offer considerably fewer. A study calculated that the global need for advanced fertility treatments was around 1500 cycles per million population per year. 'Only a minority of European countries meet this need,' said De Geyter.

Credit: 
European Society of Human Reproduction and Embryology

Radioactive tadpoles reveal contamination clues

image: This is a captured bullfrog tadpole surrounded by the nutrients in the vegetation.

Image: 
Terry Spivey@bugwood.org

Aiken, S.C. - Tadpoles can be used to measure the amount of radiocesium, a radioactive material, in aquatic environments, according to new research from University of Georgia scientists.

Whether from nuclear accidents, global fallout from weapons testing, or production of nuclear energy, tadpoles could be used to determine the extent and severity of radioactive contamination.

James C. Leaphart, lead investigator on the 32-day study, evaluated the rate at which the environmental pollutant radiocesium, a byproduct of nuclear production, accumulated through time in bullfrog tadpoles.

Taken from an uncontaminated wetland, the tadpoles were placed in various locations in a canal on the U.S. Department of Energy's Savannah River Site, a former nuclear production facility. The canal received releases of radiocesium from a nearby reactor from 1954 to 1964.

"Due to the rapid accumulation of radiocesium in these tadpoles, how much they accumulated and their inability to leave aquatic systems before metamorphosis, these tadpoles are excellent indicators of the bioavailability and distribution of radiocesium in the system," said Leaphart, graduate student at the Savannah River Ecology Laboratory and Warnell School of Forestry and Natural Resources.

According to the study results, published in the Journal of Environmental Radioactivity, bullfrog tadpoles reached what the researchers describe as maximum threshold, or the point at which their uptake of the contaminant stopped, between 11 and 14 days.

This accumulation rate was significantly faster than rates recorded for waterfowl and fish, species previously studied for uptake of the contaminant, according to Leaphart. Rates in these species varied significantly, with a range of 17 to 175 days.

James Beasley, Leaphart's adviser and associate professor at SREL and Warnell, said how quickly a species reaches the threshold level of accumulation is vital in determining its use as a biomonitor of the contaminant.

"If it takes a long time to achieve the threshold level, factors like animal movement and changes in diet can play a role in influencing the results," he said.

Tadpoles are more likely to reflect local contamination levels, according to Beasley. That's because factors like movement and changes in food availability will not have as much of an impact on an individual's exposure compared to species that may take several weeks or months to achieve maximum levels.

"Isolation is key," Leaphart said. "Tadpoles spend the first portion of their lives in aquatic systems--canals, wetlands and ponds--foraging on plants, algae, insect larvae and sediments where radiocesium has a tendency to bind."

Understanding radiocesium accumulation patterns in amphibians is important, the researchers said, because they have the potential to transfer contaminants within food webs as well as disperse aquatic contaminants into terrestrial ecosystems following metamorphosis.

Credit: 
University of Georgia