Earth

Reconnecting with nature key for sustainability

People who live in more built up areas and spend less free-time in nature are also less likely to take actions that benefit the environment, such as recycling, buying eco-friendly products, and environmental volunteering.

The finding of a new study led by the University of Exeter indicates that policies to preserve and develop urban green spaces, and support urban populations reconnect with nearby nature, could help meet sustainability targets and reduce carbon emissions.

The study, published in Environment International and funded by NIHR Health Protection Research Unit in Environmental Change and Health, analysed survey responses from more than 24,000 people in England. The team looked at people's exposure to nature in their local area, their recreational visits to natural environments (parks, woodlands, beaches etc.), and the extent to which they valued the natural world.

The team, including collaborators from the University of Plymouth and Public Health England, found that many green choices were more common in people who lived in greener neighbourhoods or at the coast, and among those who regularly visited natural spaces regardless of where they lived. The relationships were the same for men and women, young and old, and for rich and poor.

Lead author Dr Ian Alcock, of the University of Exeter Medical School, said: "Over 80% of the English population now live in urban areas and are increasingly detached from the natural world. Greening our cities is often proposed to help us adapt to climate change - for example, city parks and trees can reduce urban heat spots. But our results suggest urban greening could help reduce the damaging behaviours which cause environmental problems in the first place by reconnecting people to the natural word."

Co-researcher Dr Mat White, of the University of Exeter Medical School, said: "The results are correlational so there is always the issue of untangling cause and effect, but our results based on a very large representative sample are consistent with experimental work which shows that people become more pro-environmental after time spent in natural vs. urban settings."

The paper is entitled 'Associations between pro-environmental behaviour and neighbourhood nature, nature visit frequency and nature appreciation: Evidence from a nationally representative survey in England'. Authors are Ian Alcock, Mathew White, Sabine Pahl, Raquel Duarte-Davidson and Lora Fleming.

Credit: 
University of Exeter

Involving family in bipolar care helps children and teens stay healthier, longer

image: This is David Miklowitz, PhD, director of the Max Gray Child and Adolescent Mood Disorders Program of the Jane and Terry Semel Institute for Neuroscience and Human Behavior at UCLA.

Image: 
UCLA Health

In a UCLA-led study, children and adolescents with a high risk for developing bipolar disorder stayed healthier for longer periods when their family members participated in their psychotherapy sessions.

The research team, which also included authors from the University of Colorado and Stanford University, studied 127 children and teens between the ages of 9 and 17. Its goal was to determine which of two types of treatment was more effective at delaying new and recurring bipolar symptoms: 12 sessions of a family-focused therapy that teaches patients and families better communication skills or six sessions of a traditional form of psychoeducation focused on developing a plan to help patients better understand and cope with their mood symptoms.

The study was published in JAMA Psychiatry.

Bipolar disorder, sometimes referred to as manic depression, is a brain disorder marked by sudden shifts in mood and energy levels. Treatment with medication and/or psychotherapy may help alleviate the most severe symptoms, but there is no cure. Recurring symptoms may include bouts of mania and depression, characterized by high energy, feelings of grandiosity and elation that alternate with deep feelings of sadness, lethargy and suicidal thoughts and actions.

The children and adolescents in the study had an elevated risk for developing the disorder because they had a genetic or family history of the condition, said David Miklowitz, the study's lead author, and a distinguished professor of psychiatry at the David Geffen School of Medicine at UCLA. All of the participants also demonstrated early warning signs of bipolar disorder -- such as depression and brief periods of mania -- at the start of the study.

Miklowitz said many families don't know that their kids have a high risk for developing bipolar disorder.

"They think they're just moody kids," he said. "But there are telltale signs, such as sudden changes in mood and energy levels, irritability, feeling increasingly anxious and depressed such that they can't get out of bed. Often, these are kids for whom bipolar disorder runs in the family."

For the study, half of the participants underwent 12 sessions of family-focused therapy over a four-month period. The other half received a traditional, less intensive educational treatment over six sessions that was designed to be similar to the current standard treatment for children and adolescents. That arm received three individual and three family sessions, also over a four-month period. The treatments took place at UCLA, two Colorado campuses and Stanford.

About 60% of participants also chose to receive medications for symptoms of depression, mood instability, ADHD or anxiety. Medication regimens were equivalent in the two therapy treatments.

Participants were followed for up to four years -- less in some cases because some participants dropped out of the study or moved before the analysis was complete -- after the four-month treatments had ended.

In the family-focused therapy sessions, children and their parents were taught to recognize and understand the early symptoms of bipolar disorder, and to practice communication and problem-solving skills such as active listening and conflict resolution. Participants in the traditional educational treatment were taught how to monitor their moods for new and recurring symptoms, and given individualized instruction for managing those symptoms.

A total of 77% of the adolescents in the family-focused treatment recovered from their initial symptoms during the study, and the average time before new symptoms of depression returned was 87 weeks. By comparison, 65% of the adolescents in the educational group recovered, but the average time before their symptoms returned was shorter, just 63 weeks.

"If kids who are at risk for bipolar disorder are living in chaotic households with no boundaries or highly critical parents, they will do worse over time," Miklowitz said. "Involving the parents in the child's therapy teaches family members how to create a more protective environment so that kids can stay well for longer."

Treating children and adolescents before severe symptoms of bipolar disorder start is considered an early intervention rather than prevention, said Miklowitz, who is the director of the Max Gray Child and Adolescent Mood Disorders Program of the Jane and Terry Semel Institute for Neuroscience and Human Behavior.

"We don't know yet whether bipolar disorder can be prevented," he said. "What we're trying to do is to catch the illness early so that kids and teens, with the help of their families, can learn ways to reduce the severity of their symptoms and the frequency in which symptoms recur."

The researchers also are studying whether family-focused therapy changes brain activity in ways that may help to alleviate mood symptoms, as well as whether it helps reduce suicidal thoughts and behaviors in youth with a high risk for bipolar disorder.

"This study is an important first step in trying to decrease the severity of bipolar disorder early on for children," said Dr. Christopher Schneck, an associate professor of psychiatry at the University of Colorado Anschutz Medical Campus and a co-author of the study. "Efforts at home and in health care settings, like providing skill training for families, can make a big difference in a child's suffering."

Credit: 
University of California - Los Angeles Health Sciences

Sub-national 'climate clubs' could offer key to combating climate change

'Climate clubs' offering membership for sub-national states, in addition to just countries, could speed up progress towards a globally-harmonised climate change policy, which in turn offers a way to achieve stronger climate policies in all countries.

This is the key finding of a new study by researchers from the Institute of Environmental Science and Technology of the Autonomous University of Barcelona (UAB), recently published in the open-access journal Environmental Research Letters.

ICTA-UAB researcher and first author Nick Martin explained that the United Nations Framework Convention on Climate Change (UNFCCC) is the default facilitator of global negotiations on climate issues. However, due to the logistic limitations of large groups and involvement being essentially voluntary in nature, progress has been slow. Its two most ambitious initiatives - the defunct Kyoto Protocol and the current Paris Agreement - both relied on voluntary actions and were not legally binding. As a result, such climate policies lack global harmonisation and therefore are bound to remain weak.

They consider that it is important, therefore, to think about alternatives. A 'climate club' of countries has been suggested as a way to move towards a global agreement that enforces national climate policies through harmonisation.

"We take this idea a step further in our study. Extending a club to comprise sub-national states or provinces that want to implement their own, more ambitious climate policies could allow the inclusion of considerable contributions from important emitters like the US. Given the US intention to withdraw from the Paris Agreement, this could have a significant impact on overall US emissions by allowing more motivated states to take part."

The 'climate club' model is based on a uniform policy - most likely in the form of carbon pricing. The club would then offer exclusive trade benefits or club goods to members. It could also attract further membership by imposing penalties on imports from non-members, to limit competition from unregulated sources.

The study used four measures to predict the likelihood of involvement for governments at multiple levels. These included the level of carbon independence, public opinion regarding climate change, current government policy, and level of membership in existing climate-related coalitions.

Dr Jeroen van den Bergh, ICREA Research Professor at ICTA-UAB and second author, explained that, taken together, these measures provide a good indication of a government's level of ambition regarding climate policy, and therefore its potential willingness to join an international 'climate club'.

"We initially identified a group of nine countries likely to be most receptive to club membership. Although the US and China were not among this group, our findings suggest that the EU (taken as a single country for these purposes) is the preferred initiator of the club, given its high emissions, high GDP and long history of leadership on climate change mitigation". What's more, they believe that China could well be convinced to join a club given its recent sharp rise in concern about local and global emissions.

Considering the current climate policies of the Trump administration, the US would seem highly unlikely to take part in initiatives of this kind for the foreseeable future. However, as US states have quite high levels of decision making at the local level and many control significant economies in their own right, they also evaluated the likelihoods of individual states to consider joining a 'climate club'.

They found that 10 of the 50 states were 'very likely' to consider club membership, with a further 13 'moderately likely' to do so. Jointly , these 23 states represent 36 per cent of the US's national emissions and 56.3 per cent of its GDP.

Less motivated US states could still be persuaded to join through strong export dependencies with four key partners - the EU, Canada, Mexico and China. In fact, 10 of the remaining 'not likely' were found to have strong trade ties to these countries. So, a club containing these four trading partners could be capable of boosting club membership significantly via trade influences. In all, the analysis suggests that US states representing a total of 69.9 per cent of emissions and 77.7 per cent of total GDP may be amenable to club membership via either of these mechanisms."

Dr van den Bergh concluded: "we recognise the political and legal hurdles climate clubs could face, but considering the limitations of the Paris Agreement and the urgency of implementing effective climate action, we believe the time is ripe for debating daring solutions."

Credit: 
Universitat Autonoma de Barcelona

NASA, NOAA analyses reveal 2019 second warmest year on record

image: This plot shows yearly temperature anomalies from 1880 to 2019, with respect to the 1951-1980 mean, as recorded by NASA, NOAA, the Berkeley Earth research group, the Met Office Hadley Centre (UK), and the Cowtan and Way analysis. Though there are minor variations from year to year, all five temperature records show peaks and valleys in sync with each other. All show rapid warming in the past few decades, and all show the past decade has been the warmest.

Image: 
NASA GISS/Gavin Schmidt

According to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA), Earth's global surface temperatures in 2019 were the second warmest since modern recordkeeping began in 1880.

Globally, 2019 temperatures were second only to those of 2016 and continued the planet's long-term warming trend: the past five years have been the warmest of the last 140 years.

This past year, they were 1.8 degrees Fahrenheit (0.98 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA's Goddard Institute for Space Studies (GISS) in New York.

"The decade that just ended is clearly the warmest decade on record," said GISS Director Gavin Schmidt. "Every decade since the 1960s clearly has been warmer than the one before."

Since the 1880s, the average global surface temperature has risen and the average temperature is now more than 2 degrees Fahrenheit (a bit more than 1 degree Celsius) above that of the late 19th century. For reference, the last Ice Age was about 10 degrees Fahrenheit colder than pre-industrial temperatures.

Using climate models and statistical analysis of global temperature data, scientists have concluded that this increase mostly has been driven by increased emissions into the atmosphere of carbon dioxide and other greenhouse gases produced by human activities.

"We crossed over into more than 2 degrees Fahrenheit warming territory in 2015 and we are unlikely to go back. This shows that what's happening is persistent, not a fluke due to some weather phenomenon: we know that the long-term trends are being driven by the increasing levels of greenhouse gases in the atmosphere," Schmidt said.

Because weather station locations and measurement practices change over time, the interpretation of specific year-to-year global mean temperature differences has some uncertainties. Taking this into account, NASA estimates that 2019's global mean change is accurate to within 0.1 degrees Fahrenheit, with a 95% certainty level.

Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2019 annual mean temperature for the contiguous 48 United States was the 34th warmest on record, giving it a "warmer than average" classification. The Arctic region has warmed slightly more than three times faster than the rest of the world since 1970.

Rising temperatures in the atmosphere and ocean are contributing to the continued mass loss from Greenland and Antarctica and to increases in some extreme events, such as heat waves, wildfires, intense precipitation.

NASA's temperature analyses incorporate surface temperature measurements from more than 20,000 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.

These in situ measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heat island effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.

NOAA scientists used much of the same raw temperature data, but with a different interpolation into the Earth's polar and other data-poor regions. NOAA's analysis found 2019 global temperatures were 1.7 degrees Fahrenheit (0.95 degrees Celsius) above the 20th century average.

NASA's full 2019 surface temperature data set and the complete methodology used for the temperature calculation and its uncertainties are available at:

https://data.giss.nasa.gov/gistemp

GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.

NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based measurements, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

For more information about NASA's Earth science activities, visit:

https://www.nasa.gov/earth

The slides for the Jan. 15 news conference are available at:

https://www.ncdc.noaa.gov/sotc/briefings/20200115.pdf

NOAA's Global Report is available at:

https://www.ncdc.noaa.gov/sotc/global/201913

Credit: 
NASA/Goddard Space Flight Center

Blue light can help heal mild traumatic brain injury

image: Research Technician Cami Barnes tests a blue light device.

Image: 
William

Early morning blue light exposure therapy can aid the healing process of people impact by mild traumatic brain injury, according to new research from the University of Arizona.

"Daily exposure to blue wavelength light each morning helps to re-entrain the circadian rhythm so that people get better, more regular sleep. This is likely true for everybody, but we recently demonstrated it in people recovering from mild traumatic brain injury, or mTBI. That improvement in sleep was translated into improvements in cognitive function, reduced daytime sleepiness and actual brain repair," said William D. "Scott" Killgore, psychiatry professor in the College of Medicine - Tucson and lead author on a new study published in the journal Neurobiology of Disease.

Mild traumatic brain injuries, or concussions, are often the result of falls, fights, car accidents and sports participation. Among other threats, military personnel can also experience mTBI from exposure to explosive blasts: Shockwaves strike the soft tissue of the gut and push a burst of pressure into the brain, causing microscopic damage to blood vessels and brain tissue, Killgore said.

"Your brain is about the consistency of thick Jell-O," he said. "Imagine a bowl of Jell-O getting hit from a punch or slamming against the steering wheel in a car accident. What's it doing? It's absorbing that shock and bouncing around. During that impact, microscopic brain cells thinner than a strand of hair can easily stretch and tear and rip from the force."

Those with a concussion or mTBI can momentarily see stars, become disoriented, or even briefly lose consciousness following the injury; however, loss of consciousness doesn’t always happen and many people who sustain a concussion are able to walk it off without realizing they have a mild brain injury, according to Killgore. Headaches, attention problems and mental fogginess are commonly reported after head injuries and can persist for weeks or months for some people.

Few, if any, effective treatments for mTBI exist. The U.S. Army Medical Research and Development Command funded the research to find alternatives to medicinal methods of mTBI recovery.

"About 50% of people with mTBI also complain that they have sleep problems after an injury," Killgore said.

Recent research has shown that the brain repairs itself during sleep, so Killgore and his co-authors - John Vanuk, Bradley Shane, Mareen Weber and Sahil Bajaj, all from the Department of Psychiatry - sought to determine if improved sleep led to a faster recovery.

In a randomized clinical trial, adults with mTBI used a cube-like device that shines bright blue light (with a peak wavelength of 469 nm) at participants from their desk or tables for 30 minutes early each morning for six weeks. Control groups were exposed to bright amber light.

"Blue light suppresses brain production of a chemical called melatonin," Killgore said. "You don't want melatonin in the morning because it makes you drowsy and prepares the brain to sleep. When you are exposed to blue light in the morning, it shifts your brain's biological clock so that in the evening, your melatonin will kick in earlier and help you to fall asleep and stay asleep."

People get the most restorative sleep when it aligns with their natural circadian rhythm of melatonin - the body's sleep-wake cycle associated with night and day.

"The circadian rhythm is one of the most powerful influences on human behavior," Killgore said. "Humans evolved on a planet for millions of years with a 24-hour light/dark cycle, and that's deeply engrained in all our cells. If we can get you sleeping regularly, at the same time each day, that's much better because the body and the brain can more effectively coordinate all these repair processes."

As a result of the blue light treatment, participants fell asleep and woke an average of one hour earlier than before the trial and were less sleepy during the daytime. Participants improved their speed and efficiency in brain processing and showed an increase in volume in the pulvinar nucleus, an area of the brain responsible for visual attention. Neural connections and communication flow between the pulvinar nucleus and other parts of the brain that drive alertness and cognition were also strengthened.

"We think we're facilitating brain healing by promoting better sleep and circadian alignment, and as these systems heal, these brain areas are communicating with each other more effectively. That could be what's translating into improvements in cognition and less daytime sleepiness," Killgore said.

Blue light from computers, smartphones and TV screens often gives blue light a bad rap. But according to Killgore, "when it comes to light, timing is critical. Light is not necessarily good or bad in-and-of-itself. Like caffeine, it all comes down to when you use it. It can be terrible for your sleep if you're consuming coffee at 10 o'clock at night, but it may be great for your alertness if you have it in the morning."

He and his team plan to continue their research to see if blue light improves sleep quality and how light therapy might affect emotional and psychiatric disorders. Killgore believes that most people, whether injured or healthy, could benefit from correctly timed morning blue light exposure, a theory he hopes to prove for certain in future studies.

Credit: 
University of Arizona

Molecular understanding of drug interactions suggests pathway to better malaria treatments

image: University of Houston Engineers Peter Vekilov, right, Wenchuan Ma and Jeffrey Rimer have for the first time demonstrated what happens at the molecular level when two compounds known to inhibit crystal growth were combined.

Image: 
University of Houston

The process of crystallization is central to drug development, petrochemical processing and other industrial actions, but scientists say they still are learning about the complex interactions involved in the building and dissolution of crystals.

Researchers from the University of Houston and the Université libre de Bruxelles reported in the journal Nature that they have for the first time demonstrated at the molecular level what happens when two compounds known to inhibit crystal growth - in this case, antimalarial drugs - were combined. The results were unexpected.

"You would expect using two drugs that attacked crystallization in two different ways would be synergistic, or at the very least additive," said Jeffrey Rimer, Abraham E. Dukler Professor of Chemical and Biomolecular Engineering at UH and a co-author of the paper. "Instead, we found that they can work against each other."

Working against each other, known as antagonistic cooperation, meant that the drugs were actually less effective in tandem than individually. Peter Vekilov, John and Rebecca Moores Professor of Chemical and Biomolecular Engineering and Chemistry at UH and another co-author, said the work will allow the design of more effective treatments for malaria, a mosquito-borne disease that killed 435,000 people in 2017, most of them children in Africa.

But more broadly, it suggests a new way to screen molecules for their potential in drug development, allowing new treatments to be developed more quickly.

"When you are using modifiers, a small change in the molecule's structure can dramatically alter its performance," Rimer said.

Malaria is caused by a parasite, which consumes hemoglobin and leaves behind a compound known as hematin, which the parasite sequesters inside a crystal. Antimalarial treatments work by inhibiting the crystal growth, freeing hematin to attack the parasite.

For this work, the researchers studied the growth of hematin crystals in the presence of four antimalarial drugs - chloroquine, quinine, mefloquine and amodiaquine - which work in one of two distinct ways.

Both computationally and experimentally, including through the use of atomic force microscopy, the researchers demonstrated how compounds which attack crystallization by two different mechanisms behave when combined. The resulting molecular-level understanding of that behavior suggests a new mechanism for materials science, Vekilov said.

"This mechanism may provide guidance in the search for suitable inhibitor combinations to control crystallization of pathological, biomimetic, and synthetic materials," the researchers wrote. "In a broader context, our results highlight modifier interactions mediated by the dynamics and structures on the crystal interface as a prime element of the regulation of the shapes and patterns of crystalline structures in nature and industry."

Credit: 
University of Houston

New mechanism for estrogen in promoting breast cancer in cells

image: D. Joseph Jerry is professor of veterinary and animal sciences at the University of Massachusetts Amherst and also serves as science director of the Pioneer Valley Life Sciences Institute and co-director of the Rays of Hope Center for Breast Cancer Research in a partnership between UMass Amherst and Baystate Medical Center.

Image: 
UMass Amherst

A new approach to studying the effects of two common chemicals used in cosmetics and sunscreens found they can cause DNA damage in breast cells at surprisingly low concentrations, while the same dose did not harm cells without estrogen receptors.

The research, published Jan. 15 in Environmental Health Perspectives, identifies a new mechanism by which estrogens and xenoestrogens - environmental chemicals that act like estrogens - may promote breast cancer, says breast cancer researcher D. Joseph Jerry, professor of veterinary and animal sciences at the University of Massachusetts Amherst. Jerry also serves as science director of the Pioneer Valley Life Sciences Institute and co-director of the Rays of Hope Center for Breast Cancer Research in a partnership between UMass Amherst and Baystate Medical Center.

"The new research offers more sensitive tools to screen for the potential deleterious effects of environmental chemicals, which would be overlooked by methods currently used," Jerry explains. He notes that federal agencies, such as the Food and Drug Administration (FDA), typically screen for toxicity of these chemicals in cell lines that don't have estrogen receptors.

The two compounds - examined in cells grown in the lab and in the mammary glands of mice - were the ultraviolet filter benzophenone-3 (BP-3), also known as oxybenzone, and propylparaben (PP), an antimicrobial preservative found in cosmetics and other personal care products.

Jerry emphasizes that more research is needed to determine what this discovery may mean in terms of consumer guidelines. "Benzophenone-3 is a sunscreen that works. If you use it, you can prevent skin cancer. Am I arguing you shouldn't use sunscreen? I am not. But there may be a subset of people for whom it may present a significant hazard," says Jerry, such as women at high risk for breast cancer or those with a history of estrogen receptor-positive breast cancer.

Previous research on the impact of BP-3 and PP focused on the exposure necessary to activate specific genes in cancer cells or accelerate their growth. "Those effects required concentrations that exceed the levels that most women are normally exposed to," Jerry says.

But the new research shows that DNA damage in breast cells with estrogen receptors occurred at concentrations that are 1/10th to 1/30th of that required to stimulate proliferation or gene expression. "There may be a risk at lower levels than we would have previously understood," Jerry says.

Jerry and colleagues at UMass Amherst, UMass Medical Center-Baystate and Pioneer Valley Life Sciences Institute decided to look at whether PP and BP-3 have estrogenic effects at concentrations relevant to population exposures because "we know that estrogen can promote breast cancer," Jerry says.

"It's not toxic unless the cells have estrogen receptors," he says. "So it's acting through the estrogen receptor to create this damage. There is no consequence if you test it in other cells."

Credit: 
University of Massachusetts Amherst

Analyzing DNA in soil could be an effective way of tracking animals

image: A night vision camera trap captured this image of mountain lions drinking from a stream at Stanford's Jasper Ridge Biological Preserve.

Image: 
Jasper Ridge Biological Preserve

It's hard to protect something you can't find. A new Stanford study reveals sampling soil for animals' left-behind DNA can provide valuable information for conservation efforts - with significantly less cost and time - than currently used methods, such as camera traps.

The process, outlined Jan. 14 in Proceedings of the Royal Society B, also proved effective at distinguishing genetic differences between animals that otherwise look identical, an arduous task with traditional tracking approaches, and may have even revealed previously unknown species diversity, according to the researchers. Although the technique still needs refinement, the authors are optimistic it could one day revolutionize the study of species in the wild.

"We need a quantum leap in the way we identify and track animals," said study lead author Kevin Leempoel, a postdoctoral research fellow in biology at Stanford. "This may be it."

A hopeful solution

The specter of extinction hangs over more than a quarter of all animal species, according to the best estimate of the International Union for Conservation of Nature, which maintains a list of threatened and extinct species. Conservationists have documented extreme declines in animal populations in every region of Earth.

One of the most promising tools for monitoring biodiversity - key to large-scale conservation efforts - is the study of environmental DNA, or eDNA, in discarded animal materials, such as hair, feces, skin and saliva. After extracting DNA, scientists sequence and compare it to online DNA sequence databases to identify the species. It's a relatively fast, low-maintenance process compared to traditional approaches, such as live-trapping, animal tracking and camera trapping, for studying species diversity, distribution and abundance. The researchers spent about $4,500 for all the study's supplies, other than lab equipment. A similar study with camera traps could cost more than twice as much.

Despite the obvious advantages, questions about the efficacy of eDNA analyses have remained. That's in part because most research so far has been done only in ocean and freshwater environments. Among the few studies done on land, most have been in enclosed areas, such as zoos, or limited to a small number of species.

Seeing the unseen

Working at Stanford's 1,193-acre Jasper Ridge Biological Preserve, Leempoel and his colleagues studied eDNA in soil. Not only did they identify almost every animal that nearby camera traps had spotted in the previous four years, they also found genetic evidence of a number of small mammals, including bats and voles, rarely if ever seen by the cameras. These creatures had likely escaped the cameras' gaze because they are too small to trigger them. Overall, there was an 80 percent chance of finding an animal's eDNA in an area within 30 days of the animal's presence there.

Another advantage of eDNA is the possibility of distinguishing species that look similar. For example, the researchers found the DNA of the Norway rat in soil samples, confirming the presence of this species in the area for the first time. Previous camera surveys could not tell the difference between Norway and black rats.

Compared with camera records and other observations, eDNA identifications appeared to be closely correlated with how frequently and recently animals had been in the area. The analysis turned up no hint of badgers - unrecorded on cameras for the previous four years - domestic cats or weasels - caught on camera only a few times in the previous two years.

"By corroborating photographs of animals with their genetic remains in the environment, this study reveals both hidden biodiversity in a terrestrial ecosystem and how well these eDNA techniques will work in other places," said study senior author Elizabeth Hadly, the Paul S. and Billie Achilles Professor in Environmental Biology in Stanford's School of Humanities and Sciences.

Toward a new paradigm

Despite these positive results, questions remain about the potential of eDNA analysis. Scientists do not know how frequently an animal must pass by a given area to be detectable in an eDNA sample, or how recent that passage must be. If an animal's size affects the amount of DNA it leaves behind, as the researchers speculate, some animals would only rarely be sampled while others would be overrepresented. No one knows the precise volume and number of samples that should be collected for maximum accuracy, which environmental source - soil or something else - is the most versatile, or whether all species are even detectable via eDNA analysis.

The study results appeared to overrepresent some species, such as mountain lions and bobcats, possibly due to the felines' habit of frequently marking their territory with urine and feces, and because they frequently use trails such as those where the researchers took soil samples. In general, it's impossible to know whether pieces of skin, fur or dried scat were transported by wind or by other species that had consumed the animal as prey.

Perhaps most importantly, incomplete DNA databases and limitations of the study's design made it difficult to detect all species present in the area, and caused at least two inconsistent results among the genetic sequencing approaches the researchers used. Analyzing eDNA remains relatively time-consuming because proven protocols have yet to be established. Still, the researchers are optimistic about the approach's promise.

"Its overall accuracy, combined with decreasing costs of genetic sequencing and new portable sequencers, makes eDNA a likely candidate to become the standard for biodiversity surveys in the next decade," Leempoel said.

Credit: 
Stanford University

Examining changes to FDA approval, regulation of pharmaceuticals over 4 decades

Bottom Line: Publicly available and Food and Drug Administration (FDA) data were used in this observational study to describe the number and types of prescription drugs approved from 1983 to 2018 and how the approval process and regulation of drugs changed during this period. Approvals of new generic drugs and biologics increased over this time, as the median annual number of generic drugs approved was 284 from 1985 to 2012 and 588 from 2013 to 2018. The authors report that the average annual number of new drug approvals, including biologics, was 34 from 1990-1999, decreasing to 25 from 2000-2009, and increasing to 41 from 2010-2018. There has been an expansion in the number of expedited development and approval programs since 1983, while the amount of evidence used for approvals has decreased. The proportion of new approvals supported by at least two pivotal trials declined from 81% in 1995-1997 to 53% in 2015-2017. The amount of industry-paid user fees collected, funds used to accelerate review times, have increased to an annual average of $820 million in 2013-2017. FDA drug review times declined from more than three years in 1983 to less than one year in 2017. A limitation of the study is the difficulty in comparing the size of the clinical benefit of drugs across different indications and populations.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Jonathan J. Darrow, S.J.D., J.D., M.B.A., of Brigham and Women's Hospital, Harvard Medical School, Boston, and coauthors.

(doi:10.1001/jama.2019.20288)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

A simple twist of cell fate

How do a couple of universally expressed proteins in stem cells and developing embryos influence an individual cell's ultimate fate -- whether it ultimately becomes, for example, a retinal cell, a heart muscle cell, or a stomach lining cell?

That's the question that Rajesh C. Rao, M.D., and his colleagues at the University of Michigan set out to answer.

Their findings, published in Cell Reports, could lead to new tools for researchers studying different types of specialized cells -- for which U-M was awarded provisional patents.

The research may also help illuminate an emerging class of anti-cancer drugs, WDR5 inhibitors -- as the two proteins at the heart of the study, WDR5 and p53, have long been studied in relation to cancer, and the study sheds light on how WDR5 regulates p53, both directly and indirectly.

"Most scientists have worked to understand stem cell differentiation by looking at transcription factors and how they jump into the action at a very specific time and place," says Rao, an assistant professor of ophthalmology and pathology at Michigan Medicine, and the study's senior author.

"Our main interest is in epigenetics and how changes in chromatin -- the packaging of DNA with proteins known as histones -- affect the fate of stem cells," adds Rao, who is also the Leonard G. Miller Professor of Ophthalmology & Visual Sciences. "How does a ubiquitous transcription factor like p53 integrate time-sensitive inputs from WDR5, which is involved in chromatin modifications, to guide differentiation?"

And while the detailed answer to that question is primarily of interest to other scientists, what the team observed in the lab is striking.

Using mouse embryonic stem cells -- which the researchers work with because of their rapid propensity to differentiate into specific tissues within structures called "organoids" -- the researchers found that by inducing a short delay in when WDR5 was expressed in precursor cells that emerge from embryonic stem cells, they could radically change their fate.

Normally these cells would develop into proto-nervous system tissue -- called the neuroectoderm -- which eventually gives rise to brain, spinal cord and retinal cells. (Rao studies the epigenetic regulation of retinal development and disease.) But the scientists' timing change of expression of the WDR5 gene steered them toward the mesoderm, which gives rise to blood cells, heart muscle cells and skeletal muscle cells.

"When we looked through the microscope, we could see the cells beating. It was a total surprise," says Rao, who is also a member of the U-M Rogel Cancer Center and director of retina service at the VA Ann Arbor Healthcare System. "We cut them open and stained for proteins that are expressed in heart muscle cells, and found that 30-40% of the cells within the organoid had them. And that was enough for them to beat."

Typically, researchers have to use more complex and expensive methods to generate heart-like cells from embryonic stem cells -- which led the team to seek patents on generating these "organoids," Rao explains.

More broadly, while p53 has been extensively studied in the context of cancer, the research fills in new details about the transcription factor's role in the fate of non-cancerous embryonic cells, he adds.

"The method we used is not the way these different cell types are actually created in nature since WDR5 is not 'naturally' turned off temporarily in the embryo," Rao says. "But we were trying to find out the effects of this important epigenetic player, WDR5, on differentiation that couldn't be studied before -- because if you turn it off entirely, the cells just die. So we had to devise a method to turn it off temporarily."

Meanwhile, approaches to inhibiting WDR5 to treat cancer is the focus of $1 billion in pharmaceutical industry investment, Rao and his colleagues note, and the research could inform the investigations of off-target effects of these new potential anti-cancer drugs.

Credit: 
Michigan Medicine - University of Michigan

Surprising beauty found in bacterial cultures

video: Time lapse video of flower-like patterns that emerge when E. coli and A. baylyi grow over a 24-hr. period.

Image: 
BioCircuits Institute/UC San Diego

Microbial communities inhabit every ecosystem on Earth, from soil to rivers to the human gut. While monoclonal cultures often exist in labs, in the real world, many different microbial species inhabit the same space. Researchers at University of California San Diego have discovered that when certain microbes pair up, stunning floral patterns emerge.

In a paper published in a recent issue of eLife, a team of researchers at UC San Diego's BioCircuits Institute (BCI) and Department of Physics, led by Research Scientist and BCI Associate Director Lev Tsimring, reports that when non-motile E. coli (Escherichia coli) are placed on an agar surface together with motile A. baylyi (Acinetobacter baylyi), the E. coli "catch a wave" at the front of expanding A. baylyi colony.

The agar provided food for the bacteria and also a surface on which E. coli couldn't easily move (making it non-motile). A. baylyi, on the other hand, can crawl readily across the agar using microscopic legs called pili. Thus, a droplet of pure E. coli would barely spread over a 24-hour period, while a droplet of pure A. baylyi would cover the entire area of the petri dish.

Yet when the E. coli and A. baylyi were mixed together in the initial droplet, both strains flourished and spread across the whole area as the non-motile E. coli hitched a ride on the highly mobile A. baylyi. However, what most surprised researchers were intricate flower-like patterns that emerged in the growing colony over a 24-hour period.

"We were actually mixing these two bacterial species for another project, but one morning I found a mysterious flower-like pattern in a petri dish where a day earlier I placed a droplet of the mixture. The beauty of the pattern struck me, and I began to wonder how bacterial cells could interact with each other to become artists," said Liyang Xiong, Ph.D. '19, who was a graduate student in the Physics Department and is the lead author of the study.

To uncover how the flower patterns were formed, Xiong et al. developed mathematical models that took into account the different physical properties of the two strains, primarily the differences in their growth rate, motility, and effective friction against the agar surface. The theoretical and computational analysis showed that the pattern formation originates at the expanding boundary of the colony, which becomes unstable due to drag exerted by the E. coli that accumulate there.

In areas where there is less E. coli accumulation, there is also less friction, allowing the boundaries to push out faster. In the areas where there is more E. coli accumulation and more friction, the boundaries stagnate. This is what creates the "petals" of the flower.

Further analysis suggests this type of pattern is expected to form when motile bacteria are mixed with a non-motile strain that has a sufficiently higher growth rate and/or effective surface friction, which could have important implications in studying growing biofilms.

Biofilms are communities of microorganisms--including bacteria and fungi--that adhere to each other and to surfaces, creating strong matrices that are difficult to break down. Common examples include dental plaque and pond scum. They also grow in medical devices such as pacemakers and catheters. Learning how non-motile bacteria can "stick" to motile bacteria may provide insight into how biofilms are formed and how they can be eliminated.

"Bacterial pattern formation has been an active area of research in the last few decades," said Lev Tsimring, "However, the majority of laboratory studies and theoretical models were focused on the dynamics of single-strain colonies. Most bacteria in natural habitats live in multi-strain communities, and researchers are finally beginning to look for mechanisms controlling their co-habitation. While a number of biochemical mechanisms of inter-species communication and cooperation have been identified, we found that surprising complexity may result from purely physical interaction mechanisms."

Credit: 
University of California - San Diego

New small molecule to treat Alzheimer's disease and Dravet syndrome

image: Gladstone researchers Jorge Palop and Keran Ma are collaborating with Jesse Hanson from Genentech to develop therapies for Alzheimer's disease and Dravet syndrome.

Image: 
Lauren Bayless, Gladstone Institutes

SAN FRANCISCO, CA January 14, 2020--Gladstone researchers, in collaboration with Genentech, a member of the Roche group, have shown therapeutic efficacy of a new experimental drug in mouse models of Alzheimer's disease and a rare genetic form of epilepsy known as Dravet syndrome. The small molecule increases the activity of a subset of neurotransmitter (NMDA) receptors that are found at synapses, the connection points between neurons. These receptors are known to support cognition and memory by enhancing communication between neurons. The new research shows that enhancing the activity of synaptic NMDA receptors helps restore the brain's rhythms to normal patterns, and improves memory.

"Before now, we haven't had ideal tools to enhance synaptic NMDA receptors," said Gladstone Associate Investigator Jorge Palop, PhD, senior author of the study, which was published in the journal Cell Reports. "Now, the ability to specifically target these receptors opens up a lot of new possibilities for treating cognitive disorders."

"This is the first time we've explored what this type of experimental drug does in animal models," said Jesse Hanson, a scientist at Genentech and lead author of the new paper. "It was very gratifying to see an effect on both the brain's electrical activity and the animals' behavior."

Abnormal activity of NMDA receptors has been long implicated in neuropsychiatric, epileptic, and neurodegenerative disorders. But previous compounds for altering NMDA receptor function worked by binding to all subtypes of NMDA receptors, and either completely blocked the receptors or put them in a permanently active state. Researchers have theorized that modulating the receptors only at active synapses may help diverse cognitive diseases by potentiating synaptic function and increasing neuronal communication.

In 2016, Genentech researchers first reported the development of a new class of experimental drugs that selectively bound to one subtype of NMDA receptors--those found only at the synapses. The new drug was also unique, because rather than directly activating the receptors, it amplified the receptors' signals primarily when engaged by neurotransmitters, the chemicals neurons use to communicate with each other.

"These compounds enhance naturally occurring activity at the synapses, rather than turning the receptors on all the time," said Keran Ma, a staff scientist at Gladstone and a co-first author of the paper. "Thus, active synapses are potentiated in a more physiologically relevant way."

Gladstone and Genentech researchers teamed up to test the effect of one of the new experimental drugs, GNE-0723, on mouse models of Alzheimer's disease and Dravet syndrome. In the new paper, they report that GNE-0723 reduced a type of brain activity called low-frequency oscillations. These oscillations occur naturally even in healthy people, but are more prominent in Alzheimer's disease and Dravet syndrome, and can be associated with epileptic brain activity, which can contribute to impaired cognition and memory loss. When the researchers treated mice simulating Alzheimer's disease or Dravet syndrome with GNE-0723, low-frequency oscillations returned to levels seen in healthy control mice, and epileptic activity ceased.

"What we saw after the treatment were brain-wide changes in neural activity that shift the brain to a more active state that facilitates learning and memory," said Palop, who is also an associate professor of neurology at UC San Francisco.

Indeed, after diseased mice were treated with the experimental drug for several weeks, they performed better in learning and memory tests than untreated animals--they both learned faster and retained memories longer.

Two different types of brain cells--interneurons and excitatory cells--have NMDA receptors, and future studies will address which cell type is responsible for the beneficial effects of GNE-0723.

At Genentech, Hanson also explained that more research is needed to understand how this class of experimental drugs affects brain function. "For now, we're focused on using GNE-0723 as a research tool to learn what happens when you enhance NMDA receptors," Hanson said. "This is a powerful tool to understand both basic biology and disease mechanisms."

Credit: 
Gladstone Institutes

Watching complex molecules at work

image: Rhodopsin before (left) and after activation by light (right): The activation causes changes in functional groups inside the molecule (magnifying glass), which affect the entire molecule.

Image: 
E. Ritter/HZB

Time-resolved infrared spectroscopy in the sub-millisecond range is an important method for studying the relationship between function and structure in biological molecules. However, the method only works if the reaction can be repeated many thousands of times. This is not the case for a large number of biological processes, though, because they often are based on very rapid and irreversible reactions, for example in vision. Individual light quanta entering the rods of the retina activate the rhodopsin protein molecules, which then decay after fulfilling their phototransduction function.

Now a team headed by Dr. Ulrich Schade (HZB) and Dr. Eglof Ritter (Humboldt-Universität zu Berlin) at the IRIS beamline of BESSY II has developed a new instrument that can detect these kinds of very fast and/or irreversible reactions with a single measurement. The time resolution is a few microseconds. The instrument, a Féry spectrometer, uses a highly sensitive detector known as a focal-plane detector array and special optics to make optimal use of the brilliant infrared radiation of the BESSY II synchrotron source. The team used this device to observe activation of rhodopsin under near-in vivo conditions for the first time.

"We used rhodopsin because it irreversibly decays after being excited by light and is therefore a real acid test for the system", explains Ritter, first author of the study. Rhodopsin is a protein molecule that acts as a receptor and is the vision pigment found in the rods of the eye's retina. Even single photons can activate rhodopsin - enabling the eye to perceive extremely low levels of light. Moreover, rhodopsin is the common element in a class of receptors with hundreds of members that are responsible for olfaction, taste, pressure sensation, hormone reception, etc. - all of which function in a similar manner.

The team also studied another exciting protein in the infrared range for the first time: actinorhodopsin. This molecule is able to convert light energy into an electric current - a property that some bacteria use to generate electrochemical energy for their metabolisms.

"The new method enables us to investigate the molecular reaction mechanisms of all irreversible processes (or slow cyclic processes), such as those in the field of energy conversion and storage, for example", emphasised Schade, who heads the IRIS team.

Credit: 
Helmholtz-Zentrum Berlin für Materialien und Energie

Sand mining is threatening lives along the Mekong River

image: Mekong River bank collapse.

Image: 
Steve Darby, University of Southampton

It's a resource used in global construction and mined from rivers and coasts across the world.

Now new research, undertaken as part of a project led by University of Southampton, has shown sand mining is causing river beds to lower, leading to riverbank instability and increasing the likelihood of dangerous river bank collapse, damaging infrastructure and housing and putting lives at risk.

The new research has been published in the journal Nature Sustainability.

Researchers focused on the Mekong River - one of the world's major sand-bedded rivers - in Cambodia.

Dr Chris Hackney at the University of Hull who led the research, said: "With the world currently undergoing rapid population growth and urbanisation, concrete production has grown massively, fuelling unprecedented demand for sand, so much so that sand is now the most consumed resource on the planet, after water"

The research was undertaken as part of a NERC funded project led by Professor Stephen Darby at the University of Southampton, which is studying the impact of climate change on the fluctuation of sediment through the Mekong.

Professor Darby added, "Much of the sand used in the production of concrete comes from the world's big sand-bedded rivers, like the Mekong. There has long been a concern that sand mining from the Mekong is causing serious problems, but our work is the first to provide a comprehensive, rigorous, estimate not only of the rate at which sand is being removed from the system but how this compares to the natural replenishment of sand by river processes, as well as the adverse impacts unsustainable sand mining has on river bank erosion."

In the study, the team, which also included researchers from the Universities of Exeter and Illinois, used sonar surveys to measure how much sand is transported through the Mekong, either in the water column, or on the river bed. The sonar surveys also revealed how much sand is being taken by sand miners; the sonar images show giant holes 42 metres in length and eight metres deep on the river bed as a result of sand being removed from the Mekong. By comparing the natural sand transport rates with the estimates of sand extraction, the team estimated that sand is being removed from the Mekong at a rate that is between five and nine times more than the rate at which sand is replenished by the river's natural sand transport processes.

Using measurements of the shape of the river banks made by a Terrestrial Laser Scanner, the team were then able to analyse the extent to which the lowering of the river bed increases the risk of dangerous river bank collapses.

Dr Julian Leyland of the University of Southampton, who performed the TLS surveys, said that "Our research showed that it only takes two metres of lowering of the river bed to cause many of the river banks along the Mekong to collapse, but we've seen that dredging pits can often exceed eight metres in depth. It's clear that excessive sand mining is responsible for increased rates of bank erosion that local communities have been reporting in recent years."

Dr Hackney warns that without proper regulation, excessive sand mining on the Mekong and other major rivers worldwide could have increasing environmental and social consequences.

He said: "We are seeing the profound effects that excessive sand mining is having on rivers, coasts and seas. We need much stronger regulation of unfettered sand mining to avoid the dangers that river side communities are facing."

Credit: 
University of Southampton

Researchers discover new strategy in the fight against antibiotic resistance

image: The antibacterial substance (top left) prevents the bacteria from cooperating, causing the biofilm to disappear. (Source: KU Leuven - MICA lab).

Image: 
KU Leuven - MICA Lab

Bioscience engineers from KU Leuven in Belgium have developed a new antibacterial strategy that weakens bacteria by preventing them from cooperating. Unlike with antibiotics, there is no resistance to this strategy, because the non-resistant bacteria outnumber resistant ones. The findings are published in Nature Communications.

Traditional antibiotics kill or reduce the activity of individual bacteria. Some bacteria become resistant to these antibiotics, allowing them to grow further and take over from non-resistant ones. The use of antibiotics therefore causes more and more bacteria to become resistant to antibiotics.

Bacteria, however, also exhibit group behaviour: for example, they can make a protective slime layer or biofilm that envelops their entire bacterial community. Dental plaque is an example of such a biofilm. Biofilms are often the source of bacterial infections. The social behaviour of bacteria is an interesting new target for antibacterial therapy.

Stronger together, weaker alone

The researchers showed that blocking slime production of salmonella bacteria weakens the bacterial community, making it easier to remove. They used a chemical, antibacterial substance that was previously developed at KU Leuven. "Without their protective slime layer, the bacteria can be washed away by mechanical forces and killed more easily by antibiotics, disinfectants or the immune system," says Professor Steenackers of the MICA Lab, lead author of the study.

The scientists then compared the development of bacterial resistance to the new substance with that of classical antibiotics in a so-called evolution experiment. Evolution experiments are used to see how microorganisms adapt to a certain situation. "We saw that the bacteria, as a group, did not become resistant to our antibacterial substance, while this did happen with antibiotics, and quickly so," Steenackers explains. "Moreover, we showed those bacteria that were resistant to the new antibacterial substance became outnumbered by non-resistant ones.

A resistant bacterium will still be able to produce slime and share this with the non-resistant bacteria in the group. However, this costs energy, while the non-resistant bacteria benefit from the protection free of charge. As a result, non-resistant bacteria can grow faster than the resistant ones, so that their share compared to the resistant bacteria increases. "In contrast to traditional antibiotics, this substance therefore does not cause selection for, but against resistance. "Antimicrobial treatments that stop bacteria from working together can therefore be a viable solution to the current problem of antibiotic resistance."

Pill or coating

"Our aim is to introduce these new antimicrobials into clinical practice," explains Steenackers. "They can be used as a preventive medicine in the form of a pill, or as a coating on implants to reduce the risk of infections." The substance could also be used together with antibiotics.

Furthermore, there are several applications possible in agriculture, industry, and even our households. To this end, the researchers collaborate with experts in various applications, and with producers of animal feeds and cleaning products and disinfectants. The researchers are also investigating whether they can reproduce the phenomenon in other forms of microbial collaboration next to biofilms, and with other bacteria. "In the long term, this concept can also be used to develop alternatives to antibiotics," concludes Steenackers.

Credit: 
KU Leuven