Culture

Benefits of integrating cover crop with broiler litter in no-till dryland cotton systems

Although most cotton is grown in floodplain soils in the Mississippi Delta region, a large amount of cotton is also grown under no-till systems on upland soils that are vulnerable to erosion and have reduced organic matter. There are much lower levels of cotton residue in these systems, which limits the effectiveness of the no-till approach to improve soil health.

Repeatedly applying broiler litter applications to these systems exposes litter and nutrients to risk of loss, reduced effectiveness as a nutrient source, and yield reduction. In contrast, integrating a cover crop with broiler litter into no-till dryland cotton offers many benefits, including improved soil health indicators and increased plant residue, cotton yield, and infiltration and water storage.

In the webcast "Manure and Cover Crop Management Practices on Dryland No-Till Cotton System in Mississippi," USDA-ARS research soil scientist Ardeshir Adeli provides a basis for farmers and producers who want to adopt cover crop management practices to maintain the fertilizer value of broiler litter and reduce the use of purchased inorganic fertilizers. In doing so, growers can maximize their net returns and protect the environment. This presentation also serves as a guideline for broiler producers and helps agricultural consultants and the Natural Resources Conservation Service develop plans for nutrient management.

This 23-minute presentation is available through the "Focus on Cotton" resource on the Plant Management Network. This resource contains more than 75 webcasts, along with presentations from six conferences, on a broad range of aspects of cotton crop management: agronomic practices, diseases, harvest and ginning, insects, irrigation, nematodes, precision agriculture, soil health and crop fertility, and weeds. These webcasts are available to readers open access (without a subscription).

Credit: 
American Phytopathological Society

Genes controlling mycorrhizal colonization discovered in soybean

image: A University of Illinois/USDA Agricultural Research Service study has identified genes related to mycorrhizal fungus colonization in soybeans.

Image: 
Michelle Pawlowski, University of Illinois

URBANA, Ill. - Like most plants, soybeans pair up with soil fungi in a symbiotic mycorrhizal relationship. In exchange for a bit of sugar, the fungus acts as an extension of the root system to pull in more phosphorus, nitrogen, micronutrients, and water than the plant could on its own.

Mycorrhizal fungi occur naturally in soil and are commercially available as soil inoculants, but new research from the University of Illinois suggests not all soybean genotypes respond the same way to their mycorrhizal relationships.

"In our study, root colonization by one mycorrhizal species differed significantly among genotypes and ranged from 11 to 70%," says Michelle Pawlowski, postdoctoral fellow in the Department of Crop Sciences at Illinois and co-author on a new study in Theoretical and Applied Genetics.

To arrive at that finding, Pawlowski grew 350 diverse soybean genotypes in pots filled with spores of a common mycorrhizal fungus. After six weeks, she looked at the roots under a microscope to evaluate the level of colonization.

"It was a little bit of a gamble because we didn't know much about soybean's relationship with mycorrhizae and did not know if differences in colonization among the soybean genotypes would occur. So when we screened the soybean genotypes and found differences, it was a big relief," Pawlowski says. "That meant there was a potential to find genetic differences, too."

The process of root colonization starts before fungal spores even germinate in the soil. Roots exude chemicals, triggering spores to germinate and grow toward the root. Once the fungus makes contact, there's a complex cascade of reactions in the plant that prevents the usual defensive attack against invading pathogens. Instead, the plant allows the fungus to enter and set up shop inside the root, where it creates tiny tree-like structures known as arbuscules; these are where the fungus and plant trade sugar and nutrients.

The study suggests there is a genetic component to root colonization rates in soybean. To find it, Pawlowski compared the genomes of the 350 genotypes and honed in on six genomic regions associated with differing levels of colonization in soybean.

"We were able to use all the information we have on the soybean genome and gene expression to find possible causal genes within these six regions," she says.

According to the study, the genes control chemical signals and pathways that call fungus toward roots, allow the plant to recognize mycorrhizal fungus as a "good guy," help build arbuscules, and more. "For almost every step in the colonization process, we were finding related genes within those regions," Pawlowski says.

Knowing which genes control root colonization could lead breeders to develop soybean cultivars with a higher affinity for mycorrhizal fungus, which could mean improved nutrient uptake, drought tolerance, and disease resistance.

"This environmentally friendly approach to improving soybean production may also help reduce the overuse of fertilizers and pesticides and promote more holistic crop production systems," says Glen Hartman, plant pathologist in the Department of Crop Sciences and crop pathologist for USDA-ARS.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Story tips: Weather days, grid balance and scaling reactors

image: The microgrid for the Smart Neighborhood in Hoover, Alabama, consists of solar panels and a battery pack and allows homes to disconnect from the main power grid.

Image: 
Southern Company

Energy - Whatever the weather

To better determine the potential energy cost savings among connected homes, researchers at Oak Ridge National Laboratory developed a computer simulation to more accurately compare energy use on similar weather days.

"Since no two weather days are alike, we created a simulated weather identification model that keeps environmental impacts such as temperature changes and sunlight consistent," said ORNL's Supriya Chinthavali. "This will help address the challenge of quantifying energy cost savings, which utility companies and homeowners are most interested in."

The team is analyzing energy use data from a neighborhood-level research platform comprising 62 homes called Smart Neighborhood®, powered by traditional electric grid and microgrid sources.

The goal is to co-optimize energy cost, comfort, environment and reliability by controlling the connected homes' devices - particularly the HVAC and water heater, a home's largest energy consumers.

Future analysis by ORNL, Southern Company and university partners will include potential energy cost savings details.

Media Contact: Sara Shoemaker, 865.576.9219; shoemakerms@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2020-01/alabama%20power%20smart%20neighborhood%20microgrid.jpg

Caption: The microgrid for the Smart Neighborhood in Hoover, Alabama, consists of solar panels and a battery pack and allows homes to disconnect from the main power grid. Credit: Southern Company

Image: https://www.ornl.gov/sites/default/files/2020-01/04.09.TD-SMartHome.jpg

Caption: The Smart Neighborhood in Hoover, Alabama, a 62-home development is connected to a microgrid operated by ORNL's open source controller. The research is sponsored by the DOE Building Technologies Office and supports BTO's Grid-Interactive Efficient Buildings strategy. Credit: Southern Company

Grid - Below-ground balancing

Oak Ridge National Laboratory researchers created a geothermal energy storage system that could reduce peak electricity demand up to 37% in homes while helping balance grid operations.

The system is installed underground and stores excess electricity from renewable resources like solar power as thermal energy through a heat pump. The system comprises underground tanks containing water and phase change materials that absorb and release energy when transitioning between liquid and solid states.

ORNL's design relies on inexpensive materials and is installed at shallow depths to minimize drilling costs. The stored energy can provide hours of heating in the winter or cooling in the summer, shaving peak demand and helping homeowners avoid buying electricity at peak rates.

"Shifting demand during peak times can help utilities better manage their loads while saving consumers money and encouraging greater use of renewable energy," said ORNL's Xiaobing Liu.

The team published results of the system's performance from a simulation.

Media Contact: Stephanie Seay, 865.576.9894; seaysg@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2020-01/Geothermal_graphic.jpg

Caption: ORNL researchers have developed a system that stores electricity as thermal energy in underground tanks, allowing homeowners to reduce their electricity purchases during peak periods while helping balance the power grid. Credit: Andy Sproles/Oak Ridge National Laboratory, U.S. Dept. of Energy

Reactors - Quality codes to scale

Nuclear scientists at Oak Ridge National Laboratory have established a Nuclear Quality Assurance-1 program for a software product designed to simulate today's commercial nuclear reactors - removing a significant barrier for industry adoption of the technology.

The suite of tools called VERA, the Virtual Environment for Reactor Applications developed by the Consortium for Advanced Simulation of Light Water Reactors, or CASL, can be used to solve various challenges in nuclear reactor operations and consists of several physics codes related to neutron transport, thermal hydraulics, fuel performance and coolant chemistry.

The goals of the continued work in improving the simulation software are to help industry by boosting the power output from existing reactors and to improve designs and confidence in current and future reactors.

ORNL's Shane Stimpson co-leads MPACT, the component of VERA responsible for modeling power distribution throughout the reactor core.

"When developing these codes, we're listening to industry's needs to provide reactor simulations with broader appeal and value," he said.

Media Contact: Sara Shoemaker, 865.576.9219; shoemakerms@ornl.gov

Image: https://www.ornl.gov/sites/default/files/2020-01/VERA-NQA1.png

Caption: Oak Ridge National Laboratory has established a quality assurance program as part of an effort to highlight software simulation product quality and make the codes more useful to industry. Credit: Benjamin Collins/Oak Ridge National Laboratory, U.S. Dept. of Energy

Credit: 
DOE/Oak Ridge National Laboratory

Progesterone from an unexpected source may affect miscarriage risk

About twenty percent of confirmed pregnancies end in miscarriage, most often in the first trimester, for reasons ranging from infection to chromosomal abnormality. But some women have recurrent miscarriages, a painful process that points to underlying issues. Clinical studies have been uneven, but some evidence shows that for women with a history of recurrent miscarriage, taking progesterone early in a pregnancy might moderately improve these women's chances of carrying a pregnancy to term.

A recent study in the Journal of Lipid Research sheds some light on a new facet of progesterone signaling between maternal and embryonic tissue, and hints at a preliminary link between disruptions to this signaling and recurrent miscarriage.

Progesterone plays an important role in embedding the placenta into the endometrium, the lining of the uterus. The hormone is key for thickening the endometrium, reorganizing blood flow to supply the uterus with oxygen and nutrients, and suppressing the maternal immune system.

Progesterone is made in the ovary as a normal part of the menstrual cycle, and at first, this continues after fertilization. About six weeks into pregnancy, the placenta takes over making progesterone, a critical handoff. (The placenta also makes other hormones, including human chorionic gonadotropin, which is detected in a pregnancy test.) Placental progesterone comes mostly from surface tissue organized into fingerlike projections that integrate into the endometrium and absorb nutrients. Some cells leave those projections and migrate into the endometrium, where they help to direct the reorganization of arteries.

Using cells from terminated pregnancies, Austrian researchers led by Sigrid Vondra and supervised by Jürgen Pollheimer and Clemens Röhrl compared the cells that stay on the placenta's surface with those that migrate into the endometrium. They discovered that the enzymes responsible for progesterone production differ between the two cell types early in pregnancy.

As a steroid hormone, progesterone is derived from cholesterol. Although the overall production of progesterone appears to be about the same in migratory and surface cells, migratory cells accumulate more cholesterol and express more of a key enzyme for converting cholesterol to progesterone. Among women who have had recurrent miscarriages, that enzyme is lower in migratory cells from the placenta compared to women with healthy pregnancies. In contrast, levels of the enzyme don't differ between healthy and miscarried pregnancies in cells from the surface of the placenta.

The team's findings suggest that production of progesterone by the migratory cells may have a specific and necessary role in early pregnancy and that disruption to that process could be linked to miscarriage.

Credit: 
American Society for Biochemistry and Molecular Biology

Ghost worms mostly unchanged since the age of dinosaurs

image: Upper specimen: Stygocapitella josemariobrancoi from a beach close to Plymouth, UK: Lower specimen: Stygocapitella furcata from the 4th of July beach on San Juan island, WA, USA

Image: 
José Cerca, Christian Meyer, Günter Purschke, Torsten H. Struck

That size, shape and structure of organisms can evolve at different speeds is well known, ranging from fast-evolving adaptive radiations to living fossils such as cichlids or coelacanths, respectively.

A team lead by biologists at the Natural History Museum (University of Oslo) has uncovered a group of species in which change in appearance seems to have been brought to a complete halt.

The tiny annelid worms belonging to the genus Stygocapitella live in sandy beaches around the world. In their 275-million-year-old history the worms have evolved ten distinct species.

But what makes the group stand out is its presence of only four different appearances, or morphotypes. Such absence of morphological change has lately proven to be a common feature of many so-called cryptic species complexes, for example, in mammals, snails, crustaceans or jellyfishes.

- Cryptic species are species which have already been distinct species for a substantial amount of time, but have accumulated very little or no morphological differences.

- Such species can help us understand how evolution proceeds in the absence of morphological evolution, and which factors might be important in these cases, explains professor Torsten Struck at the Natural History Museum (University of Oslo)

Two of the Stygocapitella species that were investigated split apart at the same time when the Stegosaurus and Brachiosaurus roamed about.

But despite 140 million years of evolution, these ghost worms today look almost exactly the same. However, looks may be deceiving. Molecular investigations reveal that they are highly genetically distinct, and considered reproductively isolated species.

In comparison to other cryptic-species complexes that are separated by a maximum of a couple million years, the time span in this complex is ten times longer, which makes the lack of change in ghost worms extreme.

- These species can also be studied to understand how species respond to extreme ecological changes in the long run. Some of these morphotypes have experienced the much warmer conditions of the Cretaceous as well as the changing intervals of the ice ages, says Struck.

What makes the case of Stygocapitella particularly puzzling is that closely related taxa seem to be evolving morphotypes significantly faster. The findings therefore highlight that evolutionary change in appearance should be viewed as a continuum, ranging from accelerated to decelerated, and where the investigated worms stand out as one of the more extreme cases of the latter. The study also points out that species formation is not necessarily accompanied by morphological changes.

The researchers suggest that lack of morphological change may be linked to the worms having adapted to an environment that has changed little over time.

- Beaches have always been around and had the same composition then as now. We suspect these worms have remained in the same environment for millions and millions of years and they are well adapted. We suspect they have become good in moving around, but not having changed much, explains first author of the study PhD Fellow José Cerca.

- Alternatively, it has been suggested that populations regularly crash to only a few surviving individuals, and newly evolved characters get eliminated in the course of these events. Finally, besides or instead of the environment their development may constrain their evolution.

However, the reasons for the slow rate rare as of yet inconclusive in the current study, and remains to be tested by the group in the future.

Credit: 
Natural History Museum, University of Oslo

New study unravels the complexity of childhood obesity

image: Nitesh Chawla, the Frank M. Freimann Professor of Computer Science and Engineering at Notre Dame, director of the Center for Network and Data Science and a lead author of the study.

Image: 
University of Notre Dame

The World Health Organization has estimated more than 340 million children and adolescents ages 5-19 are overweight or obese, and the epidemic has been linked to more deaths worldwide than those caused by being underweight.

The Centers for Disease Control recently reported an estimated 1 in 5 children in the United States, ages 12-18, are living with prediabetes -- increasing their risk of developing type 2 diabetes as well as chronic kidney disease, heart disease and stroke.

Efforts to stem the crisis have led clinicians and health professionals to examine both the nutritional and psychological factors of childhood obesity. In a new study led by the University of Notre Dame, researchers examined how various psychological characteristics of children struggling with their weight, such as loneliness, anxiety and shyness, combined with similar characteristics of their parents or guardians and family dynamics affect outcomes of nutritional intervention.

What they found was a "network effect," suggesting a personalized, comprehensive approach to treatment could improve results of nutritional interventions.

"Psychological characteristics clearly have interactional effects," said Nitesh Chawla, the Frank M. Freimann Professor of Computer Science and Engineering at Notre Dame, director of the Center for Network and Data Science and a lead author of the study. "We can no longer simply view them as individualized risk factors to be assessed. We need to account for the specific characteristics for each child, viewing them as a holistic set for which to plan treatment."

The Notre Dame team collaborated with the Centre for Nutritional Recovery and Education (CREN), a not-for-profit, nongovernmental nutritional clinic in São Paulo, Brazil, where patients participate in a two-year interdisciplinary treatment program including family counseling, nutritional workshops and various physical activities. Researchers analyzed the medical records and psychological assessments of 1,541 children who participated in the program.

The study's key takeaway points to the significant impact parents and guardians have on their child's health when it comes to nutrition. Strong family dynamics, such as concern for behavior and treatment and a sense of protectiveness for the child, led to improved outcomes of nutritional interventions. A lack of authority, however, led to minimal changes in results.

"This is quantitative evidence of the success and failure of interactions as they relate to the characteristics and interactions between the child and the parent or guardian," Chawla said.

The study also highlights the need for clinics to expand their views on patient populations. For example, while treatment programs that incorporate development of interpersonal relationship -- familial and otherwise -- may improve outcomes of nutritional interventions, the same treatment plan may not have the same result for children experiencing loneliness coupled with anxiety.

"For the group without anxiety, this makes sense when you consider a treatment plan focused on strengthening a child's social circle and address issues stemming from loneliness, such as poor social network, bullying or self-imposed isolation," said Gisela M.B. Solymos, co-author of the study, former general manager of CREN and former guest scholar at the Kellogg Institute for International Studies at Notre Dame and at the Center for Network and Data Science. "But patients feeling loneliness and anxiety actually showed minimal changes to nutritional interventions, and may be more likely to benefit from additional services at clinics like CREN."

Co-authors of the study include Keith Feldman, also at Notre Dame, and Maria Paula Albuquerque at CREN.

The National Science Foundation partially funded the study.

Credit: 
University of Notre Dame

Ooh là là! Music evokes 13 key emotions. Scientists have mapped them

image: Scientists mapped music samples according to the 13 key emotions triggered in more than 2,500 people in the United States and China when they listened to the audio clips.

Image: 
Graphic by Alan Cowen

The "Star-Spangled Banner" stirs pride. Ed Sheeran's "The Shape of You" sparks joy. And "ooh là là!" best sums up the seductive power of George Michael's "Careless Whispers."

Scientists at the University of California, Berkeley, have surveyed more than 2,500 people in the United States and China about their emotional responses to these and thousands of other songs from genres including rock, folk, jazz, classical, marching band, experimental and heavy metal.

The upshot? The subjective experience of music across cultures can be mapped within at least 13 overarching feelings: Amusement, joy, eroticism, beauty, relaxation, sadness, dreaminess, triumph, anxiety, scariness, annoyance, defiance, and feeling pumped up.

"Imagine organizing a massively eclectic music library by emotion and capturing the combination of feelings associated with each track. That's essentially what our study has done," said study lead author Alan Cowen, a UC Berkeley doctoral student in neuroscience.

The findings are set to appear this week in the journal Proceedings of the National Academy of Sciences.

"We have rigorously documented the largest array of emotions that are universally felt through the language of music," said study senior author Dacher Keltner, a UC Berkeley professor of psychology.

Cowen translated the data into an interactive audio map, where visitors can move their cursors to listen to any of thousands of music snippets to find out, among other things, if their emotional reactions match how people from different cultures respond to the music.

Potential applications for these research findings range from informing psychological and psychiatric therapies designed to evoke certain feelings to helping music streaming services like Spotify adjust their algorithms to satisfy their customers' audio cravings or set the mood.

While both U.S. and Chinese study participants identified similar emotions -- such as feeling fear hearing the "Jaws" movie score -- they differed on whether those emotions made them feel good or bad.

"People from different cultures can agree that a song is angry, but can differ on whether that feeling is positive or negative," said Cowen, noting that positive and negative values, known in psychology parlance as "valence," are more culture-specific.

Furthermore, across cultures, study participants mostly agreed on general emotional characterizations of musical sounds, such as angry, joyful and annoying. But their opinions varied on the level of "arousal," which refers in the study to the degree of calmness or stimulation evoked by a piece of music.

For the study, more than 2,500 people in the United States and China were recruited via Amazon Mechanical Turk's crowdsourcing platform.

First, volunteers scanned thousands of videos on YouTube for music evoking a variety of emotions. From those, the researchers built a collection of audio clips to use in their experiments.

Next, nearly 2,000 study participants in the United States and China each rated some 40 music samples based on 28 different categories of emotion, as well as on a scale of positivity and negativity, and for levels of arousal.

Using statistical analyses, the researchers arrived at 13 overall categories of experience that were preserved across cultures and found to correspond to specific feelings, such as being "depressing" or "dreamy."

To ensure the accuracy of these findings in a second experiment, nearly 1,000 people from the United States and China rated over 300 additional Western and traditional Chinese music samples that were specifically intended to evoke variations in valence and arousal. Their responses validated the 13 categories.

Vivaldi's "Four Seasons" made people feel energized. The Clash's "Rock the Casbah" pumped them up. Al Green's "Let's Stay Together" evoked sensuality and Israel Kamakawiwo?ole's "Somewhere over the Rainbow" elicited joy.

Meanwhile, heavy metal was widely viewed as defiant and, just as its composer intended, the shower scene score from the movie "Psycho" triggered fear.

Researchers acknowledge that some of these associations may be based on the context in which the study participants had previously heard a certain piece of music, such as in a movie or YouTube video. But this is less likely the case with traditional Chinese music, with which the findings were validated.

Cowen and Keltner previously conducted a study in which they identified 27 emotions in response to evocative YouTube video clips. For Cowen, who comes from a family of musicians, studying the emotional effects of music seemed like the next logical step.

"Music is a universal language, but we don't always pay enough attention to what it's saying and how it's being understood," Cowen said. "We wanted to take an important first step toward solving the mystery of how music can evoke so many nuanced emotions."

Credit: 
University of California - Berkeley

Biomarker predicts which patients with heart failure have a higher risk of dying

image: A UCLA-led study revealed a new way to predict which patients with 'stable' heart failure -- those who have heart injury but do not require hospitalization -- have a higher risk of dying within one to three years.

Image: 
DEAN ISHIDA

A UCLA-led study revealed a new way to predict which patients with "stable" heart failure -- those who have heart injury but do not require hospitalization -- have a higher risk of dying within one to three years.

Although people with stable heart failure have similar characteristics, some have rapid disease progression while others remain stable. The research shows that patients who have higher levels of neuropeptide Y, a molecule released by the nervous system, are 10 times more likely to die within one to three years than those with lower levels of neuropeptides.

About half of people who develop heart failure die within five years of their diagnosis, according to an American Heart Association report, but it hasn't been understood why some live longer than others despite receiving the same medications and medical device therapy.

The researchers set out to determine whether a biomarker of the nervous system could help explain the difference.

To date, no other biomarker has been identified that can so specifically predict the risk of death for people with stable heart failure.

The researchers analyzed blood from 105 patients with stable heart failure, searching for a distinct biomarker in the blood that could predict how likely a person would be to die within a few years. They found that neuropeptide Y levels were the clearest and most significant predictor.

The scientists also compared nerve tissue samples from patients with samples from healthy donors and determined that the neurons in the people who were at most at risk for dying from heart failure were likely releasing higher levels of neuropeptides.

The results could give scientists a way to distinguish very-high-risk patients with stable heart failure from others with the same condition, which could inform which patients might require more aggressive and targeted therapies. The study also highlights the need for heart failure therapies that target the nervous system.

Further studies could help determine whether a patient's risk for death can be ascertained through less invasive measures, such as a simple blood draw, and whether early aggressive intervention in these people could reduce their risk of death.

Credit: 
University of California - Los Angeles Health Sciences

Some genetic sequencing fails to analyze large segments of DNA

image: A re-analysis of clinical tests from three major U.S. laboratories showed whole exome sequencing routinely failed to adequately analyze large segments of DNA. UT Southwestern experts who conducted the review say the findings are indicative of a widespread issue for clinical laboratories.

Image: 
UTSW

Highlights:

Reanalysis of patient samples from 3 U.S. labs shows most tests didn't adequately analyze more than a quarter of genes.

Chance of detecting a disorder varied widely depending on which genes the lab completely analyzed in a given sample.

DALLAS - Jan. 6, 2020 - Children who undergo expansive genetic sequencing may not be getting the thorough DNA analysis their parents were expecting, say experts at UT Southwestern Medical Center.

A review of clinical tests from three major U.S. laboratories shows whole exome sequencing routinely fails to adequately analyze large segments of DNA, a potentially critical deficiency that can prevent doctors from accurately diagnosing potential genetic disorders, from epilepsy to cancer.

The reanalysis by UT Southwestern shows each lab on average adequately examined less than three-quarters of the genes - 34, 66, and 69 percent coverage - and had startlingly wide gaps in their ability to detect specific disorders.

Researchers say they conducted the study because they believe vast differences in testing quality are endemic in clinical genetic sequencing but have not been well documented or shared with clinicians.

"Many of the physicians who order these tests don't know this is happening," says Jason Park, M.D., Ph.D., associate professor of pathology at UT Southwestern. "Many of their patients are young kids with neurological disorders, and they want to get the most complete diagnostic test. But they don't realize whole exome sequencing may miss something that a more targeted genetic test would find."

Whole exome sequencing, a technique for analyzing protein-producing genes, is increasingly used in health care to identify genetic mutations that cause disease - mostly in children but also in adults with rare or undiagnosed diseases. However, Park says the process of fully analyzing the approximately 18,000 genes in an exome is inherently difficult and prone to oversights. About half the tests do not pinpoint a mutation.

The new study published in Clinical Chemistry gives insight into why some analyses may be coming back negative.

Researchers re-analyzed 36 patients' exome tests conducted between 2012 and 2016 - 12 from each of the three national clinical laboratories - and found starkly contrasting results and inconsistency with which genes were completely analyzed. A gene was not considered completely analyzed unless the lab met an industry-accepted threshold for adequate analysis of all DNA that encodes protein, which is defined as sequencing that segment at least 20 times per test.

Notably, less than 1.5 percent of the genes were completely analyzed in all 36 samples. A review of one lab's tests showed 28 percent of the genes were never adequately examined and only 5 percent were always covered. Another lab consistently covered 27 percent of the genes.

"And things really start to fall apart when you start thinking about using these tests to rule out a disease," Park says. "A negative exome result is meaningless when so many of the genes are not thoroughly analyzed."

For example, the chances of detecting an epileptic disorder from any of the 36 tests varied widely depending on which genes were analyzed. One lab conducted several patient tests that fully examined more than three quarters of the genes associated with epilepsy, but the same lab had three other patient samples in which less than 40 percent were completely analyzed.

Three tests from another lab came in at under 20 percent.

"When we saw this data we made it a regular practice to ask the labs about coverage of specific genes," says Garrett Gotway, M.D., Ph.D., a clinical geneticist at UT Southwestern who is the corresponding author of the study. "I don't think you can expect complete coverage of 18,000 genes every time, but it's fair to expect 90 percent or more."

The findings build upon previous research that showed similar gaps and disparities in whole genome sequencing, a technique that examines all types of genes, regardless of whether they produce proteins.

Gotway says he hopes the findings will prompt more physicians to ask labs about which genes were covered and push for improved consistency in testing quality. He also encourages physicians - even before ordering the test - to consider whether whole exome sequencing is the best approach for the patient.

"Clinical exomes can be helpful in complex cases, but you probably don't need one if a kid has epilepsy and doesn't have other complicating clinical problems," Gotway says. "There's a decent chance the exome test will come back negative and the parents are still left wondering about the genetic basis for their child's disease."

In those cases, Gotway suggests ordering a smaller genetic test that completely analyzes a panel of genes associated with that disease. He says they're less expensive and just as likely to help physicians find answers.

Credit: 
UT Southwestern Medical Center

Fast action and the right resources are key to treating fulminant myocarditis

DALLAS, Jan. 6, 2019 -- The resources needed to treat fulminant myocarditis - severe, inflammation of the heart that develops rapidly - are outlined in a new Scientific Statement (Statement) from the American Heart Association on how best to reduce fatalities from this rare condition. The Statement is published today in the Association's premier cardiovascular journal Circulation.

Fulminant myocarditis, often caused by a viral infection, comes on suddenly and often with significant severity, resulting in an exceptionally high risk of death caused by cardiogenic shock (the heart's inability to pump enough blood), fatal arrhythmias (abnormal heartbeats) and multiorgan failure .

With many of today's technology advances, numerous devices can fully support a patient's circulation and oxygenation/ventilation when necessary. The early recognition of fulminant myocarditis, institution of circulatory support and maintenance of end-organ function (especially avoiding prolonged neurologic hypoxemia) can result in favorable outcomes for this previously almost universally fatal condition .

The new Statement details increasing awareness and education of fulminant myocarditis among health care providers to speed evaluation, diagnosis and treatment. Treatment options for optimal outcomes include supporting patients through the use of extracorporeal life support (heart lung machine), percutaneous and durable ventricular assist devices (devices to help the heart pump) and heart transplantation.

"It is fortunate that fulminant myocarditis is rare and that it usually presents in typically younger and healthier patients, rather than critically ill patients seen in the office or emergency department," said Leslie T. Cooper, M.D., FAHA, vice chair of the Statement Writing Group. "This is where there are the greatest opportunities: early diagnosis, rapid treatment and the ability of frontline clinicians to detect the subtle signs and symptoms of this serious condition."

The Statement has been endorsed by the Heart Failure Society of America and the Myocarditis Foundation.

Credit: 
American Heart Association

Scientists develop new method to detect oxygen on exoplanets

image: Conceptual image of water-bearing (left) and dry (right) exoplanets with oxygen-rich atmospheres. Crescents are other planets in the system, and the red sphere is the M-dwarf star around which the exoplanets orbit. The dry exoplanet is closer to the star, so the star appears larger.

Image: 
(NASA/GSFC/Friedlander-Griswold)

Scientists have developed a new method for detecting oxygen in exoplanet atmospheres that may accelerate the search for life.

One possible indication of life, or biosignature, is the presence of oxygen in an exoplanet's atmosphere. Oxygen is generated by life on Earth when organisms such as plants, algae, and cyanobacteria use photosynthesis to convert sunlight into chemical energy.

UC Riverside helped develop the new technique, which will use NASA's James Webb Space Telescope to detect a strong signal that oxygen molecules produce when they collide. This signal could help scientists distinguish between living and nonliving planets.

Since exoplanets, which orbit stars other than our sun, are so far away, scientists cannot look for signs of life by visiting these distant worlds. Instead, they must use a cutting-edge telescope like Webb to see what's inside the atmospheres of exoplanets.

"Before our work, oxygen at similar levels as on Earth was thought to be undetectable with Webb," said Thomas Fauchez of NASA's Goddard Space Flight Center and lead author of the study. "This oxygen signal is known since the early 1980s from Earth's atmospheric studies but has never been studied for exoplanet research."

UC Riverside astrobiologist Edward Schwieterman originally proposed a similar way of detecting high concentrations of oxygen from nonliving processes and was a member of the team that developed this technique. Their work was published today in the journal Nature Astronomy.

"Oxygen is one of the most exciting molecules to detect because of its link with life, but we don't know if life is the only cause of oxygen in an atmosphere," Schwieterman said. "This technique will allow us to find oxygen in planets both living and dead."

When oxygen molecules collide with each other, they block parts of the infrared light spectrum from being seen by a telescope. By examining patterns in that light, they can determine the composition of the planet's atmosphere.

Schwieterman helped the NASA team calculate how much light would be blocked by these oxygen collisions.

Intriguingly, some researchers propose oxygen can also make an exoplanet appear to host life when it does not, because it can accumulate in a planet's atmosphere without any life activity at all.

If an exoplanet is too close to its host star or receives too much star light, the atmosphere becomes very warm and saturated with water vapor from evaporating oceans. This water could then be broken down by strong ultraviolet radiation into atomic hydrogen and oxygen. Hydrogen, which is a light atom, escapes to space very easily, leaving the oxygen behind.

Over time, this process may cause entire oceans to be lost while building up a thick oxygen atmosphere -- more even, than could be made by life. So, abundant oxygen in an exoplanet's atmosphere may not necessarily mean abundant life but may instead indicate a history of water loss.

Schwieterman cautions that astronomers are not yet sure how widespread this process may be on exoplanets.

"It is important to know whether and how much dead planets generate atmospheric oxygen, so that we can better recognize when a planet is alive or not," he said.

Schwieterman is a visiting postdoctoral fellow at UCR who will soon start as assistant professor of astrobiology in the Department of Earth and Planetary Sciences.

The research received funding from Goddard's Sellers Exoplanet Environments Collaboration, which is funded in part by the NASA Planetary Science Division's Internal Scientist Funding Model. This project has also received funding from the European Union's Horizon 2020 research and innovation program under the Marie Sklodowska-Curie Grant, the NASA Astrobiology Institute Alternative Earths team, and the NExSS Virtual Planetary Laboratory.

Webb will be the world's premier space science observatory when it launches in 2021. It will allow scientists to solve mysteries in our solar system, look to distant worlds around other stars, and probe the mysterious structures and origins of our universe and our place in it.

Credit: 
University of California - Riverside

Finding a new way to fight late-stage sepsis

COLUMBUS, Ohio - Researchers have developed a way to prop up a struggling immune system to enable its fight against sepsis, a deadly condition resulting from the body's extreme reaction to infection.

The scientists used nanotechnology to transform donated healthy immune cells into a drug with enhanced power to kill bacteria.

In experiments treating mice with sepsis, the engineered immune cells eliminated bacteria in blood and major organs, dramatically improving survival rates.

This work focuses on a treatment for late-stage sepsis, when the immune system is compromised and unable to clear invading bacteria. The scientists are collaborating with clinicians specializing in sepsis treatment to accelerate the drug-development process.

"Sepsis remains the leading cause of death in hospitals. There hasn't been an effective treatment for late-stage sepsis for a long time. We're thinking this cell therapy can help patients who get to the late stage of sepsis," said Yizhou Dong, senior author and associate professor of pharmaceutics and pharmacology at The Ohio State University. "For translation in the clinic, we believe this could be used in combination with current intensive-care treatment for sepsis patients."

The study is published today (Jan. 6, 2020) in Nature Nanotechnology.

Sepsis itself is not an infection - it's a life-threatening systemic response to infection that can lead to tissue damage, organ failure and death, according to The Centers for Disease Control and Prevention. The CDC estimates that 1.7 million adults in the United States develop sepsis each year, and one in three patients who die in a hospital have sepsis.

This work combined two primary types of technology: using vitamins as the main component in making lipid nanoparticles, and using those nanoparticles to capitalize on natural cell processes in the creation of a new antibacterial drug.

Cells called macrophages are one of the first responders in the immune system, with the job of "eating" invading pathogens. However, in patients with sepsis, the number of macrophages and other immune cells are lower than normal and they don't function as they should.

In this study, Dong and colleagues collected monocytes from the bone marrow of healthy mice and cultured them in conditions that transformed them into macrophages. (Monocytes are white blood cells that are able to differentiate into other types of immune cells.)

The lab also developed vitamin-based nanoparticles that were especially good at delivering messenger RNA, molecules that translate genetic information into functional proteins.

The scientists, who specialize in messenger RNA for therapeutic purposes, constructed a messenger RNA encoding an antimicrobial peptide and a signal protein. The signal protein enabled the specific accumulation of the antimicrobial peptide in internal macrophage structures called lysosomes, the key location for bacteria-killing activities.

From here, researchers delivered the nanoparticles loaded with that messenger RNA into the macrophages they had produced with donor monocytes, and let the cells take it from there to "manufacture" a new therapy.

"Macrophages have antibacterial activity naturally. So if we add the additional antibacterial peptide into the cell, those antibacterial peptides can further enhance the antibacterial activity and help the whole macrophage clear bacteria," Dong said.

After seeing promising results in cell tests, the researchers administered the cell therapy to mice. The mouse models of sepsis in this study were infected with multidrug-resistant Staphylococcus aureus and E. coli and their immune systems were suppressed.

Each treatment consisted of about 4 million engineered macrophages. Controls for comparison included ordinary macrophages and a placebo. Compared to controls, the treatment resulted in a significant reduction in bacteria in the blood after 24 hours - and for those with lingering bacteria in the blood, a second treatment cleared them away.

Dong considers the lipid nanoparticle delivery of messenger RNA into certain kinds of immune cells applicable to other diseases, and his lab is currently working on development of cancer immunotherapy using this technology.

Credit: 
Ohio State University

New imaging system and artificial intelligence algorithm accurately identify brain tumors

image: Stimulated Raman histologic images of diffuse astrocytoma (left) and meningioma (right).

Image: 
Daniel Orringer, NYU Langone Health

A novel method of combining advanced optical imaging with an artificial intelligence algorithm produces accurate, real-time intraoperative diagnosis of brain tumors, a new study finds.

Published in Nature Medicine on January 6, the study examined the diagnostic accuracy of brain tumor image classification through machine learning, compared with the accuracy of pathologist interpretation of conventional histologic images. The results for both methods were comparable: the AI-based diagnosis was 94.6% accurate, compared with 93.9% for the pathologist-based interpretation.

The imaging technique, stimulated Raman histology (SRH), reveals tumor infiltration in human tissue by collecting scattered laser light, illuminating essential features not typically seen in standard histologic images.

The microscopic images are then processed and analyzed with artificial intelligence, and in under two and a half minutes, surgeons are able to see a predicted brain tumor diagnosis. Using the same technology, after the resection, they are able to accurately detect and remove otherwise undetectable tumor.

"As surgeons, we're limited to acting on what we can see; this technology allows us to see what would otherwise be invisible, to improve speed and accuracy in the OR, and reduce the risk of misdiagnosis," says senior author Daniel A. Orringer, MD, associate professor of Neurosurgery at NYU Grossman School of Medicine, who helped develop SRH and co-led the study with colleagues at the University of Michigan. "With this imaging technology, cancer operations are safer and more effective than ever before."

How the Study Was Conducted

To build the artificial intelligence tool used in the study, researchers trained a deep convolutional neural network (CNN) with more than 2.5 million samples from 415 patients to classify tissue into 13 histologic categories that represent the most common brain tumors, including malignant glioma, lymphoma, metastatic tumors, and meningioma.

In order to validate the CNN, researchers enrolled 278 patients undergoing brain tumor resection or epilepsy surgery at three university medical centers in the prospective clinical trial. Brain tumor specimens were biopsied from patients, split intraoperatively into sister specimens, and randomly assigned to the control or experimental arm.

Specimens routed through the control arm--the current standard practice--were transported to a pathology laboratory and went through specimen processing, slide preparation by technicians, and interpretation by pathologists, a process which takes 20-30 minutes. The experimental arm was performed intraoperatively, from image acquisition and processing to diagnostic prediction via CNN.

Notably, the diagnostic errors in the experimental group were unique from the errors in the control group, suggesting that a pathologist using the novel technique could achieve close to 100% accuracy. The system's precise diagnostic capacity could also be beneficial to centers that lack access to expert neuropathologists.

"SRH will revolutionize the field of neuropathology by improving decision-making during surgery and providing expert-level assessment in the hospitals where trained neuropathologists are not available," says Matija Snuderl, MD, associate professor in the Department of Pathology at NYU Grossman School of Medicine and a co-author of the study.

NYU Langone's Brain and Spine Tumor Center Offers Cutting-Edge Treatment

Dr. Orringer joined NYU Langone in August 2019, bringing with him the SRH technology he helped to develop. NYU Langone's Brain and Spine Tumor Center is the first to offer this technique, using Invenio's NIO Laser Imaging System, in the Northeast.

The newest addition to the center's comprehensive suite of neurosurgical imaging technologies, SRH works in concert with intraoperative MRI and fluorescence-guided surgery to provide high-resolution precision guidance for NYU Langone's world-class neurosurgeons.

"NYU Langone's Department of Neurosurgery has long been a leader in bringing the most advanced treatment options to our patients," says John G. Golfinos, MD, Joseph P. Ransohoff Professor of neurology and chair of the Department of Neurosurgery. "With the addition of Dr. Orringer's expertise and this game-changing technology, we're now even better equipped to provide safe surgeries and quality outcomes for the most complex brain tumor cases."

The implementation of this new system is the most recent of NYU Langone's efforts to integrate artificial intelligence in clinical practice to improve diagnostics of cancer. Researchers and clinicians at NYU Langone's Perlmutter Cancer Center have made recent strides in lung cancer, breast cancer, and brain tumor.

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Over-hunting walruses contributed to the collapse of Norse Greenland, study suggests

image: Church ruins from Norse Greenland's Eastern Settlement.

Image: 
James H. Barrett

The mysterious disappearance of Greenland's Norse colonies sometime in the 15th century may have been down to the overexploitation of walrus populations for their tusks, according to a study of medieval artefacts from across Europe.

Founded by Erik the Red around 985AD after his exile from Iceland (or so the Sagas tell us), Norse communities in Greenland thrived for centuries - even gaining a bishop - before vanishing in the 1400s, leaving only ruins.

Latest research from the universities of Cambridge, Oslo and Trondheim has found that, for hundreds of years, almost all ivory traded across Europe came from walruses hunted in seas only accessible via Norse settlements in south-western Greenland.

Walrus ivory was a valuable medieval commodity, used to carve luxury items such as ornate crucifixes or pieces for games like chess and Viking favourite hnefatafl. The famous Lewis chessmen are made of walrus tusk.

However, the study also indicates that, as time wore on, the ivory came from smaller animals, often female; with genetic and archaeological evidence suggesting they were sourced from ever farther north - meaning longer and more treacherous hunting voyages for less reward.

Increasingly globalised trade saw elephant ivory flood European markets in the 13th century, and fashions changed. There is little evidence of walrus ivory imports to mainland Europe after 1400.

Dr James H. Barrett, from the University of Cambridge's Department of Archaeology, argues that the Norse abandonment of Greenland may have been precipitated by a "perfect storm" of depleted resources and volatile prices, exacerbated by climate change.

"Norse Greenlanders needed to trade with Europe for iron and timber, and had mainly walrus products to export in exchange," said Barrett, lead author of the study published in Quaternary Science Reviews.

"We suspect that decreasing values of walrus ivory in Europe meant more and more tusks were harvested to keep the Greenland colonies economically viable."

"Mass hunting can end the use of traditional haul-out sites by walruses. Our findings suggest that Norse hunters were forced to venture deeper into the Arctic Circle for increasingly meagre ivory harvests. This would have exacerbated the decline of walrus populations, and consequently those sustained by the walrus trade."

Other theories for collapse of the colonies have included climate change - the "Little Ice Age", a sustained period of lower temperatures, began in the 14th century - as well as unsustainable farming methods and even the Black Death.

"An overreliance on walrus ivory was not the only factor in Norse Greenland's demise. However, if both the population and price of walrus started to tumble, it must have badly undermined the resilience of the settlements," says co-author Bastiaan Star of the University of Oslo. "Our study suggests the writing was on the wall."

Analysis using carved artefacts would risk damage, so researchers examined pieces of "rostrum": the walrus skull and snout to which tusks remained attached during shipment, creating a protective "package" that got broken up in the ivory workshops of medieval trading centres such as Dublin, Trondheim and Bergen.

In total, the team studied 67 rostra taken from sites across Europe, dating between the 11th and 15th century. Ancient DNA (25 samples) and stable isotopes (31 samples) extracted from samples of bone, as well as tusk socket size, provided clues to the animals' sex and origins.

The stable isotope analysis was conducted by Cambridge's Dorothy Garrod Laboratory for Isotopic Analysis, and the DNA analysis by Oslo's Department of Biosciences.

The researchers also studied traces of "manufacturing techniques" - changing styles of butchery and skull preparation - to help place the walrus remains in history.

While impossible to determine exact provenance, the researchers detected a shift in European walrus finds around the 13th century to walruses from an evolutionary branch most prevalent in the waters around Baffin Bay.

These animals must have been hunted by sailing northwest up the Greenland coast, and more recent specimens were smaller and often female. "If the original hunting grounds of the Greenland Norse, around Disko Bay, were overexploited, they may have journeyed as far north as Smith Sound to find sufficient herds of walrus," said Barrett.

Norse artefacts have previously been found among the remains of 13th and 14th century Inuit settlements in this most northern of regions. One former Inuit camp on an islet off Ellesmere Island contained the rivets of a Norse boat - quite possibly a hunting trip that never returned.

"Ancestors of the Inuit occupied northern Greenland during the time of the Norse colonies. They probably encountered and traded with the Norse," said Barrett. "That pieces of a Norse boat were found so far north hints of the risks these hunters might have ended up taking in their quest for ivory."

Barrett points out that the Inuit of the region favoured female walruses when hunting, so the prevalence of females in Greenland's later exports could imply a growing Norse reliance on Inuit supply.

He says that hunting season for the Norse would have been short, as seas were choked with ice for much of the year. "The brief window of summer would have barely been sufficient for rowing the many hundreds of miles north and back."

The legend of Erik the Red itself may mask what Barrett calls "ecological globalisation": the chasing of natural resources as supply dwindles. Recent research revealed that Greenland might have been settled only after Icelandic walruses were hunted to exhaustion.

Ultimately, having been highly prized for centuries, the marbled appearance of walrus ivory fell out of favour as West African trade routes opened up, and the homogenous finish of elephant ivory became de rigueur in the 13th century.

One account suggest that in the 1120s, Norse Greenlanders used walrus ivory to secure their own bishopric from the King of Norway. By 1282, however, the Pope requests his Greenland tithes be converted from walrus tusk into silver or gold.

"Despite a significant drop in value, the rostra evidence implies that exploitation of walruses may have even increased during the thirteen and fourteenth centuries," said Barrett.

"As the Greenlanders chased depleted walrus populations ever northwards for less and less return in trade, there must have come a point where it was unsustainable. We believe this 'resource curse' undermined the resilience of the Greenland colonies."

Credit: 
University of Cambridge

2017 San Diego wildfire increased pediatric ER visits for breathing problems

image: Following a 2017 San Diego wildfire, increased number of children visited the ER with respiratory problems, including asthma.

Image: 
ATS

Jan. 6, 2020--A small wildfire in San Diego County in 2017 resulted in a big uptick in children visiting the emergency room for breathing problems, according to new research published online in the Annals of the American Thoracic Society.

In "Increase in Pediatric Respiratory Visits Associated with Santa Ana Wind-driven Wildfire Smoke and PM2.5 levels in San Diego County," Sydney Leibel, MD, MPH, and co-authors report that the Lilac Fire, which burned from Dec. 7-16, resulted in 16 more visits each day to the ER by children under the age of 19 for breathing complaints. The complaints included difficulty breathing, respiratory distress, wheezing and asthma.

Before it was extinguished, the Lilac Fire burned 4,100 acres. In 2017, wildfires burned more than 1.5 million acres across California, according to the state's Department of Forestry & Fire Protection.

"We conducted this study because wildfires are becoming increasingly common in California," said Dr. Leibel, a pediatric allergist/immunologist at Rady Children's Hospital in San Diego and an assistant professor of pediatrics at UC San Diego School of Medicine. "While there is significant data on the respiratory effects of theses wildfires in adults, we wanted to investigate the health effects of wildfire smoke in the vulnerable pediatric population."

In collaboration with the Scripps Institute of Oceanography, the authors also demonstrated how the regional phenomenon known as the Santa Ana Winds have increased the health impacts of these fires in the county.

The researchers also found that children under the age of 12 were more likely to develop breathing problems leading to an ER visit than older children. The authors report that they found a similar pattern of increased visits for respiratory complaints to the county's urgent care centers during the wildfire, especially by younger children.

To account for seasonal changes in ER and urgent care visits, the researchers analyzed health electronic medical records from 2011-17. They also analyzed levels of fine particle pollution, known scientifically as PM2.5. over the same time period. The researchers estimated that there was a five-fold increase in these tiny particles during the wildfire.

The authors said that the five zip codes with the largest changes in ER and urgent care visits for pediatric respiratory problems were located downwind of the wildfire, which was driven by the Santa Ana Winds blowing from northeast towards the county's more populated coastal communities.

Given predicted changes in climate and population growth, the authors write that the impact of wildfires in the county is likely to grow in the coming decades.

"Our findings suggest that public health efforts focused on protecting young children with early warning systems and mitigation efforts downwind of Santa Ana Wind-driven wildfires may decrease the impact of these destructive wildfires in the future," Dr. Leibel said.

Credit: 
American Thoracic Society