Culture

LGBTQ military service members at higher risk of sexual harassment, assault, stalking

CORVALLIS, Ore. -- A recent study found that LGBTQ service members face an elevated risk of sexual victimization including harassment, assault and stalking while in the military than their non-LGBTQ counterparts.

The study, one of the first funded by the Department of Defense to look specifically at LGBTQ victimization in the military, aims to inform future polices that will identify vulnerable populations and appropriate interventions to help prevent such experiences going forward.

Previous research has found that experiencing sexual harassment and assault during military service can lead to negative health outcomes including PTSD, depression, substance use and suicidal behavior, all of which are often reported at higher rates among LGBTQ veterans than in the straight cisgender population.

"We're really trying to understand the experiences and well-being of LGBTQ service members and help the military learn how they can improve those experiences," said lead author Ashley Schuyler, a Ph.D. student in OSU'S College of Public Health and Human Sciences. "Our findings suggest that LGBTQ service members do experience an elevated risk of sexual and stalking victimization, even in this post-'don't ask, don't tell' era."

Published in the Journal of Traumatic Stress last month, the study surveyed 544 active-duty service members, ages 18-54, including about 41% who identified as LGBTQ and roughly 10% who identified as trans or gender-nonconforming.

"Don't ask, don't tell," the law that barred openly gay, lesbian and bisexual people from serving in the military, was repealed in 2011, but "it seems like some of those effects could linger, including sexual prejudice and discrimination, which may elevate victimization risk," Schuyler said.

The researchers considered that the culture of the military, with a high value placed on "masculine" ideals such as dominance, aggression and self-sufficiency, may compel some individuals to act out toward people they see as weaker to prove their masculinity to others.

That environment may explain a disparity between men and women in the study: Female service members were more likely to experience sexual harassment than male service members, but the risk of harassment did not increase among women who identified as lesbian or bisexual. Among male service members, however, gay and bisexual men were significantly more likely to experience sexual harassment than straight men.

"Our conclusion was that female service members have such an elevated risk of sexual harassment in general, that being bi or lesbian doesn't increase that risk," Schuyler said.

Among all service members in the sample, those identifying as gay, lesbian or bisexual had an increased risk of sexual harassment, stalking and sexual assault compared to heterosexual service members.

More research is needed on how stalking manifests in the military, Schuyler said. It may look different on board a ship with service members confined in close quarters for months at a time, for example.

"Something the military has started to acknowledge is this idea of a continuum of harm, where if you experience sexual harassment or gender discrimination behaviors, you're at higher risk of more severe encounters down the road, like assault," she said. "We're trying to understand where stalking fits into that spectrum of experiences, so we can intervene to help people who we know experience harassment or stalking and prevent potential assault in the future."

The researchers recommend further investigation into victimization in the military, especially as the policies governing LGBTQ service continue to change. Such research was not possible during the "don't ask, don't tell" era.

Schuyler said they'd like to see military leaders and health care providers be more educated about identifying victimization experiences and providing supports that are inclusive of LGBTQ people who have experienced sexual harassment, assault or stalking. With an increased understanding of those experiences, leaders can pinpoint targets for intervention to help stop sexual violence before it happens.

Credit: 
Oregon State University

COVID-19 news from Annals of Internal Medicine

Below please find a summary and link(s) of new coronavirus-related content published today in Annals of Internal Medicine. The summary below is not intended to substitute for the full article as a source of information. A collection of coronavirus-related content is free to the public at http://go.annals.org/coronavirus.

Preparing for Battle: How Hospitalists Can Manage the Stress of COVID-19

During the COVID-19 pandemic, hospitalists are on the front lines. As with other pandemics, COVID-19 presents challenges for the well-being of citizens around the globe, resulting from fear of illness, social distancing measures, isolation and quarantines, and protracted uncertainty. The author of a commentary from the Center for the Study of Traumatic Stress, School of Medicine, Uniformed Services University, Bethesda, Maryland, explains how hospitalists and other health care workers face additional unique challenges related to this pandemic. Read the full text: http://annals.org/aim/article/doi/10.7326/M20-1897.

Credit: 
American College of Physicians

New study shows sharp decrease of intimate partner violence in Nicaragua

image: A researcher conducting interviews with women in Leon, Nicaragua

Image: 
Mary Ellsberg/GWI

WASHINGTON (April 21, 2020)--The percentage of women and girls in Nicaragua's second-largest city who reported experiencing physical violence by their partners during their lifetimes decreased from 55% in 1995 to 28% in 2016, according to a new study published in the journal BMJ Global Health. Researchers at the George Washington University's Global Women's Institute (GWI), in partnership with the Autonomous National University of Nicaragua at León and InterCambios, a Nicaraguan nongovernmental organization, recorded the decline in a follow-up study conducted on intimate partner violence (IPV) in the city of León 20 years after the initial prevalence study.

Led by GWI Director Mary Ellsberg, the research team also found that women and girls reporting physical violence by partners in the 12 months preceding their study interviews decreased from 28% to 8%. The team recorded similar decreases in emotional violence over respondents' lifetimes (from 71% to 42%) as well as the preceding 12-month period (from 43% to 23%) of the study. No significant difference was found in the prevalence of lifetime sexual violence between the two time periods.*

"The only other country to our knowledge with a documented reduction in IPV prevalence is the U.S., where the Justice Department reported a similar decrease in IPV victimization between 1994 to 2012," Dr. Ellsberg said. "That Nicaragua, the second-poorest country in the Western Hemisphere, shows a comparable reduction in IPV to the U.S. is a stunning achievement."

Intimate partner violence is defined as physical violence, sexual violence, stalking or psychological harm by a current or former partner or spouse. Thirty-five percent of women globally experience sexual or physical IPV or nonpartner sexual violence at some point in their lives, according to the World Health Organization. Though efforts to address violence against women and girls around the world increased over the last 25 years, few studies measured a long-enough period to adequately measure large-scale and sustained reductions in IPV and to identify the proven strategies that work to reduce violence.

GWI led the first study to measure population-level change in IPV prevalence over a 20-year period. It conducted the study in León by comparing prevalence of physical, emotional and sexual violence against women and girls between 1995 and 2016. During that time period, the influence of the Nicaraguan women's movement over social policies and the movement's efforts to increase women's knowledge of their rights spurred multiple sectors of Nicaraguan society to address violence against women and girls, resulting in legislative and judicial reforms as well as collaboration among the police, government ministries, civil society organizations and others to protect and support victims.

The study included additional data findings:

Physical IPV

Lifetime (happened at least once during their lifetime) decreased from 55% to 28%.
63% decrease in violence when controlling for possible factors that could influence the results, such as age and education

12-month prevalence (happened in the 12 months preceding study interviews) decreased from 28% to 8%.
71% decrease in violence when controlling for possible factors that could influence the results, such as age and education

Emotional IPV

Lifetime decreased from 71% to 43%.
66% decrease in violence when controlling for possible factors that could influence the results, such as age and education

12-month prevalence decreased from 43% to 23%.
51% decrease in violence when controlling for possible factors that could influence the results, such as age and education

Sexual IPV
Lifetime decreased from 20% to 15%, but that change was not statistically significant.

The researchers noted the reduction in violence was not primarily because of demographic shifts, such as increased education or age, but reflected a true decrease in the prevalence of IPV. They concluded that violence against women and girls is preventable through large-scale, structural interventions undertaken by advocacy groups, civil society organizations, national governments, international donors and other sectors.

Now, however, the ongoing violence in Nicaragua and the COVID-19 crisis may threaten those gains.

"Defenders of women's rights have been prominent in the movement calling for election reforms and justice for those who were killed or arbitrarily detained," Dr. Ellsberg said. "As for COVID-19, we can assume that many more women will experience domestic violence, so the need to reestablish services and support for women and girls suffering from violence is even more critical."

Credit: 
George Washington University

SCAI, ACC, and ACEP release consensus on managing AMI patients during COVID-19

WASHINGTON - The Society for Cardiovascular Angiography and Interventions (SCAI), along with the American College of Cardiology (ACC) and American College of Emergency Physicians (ACEP) have released a consensus statement that provides recommendations for a systematic approach for the care of patients with an acute myocardial infarction (AMI) during the coronavirus disease-2019 (COVID-19) pandemic. The document is jointly published in Catheterization and Cardiovascular Interventions, the official journal of SCAI, and the Journal of the American College of Cardiology.

According to recent studies, cardiovascular disease patients who develop COVID-19 have a higher risk of mortality. However, many patients in need of care for the management of various heart diseases may not be infected with this coronavirus. The document identifies several challenges in providing recommendations for AMI care during the COVID-19 epidemic: cardiovascular manifestations in the COVID-19 patient are complex and variable, the prevalence of COVID-19 in U.S. populations remains unknown, and personal protection equipment (PPE) is not uniformly available.

"During the COVID-19 pandemic we wanted to ensure that patients continue to benefit from the tremendous advances made in the care of patients with cardiovascular disease over the past three decades," said Ehtisham Mahmud, MD, FSCAI, SCAI president and lead author of the writing group. "Primary percutaneous coronary intervention (PCI) is the standard of care for STEMI patients, and in this document, we outline an approach to providing that therapy at PCI-capable hospitals while also ensuring health care worker safety with appropriate PPE."

The writing group recommends informing the public that exposure to the virus can be minimized, that that patients continue to call the Emergency Medical System when presenting with acute ischemic heart disease symptoms, with the intention of primary PCI when indicated. Fibrinolysis at referral hospitals (non-PCI capable) is appropriate with a plan of care for rescue or pharmacoinvasive PCI. The document also provides strategies for maximizing the safety of medical personnel with appropriate use of personal protection equipment and masking patients.

Credit: 
Society for Cardiovascular Angiography and Interventions

Pulse oximetry monitoring overused in infants with bronchiolitis

Philadelphia, April 21, 2020--Monitoring blood oxygen levels with continuous pulse oximetry is being overused in infants with bronchiolitis who do not require supplemental oxygen, according to a study by researchers at Children's Hospital of Philadelphia (CHOP). The researchers found the use of continuous pulse oximetry occurred frequently and varied widely among hospitals in their sample, despite national recommendations advising against the practice.

The findings were published today in JAMA.

"We all have a tendency to believe that continuous monitoring is something that is always going to provide benefit and safety, and unfortunately that isn't the case," said Christopher P. Bonafide, MD, MSCE, an attending physician at CHOP and first author of the study. "When you monitor patients unnecessarily, it creates risk not only for that patient, in terms of longer hospital stays and increased costs, but also for the entire unit due to the potential for alarm fatigue. Our prior work shows that when alarms go off for both patients who need immediate, life-saving care and those who do not, it diminishes trust in the accuracy of the alarms for signaling true emergencies."

Acute viral bronchiolitis is the leading cause of infant hospitalization and is usually treated with supportive care, including fluids, suctioning, and supplemental oxygen when necessary. The Society of Hospital Medicine Choosing Wisely initiative discourages physicians from using continuous pulse oximetry monitoring in infants with bronchiolitis unless they are on supplemental oxygen, and the American Academy of Pediatrics also recommends against the practice.

To examine the extent to which hospitals were using continuous pulse oximetry in infants with bronchiolitis, the research team conducted an observational study in 56 U.S. and Canadian hospitals in the Pediatric Research in Inpatient Settings Network (PRIS), an independent, hospital-based network. The hospitals in the study included freestanding children's hospitals, children's hospitals within hospitals, and community hospitals. Researchers gathered data throughout one bronchiolitis season, from December 1, 2018 until March 31, 2019 and included 3, 612 patients between the ages of 8 weeks and 23 months.

Of the patients in the study who did not receive any supplemental oxygen, 46% were monitored via continuous pulse oximetry. After standardizing the results to account for differences in variables across hospitals that could have influenced monitoring, researchers found the percentage of patients being unnecessarily monitored ranged from 6% to 82%.

"We were surprised by the huge amount of variation we saw across the hospitals in this study, which shows many institutions are using monitoring unnecessarily as a safety net," Bonafide said. "This study represents an essential first step in phasing out an overused, low-value care practice that does not improve outcomes, raises healthcare costs, and leads to alarm fatigue among healthcare workers."

The CHOP-led study was a collaborative, multi-institutional effort that included researchers from CHOP, University of Pennsylvania, Boston Children's Hospital, and Cincinnati Children's Hospital Medical Center. The research was supported by a cooperative agreement awarded by the National Institutes of Health/National Heart, Lung, and Blood Institute (award number U01HL143475).

Credit: 
Children's Hospital of Philadelphia

Humble bug holds key to relieving millions of allergy sufferers in Europe

image: Leaf beetle Ophraella communa.

Image: 
Professor Heinz Müller-Schärer

CABI has led a team of scientists on new research which reveals that a humble bug can help relieve more than 2 million sufferers of allergies in Europe while also saving more than Euro 1 billion in health costs.

Dr Urs Schaffner, lead author of the study published in Nature Communications, says the leaf beetle Ophraella communa can significantly reduce pollen – which causes a range of symptoms from sneezing to itchy eyes and aggravates conditions such as asthma and eczema – from common ragweed (Ambrosia artemisiifolia).

The interdisciplinary study – the first to quantify the economic benefits of biological control in Europe – also argues that the costs inflicted by invasive species in Europe are ‘most probably seriously underestimated’.

The team of scientists from institutions including the University of Fribourg and ETH Zurich, Switzerland, the University of Worcester, UK, and Leiden University, NL, , suggest countries in the Balkan Peninsula – such as Bulgaria, Romania and Serbia – will benefit most from the leaf beetle as a biological control.

Prior to the accidental arrival of the leaf beetle in 2013, some 13.5 million people suffered from ragweed-induced allergies in Europe, causing economic costs of approximately Euro 7.4 billion annually.

In Europe, common ragweed is considered invasive in more than 30 countries and its spread and impact, the scientists say, is likely to increase with rising temperatures caused by climate change.

Field studies in Italy have proved that the leaf beetle can reduce ragweed pollen by 82 percent. In the Milan area, where the beetle was first detected, up to 100 percent of ragweed plants were attacked and the damage caused was enough to prevent flowering that causes pollen to be released.

Dr Schaffner said, “Our study provides evidence that the impacts of common ragweed on human health and the economy are so far highly underestimated, but that biological control by Ophraella communa might mitigate these impacts in parts of Europe.

“We propose that future assessments of the economic impacts of Invasive Alien Species (IAS) should more thoroughly consider costs related to human health.”

The scientists drew upon information from the European Pollen Monitoring Programme before mapping seasonal total ragweed pollen integrals in Europe during 2004 and 2012 – prior to the introduction of the leaf beetle. They then interpolated data from 296 pollen monitoring sites across Europe.

To validate the estimated number of patients suffering from ragweed pollen allergy, the researchers compared their European-wide assessment with detailed healthcare data from the Rhône-Alpes region in southeastern France.

They then weighted the treatment and lost work time cost at the country level using purchasing power parity – adjusted health expenditures per capita for 2015 – to determine the overall economic costs of healthcare to treat the symptoms and other effects of ragweed pollen.

Professor Heinz Müller-Schärer, of the University of Fribourg, said, “We were not sure as first whether the leaf beetle was useful or harmful. Laboratory tests had shown that it was possible that it was harmful to sunflowers. However, field tests in China and Europe could not confirm this finding.”

Dr Schaffner, Professor Müller-Schärer and the other authors conclude that accurate information of policy and management about the impact of IAS on human health and the potential savings – due to the implementation of mitigation measures – is essential to ensure that reasonable resources are invested and actions coordinated in IAS management.”

Credit: 
CABI

Ultrasound-assisted molecule delivery looks to preserve blood for years

image: Illustration of ultrasound-induced microbubble (MB) rupture causing temporary pores in cell membranes, which enables entry of soluble molecules such as trehalose (not to scale).

Image: 
Jonathan A Kopechek

WASHINGTON, April 21, 2020 -- Ensuring adequate preservation of the millions of units of blood that are donated every year presents a challenge for blood banks, as blood can typically be stored for only six weeks after donation. A potential solution to the problem attempts to dry blood by using a sugar-based preservative that organisms living in some of Earth's most extreme environments produce to weather long periods of dryness. New work in ultrasound technology looks to provide a path to inserting these sugars into human red blood cells, in an effort to help them last for years.

Researchers at the University of Louisville have demonstrated a new way to use ultrasound to create pores in blood cells, which allows the molecule trehalose to enter the cells and prevent their degradation when dried for preservation. Oscillating microscopic bubbles of inert gases with ultrasound provides the microfluidic system with the ability to increase the number of viable cells that can be rehydrated. The researchers discuss their work in this week's Biomicrofluidics, from AIP Publishing.

The approach could lead to ways of increasing the shelf life of blood donations from weeks to the order of years. Such advances would provide a boon to those needing blood in areas where access to donations is difficult, like on the battlefield or in space.

"What's unique about this is there are not many other studies that look at using acoustofluidics to place a molecule like this inside red blood cells," said author Jonathan Kopechek. "It's also interesting, because it allows us to store blood without keeping it cool."

When it's not helping these extremophiles survive, trehalose is a relatively cheap sugar that is so safe that it is used as a preservative for food items, like donut glaze.

The group constructed a spiral-shaped channel that exposed blood cells to trehalose while surrounded by microbubbles. They tuned ultrasonic vibrations using several parameters until the bubbles shook nanosized holes in the membranes of the blood cells, just large enough for trehalose or a closely related fluorescent molecule called fluorescein to enter and just briefly enough to maintain the integrity of the blood.

After confirming that fluorescein could enter cells on test samples, they added trehalose to a new batch of samples, dried the blood, rehydrated it, and performed tests to count how many of the blood cells were still viable after the process.

The ultrasound technique was able to preserve a significantly higher portion of cells by adding trehalose versus leaving the trehalose out.

The group looks to improve on the yield for their technique with the hopes of verifying the effectiveness of dry-preserved blood in patients.

Credit: 
American Institute of Physics

NEI researchers link age-related DNA modifications to susceptibility to eye disease

image: DNA methylation is an epigenetic mechanism essential for normal cell development and differentiation, and is also associated with aging and the formation of cancers. When present, DNA methylation generally represses gene expression.

Image: 
NIH Common Fund, adapted from

National Eye Institute (NEI) researchers profiling epigenomic changes in light-sensing mouse photoreceptors have a clearer picture of how age-related eye diseases may be linked to age-related changes in the regulation of gene expression. The findings, published online April 21 in Cell Reports, suggest that the epigenome could be targeted as a therapeutic strategy to prevent leading causes of vision loss, such as age-related macular degeneration (AMD). NEI is part of the National Institutes of Health.

"Our study elucidates the molecular changes and biological pathways linked with aging of rod photoreceptors, light-sensing cells of the retina. Future investigations can now move forward to study how we can prevent or delay vision loss in aging and hopefully reduce the risk of associated neurodegeneration" said the study's lead investigator, Anand Swaroop, Ph.D., senior investigator and chief of the NEI Neurobiology, Neurodegeneration, and Repair Laboratory.

Each organism is born with a genome, a library of genes that control all the body's cellular and tissue functions. Expression of those genes - when information stored in DNA is converted into instructions for making proteins or other molecules - is modulated and maintained by the organism's epigenome. The epigenome tags the DNA code to modify gene expression in ways that can be favorable and unfavorable for survival.

As it turns out, that interplay between the genome and the epigenome evolves as the organism ages. Scientists therefore study epigenomic DNA modifications for clues about why certain diseases develop with advancing age.

To explore how such DNA modifications might influence visual function as we age, Swaroop's team performed whole genome sequencing of DNA methylation changes in mouse rod photoreceptors at four separate stages over the animal's lifetime. DNA methylation is an epigenetic mechanism essential for normal cell development and differentiation, and is also associated with aging and the formation of cancers. When present, DNA methylation generally represses gene expression.

The sequencing was performed at ages three months (young), 12 months (middle-aged), and 18 and 24 months (older). The average lifespan of a mouse is about two years.

Rod photoreceptors are the predominant type of cell in the retina, the light-sensing tissue at the back of the eye. Rod photoreceptors enable dim-light vision, and are critical for the survival of cone photoreceptors that enable daylight and color vision. Rod dysfunction is common in older human adults and can be an early warning sign of AMD and other retinal degenerative diseases.

The researchers identified 2,054 differentially methylated regions across the four mouse age groups, that is, genomic regions with differences in DNA methylation.

"We know that DNA methylation changes are strongly associated with biological age, but prior to this study we had limited understanding of how these alterations correlated with cellular function," Swaroop said. This is the first study to look at DNA methylation changes as animals age. Very few studies have looked at DNA methylation changes in people with AMD, a leading cause of vision loss in people age 50 and older, which can progress even when vision loss is undetectable.

The researchers then analyzed the differentially methylated regions with RNA sequencing data to look more closely at how the mouse genes were transcribing proteins differently in the retina as the animals aged.

Those analyses uncovered distinct shifts in how the genes produced proteins relevant to energy metabolism, mitochondria function, and the longevity of rod photoreceptors, indicating their contribution to age-related disease susceptibility.

Rod photoreceptors require vast amounts of energy to sustain vision and are thus vulnerable to metabolic stresses that accompany aging. Energy deprivation of photoreceptors is believed to be a key driver of neurodegeneration of the retina.

"Neurons, specifically photoreceptors, prefer glucose as a source of energy, but in aging, we surprisingly observed utilization of fatty acids as well. These studies suggest how changes in aging rod functions can make them vulnerable to genetic susceptibility variations and environmental factors, which together cause common blinding aging-associated diseases," Swaroop said.

"Our work provides pivotal connections between aging, the epigenome, dysfunction of the cell's mitochondria, and diseases such as AMD. The findings have broad implications for how we understand age-associated neurodegeneration, not only in the eye, but elsewhere in the body," he said.

"Future studies will assess whether DNA methylation contributes to alterations in the expression of metabolic genes and thus introduce epigenomic editing as a therapeutic possibility for age-related retinal disease," said the study's first author, Ximena Corso-Díaz, Ph.D., a postdoctoral fellow in the Neurobiology, Neurodegeneration, and Repair Laboratory.

Credit: 
NIH/National Eye Institute

HudsonAlpha plant genomics researchers surprised by cotton genome

image: Jane Grimwood, Ph.D., is a faculty investigator and co-director of the HudsonAlpha Genome Sequencing Center.

Image: 
HudsonAlpha Institute for Biotechnology

April 20, 2020 (Huntsville, Ala.) - Plant genomics researchers at HudsonAlpha Institute for Biotechnology announce the surprising results of a cotton sequencing study led by Jane Grimwood, PhD, and Jeremy Schmutz, who co-direct the HudsonAlpha Genome Sequencing Center (HGSC). The goal of the project was to identify differences among wild and domesticated cotton that could be used to reintroduce agriculturally beneficial traits like disease or drought resistance. The results, however, surprised the researchers and led them to unexpected conclusions, as described in their paper in Nature Genetics.

"The importance of this study is that it helps us understand more about cotton fiber development," said Grimwood, who is a faculty investigator at HudsonAlpha. "But perhaps more importantly, it reinforces the surprising concept that wild and domesticated cotton is remarkably similar, leading us to the conclusion that we will need to work on other approaches to generate diversity for cotton species."

For the study, the group sequenced and assembled reference-grade genomes for five different species of allotetraploid cotton and compared them with two diploid cotton genomes. Their genomic analysis showed that two ancestral diploid cotton genomes came together to form what is basically the modern tetraploid cotton between 1 and 1.6 million years ago.

Then, about 8,000 years ago, humans began to domesticate cotton for agriculture. They selected wild plants for cultivation that had desirable traits like stronger fiber or more cotton on each plant. Humans then continued to choose plants for breeding that would improve yield and the quality of the harvested crop.

"When we compared the wild cotton plants to domesticated cotton, we expected to see a genetic bottleneck where many wild traits had been discarded," said Schmutz, a faculty investigator at HudsonAlpha. "What you typically see with these crops is that all the selection has gone into improving production, potentially at the cost of losing beneficial genetic material from the wild species."

What they found, however, surprised them.

The wild and domesticated genomes, it turns out, were incredibly similar.

"In all of the cotton tetraploids, there's less diversity between what are supposed to be different species of cotton than between two humans or even within different cells in a single human body," Schmutz said.

This lack of diversity means that researchers won't be as easily able to reach back into the wild cotton gene pool to introduce lost traits like disease resistance back into cultivated cotton plants.

"We can't only rely on the gene pool to make changes to the plant architecture because those wild genes don't exist. Exploring the route of mining natural diversity won't work. The only real way forward to improve cotton as a crop will be genome editing," Schmutz said.

The result of this surprising discovery has been the launch of new projects for Schmutz, Grimwood and their team at HudsonAlpha. To begin with, they now have complete comparison genomes for multiple species of the same crop plant. This knowledge allows them to "walk" from one species to the other to introduce desirable traits to cultivated cotton.

"We are the first group to do these sorts of large projects where we are sequencing multiple high-quality references from the germplasm with the goal of getting direct comparisons across multiple species," Grimwood said.

"We just sequenced a cotton genome in two days," Schmutz added. "The promise of that speed is that we can start to move from inferred variation where you look at a single reference and infer the variation to looking at multiple references and directly comparing the differences."

One ongoing project in the HGSC in collaboration with Clemson University, then, is trying to accelerate targeted gene editing in cotton. The group is looking at genetic mechanisms that enable fast transformation in cotton so they can bump up the rate of making targeted modifications.

In addition, while the group was surprised to find so much similarity among the cotton genomes, they did find some useful variation. Wild cotton, for instance, does have some more disease resistance triggers than cultivated cotton varieties, which tend to be more vulnerable.

"This is the basis from which we can start to compare what else we can do with existing cotton diversity," Grimwood said. "Breeders have selected for 'improved' strains of cotton based on phenotype changes but they don't necessarily have a full understanding of the changes they are making on the genetic side. With this new information, they can look at what their selections are doing on a genetic level."

Even though the project results were unexpected, the entire team is confident that the newly assembled cotton genomes will lead to direct benefits for cotton producers and the cotton industry.

Don Jones, the Director of Agricultural Research at the nonprofit Cotton Incorporated, said these reference grade assemblies are significant advancements for improving the sustainability of cotton production.

"The results described in this Nature Genetics publication will facilitate deeper understanding of cotton biology and lead to higher yield and improved fiber while reducing input costs. Growers, the textile industry, and consumers will derive benefit from this high impact science for years to come," Jones said.

Credit: 
HudsonAlpha Institute for Biotechnology

North pole will be ice-free in summer

image: People standing on Arctic sea-ice.

Image: 
Dirk Notz

Summer Arctic sea-ice is predicted to disappear before 2050, resulting in devastating consequences for the Arctic ecosystem. The efficacy of climate-protection measures will determine how often and for how long. These are the results of a new study involving 21 research institutes from around the world, including McGill University.

The North Pole is presently covered by sea-ice all year. Each summer, the area of sea-ice coverage decreases and grows again in winter. However, as a result of global warming, the overall area of the Arctic Ocean covered by sea-ice has reduced rapidly over the past few decades. According to the researchers, this substantially affects the Arctic ecosystem and climate. The sea-ice cover is a hunting ground and habitat for polar bears and seals and keeps the Arctic cool by reflecting sunlight.

"While the Arctic sea-ice extent is decreasing during this transition to an ice-free Arctic, the year-to-year variability in extent greatly increases, making life more difficult for local populations and ice-dependent species," says co-author Bruno Tremblay, Associate Professor in the Department of Atmospheric and Oceanic Sciences at McGill University.

The study published in Geophysical Research Letters analyzed recent results from 40 different climate models. Using these models, the researchers assessed the evolution of Arctic sea-ice cover in a scenario with high CO2 emissions and little climate protection. As expected, summer Arctic sea-ice disappeared quickly in these simulations. Surprisingly, they also found that ice disappeared in some simulations where CO2 emissions were rapidly reduced.

How often the Arctic will lose its sea-ice cover in the future critically depends on future CO2 emissions, the study shows. If emissions are reduced rapidly, ice-free years only occur occasionally. With higher emissions, the Arctic Ocean will become ice-free in most years. This tells us that humans still determine how often the Arctic Ocean will be ice-free in the summer, depending on our future level of emissions, says Tremblay.

Credit: 
McGill University

International team develops new model to improve accuracy of storm surge analysis

image: Thomas Wahl, a University of Central Florida assistant professor of engineering and a co-author of the study says it is important to have good models that can help planners understand and prepare for the negative consequences of a warming climate.

Image: 
University of Central Florida

Accurately predicting how many people are at risk due to sea level rise and storm surges has always challenged scientists, but a new method is improving models that account for the impact of these natural occurrences.

A new international study published today in the journal Nature Communications, applied a novel statistical method that -- for the first time -- captures the important interactions between tides and storm surges. These natural forces are caused by meteorological effects, such as strong winds and low atmospheric pressure, and their impacts have often been difficult to understand because of the complexity of Mother Nature.

"It's important to have good models that can help planners understand and prepare for the negative consequences of a warming climate," says Thomas Wahl, a University of Central Florida assistant professor of engineering and a co-author of the study.

Using the new method, the team found the number of people at risk from coastal flooding and the associated costs to coastal communities may have been previously over-estimated.

For example, the new method in the study found the number of people at risk from coastal flooding along the UK North Sea coast is 5 percent lower and the direct costs associated with it are 7 percent lower.

"Global studies often include uncertainties, because the interplay between the natural processes are ignored, although they largely determine how high the water really piles up along the coast during extreme events," says Arne Arns, an assistant professor of agricultural and environmental Sciences at the University of Rostock in Germany who led the study.

For the U.S. coasts the reduction is more pronounced, leading to 17 percent less people affected and 13 percent lower costs.

"From our results, we cannot really conclude the consequences of future sea level rise will be less serious than we currently anticipate, but it highlights where uncertainties in our methodologies exist and where future research efforts should be directed to better capture all relevant processes," Arns says.

These new estimates can be useful at the global level and at the local level to aid in improving coastal protection, especially when there is a lack of data or limited access to more complex computer models for smaller communities, the authors say.

In the past when scientists conducted similar studies, they used information derived from computer models that approximate nature's physical processes, which cause tides and storm surge. Such models are vital for global assessments, because measurements from all coastal locations do not exist. However, the new method could be more accurate.

"We have now a generalized approach with which we can re-evaluate the outcomes of previous studies and derive more robust conclusions in future assessments," says Wahl, who joined UCF in 2017. He is a member of UCF's National Center for Integrated Coastal Research which conducts research to support coastal communities. The center aims to link the ecological security of coastal ecosystems with the economic security of coastal communities, ensuring the sustainability of coastlines and economy.

Wahl also has a doctorate in civil engineering from the University of Siegen in Germany and has been studying the sea-level rise and its impacts for years, publishing papers and making presentations around the world. Other areas of his research include sustainability of human-natural systems in coastal zones, coastal engineering design concepts, and climate adaptation and resilience.

Credit: 
University of Central Florida

Human pregnancy is weird -- new research adds to the mystery

BUFFALO, N.Y. -- From an evolutionary perspective, human pregnancy is quite strange, says University at Buffalo biologist Vincent Lynch.

"For example, we don't know why human women go into labor," Lynch says. "Human pregnancy tends to last longer than pregnancy in other mammals if you adjust for factors like body size. The actual process of labor tends to last longer than in other animals. And human pregnancy and labor are also much more dangerous."

With these oddities in mind, Lynch and colleague Mirna Marinic set out to investigate the evolution of a gene that helps women stay pregnant: the progesterone receptor gene.

But the results of the study only add to the mystery, says Lynch, PhD, an assistant professor of biological sciences in the UB College of Arts and Sciences.

Unexpected findings about a gene that's critical to pregnancy

Past research has shown that the progesterone receptor gene underwent rapid evolution in humans, and some scientists have suggested that these swift changes occurred because they improved the function of the gene. This is called positive selection.

But Lynch and Marinic's study -- published online on April 17 in the journal PLOS Genetics -- draws a different conclusion.

Their research finds that while the progesterone receptor gene evolved rapidly in humans, there's no evidence to support the idea that this happened because those changes were advantageous. In fact, the evolutionary force of selection was so weak that the gene accumulated many harmful mutations as it evolved in humans, Lynch says.

The results come from an analysis of the DNA of 115 mammalian species. These included a variety of primates, ranging from modern humans and extinct Neanderthals to monkeys, lemurs and lorises, along with non-primate mammalian species such as elephants, pandas, leopards, hippos, aardvarks, manatees and walruses.

The findings were a surprise, Lynch says.

"We expected something very different. It opens up this mystery that we didn't anticipate," he says. "I thought that the progesterone receptor gene would have evolved to respond better to progesterone, to be better at suppressing inflammation or contractions to keep us pregnant for longer. It looks like it's the reverse: In human pregnancy, there's just an incredible amount of progesterone around, and yet the gene is less good at doing its job. I wonder if this might predispose us to things like preterm birth, which is not that common in other animals."

"Pregnancy is such an everyday event -- none of us would be here without it -- and yet, so many aspects of this process remain puzzling," says Marinic, PhD, a postdoctoral researcher in the University of Chicago Department of Organismal Biology and Anatomy. "This study focused on an essential ingredient, progesterone signaling via progesterone receptors, and our results add another step toward deeper understanding of specificities of human pregnancy."

The progesterone receptor gene is crucial to pregnancy because it provides cells with instructions for how to create tiny structures called progesterone receptors.

During human pregnancy, these receptors detect the presence of progesterone, an anti-inflammatory hormone that pregnant women and the placenta produce at various points in time. When progesterone is present, the receptors jump into action, triggering processes that help keep women pregnant in part by preventing the uterus from contracting, reducing uterine inflammation, and suppressing the maternal immune response to the fetus, Lynch says.

Evolution changed the function of progesterone receptors in humans

In addition to exploring the evolutionary history of the progesterone receptor gene, Lynch and Marinic conducted experiments to test whether mutations in the human version of the gene altered its function. The answer is yes.

As the scientists wrote in their paper, "We resurrected ancestral forms of the progesterone receptor and tested their ability to regulate a target gene. We found that the human progesterone receptor forms have changed in function, suggesting the actions regulated by progesterone may also be different in humans. Our results suggest caution in attempting to apply findings from animal models to progesterone biology of humans."

Credit: 
University at Buffalo

Got seasonal allergies? Beetles could help

It's time once again for the misery familiar to millions of people around the world: seasonal allergy season. But there may be hope, courtesy of a tiny critter with a big appetite: a new study published in Nature Communications suggests that a species of beetle could help control an invasive and highly allergenic weed at the root of many people's suffering.

Allergies caused by the common ragweed, Ambrosia artemisiifolia, impact millions, and in Europe alone, around 13.5 million people suffer with symptoms, resulting in 7.4 billion Euros worth of health costs per year, according to the research. The study suggests the leaf beetle, Ophraella communa, could reduce the number of people affected by the pollen and the associated economic impacts, since the beetle - itself a recent arrival in Europe - loves to munch on the invasive plant.

Invasive alien species, such as common ragweed, have a significant effect when introduced into new ecosystems - from crowding out ecologically important native plant species to altering and damaging the ecological services a landscape can provide, invasive plants can lead to substantial economic costs. However, the researchers note very little research has been done on the human health impacts of these species.

Using data from the European Pollen Monitoring Programme, a team of researchers including co-first authors Sandro Steinbach of UConn's Agricultural and Resource Economics Department and Urs Schaffner of the Centre for Agriculture and Bioscience International, mapped seasonal total ragweed pollen in Europe from 2004-2012. They then determined ragweed sensitization rates in the European population to estimate the number of allergy sufferers.

"Assessing the human health impacts of IAS is a difficult task; it requires collaboration among scientists from different disciplines, including plant and insect ecology, aerobiology, medicine, and economics," says Steinbach.

They estimate that 13.5 million people were affected by seasonal ragweed pollen allergies, with economic costs of approximately 7.4 billion Euros per year, including factors such as medical costs and work absences. These numbers are prior to the unintended arrival of O. communa in Europe in 2013.

By modelling the number of generations of the beetle across its suitable habitat range in Europe, the authors project that biological control of common ragweed could reduce the number of people suffering from the ragweed allergy to approximately 11.2 million, and bring the health costs down to 6.4 billion Euros per year.

"Our conservative estimates indicate that biological control of A. artemisiifolia by O. communa will reduce the number of patients by approximately 2.3 million and the health costs by Euro 1.1 billion per year," says Steinback. "Future costs of this management approach will be basically zero since the beetle has established permanently and is propagating by itself."

Though this research is specific for Europe, this method of biological control is already happening in China where the beetle is reared and distributed for the control of ragweed. Fortunately, the authors note that previous studies suggest the beetle would have no negative impacts on native or ornamental plants in Europe, so this form of biological control may have no unintended consequences on the local landscape.

Schaffner says, "We were not sure at first whether the leaf beetle was useful or harmful. Laboratory tests had shown that O. communa might be detrimental to sunflowers. However, field tests in China and Europe could not confirm this finding."

This research also underscores the need for more work to be done on the human health impact of invasive alien species, since the benefits of management strategies are likely greatly undervalued, as shown by the authors' estimated public health costs being higher than previously reported.

"Mainstreaming invasive species management into policy and decision-making is dependent on the availability of robust data regarding their ecological and economic impacts. Our study provides evidence that the health costs incurred by a single species, A. artemisiifolia, is in a similar range as the currently discussed overall economic costs of all invasive species in Europe, suggesting that the overall costs of invasive species in Europe are grossly underestimated," says Steinbach.

Schaffner adds, "Because O. communa was accidentally introduced into Europe and did not go through a thorough risk assessment typical for deliberate releases of biological control agents, we started investigating whether this beetle can damage native European plant species. The good news is that there is only one European plant species, which is closely related to A. artemisiifolia. The risk assessment is still in progress, but so far, we have not found evidence for significant damage by this leaf beetle on native European plants."

Schaffner says another aspect the team is currently looking into is how climate change will affect the distribution of the weed and the beetle and whether the beetle's impact on pollen production by A. artemisiifolia will increase or decrease in the future.

Credit: 
University of Connecticut

Penn Engineering's new scavenger technology allows robots to 'eat' metal for energy

image: Like a traditional battery, the researchers' MAS starts with a cathode that's wired to the device it's powering. Underneath the cathode is a slab of hydrogel, a spongy network of polymer chains that conducts electrons between the metal surface and the cathode via the water molecules it carries. With the hydrogel acting as an electrolyte, any metal surface it touches functions as the anode of a battery, allowing electrons to flow to the cathode and power the connected device.

Image: 
Pikul Research Group, Penn Engineering

When electronics need their own power sources, there are two basic options: batteries and harvesters. Batteries store energy internally, but are therefore heavy and have a limited supply. Harvesters, such as solar panels, collect energy from their environments. This gets around some of the downsides of batteries but introduces new ones, in that they can only operate in certain conditions and can't turn that energy into useful power very quickly.

New research from the University of Pennsylvania's School of Engineering and Applied Science is bridging the gap between these two fundamental technologies for the first time in the form of a "metal-air scavenger" that gets the best of both worlds.

This metal-air scavenger works like a battery, in that it provides power by repeatedly breaking and forming a series of chemical bonds. But it also works like a harvester, in that power is supplied by energy in its environment: specifically, the chemical bonds in metal and air surrounding the metal-air scavenger.

The result is a power source that has 10 times more power density than the best energy harvesters and 13 times more energy density than lithium-ion batteries.

In the long term, this type of energy source could be the basis for a new paradigm in robotics, where machines keep themselves powered by seeking out and "eating" metal, breaking down its chemical bonds for energy like humans do with food.

In the near term, this technology is already powering a pair of spin-off companies. The winners of Penn's annual Y-Prize Competition are planning to use metal-air scavengers to power low-cost lights for off-grid homes in the developing world and long-lasting sensors for shipping containers that could alert to theft, damage or even human trafficking.

The researchers, James Pikul, assistant professor in the Department of Mechanical Engineering and Applied Mechanics, along with Min Wang and Unnati Joshi, members of his lab, published a study demonstrating their scavenger's capabilities in the journal ACS Energy Letters.

The motivation for developing their metal-air scavenger, or MAS, stemmed from the fact that the technologies that make up robots' brains and the technologies that power them are fundamentally mismatched when it comes to miniaturization.

As the size of individual transistors shrink, chips provide more computing power in smaller and lighter packages. But batteries don't benefit the same way when getting smaller; the density of chemical bonds in a material are fixed, so smaller batteries necessarily mean fewer bonds to break.

"This inverted relationship between computing performance and energy storage makes it very difficult for small-scale devices and robots to operate for long periods of time," Pikul says. "There are robots the size of insects, but they can only operate for a minute before their battery runs out of energy."

Worse still, adding a bigger battery won't allow a robot to last longer; the added mass takes more energy to move, negating the extra energy provided by the bigger battery. The only way to break this frustrating inverted relationship is to forage for chemical bonds, rather than to pack them along.

"Harvesters, like those that collect solar, thermal or vibrational energy, are getting better," Pikul says. "They're often used to power sensors and electronics that are off the grid and where you might not have anyone around to swap out batteries. The problem is that they have low power density, meaning they can't take energy out of the environment as fast as a battery can deliver it."

"Our MAS has a power density that's ten times better than the best harvesters, to the point that we can compete against batteries," he says, "It's using battery chemistry, but doesn't have the associated weight, because it's taking those chemicals from the environment."

Like a traditional battery, the researchers' MAS starts with a cathode that's wired to the device it's powering. Underneath the cathode is a slab of hydrogel, a spongy network of polymer chains that conducts electrons between the metal surface and the cathode via the water molecules it carries. With the hydrogel acting as an electrolyte, any metal surface it touches functions as the anode of a battery, allowing electrons to flow to the cathode and power the connected device.

For the purposes of their study, the researchers connected a small motorized vehicle to the MAS. Dragging the hydrogel behind it, the MAS vehicle oxidized metallic surfaces it traveled over, leaving a microscopic layer of rust in its wake.

To demonstrate the efficiency of this approach, the researchers had their MAS vehicle drive in circles on an aluminum surface. The vehicle was outfitted with a small reservoir that continuously wicked water into the hydrogel to prevent it from drying out.

"Energy density is the ratio of available energy to the weight that has to be carried," Pikul says. "Even factoring in the weight of the extra water, the MAS had 13 times the energy density of a lithium ion battery because the vehicle only has to carry the hydrogel and cathode, and not the metal or oxygen which provide the energy."

The researchers also tested the MAS vehicles on zinc and stainless steel. Different metals give the MAS different energy densities, depending on their potential for oxidation.

This oxidation reaction takes place only within 100 microns of the surface, so while the MAS may use up all the readily available bonds with repeated trips, there's little risk of it doing significant structural damage to the metal it's scavenging.

With so many possible uses, the researchers' MAS system was a natural fit for Penn's annual Y-Prize, a business plan competition that challenges teams to build companies around nascent technologies developed at Penn Engineering. This year's first-place team, Metal Light, earned $10,000 for their proposal to use MAS technology in low-cost lighting for off-grid homes in the developing world. M-Squared, which earned $4,000 in second place, intends to use MAS-powered sensors in shipping containers.

"In the near term, we see our MAS powering internet-of-things technologies, like what Metal Light and M-Squared propose," Pikul says. "But what was really compelling to us, and the motivation behind this work, is how it changes the way we think about designing robots."

Much of Pikul's other research involves improving technology by taking cues from the natural world. For example, his lab's high-strength, low-density "metallic wood" was inspired by the cellular structure of trees, and his work on a robotic lionfish involved giving it a liquid battery circulatory system that also pneumatically actuated its fins.

The researchers see their MAS as drawing on an even more fundamental biological concept: food.

"As we get robots that are more intelligent and more capable, we no longer have to restrict ourselves to plugging them into a wall. They can now find energy sources for themselves, just like humans do," Pikul says. "One day, a robot that needs to recharge its batteries will just need to find some aluminum to 'eat' with a MAS, which would give it enough power to for it work until its next meal."

Credit: 
University of Pennsylvania

AI may help brain cancer patients avoid biopsy

video: Patients may not need biopsy.

Image: 
UTSW

DALLAS – April 21, 2020 – Brain cancer patients in the coming years may not need to go under the knife to help doctors determine the best treatment for their tumors.

A new study by UT Southwestern shows artificial intelligence can identify a specific genetic mutation in a glioma tumor simply by examining 3D images of the brain – with more than 97 percent accuracy. Such technology could potentially eliminate the common practice of pretreatment surgeries in which glioma samples are taken and analyzed to choose an appropriate therapy.

Scientists across the country have been testing other imaging techniques in recent years, but the latest research describes perhaps one of the most accurate and clinically viable methods in the widespread effort to alter the paradigm of assessing brain cancer.

"Knowing a particular mutation status in gliomas is important in determining prognosis and treatment strategies," says Joseph Maldjian, M.D., chief of neuroradiology at UT Southwestern’s O’Donnell Brain Institute. “The ability to determine this status using just conventional imaging and AI is a great leap forward.”

Mutated Enzymes

The study used a deep-learning network and standard magnetic resonance imaging (MRI) to detect the status of a gene called isocitrate dehydrogenase (IDH), which produces an enzyme that in mutated form may trigger tumor growth in the brain.

Doctors preparing to treat gliomas will often have patients undergo surgery to obtain tumor tissue that is then analyzed to determine the IDH mutation status. The prognosis and treatment strategy will vary based on whether a patient has an IDH-mutated glioma.

However, because obtaining an adequate sample can sometimes be time consuming and risky – particularly if tumors are difficult to access – researchers have been studying non-surgical strategies to identify IDH mutation status.

The study, published this spring in Neuro-Oncology, differentiates itself from previous research in three ways:

The method is highly accurate. Previous techniques have often failed to eclipse 90 percent accuracy.
Mutation status was determined by analyzing only a single series of MR images, as opposed to multiple image types.
A single algorithm was required to assess the IDH mutation status in the tumors. Other techniques have required either hand-drawn regions of interest or additional deep-learning models to first identify the boundaries of the tumor then detect potential mutations.

“The beauty of this new deep-learning model is its simplicity and high degree of accuracy,” says Maldjian, adding that similar methods may be used to identify other important molecular markers for various cancers. “We’ve removed additional pre-processing steps and created an ideal scenario for easily transitioning this into clinical care by using images that are routinely acquired.”

Tumor imaging

Gliomas comprise the strong majority of malignant tumors found in the brain and can often spread quickly through surrounding tissue. The five-year survival rate for high-grade gliomas is 15%, though tumors with mutated IDH enzymes generally have a better prognosis.

The IDH mutation status also helps doctors decide on a combination of treatments most suitable for the patient, from chemotherapy and radiation therapy to surgery to remove the tumor.

To improve the process of detecting enzyme mutations and deciding on appropriate therapy, Maldjian’s team developed two deep-learning networks that analyzed imaging data from a publicly available database of more than 200 brain cancer patients from across the U.S.

One network used only one series from the MRI (T2-weighted images), while the other used multiple image types from the MRI. The two networks achieved nearly the same accuracy, suggesting that the process of detecting IDH mutations could be significantly streamlined by using only the T2-weighted images.

 ‘Big picture’

Maldjian’s team will next test his deep-learning model on larger datasets for additional validation before deciding whether to incorporate the technique into clinical care.

Meanwhile, researchers are hoping to develop medications to inhibit IDH through ongoing national clinical trials. If effective, these inhibitors could combine with AI-imaging techniques to overhaul how some brain cancers are assessed and treated.

“In the big picture, we may be able to treat some gliomas without operating on a patient,” Maldjian says. “We would use AI to detect an IDH-mutated glioma, then use IDH inhibitors to slow down or reverse the tumor growth. The field of radio-genomics is exploding with possibilities.”

Credit: 
UT Southwestern Medical Center