Culture

Cell-autonomous immunity shaped human evolution

image: Anthropology professor Jessica Brinkworth studies the evolution of human immune function and how that affects susceptibility to severe infection.

Image: 
Photo by Fred Zwicky

CHAMPAIGN, Ill. -- Every human cell harbors its own defenses against microbial invaders, relying on strategies that date back to some of the earliest events in the history of life, researchers report. Because this "cell-autonomous immunity" is so ancient and persistent, understanding it is essential to understanding human evolution and human medicine, the researchers said.

Like amoebae, most human cells can transform themselves to engulf and degrade foreign agents in a process known as phagocytosis, said Jessica Brinkworth, a professor of anthropology at the University of Illinois, Urbana-Champaign who wrote the new report with former undergraduate student Alexander Alvarado. And the methods that human cells use to detect, pierce or hack up invading microbes are inherited from - and shared by - bacteria and viruses, she said.

"Every cell has these things and they have this deep evolutionary history," Brinkworth said. "This means that if you're going to study humans, you need to accept that immunity is always going to be part of what you're looking at. And you're going to have to go deep into evolutionary time."

The authors reject the notion that the immune system is distinct from other bodily systems.

"Immunity is literally everywhere," Brinkworth said. "The whole of the organism, from the skin down to the level of the last enzyme floating anywhere in the body, almost all of it is engaged in protection in one form or another."

For that reason, she suggests that medical approaches to fighting infection that try to tamp down evolutionarily conserved immune responses such as pro-inflammatory pathways are misguided. While it can be useful or necessary to use immune-suppressing drugs against autoimmune conditions or in the case of organ transplants, such drugs do not appear to work against severe microbial infections.

"In the context of severe infections, there have been many attempts to come up with ways of reducing the immune response by throwing a bunch of steroids at it or blocking the body's ability to detect the pathogen," Brinkworth said. "But targeting these immune mechanisms that have been around for millions of years is potentially counterproductive."

In the case of sepsis, which Brinkworth studies, this approach has not been fruitful.

"More than 100 trials of immunomodulatory approaches to sepsis have failed," she said. "And the one drug that made it to market then failed. Most of these drugs tried to block highly evolutionarily conserved defenses, like mechanisms of cell-autonomous immunity."

Many immunomodulatory drugs now being tested against the new coronavirus are failed sepsis drugs, she said.

Similarly, anthropologists often fail to consider how millions of years of battle against infections at the cellular level have shaped human genetics, physiology and even behavior, Brinkworth said.

"If you're talking about human evolution, if you're in any physiological system, you're going to have to address at some point how pathogens have shaped it," she said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Metabolite signature of COVID-19 reveals multi-organ effects

The manuscript on which this press release is based has an associated correction, which can be found here: https://pubs.acs.org/doi/10.1021/acs.jproteome.1c00273

SARS-CoV-2, the virus responsible for COVID-19, can cause a wide range of symptoms, from none at all to severe respiratory stress, multi-organ failure and death. The virus notably targets the lungs, but many patients also experience non-respiratory symptoms. Now, researchers reporting in ACS' Journal of Proteome Research compared lipoproteins and metabolites in the blood of COVID-19 patients and healthy subjects, revealing signs of multi-organ damage in patients that could someday help diagnose and treat COVID-19.

Current diagnostic tests for COVID-19 rely on the detection of viral RNA or antibodies against the virus. Both types of tests are prone to false-negative results, as well as having other limitations. Another possible way of detecting SARS-CoV-2 infection could involve analyzing metabolic changes the virus causes in an infected person. Jeremy Nicholson, Elaine Holmes and colleagues wanted to analyze the systemic effects of the disease and determine whether there is a general metabolic signature of COVID-19.

The researchers collected blood samples from 17 patients who tested positive for COVID-19 with current assays and from 25 healthy age-, sex- and body mass index-matched controls who were proven negative for current or prior SARS-CoV-2 infection with an antibody test. Then, the team analyzed the plasma lipoprotein, metabolite and amino acid levels in blood plasma with nuclear magnetic resonance spectroscopy and liquid chromatography-mass spectrometry. Through multivariate statistical analyses that detected differences between patients and controls, the researchers revealed a metabolic signature of SARS-CoV-2 infection involving signs of acute inflammation, liver dysfunction, diabetes and cardiovascular disease risk. The team is now validating the data in a much larger group of patients.  In addition to possibly being used to develop a metabolite-based diagnostic test, these results suggest that recovered COVID-19 patients should be evaluated for increased risks for other conditions, the researchers say.

Credit: 
American Chemical Society

Allergic immune responses help fight bacterial infections

image: Artistic 3D of a mast cell (in the center of the picture) with IgE antibodies (in blue), which bound to the receptor Fc?RI (in pink) on the cell surface and Staphylococcus aureus bacteria (in gold). The IgE antibodies were induced during an earlier S. aureus infection and recognize bacterial toxins (in green). Upon re-infection with S. aureus (as shown) the IgE antibodies increase the mast cell response to S. aureus toxins (in green), leading to enhanced release of mast cell granules (in red) and antibacterial activity

Image: 
Bobby R. Malhotra / CeMM

Allergy is one of the most common diseases in Europe, it is estimated that more than 150 million Europeans suffer from recurring allergies and by 2025 this could have increased to half of the entire European population.1 Allergic patients initially undergo a process of "sensitization", meaning that their immune system develops a specific class of antibodies, so called Immunoglobulin E antibodies (IgE), which can recognize external proteins, referred to as allergens. IgEs bind and interact with cells that express a specific receptor called FcεR1. There are only a few cell types in the body that express the FcεR1 receptor and probably the most important ones are mast cells, a type of immune cell found in most tissues throughout the body.

When re-exposed to the allergen, mast cells (with IgE bound to their FcεR1 receptors) immediately react by rapidly releasing different mediators (e.g. histamine, proteases or cytokines) that cause the classic allergic symptoms. These symptoms depend on the tissue where the contact with the allergen happens and can range from sneezing/wheezing (respiratory tract) to diarrhea and abdominal pain (gastrointestinal tract) or itching (skin). Systemic exposure to allergens can activate a large number of mast cells from different organs at the same time, causing anaphylaxis, a serious and life-threatening allergic reaction.

Despite decades of research and detailed knowledge of the critical role of IgEs and mast cells in allergies, the physiological, beneficial function of this "allergy module" is still not completely understood. In 2006, Stephen J. Galli, senior co-author of this study, and his laboratory at Stanford University revealed the importance of mast cells for innate resistance against venoms of certain snakes and the honeybee (Science. 2006 Jul 28;313(5786):526-30. DOI: 10.1126/science.1128877). Subsequent work from the Galli laboratory showed the critical role of the "allergy module" in acquired host defense against high doses of venom (Immunity. 2013 Nov 14;39(5):963-75. doi: 10.1016/j.immuni.2013.10.005): this finding (to which Philipp Starkl, first author of the current study, contributed importantly) represented the first clear experimental evidence supporting the "Toxin Hypothesis" postulated by Margie Profet in 1991. This hypothesis proposed a beneficial function for allergic reactions against noxious substances (Q Rev Biol. 1991 Mar;66(1):23-62. doi: 10.1086/417049).

Following up on this discovery, Philipp Starkl, Senior Postdoctoral fellow at the Medical University of Vienna and CeMM, together with Sylvia Knapp, Professor at the Medical University of Vienna and CeMM PI, and Stephen J. Galli, Professor at Stanford University School of Medicine, and colleagues, set out to investigate whether this phenomenon could be relevant in defense against other toxin-producing organisms, in particular, pathogenic bacteria. The authors selected the bacterium Staphylococcus aureus as pathogen model due to its enormous clinical relevance and broad repertoire of toxins. This bacterium is a prototypic antibiotics-resistant pathogen and is also associated with the development of allergic immune responses in diseases such as asthma and atopic dermatitis. For their research, they used different experimental S. aureus infection models in combination with genetic approaches and in vitro mast cell models to reveal the functions of selected components of IgE effector mechanisms.

The scientists found that mice with a mild S. aureus skin infection develop an adaptive immune response and specific IgEs antibodies against bacterial components. This immune response grants these mice an increased resistance when they are confronted with a severe secondary lung or skin and soft tissue infection. However, mice that are lacking functional IgE effector mechanisms or mast cells are unable to build such protection. These findings indicate that the "allergic" immune response against bacteria is not pathological, but instead protective. Hence, defense against toxin-producing pathogenic bacteria might be an important biological function of the "allergy module".

This study is an important collaboration initiated by Philipp Starkl at the laboratory of Stephen J. Galli at Stanford University together with other colleagues and then continued at the laboratory of Sylvia Knapp at CeMM and the Medical University of Vienna. This exciting discovery not only advances the general understanding of the immune system and most notably allergic immune responses, but it could also explain why the body has maintained the "allergy module" throughout evolution. Despite their dangerous contributions to allergic diseases, IgEs and mast cells can exert beneficial functions that the immune system can capitalize on to protect the body against venoms and infections with toxin-producing bacteria, such as S. aureus.

Credit: 
CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences

Gut microbiota not involved in the incidence of gestational diabetes mellitus

Maternal overweight and obesity increase the risk of gestational diabetes mellitus. Gut microbiota composition has recently been associated with both overweight and a range of metabolic diseases. However, it has thus far been unclear whether gut microbiota is involved in the incidence of gestational diabetes.

A clinical study with the purpose to investigate the impact of two food supplements, fish oil and probiotics (containing Lactobacillus rhamnosus HN001 and Bifidobacterium animalis ssp. lactis 420), on maternal and child health was conducted at the University of Turku and Turku University Hospital in Finland. The microbiota was analysed from fecal samples of 270 overweight and obese women using the state of the art analytical and bioinformatics methods based on deep sequencing metagenomics analysis.

"Metagenomics is a next-generation sequencing tool that provides species level resolution of the gut microbiota composition. Metagenomics also provides information on the bacterial genes and gives clues about the possible function of the gut microbiota," says Senior Researcher Kati Mokkala from the Institute of Biomedicine of the University of Turku, Finland.

"Our study shows that gut microbiota composition and function is not involved in the onset of gestational diabetes in overweight and obese women. Also, no difference were found in women with gestational diabetes when compared to women remaining free from the condition," explains Associate Professor Kirsi Laitinen from the Early Nutrition and Health research group.

Probiotics have been shown to influence gut microbiota composition, but the impact of the combination of probiotics and fish oil is less well characterized. The women were randomised into four groups to consume two food supplements either as a combination or separately: fish oil + placebo, probiotics + placebo, fish oil + probiotics, or placebo + placebo. The women consumed the supplements from early pregnancy onwards until after the pregnancy.

"Interestingly, our study revealed that the combination of fish oil and probiotics modulated the composition of gut microbiota particularly in women who did not develop gestational diabetes," Mokkala explains.

Whether the gut microbiota of women with gestational diabetes is less amenable for modification by food supplements needs to be confirmed in further studies.

Credit: 
University of Turku

Bat tick found for the first time in New Jersey

image: Live larval bat ticks (Carios kelleyi) removed in 2019 from big brown bats in Mercer County, New Jersey.

Image: 
J. Occi/Rutgers Center for Vector Biology

A tick species associated with bats has been reported for the first time in New Jersey and could pose health risks to people, pets and livestock, according to a Rutgers-led study in the Journal of Medical Entomology.This species (Carios kelleyi) is a “soft” tick. Deer ticks, which carry Lyme disease, are an example of “hard” ticks.
“All ticks feed on blood and may transmit pathogens (disease-causing microbes) during feeding,” said lead author James L. Occi, a doctoral student in the Rutgers Center for Vector Biology at Rutgers University–New Brunswick. “We need to be aware that if you remove bats from your belfry, attic or elsewhere indoors, ticks that fed on those bats may stay behind and come looking for a new source of blood. There are records of C. kelleyi biting humans.”
This soft tick species, a parasite of bats, is known to be in 29 of the 48 contiguous U.S. states, and was confirmed in New Jersey as larvae collected from big brown bats (Eptesicus fuscus) in Mercer and Sussex counties. This is a new addition to the list of New Jersey ticks.
While the public health risk remains unknown, “finding them on New Jersey bats was an unusual event that prompted bat specialists to contact us. Maybe these ticks are becoming more common,” said senior author Dina M. Fonseca, a professor and director of the Center for Vector Biology in the Department of Entomology in the School of Environmental and Biological Sciences.In other states, C. kelleyi has been found infected with microbes that are harmful to people, pets and livestock. There have been reports of this soft tick feeding on humans, and the bat that hosts them regularly roosts in structures such as attics and barns, underscoring the need to learn more about them, the study says.

“This tick belongs to the family Argasidae, known as 'soft ticks' because their body looks leathery and soft,” Fonseca said. That is in contrast to the "hard ticks" (family Ixodidae) that New Jerseyans are more familiar with.Scientists in the Endangered and Nongame Species Program of the Division of Fish and Wildlife in the New Jersey Department of Environmental Protection found the tick larvae on bats last year. Technically, this is not the first time a soft tick has been reported in New Jersey. In 2001, a related tick species – Carios jersey – was found in amber in Middlesex County. That specimen was 90 million to 94 million years old.

“The next steps are to collect more soft tick specimens and test them for disease-causing microbes,” Occi said.
Rutgers coauthors include Andrea M. Egizi, a visiting professor in the Department of Entomology and a research scientist with the Monmouth County Tick-borne Diseases Laboratory hosted by the Rutgers Center for Vector Biology. Scientists at the New Jersey Division of Fish and Wildlife, Smithsonian Institution and Walter Reed Army Institute of Research contributed to the study.

Journal

Journal of Medical Entomology

DOI

10.1093/jme/tjaa189

Credit: 
Rutgers University

Analysis: 'Near-zero incidence' of patients acquiring COVID-19 at Brigham and Women's

Boston, MA -- As COVID-19 began to surge in the Boston area earlier this year, new infection control measures were put in place at Brigham and Women's Hospital to protect patients and staff. Over the ensuing weeks, infection control policies continued to evolve, eventually encompassing:

Universal masking of all patients, staff and visitors

Dedicated COVID-19 units with airborne infection isolation rooms

Personal protective equipment in accordance with CDC recommendations

A restricted visitor policy

Daily symptom screening for employees and patients

Testing of all patients being admitted to the hospital

A new study addresses a critical question: Were these infection control measures successful in preventing transmission of COVID-19 to patients in the hospital? In a paper published in JAMA Network Open, a team of investigators from the Brigham report on an analysis of all cases in which a patient tested positive for COVID-19 three days or later after coming to the hospital and up to 14 days after discharge during the first 12 weeks of the surge in Massachusetts. They found that although the Brigham cared for over 9,000 inpatients during this timeframe -- including nearly 700 with COVID-19 -- only two patients likely acquired the disease within the hospital, including one who likely acquired it from his visiting spouse prior to universal masking and restriction of visitors, and one with no clear exposures within or outside the hospital.

"Our data show that in a hospital with robust, rigorous infection control measures, it is very much possible to prevent the spread of COVID-19 to patients," said corresponding author Chanu Rhee, MD, MPH, an infectious disease and critical care physician and associate hospital epidemiologist at the Brigham. "This is an important finding as we know that many patients are avoiding essential care due to fear of contracting COVID-19 in health care settings. Our study shows that the hospital is in fact very safe, and if people need to go the hospital for care, they should go."

Rhee and colleagues conducted their study on data from all patients seen at the Brigham beginning March 7 (when the first patient with COVID-19 was admitted) through May 30, 2020. During that 12-week period, 9,149 patients were admitted to the hospital. More than 7,300 diagnostic COVID-19 tests were performed, with 697 people testing positive. Twenty-three patients were diagnosed with COVID-19 after the third day of hospitalization or within two weeks after discharge. All cases were reviewed in detail by Rhee and hospital epidemiologist and co-author Michael Klompas, MD, MPH, to assess the most likely source of each patient's infection. Of these 23 patients, 14 had symptoms on admission and were deemed to have been infected prior to admission, while seven were diagnosed following high-risk, post-discharge exposures. Of the remaining two patients who may have acquired their infection in the hospital, one likely acquired his, prior to visitor restrictions and universal masking, from a visiting spouse who was found to have COVID-19. There was only one other patient without a clear exposure who may have been infected in the hospital.

Rhee characterized the team's findings as "an exceedingly low rate of infection" and a "near-zero incidence" of COVID-19 acquisition among patients seeking care at the hospital during the surge.

The authors note that their study cannot determine which infection control measures in place at the hospital were most critical. In addition, while the researchers comprehensively analyzed and reviewed each case, they could not definitively determine the source of infection in every case. Results were also limited to the Brigham and may not be applicable to hospitals that have adopted other infection control measures. The study did not examine infection among health care workers, and the authors believe that this important topic warrants a separate, detailed analysis.

"Overall, our results should provide confidence to clinicians and patients around the country that currently recommended infection-control measures -- if carefully implemented and followed -- can prevent the spread of COVID-19 within the hospital," said Rhee.

Credit: 
Brigham and Women's Hospital

Researchers solve decades old mitochondrial mystery that could lead to new disease treatments

PHILADELPHIA -- Penn Medicine researchers have solved a decades old mystery around a key molecule fueling the power plant of cells that could be exploited to find new ways to treat diseases, from neurodegenerative disorders to cancer.

Reporting in a new study published today in Nature, researchers from the Department of Physiology in the Perelman School of Medicine at the University of Pennsylvania and other institutions found that the SLC25A51 gene dictates the transport of nicotinamide adenine dinucleotide (NAD+), a fundamental coenzyme in cellular metabolism, to the mitochondria, where energy from nutrients is converted into chemical energy for the cell. A low level of NAD+ is a hallmark of aging and has been associated with diseases including muscular dystrophy and heart failure.

"We have long known that NAD+ plays a critical role in the mitochondria, but the question of how it gets there had been left unanswered," said co-senior author Joseph A. Baur, PhD, an associate professor of Physiology and member of Penn's Institute for Diabetes, Obesity, and Metabolism. "This discovery opens up a whole new area of research where we can actually manipulate--selectively deplete or add--NAD+ at a subcellular level, now that we know how it's transported."

Xiaolu Ang Cambronne, PhD, an assistant professor in the department of Molecular Biosciences in The University of Texas at Austin, served as co-senior author.

The finding closes out a longstanding unknown around how NAD+ finds its way into the mitochondrial matrix. Several hypotheses had been circulating, including the idea that mammalian mitochondria were incapable of NAD+ transport, instead relying entirely on synthesis of NAD+ within the organelle, but in 2018, Baur's lab put that idea to rest when it reported in an eLife study that a transporter was in fact responsible.

From there, the team began its search for the genetic identity of the mammalian mitochondrial NAD+ transporter, homing in on several genes, including SLC25A51, that were predicted to be transporters, but for which the function remained unknown. SLC25A family members encode mitochondrially-localized proteins that carry materials across mitochondrial membranes.

"In our approach, we focused in on genes that were determined to be essential for cellular viability. NAD+ is a fundamental molecule required for maintaining the mitochondrial-mediated energy production. We predicted that loss-of mitochondrial NAD+ transport would disrupt oxidative phosphorylation and possibly reduce cell survival," said lead author Timothy S. Luongo, PhD, a postdoctoral fellow in the Baur lab.

In laboratory experiments, the researchers isolated the mitochondria from human cells and measured the levels of NAD+ after knocking out SLC25A51 or overexpressing it. Using mitochondrially-targeted NAD+ "biosensors," they showed that a change in the gene expression level controls mitochondria NAD+ levels specifically.

"We observed that loss of SLC25A51 expression dramatically altered the mitochondria's ability to consume oxygen and generate ATP as well as transport NAD+ into the matrix. Also, in collaboration with the Cambronne lab, we were able to demonstrate that expression of SLC25A51 in yeast lacking their endogenous mitochondrial NAD+ transporters restored NAD+ mitochondrial transport," said Luongo.

NAD+ levels can be targeted in various disease treatments; however, it has been more of a catch-all approach, where levels are increased or reduced in all parts of the cell, which runs the risk of unintended alterations of gene expression or other types of metabolism. This study is the first published case where researchers identified a specific target and reduced the levels solely in the mitochondria and no other parts of the cell.

Controlling the levels of NAD+ and thus metabolic processes in the mitochondria could have major implications for the study and development of new treatments for diseases. Activating the transport mechanism could potentially make cells favor a state of respiration to make energy, instead of glycolysis. Different cancer types, for example, rely heavily on glycolysis, so creating an unfavorable environment without that metabolism could be one strategy. Or, conversely, it might be possible to deny highly respiratory cancer cells mitochondrial NAD+, so they're forced to rely on glycolysis. The heart requires abundant quantities of mitochondrial-produced energy to continually supply blood to peripheral tissue. A major contributor to heart failure is mitochondrial dysfunction, so targeting the mitochondria's capacity to transport NAD+ may improve cardiac function of the failing heart. With respect to exercise, shifting towards a more oxidative metabolism could boost endurance.

The work is in its early days, but a door has been opened for new investigations centered on mitochondrial NAD+ and this gene. Next, the researchers will study the physiological function of NAD+ transport and how this mechanism is regulated, as well as ways to turn the transport on and off outside of reducing or increasing gene expression.

"An approach to specifically alter the mitochondrial NAD+ pool is something many researchers have been looking for, so I would expect that we will see this gene targeted in a multitude of systems," Baur said. "I think this is going to be a really valuable tool to help us better understand the function of mitochondrial NAD+ and its therapeutic potential."

Credit: 
University of Pennsylvania School of Medicine

Dismantling structural racism in nursing

image: Antonia M. Villarruel, PhD, RN, FAAN, Professor and Margaret Bond Simon Dean of Nursing at the University of Pennsylvania School of Nursing (Penn Nursing)

Image: 
Penn Nursing

PHILADELPHIA (September 9, 2020) - Confronting the uncomfortable reality of systemic racism - the system that creates and maintains racial inequality in every facet of life for people of color - is having a national heyday. But calling out this injustice and doing something about it are two different things.

Throughout its history, nursing has been on the forefront of advocacy addressing public policies, institutional practices, and other norms that perpetuate racial group inequities. Yet structural racism still remains in the teaching, research, scholarship, and practice of nursing.

In an editorial for the journal Nursing Outlook, two nurse leaders propose a framework to guide thinking and action to effectively address racial inequities and injustices throughout nursing.

"There remain too many examples of structural racism throughout nursing and we must be open to continuing to examine, identify, and change these within our own profession," writes Antonia M. Villarruel, PhD, RN, FAAN, Professor and Margaret Bond Simon Dean of Nursing at the University of Pennsylvania School of Nursing (Penn Nursing). Villarruel wrote the editorial titled "Beyond the naming: institutional racism in nursing," along with Marion E. Broome, PhD, RN, FAAN, Dean of the School of Nursing at Duke University.

The framework the authors outline identifies ways that nurses can lead in their organizations and change policies, practices, and traditions that disadvantage and diminish people of color in schools of nursing, nursing professional organizations, and health systems. The authors challenge nurses to use the framework to dismantle structural racism in practice.

"If it is to be different, it is time to act. Actions, if inclusive and well thought out, can be the medium to bring people together to make a real difference--especially the younger students and faculty who we so often 'protect' from that work," the authors add.

Credit: 
University of Pennsylvania School of Nursing

Prediction of protein disorder from amino acid sequence

image: Associate Professor Frans Mulder and coworkers at Aarhus University have developed ODiNPred (Prediction of Order and Disorder by evaluation of NMR data), a software tool developed for prediction of protein order and disorder.

Image: 
Frans Mulder

In the last century, Anfinsen showed beyond a doubt that a protein can find its way back to its 'native' three-dimensional structure after it has been placed under 'denaturing conditions' where the protein structure is unfolded. The profound conclusion of his experiments was that apparently the information that governs the search back to the native state is hidden in the amino acid sequence. Thermodynamic considerations then set forth a view where the folding process is like rolling energetically downhill to the lowest point - to the unique native structure. These findings have often been intertwined with the central dogma of molecular biology. Thus, a gene codes for an amino acid sequence, and the sequence codes for a specific structure. 

Enter intrinsically disordered proteins.

The next breakthrough came with the advent of cheap and fast genome sequencing in the wake of the human genome project; once thousands of genomes of various organisms were sequenced, scientists made a staggering discovery - there were lots and lots of genes that coded for proteins with low-complexity. In other words, these proteins did not contain the right amino acids to fold up and experiments confirmed that they remained 'intrinsically disordered'. Also, the human genome turned out to have more than a third of its genes coding for protein disorder!

How to detect protein disorder?

Since disordered proteins are very flexible, they are not amenable to crystallization and therefore no information can be obtained from X-ray diffraction on protein crystals - the approach that has been so pivotal for folded proteins. Instead, these proteins must be studied in solution, and for this purpose NMR (Nuclear Magnetic Resonance) spectroscopy is the most suited tool. In this method, a quantum physical property called 'spin' is measured in a strong magnetic field for each atom in the molecule. The exact precession frequencies of the spins are a function of their environment, and it is exactly this frequency that allows researchers to quantitatively measure to which extent each amino acid is ordered or disordered in the protein.

In their new paper, published on 8 Sept 2020, Dr. Rupashree Dass together with Associate Professor Frans Mulder and Assistant Professor Jakob Toudahl Nielsen have used machine learning together with experimental NMR data for hundreds of proteins to build a new bioinformatics tool that they have called ODiNPred. This bioinformatics program can help other researchers making the best possible predictions of which regions of their proteins are rigid and which are likely to be flexible. This information is useful for structural studies, as well as understanding the biological role and regulation of intrinsically disordered proteins.

READ ALSO:

Empty spaces, how do they make a protein unstable?

How good are protein disorder prediction programmes actually? 

Read more about the results in Scientific Reports: ODiNPred: comprehensive prediction of protein order and disorder by Rupashree Dass, Frans A. A. Mulder & Jakob Toudahl Nielsen

The research was carried out by researchers from Interdisciplinary Nanoscience Center (iNANO) and the Department of Chemistry at Aarhus University. The work was financially supported by VILLUM Fonden.

For further information, please contact

Associate Professor Frans A. A. Mulder Interdisciplinary Nanoscience Center and Department of ChemistryAarhus UniversityEmail: fmulder@chem.au.dk

Journal

Scientific Reports

DOI

10.1038/s41598-020-71716-1

Credit: 
Aarhus University

More chemicals can be assessed for endocrine disrupting effects

A European guidance document aimed at identifying endocrine disrupting pesticides can--with some modifications--be used to assess other chemicals' endocrine disrupting effects. This is the finding of a new study conducted by the National Food Institute, Technical University of Denmark, and Copenhagen University Hospital.

According to EU regulation, all pesticides must be thoroughly assessed for potential endocrine disrupting effects before they can be approved for use. However, the same rules do not necessarily apply to chemicals that are used for other purposes.

Researchers at the National Food Institute are pointing out that the approval process for chemicals, which are used e.g. as additives in cosmetics or food, can thus overlook chemicals that are harmful to the human endocrine system.

In a new study, the researchers along with colleagues at Copenhagen University Hospital have evaluated whether an existing European guidance document could be used to identify potential endocrine disrupting effects of chemicals that are used for purposes other than plant protection.

The European Food Safety Authority, EFSA, and the European Chemicals Agency, ECHA, developed the guidance document for the purpose of identifying endocrine disrupting pesticides.

"A person who is exposed to a chemical from food or the environment won't care what the chemical is used for if it is harmful to human health," Head of Research Group Terje Svingen at the National Food Institute says.

Butylparaben--a case study

The focus of the study was butylparaben, which can be used to e.g. extend the shelf life of cosmetic creams. The available data for butylparaben is very different from the available data for biocides and pesticides, as the authorities require far fewer studies of additives.

Nonetheless, by using the EU guidance document the researchers were able to conclude that butylparaben has endocrine disrupting effects on humans--in part because university researchers globally have conducted many smaller studies of butylparaben's endocrine disrupting properties.

Guidance document can be applied to a broader spectrum of chemicals

According to the researchers from the National Food Institute and Copenhaven University Hospital, the butylparaben study provides sufficient evidence to propose that the existing guidelines--with some modifications--can be used to identify the endocrine disrupting effects of a much broader spectrum of chemicals than just pesticides. The modifications are intended to facilitate the use of data other than those generated for pesticides.

"Such use of the guidance document could be a first step towards a more harmonized assessment of the endocrine disrupting effects of chemicals, which is independent of the chemical's intended use. In theory, the guidance could be applied to all chemical substances to which humans are exposed," Senior Researcher Julie Boberg at the National Food Institute says.

Credit: 
Technical University of Denmark

During the pandemic, online lecture series helps fill gaps in training for urology residents

September 9, 2020 - The ongoing coronavirus pandemic has affected all aspects of healthcare - including sharp drops in educational opportunities for resident physicians in training. In response, urology training programs across the United States joined forces to develop a multi-institutional online video lecture collaboration, according to a special article in Urology Practice, an Official Journal of the American Urological Association (AUA). The journal is published in the Lippincott portfolio by Wolters Kluwer.

Called "Urology Collaborative Online Video Didactics" - Urology COViD for short - the online lecture series has been a runaway success in the urology world, with thousands of views and overwhelmingly positive reviews from trainees and educators. Lindsay A. Hampson, Assistant Professor of Urology at University of California, San Francisco (UCSF), and colleagues share their experience with the development and initial evaluation of the groundbreaking lecture series.

Urology Programs Team Up to Replace Educational Opportunities Lost to Coronavirus

An overlooked effect of drastic declines in routine clinical care has been the loss of invaluable training opportunities for resident physicians. This may be especially true in surgical specialties such as urology, as many hospitals have only recently started to resume operations other than emergency or urgent surgery.

Spurred by an example from another surgical subspecialty (otolaryngology), the urology residents at UCSF brought the idea to Dr. Hampson. The following day Dr. Hampson conceptualized Urology COViD and reached out to program directors at eight academic training programs: UCSF, University of Washington, University of California-Davis, Stanford University, University of Minnesota, University of Michigan, Northwestern University, and University of Virginia. There was immediate buy-in, with programs across the country facing the same issues with how to train residents during decreased clinical volume and changing educational contexts.

Within a week the UCSF team had created a new website (https://urologycovid.ucsf.edu/) and launched Urology COViD as a "urology-specific collaborative didactic series." Almost immediately, there was a significant influx of collaborating programs from across the country; by the end of the first week of lectures, volunteer faculty had filled all 84 available lecture slots. The following week, a month-long waiting list was filled.

Consisting of a 45-minute lecture followed by a 15-minute question-and-answer session, lectures are delivered live over the Zoom platform in webinar format, including interactive features. All lectures are subsequently posted on the website for viewing via YouTube.

By any measure, Urology COViD has been a smashing success, with lectures delivered by faculty from 35 institutions. The twice-daily webinars have been seen by an average of more than 470 viewers live on Zoom. Within the first two weeks, there were more than 7,000 views of the lecture recordings on YouTube.

More than 90 percent of users leaving feedback on the lecture series and videos have left above average or excellent ratings. More than 80 percent said the series provided a sense of "community connectedness" during a time of social isolation. "All (100 percent) of the viewers surveyed in this study indicated that they would like to see the series continue into the future," the researchers write.

Urology COViD is resuming in September and is expected to provide continued educational opportunities even after the pandemic ends. "There will be a time in the future when we are back in the operating rooms, clinics and lecture halls," Dr. Hampson and colleagues conclude. "We hope that this series can evolve and persist so that these new collaborative educational efforts can outlast the pandemic and continue to provide a source of shared knowledge, resident teaching, and community building for our diverse field."

Credit: 
Wolters Kluwer Health

An evolutionary roll of the dice explains why we're not perfect

If evolution selects for the fittest organisms, why do we still have imperfections? Scientists at the Milner Centre for Evolution at the University of Bath investigating this question have found that in species with small populations, chance events take precedence over natural selection, allowing imperfections to creep in.

Recent work by Alex Ho and Laurence Hurst from the Milner Centre for Evolution at the University of Bath analysed the genomes of a wide range of organisms, from mammals to single-celled algae. They compared the genetic instructions used by cells to make proteins - specifically the code at the end of the gene that tells the cell to stop reading, called stop codons.

When making proteins, our DNA is read out in strings, with a stop codon at the end of a string to tell the cell to stop reading. In any given gene, most organisms have a choice of using one of three very similar stop codons, however, one of them (so called TAA) is much better than the others (called TGA and TAG) at making the cell machinery stop.

The researchers, publishing in Molecular Biology and Evolution, looked at why some genes use the less efficient stop codons, when evolution by natural selection should cause most genes to use the more efficient TAA codon.

They found that in species such as humans and other mammals, where populations are relatively small and reproduction is slow, selection favoured TAA in the most highly expressed genes. However mutations creating the less effective stop codons could increase in frequency because of chance events, the roll of the dice being more influential when populations are small. This results in a less efficient stop codon being found more often than would be expected, mostly in the less commonly used genes.

In contrast, in species with large, fast replicating populations, such as yeast or bacteria, chance is less important and so natural selection tended to "weed out" any less favourable mutations, resulting in TAA being very common.

The findings could help the design of new gene therapies for genetic diseases.

Professor Laurence Hurst, Director of the Milner Centre for Evolution, said: "Our total set of DNA seems very much more complicated than that of something like yeast. Humans have lots of enigmatic DNA between our genes and each of our genes can typically make many different products, whereas yeast genes tend to make just one.

"Our work shows that natural selection in humans is not very efficient and so our DNA ends up similar to an ancient rusting motor car - just able to function, with all sorts of bad repairs and accretions built up over time. Yeast instead is more like an organism straight out of the showroom: the perfect machine."

Their results indicate that organisms, such as humans and other mammals, with relatively small population sizes, cannot sustain a perfect state over evolutionary time. It also supports the view that human DNA is error prone and poor quality, not as part of some complex machine for a complex organism, but instead because selection is too weak a force to stop our DNA from deteriorating.

Professor Hurst said: "These results matter because they help us understand that just because something is common, it doesn't mean it is the best. This helps both the understanding of, and therapeutics for, genetic diseases.

"For example, it suggests when making new genes for gene therapy, we should do what yeast do and use the best stop codon: TAA."

Credit: 
University of Bath

As collegiate esports become more professional, women are being left out

A new study from North Carolina State University reports that the rapidly growing field of collegiate esports is effectively becoming a two-tiered system, with club-level programs that are often supportive of gender diversity being clearly distinct from well-funded varsity programs that are dominated by men.

"Five years ago, we thought collegiate esports might be an opportunity to create a welcoming, diverse competitive arena, which was a big deal given how male-dominated the professional esports scene was," says Nick Taylor, co-author of the study and an associate professor of communication at NC State. "Rapid growth of collegiate esports over the past five years has led to it becoming more professional, with many universities having paid esports positions, recruiting players, and so on. We wanted to see how that professionalization has affected collegiate esports and what that means for gender diversity. The findings did not give us reason to be optimistic."

For this qualitative study, the researchers conducted in-depth interviews with 21 collegiate esports leaders from the U.S. and Canada. Eight of the study participants were involved in varsity-level esports, such as coaches or administrators, while the remaining 13 participants were presidents of collegiate esports clubs. Six of the participants identified as women; 15 identified as men.

"Essentially, we found that women are effectively pushed out of esports at many colleges when they start investing financial resources in esports programs," says Bryce Stout, co-author of the study and a Ph.D. student at NC State. "We thought collegiate esports might help to address the disenfranchisement of women in esports and in gaming more generally; instead, it seems to simply be an extension of that disenfranchisement."

"Higher education has been spending increasing amounts of time, money and effort on professionalizing esports programs," Taylor says. "With some key exceptions, these institutions are clearly not putting as much effort into encouraging diversity in these programs. That effectively cuts out women and minorities.

"Some leaders stress that they will welcome any player onto their team, as long as the player has a certain skill level," Taylor says. "But this ignores the systemic problems that effectively drive most women out of gaming - such as harassment. There needs to be a focus on cultivating skill and developing players, rather than focusing exclusively on recruitment."

Credit: 
North Carolina State University

Consequences of the 2018 summer drought

image: The Bioclimatology Group of the Faculty of Forest Sciences and Forest Ecology at the University of Göttingen is part of the ICOS European Research Infrastructure network with a meteorological station in the Hainich National Park.

Image: 
Alexander Knohl

The drought that hit central and northern Europe in summer 2018 had serious effects on crops, forests and grasslands. Researchers from the European Research Infrastructure Integrated Carbon Observation System (ICOS), including researchers from the University of Göttingen, are showing what effects this had and what lessons can be learned. The results of 16 studies that are currently underway have been published as a special issue in the journal Philosophical Transactions.

The interdisciplinary teams shed light on different aspects of this research. Among many findings, they found that the plants initially benefited from the warm and sunny conditions in spring, but had too little water available for their roots when the summer heatwave started. As a result, grasslands began to dry up and numerous arable areas recorded the lowest yields for decades. The forests protected themselves by greatly reducing their evaporation for several weeks, but this then led to a sharp drop in carbon dioxide uptake. Such effects were observed simultaneously - all the way from Switzerland to the Netherlands and Germany, and from the Czech Republic to Sweden and Finland.

The Bioclimatology Group of the Faculty of Forest Sciences and Forest Ecology at the University of Göttingen contributes to ICOS with a meteorological station in the Hainich National Park. For the last 20 years every 30 minutes, the station has measured the carbon dioxide (CO2) and water vapour exchange between forest and atmosphere. Comparing the data across Europe shows that the area under investigation is one of those most affected by the 2018 drought. "In 2018, the CO2 uptake calculated over the whole year was about 30 percent lower than the average of the past 20 years," says Head of the Group Professor Alexander Knohl. "On some days in the summer of 2018, the forest actually emitted carbon dioxide instead of absorbing it," adds Dr Lukas Siebicke. "In the past 20 years, this has never happened before."

The measurements from the meteorological station in the Hainich National Park are of great international scientific importance for two reasons: it is one of the world's longest time series for such continuous measurements; and it is one of the oldest unmanaged forests in which such measurements of carbon dioxide and water vapour exchange takes place.

ICOS is a European research infrastructure for measuring carbon dioxide fluxes between land, ocean and atmosphere. Across Europe, 140 measuring stations in twelve countries are involved. ICOS stations are subject to a rigorous quality assurance process and provide standardised data that is made freely available for research, teaching and other applications. ICOS provides essential data for the reports of the Intergovernmental Panel on Climate Change (IPCC) and for the decision-making processes within the UN Framework Convention on Climate Change.

Credit: 
University of Göttingen

A window into adolescence

image: Graduate student Grace McIlvain (right) began working with Prof. Curtis Johnson in 2016 as an undergraduate summer researcher. She is now a third-year doctoral student shedding new light on the biological roots for adolescent risk-taking.

Image: 
Photo by Kathy F. Atkinson

As any parent will tell you, no two children behave in exactly the same way. It is part of what makes each individual unique.

So, why do some adolescents take more risks than others?

University of Delaware Biomedical Engineer Curtis Johnson and graduate student Grace McIlvain think they may have an idea.

The part of the brain that makes adolescents want to take risks is called the socioemotional system. The brain's cognitive control center, meanwhile, is what helps prevent adolescents from acting on these impulses.

In a recently published paper in NeuroImage, Johnson and McIlvain suggest that these two centers in the brain physically mature at different rates and that adolescents with large differences in the rate of development between these two brain regions are more likely to be risk-takers. Further, the research team theorizes that it is the brain's fundamental structure that drives these risk-taking and control tendencies.

What makes this study unique is that the UD researchers and their collaborators used a technique called magnetic resonance elastography (MRE) to safely measure the mechanical properties of the brain tissue as a measure of brain development, rather than activation of those two regions.

Elastography is a method of imaging mechanical properties of tissues using a magnetic resonance imaging (MRI) scanner. Simply put, the researchers take snapshots of how the brain deforms -- or bends -- as it is vibrated under low frequencies, and then put those images through a specific algorithm to reverse engineer what is happening. Johnson explained that MRE vibration is safe for all ages and provides less movement than naturally occurs in the brain. It also offers less vibration than other devices designed for children, such as vibrating rockers.

Johnson likened the process to any other material testing and said the research team's knowledge of how tissue deforms helps them interpret what is happening under different vibrations. In adults, MRE techniques have become popular for studying diseases, such as Alzheimer's, with research showing relationships between memory and cognitive performance.

"MRE techniques do not replace other aspects of studying brain development, but they may provide a more sensitive, objective way to look at the brain's wiring," said Johnson, an assistant professor in the Department of Biomedical Engineering.

Mapping adolescent brain development

This is not the first time that researchers have looked at how two brain regions interact to form a certain output. But most of this work has been done using functional MRI (fMRI), where study participants are placed in the scanner and given a real-time task, and the researchers watch which areas of the brain light up to determine what areas of the brain relate to that task.

Johnson's research group was an early pioneer in using MRE techniques to make high-resolution three-dimensional maps that enable scientists to look at specific regions of the brain. The intensity of every 3D pixel in an image has meaning. For example, bright colors indicate high stiffness, which, in this case, indicates a measure of developmental maturity.

Looking at these features of the brain in their work, the researchers found that it wasn't the socioemotional or the cognitive control center alone, but the combination of the two centers of the brain working together at a specific age or point in time that was the definitive factor in risk taking.

"So, there is this period during adolescence where the part of the brain that makes you want to take risks is more mature than the part of the brain that suppresses those impulses," said McIlvain, who began working on the project as an undergraduate summer researcher in 2016 and is now a third-year doctoral student in biomedical engineering.

"If we can identify individuals who are more likely to take risks, based on the biological composition of their brain, or maybe groups of individuals, it might inform strategies for prevention."

Prior to this project, little MRE research had measured brain stiffness in children. Earlier work in 2018 by McIlvain showed the outside of the brain appears softer in adolescents than adults, whereas the inside of the brain appears stiffer in adolescents than adults. According to Johnson, this aligns with the known developmental trajectory where the inside of the brain develops first and the outside, the cortex, develops later.

The work grew out of Johnson's previous collaborative research with Eva Telzer, a psychology professor at University of North Carolina and co-author of the paper, and leverages the advanced MRI capabilities at UD's Center for Biomedical and Brain Imaging. Today, researchers in the Johnson lab develop all aspects of this MRE technique, from how to safely vibrate the head in the scanner to how to write the software to acquire the data to methods for turning the data into images that are translated into mechanical properties.

While the research team's previous work has shown differences in the brain function of typically developing children and those with conditions, such as cerebral palsy, this is the first time the researchers have shown a relationship with function in healthy children. But there are still more questions than answers.

For example, Johnson said currently there are no good measures for saying when the brain is mature or even for how to define brain health. And while the research team has made connections between how stiffness of the adolescent brain's socioemotional system and cognitive control center interrelate and support risk taking, there are other things they don't know, like how these regions of the brain are affected by things like socioeconomic status, early life trauma or early education.

A big focus of the work is making the MRE scan faster. The scan currently takes over six minutes, which can be difficult for children with disabilities or those who are very young.

"We'd like to complete the scan in under a minute -- less time than half a song from a Disney movie -- before a child loses interest and thinks about moving," said Johnson.

Next steps in the research include scanning kids as young as age 5, including those with autism. The hope is to create a robust data set to explore how brain mechanical properties change from age 5 to age 30, generally considered to be the end of adolescence. Among other things, they hope to use this data to better understand how children with disabilities fit into that developmental curve.

"Right now, there is no standard way to diagnose autism, no targeted treatment plan or metrics for measuring whether intervention is helping," said McIlvain, who recently was awarded an National Institutes of Health fellowship to study brain stiffness in children with autism. "If we can understand how the mechanical properties of the brain are affected in someone with autism, we can start to answer some of those questions."

Credit: 
University of Delaware