Culture

Dartmouth researchers pilot FLASH radiotherapy beam development for treatment of cancer

image: The exceptionally high dose rate of the FLASH Beam is 3,000 times higher than normal therapy treatment (300 Gray per second vs. 0.1 Gray per second, Gray being a standard unit measuring absorbed radiation). Instead of treatment over 20 seconds, an entire treatment is completed in 6 milliseconds, giving the therapy its nickname, "FLASH."

Image: 
Brian Pogue, PhD

LEBANON, NH - A joint team of researchers from Radiation Oncology at Dartmouth's and Dartmouth-Hitchcock's Norris Cotton Cancer Center (NCCC), Dartmouth Engineering, and Dartmouth-Hitchcock's Department of Surgery have developed a method to convert a standard linear accelerator (LINAC), used for delivery of radiation therapy cancer treatment, to a FLASH ultra-high-dose rate radiation therapy beam. The work, entitled "Electron FLASH Delivery at Treatment Room Isocenter for Efficient Reversible Conversion of a Clinical LINAC," is newly published online in the International Journal of Radiation Oncology, Biology & Physics.

The exceptionally high dose rate is 3,000 times higher than normal therapy treatment (300 Gray per second vs. 0.1 Gray per second, Gray being a standard unit measuring absorbed radiation). Instead of treatment over 20 seconds, an entire treatment is completed in 6 milliseconds, giving the therapy its nickname, "FLASH." "These high dose rates have been shown to protect normal tissues from excess damage while still having the same treatment effect on tumor tissues, and may be critically important for limiting radiation damage in patients receiving radiation therapy," says Brian Pogue, PhD, Co-Director of NCCC's Translational Engineering in Cancer Research Program and co-author on the project.

While the team awaits news of potential funding from the National Institutes of Health (NIH), early pilot funding from NCCC and Dartmouth's Thayer School of Engineering allowed for prototyping of the converted LINAC. Pre-clinical testing of the beam began in August and has already provided key data on its potential for different tumor plans. "This is the first such beam in New England and on the east coast, and we believe it is the first reversible FLASH beam on a clinically used LINAC where the beam can be used in the conventional geometry with patients on the treatment couch," says Pogue.

The FLASH beam is currently being used in preclinical studies on both experimental animal tumors as well as in clinical veterinary treatments, to study the normal tissue-sparing effects and how to maximize the value. The research group has expanded to involve physicians in clinical radiation oncology and dermatology, designing what they hope will be the first human safety trial with FLASH radiotherapy at Dartmouth-Hitchcock, treating patients advanced skin lesions that cannot be removed surgically.

Credit: 
Dartmouth Health

Loss of smell is the best sign of COVID-19

Two international studies confirm that for the majority of patients with respiratory infections who lose the sense of smell, this is due to COVID-19. The disease also often results in both loss of taste and the other senses in the mouth. A researcher from Aarhus University has contributed to the new results.

If you have had COVID-19, then forget about enjoying the smell of freshly made coffee. At any rate, two major international studies document that there is frequently a loss of smell and that this often lasts for a long time in cases of COVID-19

Alexander Wieck Fjaeldstad, is associate professor in olfaction and gustation at Aarhus University, and is behind the Danish part of the study.

The study shows that the average loss of the sense of smell was 79.7 on a scale from 0-100 - which indicates a large to complete sensory loss, says the researcher. In addition, the studies show that the loss of smell is very probably the best predictor of COVID-19 among patients with symptoms of respiratory diseases.

"This emphasises how important it is to be aware of this symptom, as it may be the only symptom of the disease," says Alexander Wieck Fjaeldstad, who also stresses only around half of patients with a loss of smell have gotten their sense of smell back after forty days.

"This differs from the picture we see with other viral infections and causes long-term discomfort for patients, both in relation to food and social contact, while at the same time causing them worry."

In addition to the loss of the sense of smell, the sense of taste was also significantly reduced, to 69.0 on a scale from 0-100, just as the remaining sense of feeling in the mouth was also reduced, this time to 37.3 on a scale from 0-100.

"While the loss of smell in itself removes the ability to sense the aroma of food, the simultaneous loss of the other senses make it difficult to register what you're eating. Putting food in your mouth can therefore become a decidedly unpleasant experience," explains Alexander Wieck Fjaeldstad.

A total of 23 nationalities and over 4,500 COVID-19 patients from all over the world have responded to the researchers' questionnaire.

"The study is of interest both to patients suffering sensory loss as well as clinicians and researchers who work with diagnostics and following-up on COVID-19. It shows that the loss of smell is specific to COVID-19, which is both relevant in relation to recognising the infection, and because it indicates that the sense of smell is closely linked to how SARS-CoV-2 infects the body."

Previously, researchers have based the correlation between COVID-19 and the loss of the chemical senses on smaller studies, while these studies collect large amounts of data from countries all over the world.

"The collaboration on the projects also entails a dialogue between researchers from all over the world, which makes it possible to share knowledge and ideas in order to promote the research field," says Alexander Wieck Fjaeldstad and continues: "The results are in line with our own national studies and pave the way for future studies on risk factors for permanent sensory loss, along with a better understanding of the consequences of these sensory losses for the patients. Among the aspects being studied are which factors are associated with a milder or briefer loss of the sense of smell and how this loss is associated with the rest of the course of the disease. The collection of data is continuing and will result in additional publications with even more participants."

Credit: 
Aarhus University

Specific genes increase the risk of bedwetting

In a large-scale study of Danish children and young people, researchers from Aarhus University have for the first time found genetic variants that increase the risk of nocturnal enuresis - commonly known as bedwetting or nighttime incontinence. The findings provide completely new insights into the processes in the body causing this widespread phenomenon.

Researchers have long known that nighttime incontinence is a highly heritable condition. Children who wet the bed at night often have siblings or parents who either suffer from or have suffered from the same condition. But until now, science has been unable to pinpoint the genes concerned.

In collaboration with the Danish research project iPSYCH and a team of international colleagues, researchers from Aarhus University have for the first time identified genetic variants that increase the risk of bedwetting. The results have just been published in the scientific journal The Lancet Child & Adolescent Health.

"As many as sixteen per cent of all seven-year-olds suffer from nocturnal enuresis and although many of them grow out of it, one to two per cent of all young adults still have this problem. It is a serious condition, which can negatively affect children's self-esteem and well-being. For example, the children may be afraid of being bullied, and often opt out of events that involve overnight stays," says Jane Hvarregaard Christensen, who is one of the researchers behind the study.

Regulates urine production

In the study, the researchers studied the genes of 3,900 Danish children and young people, who had either been diagnosed with nocturnal enuresis or had taken medication for it. This group was subsequently compared to 31,000 children and young people who did not suffer from the problem.

"We identified two locations in the genome where specific genetic variants increase the risk of bedwetting. The potential causal genes which we point to play roles in relation to ensuring that our brain develops the ability to keep urine production down at night, that the bladder's activity is regulated and registered, and that we sleep in an appropriate way, among other things," explains first author of the study, Cecilie Siggaard Jørgensen.

The study also shows that commonly occurring genetic variants can explain up to one-third of the genetic risk of bedwetting. This means that genetic variants which all of us have may lead to involuntary nocturnal enuresis, when they occur in a certain combination.

"But you can still also have all the variants without wetting the bed at night, because there are other risk factors in play that we haven't mapped yet - both genetic and environmental. So it's clear that this is very complex and that it's not possible to talk about a single gene that causes nocturnal enuresis," says Jane Hvarregaard Christensen.

Particularly vulnerable

The study also shows that children with many genetic variants that increase the risk of ADHD are particularly vulnerable to developing bedwetting.

"Our findings don't mean that ADHD causes bedwetting in a child, or vice versa, but just that the two conditions have common genetic causes. More research in this area will be able to clarify the details in the biological differences and similarities between the two disorders," she emphasises.

As the study is a first-time study, the researchers also examined more than 5,500 people from Iceland, where they found that the same genetic variants also appear to increase the risk of nocturnal enuresis.

"This means that we can be more certain that our findings are not coincidental. In the future, we wish to find out whether the same genetic variants increase the risk of bedwetting in children in other parts of the world. Bedwetting is not just an issue in northern European but affects millions of children all over the world," she says.

The researchers hope to be able to further clarify the causes of nocturnal enuresis. It is very likely that it will be possible to identify even more genes and thereby gain a deeper understanding of what is required for a child to become dry at night.

"At present we still can't use a child's genetic profile to predict, for example, whether the child will grow out of its condition or whether a particular treatment works. Perhaps this will be possible in the future when more detailed studies have been conducted," says Jane Hvarregaard Christensen.

Behind the results

The study is a so-called genome-wide association study (GWAS). By examining thousands of genetic variants spread out in the entire genome, a GWAS makes it possible to point to statistically significant correlations between specific genetic variants and nighttime incontinence in the persons who are examined.

Credit: 
Aarhus University

Spike proteins of SARS-CoV-2 relatives can evolve against immune responses

Scientists have shown that two species of seasonal human coronavirus related to SARS-CoV-2 can evolve in certain proteins to escape recognition by the immune system, according to a study published today in eLife.

The findings suggest that, if SARS-CoV-2 evolves in the same way, current vaccines against the virus may become outdated, requiring new ones to be made to match future strains.

When a person is infected by a virus or vaccinated against it, immune cells in their body will produce antibodies that can recognise and bind to unique proteins on the virus' surface known as antigens. The immune system relies on being able to 'remember' the antigens that relate to a specific virus in order to provide immunity against it. However, in some viruses, such as the seasonal flu, those antigens are likely to change and evolve in a process called antigenic drift, meaning the immune system may no longer respond to reinfection.

"Some coronaviruses are known to reinfect humans but it is not clear to what extent this is due to our immune memory fading or antigenic drift," says first author Kathryn Kistler, a PhD student at the Vaccine and Infectious Disease Division, Fred Hutchinson Cancer Research Center, Seattle, US. "We wanted to investigate whether there is any evidence of coronaviruses related to SARS-CoV-2 evolving to evade our immune responses."

The research team looked at the four seasonal human coronaviruses (HCoVs) which are related to SARS-CoV-2 but typically cause milder symptoms, such as the common cold. HCoVs have been circulating in the human population for 20-60 years, meaning their antigens would likely have faced pressure to evolve against our immune system.

The team used a variety of computational methods to compare the genetic sequences of many different strains of the viruses, enabling them to see how they have evolved over the years. They were especially interested in any changes that might have occurred in viral proteins that could contain antigens, such as spike proteins - the ray-like projections on the surface of coronaviruses that are particularly exposed to our immune system.

The researchers found a high rate of evolution in the spike proteins of two of the four viruses, OC43 and 229E. Nearly all of the beneficial mutations appeared in a specific region of the spike proteins called S1, which helps the virus infect human cells. This suggests that reinfection by these two viruses can occur as a result of antigenic drift as they evolve to escape recognition by the immune system.

The team also estimated that beneficial mutations in the spike proteins of OC43 and 229E appear roughly once every two to three years, about half to one-third of the rate seen in the flu virus strain, H3N2.

"Due to the high complexity and diversity of HCoVs, it is not entirely clear if this means that other coronaviruses, such as SARS-CoV-2, will evolve in the same way," explains senior author Trevor Bedford, Principal Investigator and Associate Member at the Vaccine and Infectious Disease Division, Fred Hutchinson Cancer Research Center. "The current vaccines against COVID-19, while highly effective, may need to be reformulated to match new strains, making it vital to continually monitor the evolution of the virus' antigens."

Credit: 
eLife

State responses, not federal, influenced rise in unemployment claims early in the pandemic

ATLANTA--Early in the U.S. COVID-19 pandemic, unemployment claims were largely driven by state shutdown orders and the nature of a state's economy and not by the virus, according a new article by Georgia State University economists.

David Sjoquist and Laura Wheeler found no evidence the Payroll Protection Program (PPP) affected the number of initial claims during the first six weeks of the pandemic.

Their research explores state differences in the magnitude of weekly unemployment insurance claims for the weeks ending March 14 through April 25 by focusing on three factors: the impact of COVID-19, the effects of state economic structures and state orders closing non-essential businesses and the impact of the Coronavirus Aid, Relief and Economic Security Act (CARES) legislation.

During the first week studied, unemployment claims appeared to be driven by consumer reactions to the coronavirus as they adjusted their behavior prior to government shutdown orders. States with greater employment in the industries most affected by the virus and those with a larger share of workers making less than weekly unemployment benefits saw higher shares of new unemployment insurance claims.

By March 21, 31 states had issued orders prohibiting in-restaurant dining. Those that closed nonessential businesses experienced larger numbers of unemployment insurance claims per covered worker. Those that had larger numbers of employees able to work from home did not have a lower increase in new claims. This finding is contrary to what other research has suggested, the co-authors said.

"Earlier studies exploring the effects of COVID cases and school closures on state job markets suggest the reduction in employment was mainly a nationwide response to COVID, and that specific state policies to the disease had a comparatively moderate effect," Sjoquist said. "By considering various state responses, including stay-at-home orders and those closing schools and non-essential businesses, our research provides insight into the effect of a state's industry and employment mix on its unemployment claims during a pandemic."

Credit: 
Georgia State University

With a little help from their friends, older birds breed successfully

image: This is a Seychelles with a colour-coded ring. These rings make it possible to study individual animals on the island of Cousin.

Image: 
Martijn Hammers, University of Groningen

The offspring of older animals often have a lower chance of survival because the parents are unable to take care of their young as well as they should. The Seychelles warbler is a cooperatively breeding bird species, meaning that parents often receive help from other birds when raising their offspring. A study led by biologists from the University of Groningen shows that the offspring of older females have better prospects when they are surrounded by helpers. This impact of social behaviour on reproductive success is described in a paper that was published on 19 January in the journal Evolution Letters.

The Seychelles warbler lives on a tiny island called Cousin Island, which is part of the Republic of Seychelles, an island country in the Indian Ocean. On Cousin Island, there are just over 300 birds, nearly all of them ringed with colour rings so that each individual bird can be recognized. The population has been studied for several decades. Martijn Hammers, a biologist at the University of Groningen, has frequently visited the island and is studying the interaction between social behaviour and ageing among these birds.

Natural laboratory

Cousin Island resembles a natural laboratory, Hammers explains. 'It is isolated, so there is no influx of new birds and birds rarely migrate to other islands. Furthermore, from 1985 onwards, we have ringed almost all warblers on the island, which allows us to observe their behaviour, reproduction and survival.' The Seychelles warbler is a cooperative breeder: each territory is held by a dominant couple and they sometimes allow helpers to stay there. 'These helpers are usually their own young from earlier breeding attempts and they can assist in breeding and in feeding the chicks. In return, these helpers can use the resources that are available in the territory and female helpers are occasionally allowed to lay an egg in the dominant birds' nest.'

As the parent birds age, their ability to feed their young diminishes. In this new study, Hammers and his colleagues from the University of Groningen (The Netherlands) and the University of East Anglia (UK) wanted to find out whether social behaviour, in particular the care for offspring provided by helpers, affects the breeding success of older birds. To this end, Hammers analysed over 20 years of data on these birds. He specifically looked at the frequency of chick feeding and the survival of young birds.

Humans

'Our prediction was that having helpers would be beneficial for the survival of chicks from older birds. And this turned out to be true, but only for older females,' says Hammers. The males of this species contribute less to feeding the chicks than females and their behaviour may therefore be less important for offspring survival than the behaviour of the females. 'That may be because the males are not always sure that they are feeding their own offspring since 40 per cent of the young are not their own,' explains Hammers.

The data show that while helpers compensate for age-related declines in female reproductive performance, individual helpers do not work harder when the dominant female is older. 'It appears to be a more passive process, in that older birds recruit more helpers who collectively help more.' The implications of his findings are that it is beneficial for older female birds to display social behaviour - allowing helpers to live in their territory - since it increases their reproductive success. 'It would be interesting to see if this is a general principle that also applies to other animal species, or even to humans.'

In a previous study, Hammers showed that getting help with the kids also slows down the ageing of the parents. 'This effect was also most pronounced in older females. This new study provides additional evidence for an interplay between age and cooperative breeding.'

Credit: 
University of Groningen

Study identifies a nonhuman primate model that mimics severe COVID-19 similar to humans

image: Acute respiratory distress syndrome (ARDS) in SARS-CoV-2 infected aged, African green monkey. A. Radiographic changes noted following a rapid clinical decline within a 24-hour period. B. Microscopic findings showing diffuse alveolar damage with hyaline membranes (arrows) and type II pneumocyte hyperplasia (arrowheads) consistent with ARDS. C. Immunohistochemistry identifying SARS-CoV-2 infected cells (green, arrows) within the lung. Bar = 100um.

Image: 
The American Journal of Pathology

Philadelphia, January 19, 2021 - Aged, wild-caught African green monkeys exposed to the SARS-CoV-2 virus developed acute respiratory distress syndrome (ARDS) with clinical symptoms similar to those observed in the most serious human cases of COVID-19, report researchers in The American Journal of Pathology, published by Elsevier. This is the first study to show that African green monkeys can develop severe clinical disease after SARS-CoV-2 infection, suggesting that they may be useful models for the study of COVID-19 in humans.

"Animal models greatly enhance our understanding of diseases. The lack of an animal model for severe manifestations of COVID-19 has hampered our understanding of this form of the disease," explained lead investigator Robert V. Blair, DVM, PhD, Dip ACVP, Tulane National Primate Research Center, Covington, LA, USA. "If aged green monkeys prove to be a consistent model of severe COVID-19, studying the disease pathobiology in them would improve our understanding of the disease and allow testing treatment options."

The researchers exposed four aged rhesus macaques and four aged African green monkeys to SARS-CoV-2. Older animals (13-16 years of age) were specifically chosen to see if they would develop the severe form of the disease that is observed more frequently in elderly individuals. All of the monkeys developed a spectrum of disease from mild to severe COVID-19. A day after routine screening found no remarkable symptoms, two of the African green monkeys developed rapid breathing that quickly progressed to severe respiratory distress. Radiographic studies found the two African green monkeys had widespread opacities in the lungs, in stark contrast to images taken the day before, highlighting the rapid development of the disease. Such opacities are a hallmark of ARDS in humans.

The African green monkeys that progressed to severe disease had notable increases in plasma cytokines that are compatible with cytokine storm, which is thought to underlie the development of ARDS in some patients. All four African green monkeys had elevated levels of interferon gamma; the two that had progressed to ARDS had the highest plasma concentration. Plasma cytokines were not increased in the rhesus macaques. Dr. Blair suggested that elevated interferon gamma could be explored as a potential predictive biomarker for advanced disease in patients and a possible therapeutic target.

Dr. Blair said, "Our data suggest that both rhesus monkeys and African green monkeys are capable of modeling mild manifestations of SARS-CoV-2 infection, and aged African green monkeys may additionally be capable of modeling severe disease manifestations, including ARDS."

Credit: 
Elsevier

Study finds COVID-19 attack on brain, not lungs, triggers severe disease in mice

ATLANTA--Georgia State University biology researchers have found that infecting the nasal passages of mice with the virus that causes COVID-19 led to a rapid, escalating attack on the brain that triggered severe illness, even after the lungs were successfully clearing themselves of the virus.

Assistant professor Mukesh Kumar, the study's lead researcher, said the findings have implications for understanding the wide range in symptoms and severity of illness among humans who are infected by SARS-CoV-2, the virus that causes COVID-19.

"Our thinking that it's more of a respiratory disease is not necessarily true," Kumar said. "Once it infects the brain it can affect anything because the brain is controlling your lungs, the heart, everything. The brain is a very sensitive organ. It's the central processor for everything."

The study, published by the journal "Viruses," assessed virus levels in multiple organs of the infected mice. A control group of mice received a dose of sterile saline solution in their nasal passages.

Kumar said that early in the pandemic, studies involving mice focused on the animals' lungs and did not assess whether the virus had invaded the brain. Kumars' team found that virus levels in the lungs of infected mice peaked three days after infection, then began to decline. However, very high levels of infectious virus were found in the brains of all the affected mice on the fifth and sixth days, which is when symptoms of severe disease became obvious, including labored breathing, disorientation and weakness.

The study found virus levels in the brain were about 1,000 times higher than in other parts of the body.

Kumar said the findings could help explain why some COVID-19 patients seem to be on the road to recovery, with improved lung function, only to rapidly relapse and die. His research and other studies suggest the severity of illness and the types of symptoms that different people experience could depend not only on how much virus a person was exposed to, but how it entered their body.

The nasal passages, he said, provide a more direct path to the brain than the mouth. And while the lungs of mice and humans are designed to fend off infections, the brain is ill equipped to do so, Kumar said. Once viral infections reach the brain, they trigger an inflammatory response that can persist indefinitely, causing ongoing damage.

"The brain is one of the regions where virus likes to hide," he said, because it cannot mount the kind of immune response that can clear viruses from other parts of the body.

"That's why we're seeing severe disease and all these multiple symptoms like heart disease, stroke and all these long-haulers with loss of smell, loss of taste," Kumar said. "All of this has to do with the brain rather than with the lungs."

Kumar said that COVID-19 survivors whose infections reached their brain are also at increased risk of future health problems, including auto-immune diseases, Parkinson's, multiple sclerosis and general cognitive decline.

"It's scary," Kumar said. "A lot of people think they got COVID and they recovered and now they're out of the woods. Now I feel like that's never going to be true. You may never be out of the woods."

Credit: 
Georgia State University

New research finds connection: Inflammation, metabolism and scleroderma scarring

Scleroderma, a chronic and currently incurable orphan disease where tissue injury causes potentially lethal skin and lung scarring, remains poorly understood.

However, the defining characteristic of systemic sclerosis, the most serious form of scleroderma, is irreversible and progressive scarring that affects the skin and internal organs.

Published in iScience, Michigan Medicine’s Scleroderma Program and the rheumatology and dermatology departments partnered with the Northwestern Scleroderma Program in Chicago and Mayo Clinic to investigate the causes of disabling scarring, using human patient samples, preclinical mouse models and explanted human skin.

“We found that scleroderma inflammation dramatically increases CD38, an enzyme that normally breaks down a metabolic nutrient, NAD+. When NAD+ levels decrease, tissue injury becomes chronic and progresses to scar formation rather than to healthy repair,” says study author John Varga, M.D., division chief of rheumatology at Michigan Medicine.

According to the study, treatments that prevented NAD+ reduction in the mice, whether by boosting the levels of the nutrient genetically or pharmacologically, prevented scarring in the skin, lungs and abdominal wall.

“This is one of the first studies to find a relationship between CD38 and scleroderma, as well as the linking between inflammation and metabolism to skin and organ scarring,” says study author Johann Gudjonsson, M.D. Ph.D., a dermatologist at Michigan Medicine.

Credit: 
Michigan Medicine - University of Michigan

Scientists reveal structure of plants' energy generators

image: Mung bean sprouts grown in the dark that provide the raw materials to determine the structure of plant respiratory complexes

Image: 
Kaitlyn Abe and Maria Guadalupe Zaragoza (CC BY 4.0)

Researchers have revealed the first atomic structures of the respiratory apparatus that plants use to generate energy, according to a study published today in eLife.

The 3D structures of these large protein assemblies - the first described for any plant species - are a step towards being able to develop improved herbicides that target plant respiration. They could also aid the development of more effective pesticides, which target the pest's metabolism while avoiding harm to crops.

Most organisms use respiration to harvest energy from food. Plants use photosynthesis to convert sunlight into sugars, and then respiration to break down the sugars into energy. This involves tiny cell components called mitochondria and a set of five protein assemblies that arrange themselves in an 'electron transport train'.

"Knowing how plants convert energy through respiration is a crucial part of understanding how plants grow, how they adapt to changes in the environment and what strategies we can use to improve crop yields," explains first author Maria Maldonado, a postdoctoral fellow at the Department of Molecular and Cellular Biology, University of California, Davis (UC Davis), US. "Yet although the 3D structures of respiration components are well understood in mammals, fungi and bacteria, the technical challenges of gathering pure samples of mitochondrial complexes in plants mean these structures remain largely unknown."

The team set out to obtain 3D structures of three components in the electron transport chain - complex III, complex IV and supercomplex III-IV. They extracted mitochondria complexes from mung bean sprouts treated with a gentle detergent and then stabilised them before using cryo-electron microscopy to generate high-resolution structures. Based on these structures, the team then built atomic models showing how the complexes interact with other molecules, such as other proteins, ions and lipids. For each of the three complexes, they were able to determine the number and structure of subunits, and the likely molecules that bind to them and how flexible the structures are.

Their models showed that several aspects of the complexes are shared between plants, mammals, fungi and bacteria, including several components that were originally thought to exist only in plants. However, the team also found several features of the complexes that are unique to plants, including the way the supercomplex III-IV assembles. This is important, because many agricultural herbicides and pesticides are designed to interfere with the respiratory complexes, and this finding could help to make them more selective for the pests they are intended to kill.

"Our work provides high-resolution structures of plant respiratory complexes that reveal plant-specific features, allowing for the development of more selective inhibitors as herbicides and pesticides," concludes senior author James Letts, Assistant Professor at the Department of Molecular and Cellular Biology, UC Davis, US. "Further comparative analyses of these structures with the growing number of respiratory complexes will allow us to understand the fundamental principles of respiration across the tree of life."

Credit: 
eLife

Study in twins identifies fecal microbiome differences in food allergies

image: Analysis of the fecal microbiome and metabolome in healthy and food allergic twins has identified key bacterial changes that may play a role in the condition.

Image: 
Bao et. al.

A new study out of the University of Chicago and Stanford University on pairs of twins with and without food allergies has identified potential microbial players in this condition. The results were published on Jan. 19 in the Journal of Clinical Investigation.

The study grew out of prior research in the Nagler laboratory at UChicago on the fecal microbiota in infants. By transplanting fecal microbes from healthy and food-allergic infants to germ-free mice (who do not possess a microbiome), investigators found that the healthy infant microbiota was protective against the development of food allergies.

"In this study, we looked at a more diverse population across a large range of ages," said Cathryn Nagler, PhD, the Bunning Family Professor in the Pritzker School of Molecular Engineering, the Department of Pathology and the College at UChicago. "By studying twin pairs, we had the benefit of examining genetically identical individuals who grew up in the same environment, which allowed us to begin to parse out the influence of genetic and environmental factors."

After a discussion at a research conference, Nagler and her colleague at Stanford, Kari Nadeau, MD, PhD, decided to collaborate on the project. Nadeau, the Director of the Sean N. Parker Center for Allergy and Asthma Research, had been conducting a study on the epigenetics of food allergies and had already collected fecal samples from study participants. Nagler's lab conducted the sequencing on the samples collected from 13 pairs of twins with and without food allergies, as well as an additional five pairs of twins where both twins had at least one food allergy.

The research team looked at which microbes were present in the fecal samples as well as metabolic products (called metabolites), derived not only from the microbes, but also from host and dietary sources.

"We desperately need biomarkers to understand the immunoregulatory function of intestinal bacteria," said Nagler. "Metabolites give us clues as to what bacteria are doing mechanistically to regulate the immune response."

This approach identified 64 distinct sets of bacterial species and metabolites that set apart the healthy and allergic twin groups. Most of these differentially abundant bacteria were members of the Clostridia class, shown to protect against food allergies in several earlier reports from the Nagler lab. Enrichment of the allergy-protective bacteria in the healthy twins, presumably established in early life, persisted into adulthood despite separation and lifestyle changes. In addition, healthy twins showed enrichment for the diacylglycerol metabolic pathway and two specific bacteria: Phascolarctobacterium faecium and Ruminococccus bromii.

"To narrow down from thousands of bacteria to specific species as candidates for future therapeutic interventions, one dimension of data is not enough - bringing together data from multiple dimensions is the key," said first author Riyue Bao, PhD, now a Research Associate Professor of Medicine at the University of Pittsburgh. "In our study, we harnessed the benefits of both high-throughput microbiome sequencing and metabolic profiling techniques, and were able to nominate two specific species, each involved in distinct metabolite pathways, that can be prioritized as potential targets for future research and therapeutic interventions in food allergies."

"Tons of people will go to Google and they want to know: 'Should I eat yogurt? Should I not eat yogurt? Does my microbiome play a role in my disease?'" said Nadeau. "This research is important as one of key 'bricks' in knowledge of the human microbiome that needs to be laid down to answer these questions. We can't say this is a cause and effect relationship yet, but we can say that there is an association with disease and health. So now we can start to ask, what does this mean?"

While the study only included a small group of participants, researchers are excited by the results and how they can be applied to future projects.

Future research will investigate the specific roles of these bacteria in food allergies; for example, R. bromii is a keystone species in the degradation of resistant starch -- dietary starch that normally escapes digestion. Nagler plans to investigate how dietary supplementation with resistant starch can affect R. bromii's presence in the fecal microbiome, and in turn whether or not it can boost the response to oral immunotherapy, the only currently available treatment for food allergies.

The study, "Fecal microbiome and metabolome differ in healthy and food-allergic twins," was supported by the Sunshine Charitable Foundation, the Moss Family Foundation, NIAID (R56AI134923, R01AI140134) and NHLBI (R01HL118162). Additional authors include Riyue Bao of the University of Chicago (now at the University of Pittsburgh Medical Center), Lauren A. Hesser of UChicago, and Ziyuan He and Xiaoying Zhou of Stanford University.

Credit: 
University of Chicago Medical Center

Disease threatens to decimate western bats

image: A researcher about to handle a hibernating Townsend's big-eared bat in an abandoned
mine in Nevada.

Image: 
Kim Raff

BOZEMAN, Montana (January 19, 2021) - A four-year study recently published in Ecology and Evolution concludes that the fungal disease, white-nose syndrome, poses a severe threat to many western North American bats.

Since it was first detected in 2006, white-nose syndrome has killed millions of bats in eastern and central North America. The spread of the fungal pathogen that causes white-nose syndrome in hibernating bats has reached several western U.S. states, mostly likely through bat-to-bat spread, and is presently threatening western species.

Bats with white-nose syndrome have fungus growing on their nose and wings, as the name implies, but the fungal infection also triggers a higher frequency of arousals from hibernation. Each arousal involves an increase in body temperature from as low as near freezing (when bats use torpor) to an active mammalian body temperature (~98?F or 38?C), which uses a significant amount of energy. Bats have limited fat stored for the winter, and if this is used up before the end of winter, death by starvation occurs.

The researchers' aim was to provide managers with information on which western bat species may suffer high mortality and extinction risk if infected with the disease. To do so they combined an unprecedented field data collection effort with a mechanistic model that explains how energy is consumed during hibernation and how the causal fungus impacts this energy consumption. By comparing their new knowledge of how long bats infected with white-nose syndrome could hibernate against the duration of winter that they would need to hibernate with the disease, the authors predicted survival outcomes for each species. If a bat did not have sufficient energy to live beyond the duration of winter the simulation recorded a mortality.

Three years of intensive fieldwork resulted in 946 bat captures (all released after measuring). Bat energetic measurements paired with hibernaculum environmental data were gathered for nine species that were sampled at eight sites scattered throughout the West (see Figure A). The researchers then assessed how the arrival of white-nose syndrome might affect hibernation energy use, and subsequently each species' ability to survive hibernation with the disease. Combining data on the host, the environment they select for hibernation, and how the pathogen grows at different temperature and humidity conditions the authors simulated how many days infected populations could hibernate under field conditions.

The study revealed there are white-nose syndrome threats to all the small Myotis species examined, including M. ciliolabrum (western small-footed bat), M. evotis (long-eared bat), M. lucifugus (little brown bat), M. thysanodes (fringed myotis), and M. volans (long-legged bat), as well as Perimyotis subflavus (tricolored bat). In comparison, larger species like M. velifer (cave bat), Corynorhinus townsendii (Townsend's big-eared bat) and Eptesicus fuscus (big brown bat) are predicted to be less impacted. Further analysis showed body mass (and relatedly body-fat as these attributes are correlated) as well as hibernaculum water vapor deficit (i.e. relative humidity) explained over half the variation observed in bat survival.

Dr. Catherine Haase, now Assistant Professor of Biology at Austin Peay State University and the study's lead author said: "Our results indicate the need to take a holistic view on conservation, as it is not just one thing that determines survival from white-nose syndrome, but rather the combination of bat, environment, and disease variables."

All of the western bat species studied were insectivores, meaning they prey on insects, including those that are pests to agricultural crops. In addition to providing valuable ecosystem services, they are incredibly fascinating species, from their ability to echolocate to their unique immune system.

Dr. Sarah Olson, Wildlife Conservation Society Health Program co-author and project Principle Investigator said: "This study demonstrates the value of collecting baseline data to pre-emptively understand a threat posed by a wildlife disease, like white-nose syndrome, to western bats, so that more proactive conservation measures can be taken to protect these species. Here, an all hands on deck approach is needed. Western states can take steps now to put protections in place before anticipated severe declines are observed, like reducing habitat loss and restricting access to hibernacula, as well as investing in research and surveillance."

Credit: 
Wildlife Conservation Society

Where do our minds wander? Brain waves can point the way

Anyone who has tried and failed to meditate knows that our minds are rarely still. But where do they roam? New research led by UC Berkeley has come up with a way to track the flow of our internal thought processes and signal whether our minds are focused, fixated or wandering.

Using an electroencephalogram (EEG) to measure brain activity while people performed mundane attention tasks, researchers identified brain signals that reveal when the mind is not focused on the task at hand or aimlessly wandering, especially after concentrating on an assignment.

Specifically, increased alpha brain waves were detected in the prefrontal cortex of more than two dozen study participants when their thoughts jumped from one topic to another, providing an electrophysiological signature for unconstrained, spontaneous thought. Alpha waves are slow brain rhythms whose frequency ranges from 9 to 14 cycles per second.

Meanwhile, weaker brain signals known as P3 were observed in the parietal cortex, further offering a neural marker for when people are not paying attention to the task at hand.

"For the first time, we have neurophysiological evidence that distinguishes different patterns of internal thought, allowing us to understand the varieties of thought central to human cognition and to compare between healthy and disordered thinking," said study senior author Robert Knight, a UC Berkeley professor of psychology and neuroscience.

The findings, published this week in the Proceedings of the National Academy of Sciences journal, suggest that tuning out our external environment and allowing our internal thoughts to move freely and creatively are a necessary function of the brain and can promote relaxation and exploration.

Moreover, EEG markers of how our thoughts flow when our brains are at rest can help researchers and clinicians detect certain patterns of thinking, even before patients are aware of where their minds are wandering.

"This could help detect thought patterns linked to a spectrum of psychiatric and attention disorders and may help diagnose them," said study lead author Julia Kam, an assistant professor of psychology at the University of Calgary. She launched the study as a postdoctoral researcher in Knight's cognitive neuroscience lab at UC Berkeley.

Another co-author on the paper is Zachary Irving, an assistant professor of philosophy at the University of Virginia who explored the psychological and philosophical underpinnings of mind-wandering as a postdoctoral scholar at UC Berkeley.

"If you focus all the time on your goals, you can miss important information. And so, having a free-association thought process that randomly generates memories and imaginative experiences can lead you to new ideas and insights," said Irving, whose philosophical theory of mind-wandering shaped the study's methodology.

Irving worked with Alison Gopnik, a UC Berkeley developmental psychologist and philosophy scholar who is also a co-author of the study.

"Babies and young children's minds seem to wander constantly, and so we wondered what functions that might serve," Gopnik said. "Our paper suggests mind-wandering is as much a positive feature of cognition as a quirk and explains something we all experience."

To prepare for the study, 39 adults were taught the difference between four different categories of thinking: task-related, freely moving, deliberately constrained and automatically constrained.

Next, while wearing electrodes on their heads that measured their brain activity, they sat at a computer screen and tapped left or right arrow keys to correspond with left and right arrows appearing in random sequences on the screen.

When they finished a sequence, they were asked to rate on a scale of one to seven -- whether their thoughts during the task had been related to the task, freely moving, deliberately constrained or automatically constrained.

One example of thoughts unrelated to the task and freely moving would be if a student, instead of studying for an upcoming exam, found herself thinking about whether she had received a good grade on an assignment, then realized she had not yet prepared dinner, and then wondered if she should exercise more, and ended up reminiscing about her last vacation, Kam said.

The responses to the questions about thought processes were then divided into the four groups and matched against the recorded brain activity.

When study participants reported having thoughts that moved freely from topic to topic, they showed increased alpha wave activity in the brain's frontal cortex, a pattern linked to the generation of creative ideas. Researchers also found evidence of lesser P3 brain signals during off-task thoughts.

"The ability to detect our thought patterns through brain activity is an important step toward generating potential strategies for regulating how our thoughts unfold over time, a strategy useful for healthy and disordered minds alike," Kam said.

Credit: 
University of California - Berkeley

Counting elephants from space

image: Elephants in woodland as seen from space. Green rectangles show elephants detected by the algorithm, red rectangles show elephants verified by humans.

Image: 
Satellite image (c) 2020 Maxar Technologies

Editor's Note: In the fifth graf, the number of African savannah elephants left in the wild was updated to 415,000.

For the first time, scientists have successfully used satellite cameras coupled with deep learning to count animals in complex geographical landscapes, taking conservationists an important step forward in monitoring populations of endangered species.

For this research, the satellites Worldview 3 and 4 used high-resolution imagery to capture African elephants moving through forests and grasslands. The automated system detected animals with the same accuracy as humans are able to achieve.

The algorithm that enabled the detection process was created by Dr Olga Isupova, a computer scientist at the University of Bath. The project was a collaboration with the University of Oxford and the University of Twente in the Netherlands.

Dr Isupova said the new surveying technique allows vast areas of land to be scanned in a matter of minutes, offering a much-needed alternative to human observers counting individual animals from low-flying airplanes. As it sweeps across the land, a satellite can collect over 5,000 km² of imagery in a matter of minutes, eliminating the risk of double counting. Where necessary (for instance, when there is cloud coverage), the process can be repeated the next day, on the satellite’s next visit.

The population of African elephants has nose-dived over the past century, mainly due to poaching and habitat fragmentation. With approximately 415,000 African savannah elephants left in the wild, the species is classified as endangered.

“Accurate monitoring is essential if we’re to save the species,” said Dr Isupova. “We need to know where the animals are and how many there are.”

Satellite monitoring eliminates the risk of disturbing animals during data collection and ensures humans are not hurt in the counting process. It also makes it simpler to count animals moving from country to country, as satellites can orbit the planet without regard for border controls or conflict.

This study was not the first to use satellite imagery and algorithms to monitor species, but it was the first to reliably monitor animals moving through a heterogeneous landscape – that is, a backdrop that includes areas of open grassland, woodland and partial coverage.

“This type of work has been done before with whales, but of course the ocean is all blue, so counting is a lot less challenging,” said Dr Isupova. “As you can imagine, a heterogeneous landscape makes it much hard to identify animals.”

The researchers believe their work demonstrates the potential of technology to support conservationists in their plight to protect biodiversity and to slow the progress of the sixth mass extinction – the ongoing extinction event triggered by human activity.

“We need to find new state-of-the-art systems to help researchers gather the data they need to save species under threat,” said Dr Isupova.

African elephants were chosen for this study for good reason – they are the largest land animal and therefore the easiest to spot. However, Dr Isupova is hopeful that it will soon be possible to detect far smaller species from space.

“Satellite imagery resolution increases every couple of years, and with every increase we will be able to see smaller things in greater detail,” she said, adding: “Other researchers have managed to detect black albatross nests against snow. No doubt the contrast of black and white made it easier, but that doesn’t change the fact that an albatross nest is one-eleventh the size of an elephant.”

Credit: 
University of Bath

5G doesn't cause COVID-19, but the rumor it does spread like a virus

image: Researchers found that COVID-19 misinformation spread exponentially across the countries, much like the coronavirus itself.

Image: 
Photo by Frederik Lipfert on Unsplash

People's fear of 5G technology is rational. Such technology does emit radiation, even if it's at low levels. But 5G isn't all that different from 4G, and it certainly doesn't cause COVID-19 despite such rumors having spread rapidly across the globe.

Researchers need to better understand how misinformation like this spreads in order to hone their intervention efforts and prevent misinformed perspectives from taking root. In society's virtual world, preventing technological misinformation, in particular, is important now more than ever.

A research team led by Elaine Nsoesie, a Hariri Institute Faculty Fellow, investigated how COVID-19 misinformation proliferated using the same epidemiological techniques for modeling disease transmission. Nsoesie, along with Nina Cesare, a postdoctoral associate at the BU School of Public Health, and other scientists from Harvard Medical School and École Polytechnique Fédérale recently published their findings in the Journal of Medical Internet Research.

The team examined the spread of COVID-19 misinformation across eight English-speaking countries, including the United States, using Google Trends. The researchers focused on myths that the World Health Organization (WHO) "busted" on its website including the relationships between COVID-19 and alcohol, ginger root, the sun, 5G, and hydroxychloroquine.

What Nsoesie and colleagues found was that some COVID-19 misinformation spread exponentially across the countries, much like the coronavirus itself.

This rapid proliferation isn't surprising. Most people were scrambling for any sort of information on the mysterious virus in the early months of 2020. "There was such a rapid proliferation of any information at the onset of the pandemic that misinformation had a golden opportunity to enter the public conscience," said Cesare.

Thankfully, debunking myths online seems effective in stopping their spread. As soon as public health officials at WHO responded to COVID-19 misinformation on the WHO website, the number of Google searches for that misinformation dropped significantly.

But, the team was surprised that there seems to be a consistent, global misunderstanding of 5G technology. The myth of "COVID-19 and 5G" spread faster than any of the other rumors they investigated. "I didn't expect 5G to stand out among the misinformation as much as it did," said Nsoesie.

What makes this even more surprising is that 5G technology isn't brand new. Rather, it's a continued development, based on international standards, of the communication technologies preceding it, like 4G.

"5G is the new standard for communication technology. It allows for faster communication by using different frequencies and multiple antennas," said David Starobinski, a professor in Boston University's Department of Electrical and Computer Engineering. "It is an evolution of communication technology rather than a revolution," he said.

Even though 5G technology isn't entirely new, there are a few reasons why people might have believed it causes COVID-19.

For one, there is very little transparency from researchers in communication technologies that leads to institutional distrust. "I think the belief has something to do with a certain distrust in government and the ability to tie this narrative about 5G technology into conversations around government surveillance," said Cesare. This distrust is a concern even now, as myths around microchips being put into vaccines explode on Facebook.

Another explanation for why folks might associate 5G with COVID-19 is that such technology emits invisible electromagnetic waves that people fear could impact their health. "People are much more worried about things [like radiation] that they cannot see," said Starobinski.

While exposure to high-power radiation can be harmful to health, Starobinski assures that there have been safety guidelines on the radiation from communication technologies and 5G should be safe to use. "People have been using smartphones for years and we don't see evidence that this radiation has caused noticeable increase in diseases or hospitalizations due to usage," he said. He also noted that "regulators have set limits on the radiation power of 5G devices, though additional safety studies may still be warranted."

And, such radiation can't cause COVID-19. COVID-19 is a viral disease that comes from the coronavirus known as SARS-CoV-2.

To stop the spread of similar myths in the future, experts need to consistently and clearly correct common misconceptions. And better transparency from both government bodies and researchers could prevent misinformation from ever taking root.

"We [researchers] need to humanize the conversations around misinformation and continue to share true information so that misinformation becomes less prevalent in the media," said Nsoesie.

Credit: 
Boston University