Culture

Low-calorie diet and mild exercise improve survival for young people with leukemia

In some cancers, including leukemia in children and adolescents, obesity can negatively affect survival outcomes. Obese young people with leukemia are 50% more likely to relapse after treatment than their lean counterparts.

Now, a study led by researchers at UCLA and Children's Hospital Los Angeles has shown that a combination of modest dietary changes and exercise can dramatically improve survival outcomes for those with acute lymphoblastic leukemia, the most common childhood cancer.

The researchers found that patients who reduced their calorie intake by 10% or more and adopted a moderate exercise program immediately after their diagnosis had, on average, 70% less chance of having lingering leukemia cells after a month of chemotherapy than those not on the diet-and-exercise regimen.

Lingering cancer cells in the bone marrow, which are more likely to be found in overweight individuals, are associated with worse survival and a higher risk of relapse, often leading to more intense treatments like bone marrow transplants and immunotherapy.

"We tested a very mild diet because this was our first time trying it, and the first month of treatment is already so difficult for patients and families," said senior author Dr. Steven Mittelman, chief of pediatric endocrinology at UCLA Mattel Children's Hospital and a member of the UCLA Jonsson Comprehensive Cancer Center. "But even with these mild changes in diet and exercise, the intervention was extremely effective in reducing the chance of having detectable leukemia in the bone marrow."

As part of the clinical trial, which took place at Children's Hospital Los Angeles, researchers worked with registered dieticians and physical therapists to create personalized 28-day interventions for 40 young people between the ages of 10 and 21 who were newly diagnosed with acute lymphoblastic leukemia.

The intervention was designed to cut participants' calorie intake by a minimum of 10% in order to reduce both fat gain and lean muscle loss. The physical activity component included a target level of 200 minutes per week of moderate exercise.

The results, published in the American Society of Hematology's journal Blood Advances, showed a decrease in fat gain among those who were overweight and obese, as well as an improved insulin sensitivity and an increase in the beneficial hormone adiponectin, which is involved in regulating glucose and breaking down fatty acids. Most importantly, the researchers found a 70% decrease in the chance of having lingering leukemia cells in the bone marrow -- known medically as minimal residual disease -- when compared with a historical control group.

"We hoped the intervention would improve outcomes, but we had no idea it would be so effective," Mittelman said. "We can't really add more toxic chemotherapies to the intense initial treatment phase, but this is an intervention that likely has no negative side effects. In fact, we hope it may even reduce toxicities caused by chemotherapy."

"This is the first trial to test a diet-and-exercise intervention to improve treatment outcomes from a childhood cancer," said principal investigator and lead author Dr. Etan Orgel, Director of the Medical Supportive Care Service in the Cancer and Blood Disease Institute at Children's Hospital Los Angeles. "This is an exciting proof-of-concept, which may have great implications for other cancers as well."

Next steps include further testing of the approach in a multicenter randomized trial, which will be launched later this year, the researchers said.

Credit: 
University of California - Los Angeles Health Sciences

Possible trigger for Crohn's disease identified

image: The team's findings were led by postdoctoral fellow, Wael Elhenawy, McMaster University

Image: 
McMaster University

Hamilton, ON (April 1, 2021) - People living with the often-debilitating effects of Crohn's disease may finally gain some relief, thanks to ground-breaking research led by McMaster University.

McMaster investigator Brian Coombes said his team identified a strain of adherent-invasive E-coli (AIEC) that is strongly implicated in the condition and is often found in the intestines of people with Crohn's disease.

"If you examine the gut lining of patients with Crohn's disease, you will find that around 70 to 80 per cent of them test positive for AIEC bacteria, but one of the things we don't understand is why," said Coombes, professor and chair of the Department of Biochemistry and Biomedical Sciences, and the Canada Research Chair in Infectious Disease Pathogenesis.

"We believe that AIEC is a potential trigger of Crohn's disease."

By mutating every gene in a particular strain of AIEC and testing how those mutants grow in mice, the researchers were able to pinpoint which genes allowed the bacteria to freely colonize the gut linings of people with Crohn's disease.

AIEC bacteria grow in a biofilm that coats cells lining the intestinal wall, protecting them from both the immune system and antibiotics. In this research, the team identified a critical protein structure on the surface of the bacteria that allow them to grow in biofilms.

Coombes said Crohn's disease is caused by the immune system's inability to "switch off" its inflammatory response to gut bacteria. Symptoms include severe diarrhoea, fatigue, weight loss and malnutrition.

Current treatments focus on easing the inflammation, but do not address the root cause of the condition.

"New therapies are on the way - we are one step closer to figuring out how this Crohn's disease-associated bacteria lives in the gut and when we do that, we can develop new treatments," said Coombes.

The team's findings were led by postdoctoral fellow, Wael Elhenawy and published in Nature Communications.

Credit: 
McMaster University

How brain cells repair their DNA reveals "hot spots" of aging and disease

image: In this image of a neuron nucleus, bright spots show areas of focused genetic repair.

Image: 
Salk Institute/Waitt Advanced Biophotonics Center

LA JOLLA--(April 1, 2021) Neurons lack the ability to replicate their DNA, so they're constantly working to repair damage to their genome. Now, a new study by Salk scientists finds that these repairs are not random, but instead focus on protecting certain genetic "hot spots" that appear to play a critical role in neural identity and function.

The findings, published in the April 2, 2021, issue of Science, give novel insights into the genetic structures involved in aging and neurodegeneration, and could point to the development of potential new therapies for diseases such Alzheimer's, Parkinson's and other age-related dementia disorders.

"This research shows for the first time that there are sections of genome that neurons prioritize when it comes to repair," says Professor and Salk President Rusty Gage, the paper's co-corresponding author. "We're excited about the potential of these findings to change the way we view many age-related diseases of the nervous system and potentially explore DNA repair as a therapeutic approach."

Unlike other cells, neurons generally don't replace themselves over time, making them among the longest-living cells in the human body. Their longevity makes it even more important that they repair lesions in their DNA as they age, in order to maintain their function over the decades of a human life span. As they get older, neurons' ability to make these genetic repairs declines, which could explain why people develop age-related neurodegenerative diseases like Alzheimer's and Parkinson's.

To investigate how neurons maintain genome health, the study authors developed a new technique they term Repair-seq. The team produced neurons from stem cells and fed them synthetic nucleosides--molecules that serve as building blocks for DNA. These artificial nucleosides could be found via DNA sequencing and imaged, showing where the neurons used them to make repairs to DNA that was damaged by normal cellular processes. While the scientists expected to see some prioritization, they were surprised by just how focused the neurons were on protecting certain sections of the genome.

"What we saw was incredibly sharp, well-defined regions of repair; very focused areas that were substantially higher than background levels," says co-first and co-corresponding author Dylan Reid, a former Salk postdoctoral scholar and now a fellow at Vertex Pharmaceutics. "The proteins that sit on these 'hot spots' are implicated in neurodegenerative disease, and the sites are also linked to aging."

The authors found approximately 65,000 hot spots that covered around 2 percent of the neuronal genome. They then used proteomics approaches to detect what proteins were found at these hot spots, implicating many splicing-related proteins. (These are involved in the eventual production of other proteins.) Many of these sites appeared to be quite stable when the cells were treated with DNA-damaging agents, and the most stable DNA repair hot spots were found to be strongly associated with sites where chemical tags attach ("methylation") that are best at predicting neuronal age.

Previous research has focused on identifying the sections of DNA that suffer genetic damage, but this is the first time researchers have looked for where the genome is being heavily repaired.

"We flipped the paradigm from looking for damage to looking for repair, and that's why we were able to find these hot spots," Reid says. "This is really new biology that might eventually change how we understand neurons in the nervous system, and the more we understand that, the more we can look to develop therapies addressing age-related diseases."

Gage, who holds the Vi and John Adler Chair for Research on Age-Related Neurodegenerative Disease, adds, "Understanding which areas within the genome are vulnerable to damage is a very exciting topic for our lab. We think Repair-seq will be a powerful tool for research, and we continue to explore additional new methods to study genome integrity, particularly in relation to aging and disease."

Credit: 
Salk Institute

A gender gap in negotiation emerges between boys and girls as early as age eight

image: A gender gap in negotiation emerges in girls as young as eight, according to the latest research from Boston College's Katherine McAuliffe, director of the Cooperation Lab.

Image: 
Boston College

Chestnut Hill, MA (4/1/2021) - A gender gap in negotiation emerges as early as age eight, a finding that sheds new light on the wage gap women face in the workforce, according to new research from Boston College's Cooperation Lab, lead by Associate Professor of Psychology and Neuroscience Katherine McAuliffe.

The study of 240 boys and girls between ages four and nine, published recently in the journal Psychological Science, found the gap appears when girls who participated in the study were asked to negotiate with a male evaluator, a finding that mirrors the dynamics of the negotiation gap that persists between men and women in the workforce.

The researchers say this study is the first to identify a gender gap in negotiation among children.

"We found that--consistent with adult work--girls asked for less than boys when negotiating with a man," said McAuliffe. "We did not see this gender gap when children were negotiating with a woman. There is still much more work to be done, but one thing this tells us is that we should be teaching young girls to advocate for themselves in the context of negotiation from as early elementary school."

The results point to a disparity that takes root in childhood which may help explain the well-documented gender gap in pay that separates women and men, said McAuliffe, who co-authored the study with New York University graduate student Sophie Arnold, who spent two summers working in McAuliffe's Cooperation Lab, which studies how attributes like fairness develop in children.

McAuliffe and Arnold say the findings reveal a need to determine if cultural signals sent to girls are to blame and whether there should be interventions in childhood to ensure that both girls and boys feel comfortable advocating for themselves, regardless of the gender of the person they are talking with.

"Boys and girls performed equally between the ages of four and seven," said Arnold, whose work on the project became her undergraduate honors thesis at the University of Chicago. "But by age eight, we see girls are requesting less than boys when they are negotiating with a man. It's not that girls are negotiating less overall - it's only when children are negotiating with a man that we see these gender differences emerge."

Prior studies of adults have found gender differences in negotiation, but could not explain their origins, said the authors. The researchers tested 240 children between the ages of 4 and 9 to determine whether girls in the age range would negotiate differently than boys and whether girls, like women, would ask for less from a man than a woman.

"Although adult research supports the existence of gender differences in negotiation, it cannot reveal their origins. Learning when these gender differences emerge is imperative for understanding what individual and societal factors lead to these gender differences in adulthood," write Arnold and McAuliffe.

The experiments were modeled on real-world scenarios, the authors said. In phase one, children completed a task and then were allowed to ask for as many stickers as they liked as a reward. Unbeknownst to them, the evaluator would always accept requests for two stickers or less, and always reject requests above that threshold. If a request was rejected, negotiations would progress to the next phase.

In the second phase, the evaluator explained the rules of negotiation to the children and spelled out how their next sticker requests could be handled in a way that highlighted the risk and reward inherent to negotiations.

The results revealed a significant interaction between age, participant gender, and evaluator gender. For girls, the number of stickers initially requested decreased with age when the evaluator was a male. When the evaluator was female, girls requested more stickers with age.

Boys, on the other hand, made requests that did not change with age or the gender of the evaluator.

In measuring persistence, a proxy for negotiation length, the study found that girls' persistence decreased with age when the evaluator was male, but remained the same with age when the evaluator was female. For boys, persistence did not change with age or depending on the gender of the evaluator.

"Our study is the first to show that gender differences in negotiation emerge in childhood," the co-authors write. "Older girls in our paradigm differed from boys in how much they were willing to ask for from a male evaluator, showing a striking resemblance to findings in adults. Girls are not asking for less from both evaluators but instead are negotiating less only with the male evaluator."

"Our findings add negotiation to the list of early-developing gendered behaviors that may lead to and magnify the wage gap in adulthood," the researchers report.

McAuliffe said the next steps in this area of study include trying to determine why these differences emerge.

She and Arnold highlighted two potential avenues for future research: status and gender stereotypes. It could be the case that the gender differences observed in this study are not unique to gender but reflect a more general dynamic due to perceived status of social groups. Namely, that individuals perceived to be a part of a lower status group (in the US, with respect to gender, this would be girls or women) request less from individuals who are perceived to be part of a higher status group (men).

With respect to gender stereotypes, it could be the case that girls are adopting gender stereotypes about communality or modesty that in turn shape their behavior with men. More research is needed to understand how these avenues contribute to the development of gender differences in negotiation.

The challenge of the next steps in this research are clarifying exactly what explains the development of this behavior and trying to develop interventions that could help remedy the imbalance, said McAuliffe.

Credit: 
Boston College

Stem cell transplants prevent relapses of most common childhood cancer

image: Daniel "Trey" Lee, MD, is conducting pioneering clinical research in the battle against childhood cancer. Lee is a pediatric oncologist and director of Pediatric Stem Cell Transplant and Immunotherapy at UVA Children's and the UVA Cancer Center.

Image: 
Dan Addison | UVA Communications

Children and young adults who receive CAR T-cell therapy for the most common childhood cancer - acute lymphoblastic leukemia - suffer remarkably fewer relapses and are far more likely to survive when the treatment is paired with a subsequent stem cell transplant, a new study finds.

The research, with an average follow up of nearly five years, suggests that stem cell transplants offer long-term benefits for young patients who receive the cutting-edge immunotherapy. CAR T-cell therapy results in complete remission in 60%-100% of patients initially, but the relapse rate is high. However, among those who received a stem cell transplant after CARs, the relapse rate was less than 10% two years later.

"More than 50% of kids in other studies with a different CAR relapse, with the majority of them losing the target the CAR goes after," said researcher Daniel "Trey" Lee, MD, a pediatric oncologist and Director of Pediatric Stem Cell Transplant and Immunotherapy at UVA Children's and the UVA Cancer Center. "Most of these kids have a single shot at this life-saving and paradigm-changing therapy called CAR T-cells. We should do all we can to maximize the chance for a cure, and right now that means a transplant after CAR therapy for most."

About CAR T-Cell Therapy

Chimeric antigen receptor (CAR) T-cell therapy takes a person's own immune cells and genetically modifies them to make them more effective cancer killers.

The approach has shown high remission rates - up to 100% - 28 days after it is given to children and young adults with B-cell acute lymphoblastic leukemia. But a significant number relapse, the limited data available suggests. One study found that more than 40% had relapsed 13.1 months later.

To determine if stem cell transplants could help, the new study looked at outcomes in 50 children and young adults, ages 4 to 30. The median age was 13.5 years.

Among the 21 who received an allogenic stem cell transplant after CARs, only 9.5% had relapsed 24 months later. In comparison, all of those who did not receive a stem-cell transplant had relapsed.

"Even as impactful as CAR T-cell therapy is for children with relapsed leukemia, we now know that the best outcomes happen when the child undergoes a stem cell transplant afterwards," Lee said. "Many parents turn to CAR T-cells to possibly avoid a stem cell transplant, and that is entirely understandable. But there is a window of opportunity after CARs to cure more of these incurable kids with a transplant; our study demonstrates this." His hope for the future is for the field to be able to distinguish up front those who need a transplant and those who do not.

Credit: 
University of Virginia Health System

HKBU-led research reveals hyocholic acids are promising agents for diabetes prediction and treatment

image: Professor Jia Wei, Chair Professor of the School of Chinese Medicine at HKBU, has revealed that hyocholic acid and its derivatives are a promising risk indicator of type 2 diabetes.

Image: 
Hong Kong Baptist University

A series of studies led by researchers from Hong Kong Baptist University (HKBU) have revealed that hyocholic acid and its derivatives (collectively known as HCAs), a component of bile acids that facilitate fat digestion, are a promising risk indicator of type 2 diabetes. The strong efficacy of HCAs in regulating blood glucose levels and protecting against diabetes has also been uncovered. The findings open a window for the development of HCA-based predictive markers as well as anti-diabetic drugs.

The research results have been published in the international scientific journals Cell Metabolism and Nature Communications.

High concentration of HCAs protects pigs from diabetes

Inspired by the traditional Chinese medical book Compendium of Materia Medica, which recorded the use of pig bile to treat excessive thirst, a condition known today as diabetes, Professor Jia Wei, Chair Professor of the School of Chinese Medicine at HKBU, led research teams to conduct a series of studies on the role of HCAs in glucose homeostasis and diabetes prevention.

Diabetes is characterised by high blood glucose levels. Through a series of tests conducted on 55 humans, 32 mice and 12 pigs, Professor Jia's team confirmed that fasting blood glucose levels in pigs are significantly lower than that of humans and mice. As HCAs constitute nearly 80% of bile acids in pigs, while the proportions in humans and mice are only about 2% and 3% respectively, a negative correlation between HCAs and blood glucose levels was observed.

The result indicates the potential role of HCAs in the maintenance of stable glucose levels. This may explain why pigs, unlike humans, seldom suffer from diabetes despite their low physical activity levels and consumption of a calorie-rich diet.

HCAs correlate with diabetes and metabolic health

To analyse the correlation between the levels of HCAs and the occurrence of diabetes in humans, data was collected from two large-scale cohort studies, namely the Shanghai Obesity Study and the Shanghai Diabetes Study. The researchers examined the serum bile acid profiles of 1,107 participants of the Shanghai Obesity Study, which was published in 2013. The participants were divided into three groups: healthy lean, healthy obese and obese with type 2 diabetes. It was discovered that the levels of serum HCAs were significantly lower in the healthy obese and obese with type 2 diabetes groups.

In another study, the serum bile acids of 132 participants of the Shanghai Diabetes Study were investigated. They were all healthy (at baseline) when they were enrolled in the study between 1998 and 2001. Ten years later, 86 of them had become metabolically unhealthy, while 46 remained healthy. Analysis showed that, compared with those who remained healthy ten years later, those who had become metabolically unhealthy had significantly lower baseline levels of serum HCAs, illustrating that levels of HCAs are a strong predictor of metabolic syndromes such as diabetes.

HCAs regulate blood glucose levels in animal models

Through a series of laboratory experiments, the researchers looked further into the mechanisms that underpin the key role that HCAs play in regulating blood glucose levels. In an animal model experiment, the researchers suppressed the synthesis of HCAs in the livers of a group of pigs by around 30%, and they found that their blood glucose levels increased by 30% when compared with the control group. HCAs were then given to the pigs, after which their blood glucose levels eased off.

Another experiment conducted by the researchers focused on the effect of HCAs on glucagon-like peptide-1 (GLP-1). GLP-1 is a hormone produced by L-cells, a type of enteroendocrine cell that enhances insulin secretion and decreases blood glucose. In a laboratory setting, different kinds of bile acids, including HCAs, were applied to L-cells, at varying levels of concentration. Results showed that at a high concentration of 50 micromolar, HCAs were the most effective at stimulating GLP-1 secretion when compared with other types of bile acids. The findings also revealed that HCAs regulate blood glucose levels by stimulating the secretion of GLP-1 and thus insulin production.

Potential for diabetes prediction and treatment

"The results of our studies provide evidence of how HCAs help to regulate blood glucose levels, and they have revealed the mechanism of how it is achieved at a cellular level. HCAs demonstrate promising potential, and they could be developed into an agent for the prediction and treatment of type 2 diabetes," said Professor Jia.

"As gut microbiota can regulate the metabolism of HCAs, targeting the intestines instead of the pancreas could be a prospective novel strategy for treating diabetes. We will further investigate how to increase the secretion levels of HCAs in diabetic patients by regulating the intestinal bacteria," he added.

Credit: 
Hong Kong Baptist University

Poor judgment of autistic adults

Autistic adults can be wrongly perceived as deceptive and lacking credibility, Flinders University researchers say, with this working against many caught in the legal system.

Ahead of World Autism Awareness Day (2 April 2021), a new paper in the Journal of Autism and Developmental Disorders asked 1,410 civilians to respond to video recordings with 30 adults with Autism Spectrum Disorder (ASD) and 29 non-ASD individuals to examine whether stereotypical behaviors associated with autism influenced people's perceptions of the individual.

Common behaviors include gaze aversion, repetitive body movements, literal interpretations of figurative language and poor reciprocity.

Co-author Flinders Professor Robyn Young, author of Crime and Autism Spectrum Disorder: Myths and Mechanisms (2015) with Emeritus Professor Neil Brewer, says "it's unfortunate that many of the behaviors that are believed to be portrayed by people who are being deceptive, often erroneously, are also commonly seen among people on the autism spectrum".

These behaviors can therefore disadvantage a person who has autism when they interact with the criminal judicial system, the Flinders University professors argue.

Professor Young regularly consults with the legal system to educate judges and juries about ASD so that an autistic person's presentation will not be misinterpreted by people who do not understand their condition. "We have now extracted recent statistics suggesting that sentences of autistic people are on average higher than their non-autistic peers who have committed similar offences," she says.

"If you ask most people how they determine if someone is not telling the truth, they will often refer to lack of eye contact or fidgety behavior," says lead author Dr Alliyza Lim.

"Even though our study actually showed that none of these behaviors individually were directly linked to a person being considered as less credible and more deceptive, overall autistic people are considered less reliable than their non-ASD peers."

Autistic individuals may be erroneously perceived as deceptive and lacking credibility (2021) by A Lim, RL Young and N Brewer has been published in the Journal of Autism and Developmental Disorders (Springer Nature) DOI 10.1007/s10803-021-04963-4.

Autism, or ASD, refers to a broad range of conditions characterised by challenges with social skills, repetitive behaviours, speech and non-verbal communication. Autism affects an estimated 1 in 160 children and the WHO is calling for more intervention services and broader actions for making physical, social and attitudinal environments more accessible, inclusive and supportive for people with ASD.

An Autism Awareness Australia report (2020) revealed 60.5% of adults with ASD surveyed had communication issues and 48% behaviour issues which led them to seek help.

Credit: 
Flinders University

Large study identified new genetic link to male infertility

image: The prevalence of the Y-chromosomal haplogroup R1a1-M458 carrying a fixed r2/r3 inversion.
(A) Geographical distribution of haplogroup R1a1-M458 and its sub-lineages in Europe. Pie charts indicate populations, with the black sector showing the proportion of R1a1-M458 according to Underhill et al., 2015. (B) The estimated proportion of subjects among idiopathic cases with severe spermatogenic failure (sperm counts 0-10 × 106/ejaculate) carrying R1a1-M458 Y lineage (and its sub-lineages) chromosomes that have undergone a subsequent partial AZFc deletion. The prevalence was estimated using reported population frequencies of R1a1-M458, including Estonians (Underhill et al., 2015) and data available in the current study for Estonian men with spermatogenic failure. Estonians are shown in bold and with a striped filling.

Image: 
University of Tartu

The findings published in eLife show that men with this unstable subtype of the Y chromosome have a significantly increased risk of genomic rearrangements. These rearrangements affect the sperm production process (spermatogenesis) and consequently, these men can be up to nine times more likely to have fertility issues. Molecular diagnostics of this genetic variant could help identify those at higher risk in their early adulthood, giving them the chance to make decisions about future family planning early on. Currently, the exact cause of infertility remains unknown in more than half of men with spermatogenic impairment.

In the large-scale study led by geneticist Pille Hallast and conducted in cooperation between the Human Genetics Research Group of the University of Tartu Institute of Biomedicine and Translational Medicine, the Andrology Centre of Tartu University Hospital and the Wellcome Sanger Institute, the Y chromosomes of more than 2,300 Estonian men were analysed. The men were involved in the study at the Andrology Centre of Tartu University Hospital by the team of Professor Margus Punab.

The Y-chromosomal region studied in this paper has been associated with male infertility before, but this is the first time the genetic variation of such a large clinical sample has been investigated. According to Hallast, this is the most sophisticated in-depth analysis of the genetic variation of Y chromosome in patients with spermatogenic impairment.

One of the senior co-authors of the study, Professor of Human Genetics of the University of Tartu Maris Laan said that the previously undescribed subtype of Y chromosome discovered in cooperation with colleagues from the UK causes a high risk of severe spermatogenic impairment. "At one point during evolution, the 'ancestor' of these Y chromosomes experienced the inversion of a long DNA segment, causing instability in this Y chromosome and predisposing to the deletion of surrounding DNA segments," explained Laan.

This inverted Y chromosome variant is relatively common and does not always lead to partial deletion of DNA or fertility issues. Therefore, it can be passed down in families and remain unnoticed until an offspring has infertility issues. According to estimates, in this Y lineage, the deletion of genetic material that harms spermatogenesis occurs in about one man in ten or even less often. This subtype is carried by a significant number of men of European descent, but it is not found as widely in other continents. In Estonia, it can be found in 5-6% of men, but in Poland and Czechia, the figure is nearly 20%.

The world's leading population geneticist Dr. Chris Tyler-Smith, collaborator working at the Wellcome Sanger Institute, underlined the need for more synergetic research between teams in population and reproductive genetics to map unfavourable Y-chromosomal variants linked to impaired spermatogenesis and understand how these have evolved and survived demographically.

Professor Maris Laan emphasised the potential practical value of molecular diagnostics of this research, as early identification of the genetic reason of male spermatogenic impairment could allow a more focused clinical management of infertile couples. "While the deletions in the studied Y chromosome region were previously known to interfere with sperm production, we now know that the deletion of DNA sequences leads to the genetic predisposition to infertility depending on the Y-chromosomal lineage," she added.

Credit: 
Estonian Research Council

Gut microbiota in cesarean-born babies catches up

image: Fredrik Backhed and Lisa Olsson, Sahlgrenska Academy, University of Gothenburg

Image: 
Photo by Johan Wingborg and Malin Arnesson

Infants born by cesarean section have a relatively meager array of bacteria in the gut. But by the age of three to five years they are broadly in line with their peers. This is shown by a study that also shows that it takes a remarkably long time for the mature intestinal microbiota to get established.

Fredrik Bäckhed, Professor of Molecular Medicine at Sahlgrenska Academy, University of Gothenburg, has been heading this research. The study, conducted in collaboration with Halland County Hospital in Halmstad, is now published in the journal Cell Host & Microbe.

Professor Bäckhed and his group have previously demonstrated that the composition of children's intestinal microbiota is affected by their mode of delivery and diet. In the current study, the researchers examined in detail how the composition of intestinal bacteria in 471 children born at the hospital in Halmstad had developed.

The first fecal sample was collected when each child was a newborn infant. Thereafter, sampling took place at 4 months, 12 months, 3 years and 5 years. The scientists were thus able to follow the successive incorporation of various bacteria into the children's gut microbiota.

At birth, the infant's intestine has already been colonized by bacteria and other microorganisms. During the first few years of life, the richness of species steadily increases. What is now emerging is a considerably more detailed picture of this developmental trajectory.

One key conclusion is that the intestinal microbiota forms an ecosystem that takes a long time to mature. Even at 5 years of age, the system is incomplete. The maturation process can look very different from one child to another, and take varying lengths of time.

At the age of 4 months, the gut microbiota in the cesarean-born infants was less diverse compared with vaginally born infants. However, when the children were 3 and 5 years the microbiota diversity and composition had caught up and were largely normalized intestinal microbiota.

"Our findings show that the gut microbota is a dynamic organ, and future studies will have to show whether the early differences can affect the cesarean children later in life," Bäckhed says.

"It's striking that even at the age of 5 years, several of the bacteria that are important components of the intestinal microbiota in adults are missing in the children," he continues.

This indicates that the intestine is a complex and dynamic environment where bacteria create conditions for one another's colonization.

According to the researchers, the current study has broadened our understanding of how humans interact with the trillions of bacteria contained in our bodies, and of how these bacteria become established.

Lisa Olsson, a researcher at the University of Gothenburg and one of the first authors, adds:

"Children learn skills like walking and talking at different rates, and it turns out that the same applies to the maturity of the gut microbiota."

Fredrik Bäckhed again:

"By investigating and understanding how the intestinal microbiota develops in healthy children, we may get a reference point to explorie if the microbiota may contribute to disease in future studies," he concludes.

Credit: 
University of Gothenburg

Keeping it fresh: New AI-based strategy can assess the freshness of beef samples

image: Consuming spoiled beef is dangerous, but there are currently no simple and efficient methods to assess beef freshness.

Image: 
Unsplash

Although beef is one of the most consumed foods around the world, eating it when it's past its prime is not only unsavory, but also poses some serious health risks. Unfortunately, available methods to check for beef freshness have various disadvantages that keep them from being useful to the public. For example, chemical analysis or microbial population evaluations take too much time and require the skills of a professional. On the other hand, non-destructive approaches based on near-infrared spectroscopy require expensive and sophisticated equipment. Could artificial intelligence be the key to a more cost-effective way to assess the freshness of beef?

At Gwangju Institute of Science and Technology (GIST), Korea, a team of scientists led by Associate Processors Kyoobin Lee and Jae Gwan Kim have developed a new strategy that combines deep learning with diffuse reflectance spectroscopy (DRS), a relatively inexpensive optical technique. "Unlike other types of spectroscopy, DRS does not require complex calibration; instead, it can be used to quantify part of the molecular composition of a sample using just an affordable and easily configurable spectrometer," explains Lee. The findings of their study are now published in Food Chemistry.

To determine the freshness of beef samples, they relied on DRS measurements to estimate the proportions of different forms of myoglobin in the meat. Myoglobin and its derivatives are the proteins mainly responsible for the color of meat and its changes during the decomposition process. However, manually converting DRS measurements into myoglobin concentrations to finally decide upon the freshness of a sample is not a very accurate strategy--and this is where deep learning comes into play.

Convolutional neural networks (CNN) are widely used artificial intelligence algorithms that can learn from a pre-classified dataset, referred to as 'training set,' and find hidden patterns in the data to classify new inputs. To train the CNN, the researchers gathered data on 78 beef samples during their spoilage process by regularly measuring their pH (acidity) alongside their DRS profiles. After manually classifying the DRS data based on the pH values as 'fresh,' 'normal,' or 'spoiled,' they fed the algorithm the labelled DRS dataset and also fused this information with myoglobin estimations. "By providing both myoglobin and spectral information, our trained deep learning algorithm could correctly classify the freshness of beef samples in a matter of seconds in about 92% of cases," highlights Kim.

Besides its accuracy, the strengths of this novel strategy lie in its speed, low cost, and non-destructive nature. The team believes it may be possible to develop small, portable spectroscopic devices so that everyone can easily assess the freshness of their beef, even at home. Moreover, similar spectroscopy and CNN-based techniques could also be extended to other products, such as fish or pork. In the future, with any luck, it will be easier and more accessible to identify and avoid questionable meat.

Credit: 
GIST (Gwangju Institute of Science and Technology)

BrainGate: First human use of high-bandwidth wireless brain-computer interface

image: A participant in the BrainGate clinical trial uses wireless transmitters that replace the cables normally used to transmit signals from sensors inside the brain.

Image: 
Braingate.ord

PROVIDENCE, R.I. [Brown University and Providence Veterans Affairs Medical Center] -- Brain-computer interfaces (BCIs) are an emerging assistive technology, enabling people with paralysis to type on computer screens or manipulate robotic prostheses just by thinking about moving their own bodies. For years, investigational BCIs used in clinical trials have required cables to connect the sensing array in the brain to computers that decode the signals and use them to drive external devices.

Now, for the first time, BrainGate clinical trial participants with tetraplegia have demonstrated use of an intracortical wireless BCI with an external wireless transmitter. The system is capable of transmitting brain signals at single-neuron resolution and in full broadband fidelity without physically tethering the user to a decoding system. The traditional cables are replaced by a small transmitter about 2 inches in its largest dimension and weighing a little over 1.5 ounces. The unit sits on top of a user's head and connects to an electrode array within the brain's motor cortex using the same port used by wired systems.

For a study published in IEEE Transactions on Biomedical Engineering, two clinical trial participants with paralysis used the BrainGate system with a wireless transmitter to point, click and type on a standard tablet computer. The study showed that the wireless system transmitted signals with virtually the same fidelity as wired systems, and participants achieved similar point-and-click accuracy and typing speeds.

"We've demonstrated that this wireless system is functionally equivalent to the wired systems that have been the gold standard in BCI performance for years," said John Simeral, an assistant professor of engineering (research) at Brown University, a member of the BrainGate research consortium and the study's lead author. "The signals are recorded and transmitted with appropriately similar fidelity, which means we can use the same decoding algorithms we used with wired equipment. The only difference is that people no longer need to be physically tethered to our equipment, which opens up new possibilities in terms of how the system can be used."

The researchers say the study represents an early but important step toward a major objective in BCI research: a fully implantable intracortical system that aids in restoring independence for people who have lost the ability to move. While wireless devices with lower bandwidth have been reported previously, this is the first device to transmit the full spectrum of signals recorded by an intracortical sensor. That high-broadband wireless signal enables clinical research and basic human neuroscience that is much more difficult to perform with wired BCIs.

The new study demonstrated some of those new possibilities. The trial participants -- a 35-year-old man and a 63-year-old man, both paralyzed by spinal cord injuries -- were able to use the system in their homes, as opposed to the lab setting where most BCI research takes place. Unencumbered by cables, the participants were able to use the BCI continuously for up to 24 hours, giving the researchers long-duration data including while participants slept.

"We want to understand how neural signals evolve over time," said Leigh Hochberg, an engineering professor at Brown, a researcher at Brown's Carney Institute for Brain Science and leader of the BrainGate clinical trial. "With this system, we're able to look at brain activity, at home, over long periods in a way that was nearly impossible before. This will help us to design decoding algorithms that provide for the seamless, intuitive, reliable restoration of communication and mobility for people with paralysis."

The device used in the study was first developed at Brown in the lab of Arto Nurmikko, a professor in Brown's School of Engineering. Dubbed the Brown Wireless Device (BWD), it was designed to transmit high-fidelity signals while drawing minimal power. In the current study, two devices used together recorded neural signals at 48 megabits per second from 200 electrodes with a battery life of over 36 hours.

While the BWD has been used successfully for several years in basic neuroscience research, additional testing and regulatory permission were required prior to using the system in the BrainGate trial. Nurmikko says the step to human use marks a key moment in the development of BCI technology.

"I am privileged to be part of a team pushing the frontiers of brain-machine interfaces for human use," Nurmikko said. "Importantly, the wireless technology described in our paper has helped us to gain crucial insight for the road ahead in pursuit of next generation of neurotechnologies, such as fully implanted high-density wireless electronic interfaces for the brain."

The new study marks another significant advance by researchers with the BrainGate consortium, an interdisciplinary group of researchers from Brown, Stanford and Case Western Reserve universities, as well as the Providence Veterans Affairs Medical Center and Massachusetts General Hospital. In 2012, the team published landmark research in which clinical trial participants were able, for the first time, to operate multidimensional robotic prosthetics using a BCI. That work has been followed by a steady stream of refinements to the system, as well as new clinical breakthroughs that have enabled people to type on computers, use tablet apps and even move their own paralyzed limbs.

"The evolution of intracortical BCIs from requiring a wire cable to instead using a miniature wireless transmitter is a major step toward functional use of fully implanted, high-performance neural interfaces," said study co-author Sharlene Flesher, who was a postdoctoral fellow at Stanford and is now a hardware engineer at Apple. "As the field heads toward reducing transmitted bandwidth while preserving the accuracy of assistive device control, this study may be one of few that captures the full breadth of cortical signals for extended periods of time, including during practical BCI use."

The new wireless technology is already paying dividends in unexpected ways, the researchers say. Because participants are able to use the wireless device in their homes without a technician on hand to maintain the wired connection, the BrainGate team has been able to continue their work during the COVID-19 pandemic.

"In March 2020, it became clear that we would not be able to visit our research participants' homes," said Hochberg, who is also a critical care neurologist at Massachusetts General Hospital and director of the V.A. Rehabilitation Research and Development Center for Neurorestoration and Neurotechnology. "But by training caregivers how to establish the wireless connection, a trial participant was able to use the BCI without members of our team physically being there. So not only were we able to continue our research, this technology allowed us to continue with the full bandwidth and fidelity that we had before."

Simeral noted that, "Multiple companies have wonderfully entered the BCI field, and some have already demonstrated human use of low-bandwidth wireless systems, including some that are fully implanted. In this report, we're excited to have used a high-bandwidth wireless system that advances the scientific and clinical capabilities for future systems."

Credit: 
Brown University

Spin-to-charge conversion achieves 95% overall qubit readout fidelity

image: a) Energy levels used to achieve SCC. b) A schematic diagram of SCC readout. c) The excitation spectrum of the nitrogen-vacancy (NV) center used here at cryogenic temperature of 8?K. d) Spin-flip process induces the photoluminescence (PL) decay.

Image: 
ZHANG Qi et al.

The team led by Professor DU Jiangfeng and Professor WANG Ya from the Chinese Academy of Sciences (CAS) Key Laboratory of Microscale Magnetic Resonance of the University of Science and Technology of China put forward an innovative spin-to-charge conversion method to achieve high-fidelity readout of qubits, stepping closer towards fault-tolerant quantum computing.

Quantum supremacy over classical computers has been fully exhibited in some specific problems, yet the next milestone, fault-tolerant quantum computing, still requires the accumulated logic gate error and the spin readout fidelity to exceed the fault-tolerant threshold. DU's team has resolved the first requirement in the nitrogen-vacancy (NV) center system [Nat. Commun. 6, 8748 (2015)] previously and this work targeted at high-fidelity readout of qubits.

Qubit state, such as spin state, is fragile: a common readout approach may cause the flip between the 0 and 1 states for even a few photons resulting in a reading error. The readout fidelity of traditional resonance fluorescence method is strictly limited by such property. Since the spin state is difficult to measure, researchers blazed a trail to replace it with an easy-to-readout and measurable property: the charge state.

They first compared the optical readout lifetime of the charge state and spin state, finding that charge state is more stable than the spin state by five orders of magnitude. Experiment results showed that the average non-demolition charge readout fidelity reached 99.96%.

Then the team adopted near-infrared (NIR) light (1064 nm) to induce the ionization of the excited spin state, transforming the spin state 0 and 1 to the "electrically neutral" and "negatively charged" charge states respectively. This process converted the spin readout to the charge readout.

The results indicated that the error of traditional resonance fluorescence method reached 20.1%, while the error of this new method can be suppressed to 4.6%.

The article was published in Nature Communications.

This new method is compatible with tradition methods, provisioning a spin readout fidelity exceeding the fault-tolerant threshold in real applications. Thanks to the less damage of NIR light to biological tissues and other samples, this method will also effectively improve the detection efficiency of quantum sensors.

Credit: 
University of Science and Technology of China

Pollen season in Switzerland earlier and more intense due to climate change

Pollen from trees, grasses and weeds are causing seasonal allergies for approximately one fifth of the Swiss population every year. A study now found that due to climate change, the pollen season has shifted substantially over the past 30 years in onset, duration and intensity. "For at least four allergenic species, the tree pollen season now starts earlier than 30 years ago - sometimes even before January," said Marloes Eeftens, Principal Investigator and Group Leader at Swiss TPH. "The duration and intensity of the pollen season have also increased for several species, meaning that allergic people not only suffer for a longer period of time but also react stronger to these higher concentrations."

The researchers analysed pollen data from 1990 to 2020 from all 14 pollen-monitoring stations in Switzerland, studying airborne pollen concentrations from 12 different plant species. "Previous studies have looked at single species or only a few locations, but this is the first time a study brings together comprehensive pollen data from across Switzerland," said Sarah Glick, first author of the study and scientific assistant at Swiss TPH.

Health impact beyond sneezing

Pollen allergies are the most common chronic disease in many European and North American countries. Today, an estimated 20% of the Swiss population suffers from pollen allergies, a dramatic increase from 100 years ago when less than 1% was affected. This increase is most likely related to changes in our environment, such as personal hygiene and increase in urban rather than rural living. Besides the typical itching eyes and nose and frequent sneezing, pollen allergies can also cause inflammation in the lungs, and result in negative impact on the cardiovascular system, decreased quality of life, and reduced school and work performance.

"Similar to man-made air pollutants like particulate matter or nitrogen dioxide, pollen may actually have numerous negative effects on our body much beyond just nuisance," explained Eeftens. "Given the extent of the population affected and the increasing burden, it is key that we further study the health effects of pollen both acute and more long-term." Swiss TPH just launched the EPOCHAL study to deepen the understanding of the multiple effects of pollen on health including blood pressure, lung function, ability to concentrate, mood and sleep.

"There is little we can do to prevent pollen release from plants, but we hope that the results of the study can help allergic people manage their allergies better," said to Eeftens. "A better understanding of allergenic species could also help guide urban planners in identifying suitable plants for parks. For example, we may think twice before planting highly allergenic trees like hazel and birch in densely populated areas."

Credit: 
Swiss Tropical and Public Health Institute

Low risk of researchers passing coronavirus to North American bats

image: USGS wildlife disease specialist Kimberli Miller collects field samples from a white-nose syndrome positive cave in Vermont.

Image: 
USGS

The risk is low that scientists could pass coronavirus to North American bats during winter research, according to a new study led by the U.S. Geological Survey. Scientists find the overall risk to be 1 in 1,000 if no protective measures are taken, and the risk falls lower, to 1 in 3,333 or less, with proper use of personal protective equipment or if scientists test negative for COVID-19 before beginning research.

The research specifically looked at the potential transmission of SARS-CoV-2, which is the type of coronavirus that causes COVID-19, from people to bats. Scientists did not examine potential transmission from bats to people.

"This is a small number, but the consequences of human-to-bat transmission of coronavirus are potentially large," said USGS scientist Evan Grant, an author of the new rapid risk assessment. "The virus has not been identified in North American bats, but if it is introduced, it could lead to illness and mortality, which may imperil long-term bat conservation. It could also represent a source for new exposure and infection in humans."

"These are hard risks for wildlife managers and other decision makers to weigh as they consider whether and how to allow researchers to study bats in their winter colonies," continued Grant.

Bats provide natural services that people value; for example, previous USGS studies found that bats save the U.S. agriculture industry more than $3 billion per year by eating pests that damage crops, reducing the need for pesticides. Yet they are often erroneously portrayed as menacing creatures at Halloween and in horror movies. They are also under duress from white-nose syndrome, a disease that has killed millions of bats in North America.

The origin of SARS-CoV-2 is not confirmed, but studies indicate the virus likely originated from similar viruses found in bats in the Eastern Hemisphere.

The rapid risk assessment conducted by the USGS and U.S. Fish and Wildlife Service focused on the winter season, when some wildlife scientists conduct field work that may require close contact with or direct handling of the animals. This includes research on white-nose syndrome and population studies that support Endangered Species Act decisions.

"If scientists wear protective equipment, particularly properly fitted masks with high filtration efficiency, or test negative for COVID-19 before conducting the research, they greatly reduce the risk of transmission to North American bats," said USGS scientist Michael Runge, another author on the new assessment.

"The current assessment represents the best available information and is useful for informing time-sensitive management decisions, but there are still many unknowns about how susceptible North American bats are to SARS-CoV-2 and how future virus variants may affect transmission," said Grant.

"The potential for SARS-CoV-2 to infect wildlife is a real concern for state and federal wildlife management agencies and reflects the important connections between human health and healthy environments," said Jeremy Coleman, National White-nose Syndrome Coordinator for the USFWS and an author of the paper. "Natural resource managers need information from these kinds of analyses to make science-based decisions that advance conservation efforts while also protecting the health of people, bats, and other wildlife."

Three bat species - free-tailed bats, little brown bats and big brown bats - were included in the analysis. They were chosen because they have physical and behavioral differences and are typical of the kinds of bats studied in winter. Scientists considered different ways the virus could be transmitted between humans and bats, with airborne transmission as the main pathway.

This study estimates transmission risk to at least one bat during a typical winter survey, which includes a team of five scientists spending one hour in a cave colonized by 1,000 bats.

This research builds on a USGS-led study published last year that examined the likelihood of researchers transmitting SARS-CoV-2 to bats during summer research. Since that study, a substantial amount of new data and knowledge on the virus has been acquired and applied. Winter and summer research can involve different settings and activities.

Credit: 
U.S. Geological Survey

Towards a better understanding of natural hazard risk and economic losses in Europe

The "Science for Disaster Risk Management 2020: acting today, protecting tomorrow", the second of its series, has been produced with the collaboration of more than 300 experts in disaster risk management. The participants come from different disciplines and sectors to provide the reader with accurate and updated information on the consequences that disasters have on key assets of society (population, economic sectors, critical infrastructures, environment and cultural heritage) and how these can be managed. Finally, the report provides a set of recommendations addressed to four target groups of society that can actively work to reduce disaster risk: policymakers, practitioners, scientists and citizens.?

The report aims to move from identifying problems to identifying solutions and approaches for action. The launch would be a good opportunity to engage with different stakeholders to identify what the community has learned up to now and to explore together how the proposed recommendations can be put in practice.

Jaroslav Mysiak, director of the CMCC research division 'Risk assessment and adaptation strategies' participated in the realization of the report as coordinating lead author (CLA).

In his contribution (Chapter 3 Assets at risks and potential impacts - 3.3. Economic Sectors), the authors reviewed the state-of-the-art knowledge, methodologies and practice of assessing damage and losses caused to residential building stock, agriculture, and industrial and energy assets.

Natural hazards are a major threat to sustainable development, economic stability and growth, territorial cohesion, and community resilience. According to the estimates of the European Environment Agency, the economic damage due to only natural hazard risks in the EU amounted to more than EUR 557 billion in 1980-2017, mostly triggered by extreme weather and climate-related events whose frequencies and/or intensities are expected to increase as a result of human-induced climate change.

"Over the past few decades, disaster risk assessment has improved thanks to the advancements in high performance computing, high-resolution topographic and other spatial data, a new generation of large-scale hazard and disaster loss/impact models, and high-resolution exposure datasets", Jaroslav Mysiak explains. "An accurate spatial representation of exposure features such as residential and industrial facilities and assets, infrastructure, population density and gross domestic product make it possible to improve the estimates and spatial distribution of disaster impacts. Advanced quality and accessibility of Earth observation products, including from the EU's Copernicus programme, have led the way to coherent exposure and vulnerability data at continental and global scales."

The authors concluded the chapter highlighting the lesson learnt from COVID-19 pandemic. "What we have lived through during the lockdown, and still will", they write, "is a mild foretaste of the systemic shocks that climate and global environmental changes may and will cause in the future. Future improvements of risk assessment need to be focused on a better understanding of indirect and spillover economic losses generated by slow-onset hazards, compound risks and cascading risks, as well as losses caused by disruption of social networks, economic flows and ecosystem services.

The EU Green Deal and the unprecedented post-COVID-19 recovery package will stimulate immense investments in green technologies and innovation, and lead the way to sustainable development and climate neutrality. Only with sound, evidence-based and multi-hazard risk assessments can we reconcile short-term 'building back better' recovery and medium- to long-term climate-resilient development."

Credit: 
CMCC Foundation - Euro-Mediterranean Center on Climate Change