Culture

Study reveals neurons responsible for rapidly stopping behaviors, actions

image: Ueli Rutishauser, PhD

Image: 
Cedars-Sinai

LOS ANGELES (Feb. 3, 2021) -- For the first time in humans, investigators at Cedars-Sinai have identified the neurons responsible for canceling planned behaviors or actions--a highly adaptive skill that when lost, can lead to unwanted movements.

Known as "stop signal neurons," these neurons are critical in powering someone to stop or abort an action they have already put in process.

"We have all had the experience of sitting at a traffic stop and starting to press the gas pedal but then realizing that the light is still red and quickly pressing the brake again," said Ueli Rutishauser, PhD, professor of Neurosurgery, Neurology and Biomedical Sciences at Cedars-Sinai and senior author of the study published online in the peer-reviewed journal Neuron. "This first-in-human study of its kind identifies the underlying brain process for stopping actions, which remains poorly understood."

The findings, Rutishauser said, reveals that such neurons exist in an area of the brain called the subthalamic nucleus, which is a routine target for treating Parkinson's disease with deep brain stimulation.

Patients with Parkinson's disease, a motor system disorder affecting nearly 1 million people in the U.S., suffer simultaneously from both the inability to move and the inability to control excessive movements. This paradoxical mix of symptoms has long been attributed to disordered function in regions of the brain that regulate the initiation and halting of movements. How this process occurs and what regions of the brain are responsible have remained elusive to define despite years of intensive research.

Now, a clearer understanding has emerged.

Jim Gnadt, PhD, program director for the NIH Brain Research through Advancing Innovative Technologies® (BRAIN) Initiative, which funded this project, explained that this study helps us understand how the human brain is wired to accomplish rapid movements.

"It is equally important for motor systems designed for quick, fast movements to have a 'stop control' available at a moment's notice--like a cognitive change in plan--and also to keep the body still as one begins to think about moving but has yet to do so."

To make their discovery, the Cedars-Sinai research team studied patients with Parkinson's disease who were undergoing brain surgery to implant a deep brain stimulator--a relatively common procedure to treat the condition. Electrodes were lowered into the basal ganglia, the part of the brain responsible for motor control, to precisely target the device while the patients were awake.

The researchers discovered that neurons in one part of the basal ganglia region--the subthalamic nucleus--indicated the need to "stop" an already initiated action. These neurons responded very quickly after the appearance of the stop signal.

"This discovery provides the ability to more accurately target deep brain stimulation electrodes, and in return, target motor function and avoid stop signal neurons," said Adam Mamelak, MD, professor of Neurosurgery and co-first author of the study.

Mamelak notes that many patients with Parkinson's disease have issues with impulsiveness and the inability to stop inappropriate actions. As a next step, Mamelak and the research team will build on this discovery to investigate whether these neurons also play a role in these more cognitive forms of stopping.

"There is strong reason to believe that they do, based on significant literature linking inability to stop to impulsiveness," said Mamelak. "This discovery will enable investigating whether the neurons we discovered are the common mechanisms that link the two phenomena."

Clayton Mosher, PhD, co-first author of the study and a project scientist in the Rutishauser lab, says while it has long been hypothesized that such neurons exist in a particular brain area, such neurons had never been observed "in action" in humans.

"Stop neurons responded very quickly following the onset of the stop cue on the screen, a key requirement to be able to suppress an impending action," said Mosher. "Our result is the first single-neuron demonstration in humans of signals that are likely carried by this particular pathway."

Credit: 
Cedars-Sinai Medical Center

First-in-human clinical trial confirms HIV vaccine approach by IAVI and Scripps Research

video: A phase 1 clinical trial testing a novel vaccine approach to prevent HIV has produced promising results, IAVI and Scripps Research announced. The vaccine showed success in stimulating production of rare immune cells needed to start the process of generating antibodies against the fast-mutating virus; the targeted response was detected in 97 percent of participants who received the vaccine. In this video, William Schief, Ph.D., discusses the trial results and the vaccine design.

Image: 
Scripps Research

NEW YORK and LA JOLLA, CA--A phase 1 clinical trial testing a novel vaccine approach to prevent HIV has produced promising results, IAVI and Scripps Research announced today. The vaccine showed success in stimulating production of rare immune cells needed to start the process of generating antibodies against the fast-mutating virus; the targeted response was detected in 97 percent of participants who received the vaccine.

"This study demonstrates proof of principle for a new vaccine concept for HIV, a concept that could be applied to other pathogens, as well," says William Schief, Ph.D., a professor and immunologist at Scripps Research and executive director of vaccine design at IAVI's Neutralizing Antibody Center, whose laboratory developed the vaccine. "With our many collaborators on the study team, we showed that vaccines can be designed to stimulate rare immune cells with specific properties, and this targeted stimulation can be very efficient in humans. We believe this approach will be key to making an HIV vaccine and possibly important for making vaccines against other pathogens."

Schief presented the results on behalf of the study team at the International AIDS Society HIV Research for Prevention (HIVR4P) virtual conference today.

The study sets the stage for additional clinical trials that will seek to refine and extend the approach--with the long-term goal of creating a safe and effective HIV vaccine. As a next step, IAVI and Scripps Research are partnering with the biotechnology company Moderna to develop and test an mRNA-based vaccine that harnesses the approach to produce the same beneficial immune cells. Using mRNA technology could significantly accelerate the pace of HIV vaccine development.

HIV, which affects more than 38 million people globally, is known to be among the most difficult viruses to target with a vaccine, in large part because it constantly evolves into different strains to evade the immune system.

"These exciting findings emerge from remarkably creative, innovative science and are a testament to the research team's talent, dedication and collaborative spirit, and to the generosity of the trial participants," says Mark Feinberg, M.D., Ph.D., president and CEO of IAVI. "Given the urgent need for an HIV vaccine to rein in the global epidemic, we think these results will have broad implications for HIV vaccine researchers as they decide which scientific directions to pursue. The collaboration among individuals and institutions that made this important and exceptionally complex clinical trial so successful will be tremendously enabling to accelerate future HIV vaccine research."

One in a million

For decades now, HIV researchers have pursued the holy grail of stimulating the immune system to create rare but powerful antibodies that can neutralize diverse strains of HIV. Known as "broadly neutralizing antibodies," or bnAbs, these specialized blood proteins could attach to HIV spikes, proteins on the virion surface that allow the virus to enter human cells, and disable them via important yet difficult-to-access regions that don't vary much from strain to strain.

"We and others postulated many years ago that in order to induce bnAbs, you must start the process by triggering the right B cells--cells that have special properties giving them potential to develop into bnAb-secreting cells," Schief says. "In this trial, the targeted cells were only about one in a million of all naïve B cells. To get the right antibody response, we first need to prime the right B cells. The data from this trial affirms the ability of the vaccine immunogen to do this."

The priming step would be the first stage of a multi-step vaccine regimen aimed at eliciting many different types of bnAbs, he says.

Promise beyond HIV

The strategy of targeting naïve B cells with specific properties is called "germline-targeting," as these young B cells display antibodies encoded by unmutated, or "germline" genes. Researchers believe the approach could also be applied to vaccines for other challenging pathogens such as influenza, dengue, Zika, hepatitis C viruses and malaria.

"This is a tremendous achievement for vaccine science as a whole," says Dennis Burton, Ph.D., professor and chair of the Department of Immunology and Microbiology at Scripps Research, scientific director of the IAVI Neutralizing Antibody Center and director of the NIH Consortium for HIV/AIDS Vaccine Development. "This clinical trial has shown that we can drive immune responses in predictable ways to make new and better vaccines, and not just for HIV. We believe this type of vaccine engineering can be applied more broadly, bringing about a new day in vaccinology."

The clinical trial, IAVI G001, was sponsored by IAVI and took place at two sites: George Washington University (GWU) in Washington, D.C., and the Fred Hutchinson Cancer Research Center (Fred Hutch) in Seattle, enrolling 48 healthy adult volunteers. Participants received either a placebo or two doses of the vaccine compound, eOD-GT8 60mer, along with an adjuvant developed by the pharmaceutical company GSK. Julie McElrath, M.D., Ph.D., senior vice president and director of Fred Hutch's Vaccine and Infectious Disease Division, and David Diemert, M.D., professor of medicine at GWU School of Medicine and Health Sciences, were lead investigators at the trial sites.

"This is a landmark study in the HIV vaccine field, demonstrating success in the first step of a pathway to induce broad neutralizing antibodies against HIV-1," McElrath says. "The novel design of the immunogen, the clinical trial and the molecular B cell analyses provide a roadmap to accelerate further progress toward an HIV vaccine."

Wide network of collaborators

Funding from the Bill & Melinda Gates Foundation, through the Collaboration for AIDS Vaccine Discovery, supported a wide network of partners conducting complex analyses.

The critical assay used to judge the vaccine candidate, epitope-specific single B cell sorting and B cell receptor (BCR) sequencing, was developed and carried out by teams at the NIH Vaccine Research Center, led by Adrian McDermott, Ph.D. (chief of the Vaccine Immunology Program), Richard Koup, M.D. (deputy director and chief of the Immunology Laboratory and Immunology Section), and research scientist David Leggat, Ph.D.; and at Fred Hutch, led by McElrath and senior staff scientist Kristen Cohen, Ph.D. Study design and data analysis were led by staff scientists Allan deCamp, Ph.D., Greg Finak, Ph.D., and Jimmy Fulp at the Vaccine Immunology Statistical Center at Fred Hutch, with assistance from the Schief lab.

Credit: 
Scripps Research Institute

Study examines role of biomarkers to evaluate kidney injury in cancer patients

ROCHESTER, Minn. -- A study by Mayo Clinic researchers published in Kidney International Reports finds that immune checkpoint inhibitors, may have negative consequences in some patients, including acute kidney inflammation, known as interstitial nephritis. Immune checkpoint inhibitors are used to treat cancer by stimulating the immune system to attack cancerous cells.

"Immune checkpoint inhibitors have improved the prognosis for patients with a wide range of malignancies including melanoma, non-small cell lung cancer and renal cancer," says Sandra Herrmann, M.D., a Mayo Clinic nephrologist and the study's senior author. "In some patients, this enhanced immune response may target kidney tissue, leading to acute kidney inflammation known as interstitial nephritis."

Dr. Herrmann says a kidney biopsy is the gold standard to diagnose this condition. However, a kidney biopsy is an invasive procedure that some patients may not be able to undergo because of the risk of bleeding.

"Our study provides important, first-time data for clinicians and patients on the use of biomarkers to routinely evaluate the cause of acute kidney injury in patients undergoing immune checkpoint inhibitor therapy for cancer," says Dr. Herrmann. "These biomarkers could assist with helping doctors discriminate treatment associated kidney injury from other causes and may also help aid clinical decision-making related to whether immune checkpoint inhibitor therapy should be continued if the injury found is not related to immunotherapy."

For this study, researchers followed patients who were seen at Mayo Clinic for acute kidney injury from 2014 to 2020. They found that blood markers of kidney function and inflammation, serum creatinine and C-reactive protein, respectively, as well as urine markers ? urine retinol binding protein-to-urine creatinine ratio ? were significantly higher in patients with acute kidney injury due to interstitial nephritis associated with immune checkpoint inhibitor therapy when compared to other patients treated with immunotherapy but with acute kidney injury due to other causes, such as acute tubular necrosis associated with other cancer therapies.

"Being able to tell if acute kidney injury in a cancer patient is due to a certain type of cancer therapy without the need for an invasive test is extremely important," says Dr. Herrmann. "It simplifies the work-up for patients, makes the approach safer and quicker, and helps physicians better guide patients through their care."

Dr. Herrmann says that being able to attribute acute kidney failure to a cause other than immune checkpoint inhibitor therapy allows patients to continue with their cancer immunotherapy, which can be lifesaving. In addition, she says acute kidney failure has profound prognostic implications for patients and needs to be properly treated, so promptly identifying the cause is important.

Credit: 
Mayo Clinic

New clues to how muscle wasting occurs in people with cancer

UNIVERSITY PARK, Pa. -- Muscle wasting, or the loss of muscle tissue, is a common problem for people with cancer, but the precise mechanisms have long eluded doctors and scientists. Now, a new study led by Penn State researchers gives new clues to how muscle wasting happens on a cellular level.

Using a mouse model of ovarian cancer, the researchers found that cancer progression led to fewer skeletal muscle ribosomes -- particles in the cell that make proteins. Because muscle mass is mainly determined by protein synthesis, having less ribosomes likely explains why muscles waste away in cancer.

Gustavo Nader, associate professor of kinesiology and physiology at Penn State, said the findings suggest a mechanism for muscle wasting that could be relevant not just for people with cancer, but other conditions as well.

"Loss of muscle mass is also associated with the aging process, malnutrition, and people with COVID-19 and HIV-AIDS, among others," Nader said. "Not only is muscle wasting a common problem, but there's currently no cure or treatment, either. But now that we understand the mechanism better, we can move forward with trying to find ways to reverse that mechanism."

According to the researchers, significant muscle wasting -- or "cachexia" -- occurs in about 80% of people with cancer and is responsible for about 30% of cancer deaths. It's also associated with a reduced quality of life, problems tolerating chemotherapy and lower survival rates. According to Nader, "cachexia is often the killer, not the tumor."

Nader said that because there is no current cure or treatment for cachexia, it is vital for scientists to understand precisely how and why it happens. But while there has been a lot of research trying to understand and prevent the mechanism that causes muscles to waste, Nader and his team wanted to tackle the problem from a new angle.

"Most of the focus has been on protein degradation, where people have tried to block proteins from being chopped up, or degraded, in order to prevent the loss of muscle mass," Nader said. "But many of those efforts have failed, and one reason may be because people forgot about the protein synthesis aspect of it, which is the process of creating new proteins. That's what we tackled in this study."

For the study, the team used a pre-clinical mouse model of ovarian cancer with significant muscle loss. By using mice, the researchers were able to study the progression of cancer cachexia over time which would be difficult to do with human patients.

After analyzing their results, the researchers found that mice with tumors experienced a rapid loss of muscle mass and a dramatic reduction in the ability to synthesize new proteins, which can be explained by a drop in the amount of ribosomes in their muscles.

"So we cracked the first layer of this problem, because we showed that there are less ribosomes and less protein synthesis," Nader said.

Then, the researchers set out to explain why the number of ribosomes was decreased. After examining the ribosomal genes, they found that once a tumor was present, the expression of the ribosomal genes started to decrease until it reached a level that made it impossible for the muscles to produce enough ribosomes to maintain enough protein synthesis to prevent muscle loss.

Nader said that while more research is needed, he hopes the findings -- recently published in the Federation of American Societies for Experimental Biology Journal -- can eventually contribute to prevent people from losing muscle mass and function.

"If we can better understand how muscles make ribosomes, we will be able to find new treatments to both stimulate muscle growth and prevent muscle wasting," Nader said. "This is especially important considering that current approaches to block tumor progression target the ribosomal production machinery, and because these drugs are given systemically, they will likely affect all tissues in the body and will also impair muscle building."

Credit: 
Penn State

How does pain experienced in everyday life impact memory?

How do the normal pains of everyday life, such as headaches and backaches, influence our ability to think? Recent studies suggest that healthy individuals in pain also show deficits in working memory, or the cognitive process of holding and manipulating information over short periods of time. Prior research suggests that pain-related impairments in working memory depend on an individual's level of emotional distress. Yet the specific brain and psychological factors underlying the role of emotional distress in contributing to this relationship are not well understood.

A new study, titled "Modeling neural and self-reported factors of affective distress in the relationship between pain and working memory in healthy individuals," and published in the journal Neuropsychologia sought to address this gap in the literature. The study was authored by recent University of Miami psychology Ph.D. graduate students Steven Anderson, Joanna Witkin, and Taylor Bolt and their advisors Elizabeth Losin, director of the Social and Cultural Neuroscience Laboratory at the University of Miami; Maria Llabre, professor and associate chair of the Department of Psychology; and Claire Ashton-James, senior lecturer at the University of Sydney.

The study used publicly available brain imaging and self-report data from the Human Connectome Project (HCP), a large-scale project sponsored by the National Institutes of Health (NIH) which aims to construct a map of the complete structural and functional connections in the healthy human brain. Brain imaging and self-report data from 416 HCP participants were analyzed using structural equation modeling (SEM), a statistical technique for modeling complex relationships between multiple variables. In the 228 participants who reported experiencing some level of pain in the 7 days prior to the study, the authors found that higher pain intensity was directly associated with worse performance a commonly used test of working memory, the n-back task. In the n-back task, participants are shown a series of letters and asked whether the letter they are seeing appeared some number of screens previously. The more screens back in the sequence participants are asked to recall, the more working memory is required.

In addition, the authors found that higher pain intensity was indirectly associated with worse working memory performance through increased activity in a particular region in the center of the frontal cortex during the n-back task, the ventromedial prefrontal cortex (vmPFC). The vmPFC is a brain region involved in pain, affective distress, and cognition. Interestingly, the relationship between everyday pain and vmPFC brain activity in this study is similar to prior findings in patients with chronic pain.

"We found that healthy participants with even low levels of reported pain had different levels of activity in the vmPFC during the n-back task compared to healthy participants who didn't report pain. Surprisingly, this pattern of activity was more similar to patients with chronic pain than healthy patients who are exposed to pain manipulations in a laboratory," said Witkin.

In contrast, the authors found that certain aspects of emotional distress reported by participants, such as anger, fear, and perceived stress, were not associated with working memory performance.

"Studies looking at the relationship between pain and cognition have typically focused on patients with chronic pain or research participants given experimentally-induced pain," noted Anderson. "Even though pain is a common experience for many people, we know surprisingly little about how the everyday experience of pain impacts cognition."

Using the publicly available HCP dataset allowed the researchers to include data from a much larger group of participants than is typical in brain imaging studies due to the high cost of brain scans. This large sample enabled authors to use structural equation modeling, a statistical technique that allows for the understanding of complex relationships between multiple variables that in this case may help explain how pain decreases working memory. The authors note that their findings have potential implications in both clinical and non-clinical settings.

"This study highlights the real impact that pain can have on our ability to think even in healthy people, and points how this may come about in the brain," said Losin.

Credit: 
University of Miami

Urban agriculture in Chicago does not allow consumers to rely solely on local food

Environmentally conscious consumers try to "buy local" when food shopping. Now, a study of food raised around Chicago has shown that buying local can't provide all necessary nutrients for area residents, though it could fulfill their needs if some nutrients were supplied as supplements. The researchers report in ACS' Environmental Science & Technology that urban agriculture made little difference in reducing overall land area, and thus distance, required to supply all nutritional needs.

As the U.S. population continues to flow to urban regions, consumers are moving farther from farms and croplands. This limits nutrient recycling and drives up emissions associated with transporting food. In addition, urban centers can develop "food deserts" where residents can't purchase nutritious food close to home. One potential solution is urban agriculture, which repurposes space within cities -- such as vacant lots and rooftops -- to grow crops. Christine Costello and colleagues wanted to know the impact of urban agriculture on enabling people living within a range of distances from Chicago's center to eat local food, yet meet their complete nutritional needs.

The team considered 28 nutrients, the amount of available land, a variety of crops and livestock, a range of crop yields and both conventional and urban agriculture in the analysis. They drew circles on a map around Chicago with increasing radii, up to 400 miles, the maximum distance the U.S. government deems "local." Within that perimeter around Chicago, no mix of locally raised crops and livestock could satisfy all nutritional needs of the population. However, if D and B12 vitamins could be provided as supplements, a radius as small as 65 miles was sufficient. Urban agriculture could provide an important nutritional benefit by increasing diet diversity and availability of fresh fruits and vegetables. But it would only slightly reduce the radius (and land area) needed for supplying nearly complete nutrition locally, the researchers say.

Credit: 
American Chemical Society

Standard water treatment technique removes and inactivates an enveloped virus

image: An enveloped virus, Φ6 (left), clumps together and becomes damaged by conventional iron coagulation (right). Scale bar, 100 nm.

Image: 
Adapted from <i>Environmental Science & Technology</i> <b>2021</b>, DOI: 10.1021/acs.est.0c07697

Enveloped viruses have been detected in raw sewage and sludge, but scientists still don't fully understand the fate and infectivity of these viruses during water purification at treatment plants. Now, researchers reporting in ACS' Environmental Science & Technology have discovered that a standard water treatment technique, called iron (III) coagulation, and its electrically driven counterpart, iron (0) electrocoagulation, can efficiently remove and inactivate a model enveloped virus.

Enveloped viruses have an outer coating of lipids and proteins that helps protect their genetic material. Typically, disrupting this coat inactivates the virus. Until now, most studies have investigated only disinfection by chlorine or ultraviolet light as a means to control enveloped viruses in wastewater. However, particles suspended in the water can sometimes shield viruses from disinfectants. Shankar Chellam and colleagues wondered whether a different method called coagulation with iron (III), which is already widely applied during water treatment, can remove and inactivate enveloped viruses. They also wanted to study a related technique, iron (0) electrocoagulation, that shows promise for small-scale water treatment. As a model enveloped virus, the researchers chose an RNA virus, called Φ6, that infects bacteria.

The researchers treated a solution containing Φ6 with either iron (III) or with iron (0) electrocoagulation, both of which formed iron precipitates. The hydrophobic viral envelopes stuck to the precipitates, allowing Φ6 to be easily removed as the solids settled. The conventional coagulation reduced the amount of active virus in the water by more than 100,000 times in 2.6 minutes, whereas electrocoagulation was slower but about 10 times more effective. The researchers observed that the structures of most of the viral particles in the iron precipitates were damaged, which rendered them unable to infect their host bacteria. Electrocoagulation also oxidized lipids in the viral envelope, further inactivating Φ6. These results suggest that water treatment plants are already well equipped to remove enveloped viruses from drinking water by iron (III) coagulation, and viral levels are likely even further reduced by the additional treatment steps of filtration and disinfection, the researchers say.

Credit: 
American Chemical Society

Study suggests environmental factors had a role in the evolution of human tolerance

Environmental pressures may have led humans to become more tolerant and friendly towards each other as the need to share food and raw materials became mutually beneficial, a new study suggests.

This behaviour was not an inevitable natural progression, but subject to ecological pressures, the University of York study concludes.

Humans have a remarkable capacity to care about people well outside their own kin or local group. Whilst most other animals tend to be defensive towards those in other groups our natural tolerance allows us to collaborate today on a global scale, as seen with trade or international relief efforts to provide aid for natural disasters.

Using computer simulations of many thousands of individuals gathering resources for their group and interacting with individuals from other groups, the research team attempted to establish what key evolutionary pressures may have prompted human intergroup tolerance.

The study suggests this may have begun when humans began to leave Africa and during a period of increasingly harsh and variable environments.

The study was concerned with the period 300,000 to 30,000 years ago where archaeological evidence indicated greater mobility and more frequent interactions between different groups. In particular, this is a time in which there is a movement of raw materials over much longer distances and between groups.

The researchers found that populations which shared resources were more likely to be more successful and more likely to survive harsh environments, where extinctions occur, than those populations which do not share across borders.

However, in resource rich environments sharing was less advantageous and in extremely harsh environments populations are too low for sharing to be feasible.

Penny Spikins, Professor in the Archaeology of Human Origins at the University of York, said: "That our study demonstrates the importance of tolerance to human success is perhaps surprising, especially when we often think of prehistory as a time of competition, however we have seen that in situations where people with surplus share across borders with those in need everyone benefits in the long term."

Dr Jennifer C. French, lecturer in Palaeolithic Archaeology at the University of Liverpool
added: "Our study's findings also have important implications for wider debates about the increases in examples of innovation and greater rates of cultural evolution that occurred during this period.

"They help to explain previously enigmatic changes in the archaeological record between 300,000 and 30,000 years ago."

Credit: 
University of York

Alcohol, calories, and obesity: Could labelling make a difference?

Mandatory calorie labelling of alcoholic drinks could possibly address both alcohol consumption and obesity. An analysis published in Obesity Reviews summaries the results of studies that have examined consumer knowledge of the calorie content of alcoholic drinks, public support for labelling of calorie content on such drinks, and the effect of labelling on consumption.

In the analysis of 18 relevant studies, there was moderate evidence that people were unaware of the calorie content of alcoholic drinks and that they supported labelling. Studies found no evidence that labelling affected consumption levels, but most studies were of low quality and were not conducted in real-world settings.

"The UK government is considering whether calorie labelling of alcoholic drinks can help address obesity," said lead author Eric Robinson, PhD, of the University of Liverpool, in the UK. "Although it's unclear if calorie labels will have a meaningful impact on what people choose to drink, making sure drinks have to be clearly labelled is a step in the right direction and may also encourage the alcohol industry to cut calories in drinks."

Credit: 
Wiley

Researchers assess cognitive impairment in patients with breast cancer

A recent analysis of published studies estimates that one-quarter of adults with breast cancer have cognitive impairment before starting therapy. The analysis, which is published in Psycho-Oncology, also found that many patients' cognitive function declines after receiving chemotherapy, endocrine therapy, and/or hormone therapy for breast cancer.

"Our results suggest that cancer-related and personal factors may make a significant contribution to cognitive functioning," said lead author Aicha Dijkshoorn, of the University Medical Center Utrecht, in the Netherlands.

The authors noted that the findings from different studies were quite diverse, and some even reported cognitive improvements in patients after treatment. They stressed the importance of evaluating and addressing cognitive function, ideally over time, in patients with breast cancer.

World Cancer Day is February 4th.

Credit: 
Wiley

The pandemic lockdown's psychological impact on pregnant women

During the lockdown in the first wave of the COVID-19 pandemic in Spain, pregnant women had higher symptoms of depression and anxiety. The finding comes from a study published in Acta Obstetricia et Gynecologica Scandinavica, which also revealed that women with higher body mass index and lower social support were most affected.

A total of 204 women accepted to participate in the study, which involved completing questionnaires related to depression, anxiety, and social support.

The study's results "highlight the need to improve mental health care during pregnancy, especially in exceptional circumstances such as the global pandemic situation or lockdown, as these can cause added stress and increased anxiety and depression symptoms, resulting in undesirable consequences on pregnancy in the future newborn," the authors wrote.

Credit: 
Wiley

Experiences of post-traumatic stress disorder (PTSD) linked to nutritional health

A study of factors associated with Post-Traumatic Stress Disorder (PTSD) has led to a number of novel findings linking nutrition to experiences of PTSD. Notable among them is the discovery that Canadians, between the ages of 45 and 85, were less likely to exhibit PTSD if they consumed an average of two to three fiber sources daily.

"It is possible that optimal levels of dietary fiber have some type of mental health-related protective effect," says Karen Davison, Director of the Nutrition Informatics Research Group and Health Science Program Faculty Member at Kwantlen Polytechnic University. "This may be due to the communication network that connects the gut and brain via short chain fatty acids (SCFAs), which are metabolic byproducts of bacterial fermentation made by microbes in the human gut."

"Produced from fermenting fiber in the colon, SCFA molecules can communicate with cells and may affect brain function," Davison says.

Other diet-related factors found to be associated with an increased likelihood of PTSD included daily consumption of pastries, pulses and nuts, or chocolate.

"This finding that increased intakes of pulses and nuts were associated with increased odds of PTSD was unexpected," states Christina Hyland, a doctoral student at the University of Toronto's FIFSW. "However, the measures for these in the Canadian Longitudinal Study on Aging dataset included sources such as peanut butter, and may have also included less healthy variations such as salted or candied nuts."

The study team analyzed data from the baseline Canadian Longitudinal Study on Aging, which included 27,211 participants aged 45-85 years, of whom 1,323 had PTSD.

Other factors associated with PTSD

The study also found relationships between PTSD and factors such as poverty, gender, age, immigration history, ethnicity, marital status and physical health.

Poverty was strongly associated with PTSD, with one in every seven respondents whose household income was under $20,000 per year experiencing the disorder.

"Unfortunately, we do not know whether PTSD symptoms undermined an individual's ability to work, which resulted in poverty or whether the stress associated with poverty exacerbated PTSD symptoms in respondents," says senior author, Esme Fuller-Thomson, director of the Institute for Life Course & Aging and professor at the University of Toronto's Factor-Inwentash Faculty of Social Work (FIFSW) and Department of Family & Community Medicine.

Women had almost double the prevalence of PTSD in comparison to men (6.9% versus 3.9%). And those who were widowed or divorced were twice as likely to experience the disorder compared to those who were married or living common-law (8.8% versus 4.4%).

When it came to age, the prevalence of PTSD was highest for those 45 to 54 years old (6.4%) and lowest for those aged 75 and older (3.1%).

"This supports previous research, which found that PTSD tends to be most common among men in their early 40s and women in their early 50s," says co-author, Karen Kobayashi, Professor in the Department of Sociology and a Research Fellow at the Institute on Aging & Lifelong Health at the University of Victoria.

The prevalence of PTSD was higher among individuals who had at least two health conditions, who were experiencing chronic pain, or who had a history of smoking.

"This is consistent with results from other studies, which found increased risks of cardiovascular, metabolic, and musculoskeletal conditions among individuals with PTSD," states co-author Meghan West, a Master of Social Work student at the U of T's FIFSW. "These links may be due to alterations in the hypothalamic-pituitary-adrenal axis (HPA axis), sympathetic nervous system inflammation, or health behaviours that increase the risk of poor physical health."

The prevalence of PTSD among visible minority immigrants (7.5%) was more than double that of white immigrants (3.6%) and approximately 50% higher than whites born in Canada (5.6%).

"Our findings underline the importance of considering race and immigration status separately," states co-author Hongmei Tong, Assistant Professor of Social Work at MacEwan University in Edmonton.

"Visible minority immigrants in Canada are largely from South Asia, China, and the Middle East, where groups of individuals have experienced political conflict and/or disruption during the previous 60 years," he says. "Immigrants from these regions are also more likely to have experienced traumatic incidents such as natural disasters and armed conflict, and could be at greater risk of PTSD as a result. As such, there may be a greater need for mental health resources for visible minority immigrants."

Credit: 
University of Toronto

'Zoombombing' research shows legitimate meeting attendees cause most attacks

BINGHAMTON, NY -- Most zoombombing incidents are "inside jobs" according to a new study featuring researchers at Binghamton University, State University of New York.

As the COVID-19 virus spread worldwide in early 2020, much of our lives went virtual, including meetings, classes and social gatherings.

The videoconferencing app Zoom became an online home for many of these activities, but the migration also led to incidents of "zoombombing" -- disruptors joining online meetings to share racist or obscene content and cause chaos. Similar apps such as Google Meet and Skype also saw problems.

Cybersecurity experts expressed concerns about the apps' ability to thwart hackers. A new study from researchers at Binghamton University and Boston University, however, shows that most zoombombing incidents are "inside jobs."

Assistant Professor Jeremy Blackburn and PhD student Utkucan Balci from the Department of Computer Science at Binghamton's Thomas J. Watson College of Engineering and Applied Science teamed up with Boston University Assistant Professor Gianluca Stringhini and PhD student Chen Ling to analyze more than 200 calls from the first seven months of 2020.

They found that the vast majority of zoombombing are not caused by attackers stumbling upon meeting invitations or "bruteforcing" their ID numbers, but rather by insiders who have legitimate access to these meetings, particularly students in high school and college classes. Authorized users share links, passwords and other information on sites such as Twitter and 4chan, along with a call to stir up trouble.

"Some of the measures that people would think stops zoombombing -- such as requiring a password to enter a class or meeting -- did not deter anybody," Blackburn said. "Posters just post the password online as well.

"Even the waiting rooms in Zoom aren't a deterrent if zoombombers name themselves after people who are actually in the class to confuse the teacher. These strategies that circumvent the technical measures in place are interesting. It's not like they're hacking anything -- they're taking advantage of the weaknesses of people that we can't do anything about."

Because almost all targeting of Zoom meetings happens in real time (93% on 4chan and 98% on Twitter), the attacks seem to happen in an opportunistic fashion. Zoombombing posts cannot be identified ahead of time, so hosts have little or no time to prepare.

"It's unlikely that there can be a purely technical solution that isn't so tightly locked up that it becomes unusable," Blackburn said. "Passwords don't work -- that's the three-word summary of our research. We need to think harder about mitigation strategies."

Because of the worldwide reach of the internet, the research team found that the problem is not restricted to just one country or time zone.

"We found zoombombing calls from Turkey, Chile, Bulgaria, Italy and the United States," Balci said. "It's a globalized problem now because of the circumstances of COVID."

Examining the dark corners of the internet has been Blackburn's main research for the past decade, but as anonymity breeds antisocial behavior and hate, there are -- sadly -- always new topics to consider.

"When we start turning over rocks, it's amazing what crawls out from under them," he said. "We're trying to look for one problem, but we'll also find five other problems under there that are somehow related, and we have to look at that, too."

One big drawback to this kind of study is having to do both quantitative and qualitative analyses on vile hate speech. It even has to be published with a warning so that readers can brace themselves for what's ahead.

Blackburn and Balc both said that the camaraderie and open conversations at Blackburn's lab keeps everyone on an even keel.

"We do our best to make sure everybody is not taking it too personally," Blackburn said. "If you don't look at the content, you can't really do research about it, but if you look at the content too much or too deeply -- you stare into the abyss a bit too long -- you might fall into it. It's hard walking that line."

Balci added: "Sometimes I don't want to look at Twitter too much because the content is too overwhelming. It might depress me. However, from a research perspective, I'm curious about why these things happen. I just need to look at it in a more objective way."

The research, "A First Look at Zoombombing," was published by the IEEE Symposium on Security and Privacy (Oakland), 2021.

Credit: 
Binghamton University

Thoughts on plant genomes

There are more than 350,000 angiosperms which are key components of ecosystems. It is now commonly accepted that their existence is essential for preserving a healthy environment and also for the production of food and raw materials. The growing world population and the challenges posed by climate change make the control of these natural resources one of the most crucial issues for all humanity in the future. In this regard, genome sequence information is of fundamental importance for understanding natural diversity and evolution of living organisms as well as for the design of breeding strategies aimed to produce new varieties with suitable traits.

Although the first genome sequence of the model plant Arabidopsis thaliana was produced more than twenty years ago, the sequencing of horticultural species awaited the advent of the new generation (NexGen) of high throughput sequencing technologies to overcome the high complexity and big size of their genomes. NextGen technologies has changed the landscape enabling the sequencing of an ever-increasing number of species including horticultural plants with at least 91 fruit species of economic importance and over 200 vegetable plants. By the end of 2018, the genome sequences of 181 horticultural species became available, including 175 angiosperms (reviewed by Chen et al. Horticulture Research 2019, 6:112)

A "10KP plan" was announced at the XIX International Botanical Congress (Shenzhen China, 2017) with the aim to sequence 10,000 genomes covering every major clade of plants and eukaryotic microbes. (Chen et al. 2018 Front. Plant Sci. 9:418. doi: 10.3389/ fpls.2018.00418) sequenced_plant genomes). However, the current databases gathering large-scale data lacks computing devices with sufficient capacity and appropriate bioinformatics tools, stressing the need to up-grade these databases to allow their optimal processing.

Notably, a relatively small number of plants contribute to human diet or provide usable raw materials. On the other hand, genome sequences of thousands of cultivars of the main cereal crops are now available while only a small number of fruit and vegetable genome sequences are available, despite the fact that a broad range of these species make an important contribution to human diet and healthy food.

Recently, some significant advances have been made to fill this gap. These include:
The persimmon (Diospyros kaki) genome providing new insights into the inheritance of astringency and ancestral evolution (Zhu et al. Horticulture Research 2019, 6:138)

The snake gourd (Trichosanthes anguina) genome that sheds light on evolutionary aspects of this Cucurbitaceae and on its fruit development and ripening (Ma et al. Horticulture Research 2020, 7:199 )

The genome sequence of Chayote (Sechium edule), another Cucurbitaceae species, allowing to gain comparative insight into shared and divergent features among closely related species (Fu et al., Horticulture Research 2021, 8:35).

Overall, these advances attest to the tremendous acceleration of the sequencing effort and testify to the growing importance of the active contribution of several Chinese groups in this field during the last period. These generic resources will benefit the entire international scientific community and will provide important leads for future research projects which cannot be envisaged in the absence of these genome sequence information.

Credit: 
Nanjing Agricultural University The Academy of Science

On the dot: Novel quantum sensor provides new approach to early diagnosis via imaging

image: Unhealthy lifestyles, various diseases, stress, and aging can all contribute to an imbalance between the production of ROS and the body's ability to reduce and eliminate them. The resulting excessive levels of ROS cause "oxidative stress".

Image: 
National Institutes for Quantum and Radiological Science and Technology.

Oxygen is essential for human life, but within the body, certain biological environmental conditions can transform oxygen into aggressively reactive molecules called reactive oxygen species (ROS), which can damage DNA, RNA, and proteins. Normally, the body relies on molecules called antioxidants to convert ROS into less dangerous chemical species through a process called reduction. But unhealthy lifestyles, various diseases, stress, and aging can all contribute to an imbalance between the production of ROS and the body's ability to reduce and eliminate them. The resulting excessive levels of ROS cause "oxidative stress," which can disrupt normal cellular functions and increase the risk of diseases like cancer, neurodegeneration, kidney dysfunction, and others, which are all accompanied by severe inflammation.

Since oxidative stress is associated with various serious diseases, its detection within living organs offers a route to early diagnosis and preventive treatment, and is, thus, a matter of considerable interest to scientists working in the field of biomedicine. Recent international collaboration between the Japanese National Institutes for Quantum and Radiological Science and Technology (QST), Bulgarian Academy of Sciences, and Sofia University St. Kliment Ohridski in Bulgaria led to a promising technology for this purpose: a novel quantum sensor. Their work is published in the scientific journal Analytical Chemistry, 2021.

According to lead scientist Dr. Rumiana Bakalova and her colleague Dr. Ichio Aoki of QST, "the new sensor is appropriate for the early diagnosis of pathologies accompanied by inflammation, such as infectious diseases, cancer, neurodegeneration, atherosclerosis, diabetes, and kidney dysfunction."

The sensor comprises a quantum dot--semiconductor--core coated with a ring-shaped sugar-like compound called α-cyclodextrin, which in turn is bonded to six redox-sensitive chemical groups called nitroxide derivatives. These components have the advantage of favorable safety profiles, with cyclodextrins being approved for use in food and nitroxide derivatives being considered generally harmless for living beings due to their antioxidant properties.

The nitroxide derivatives cause the sensor to give ON fluorescence signals when in a reduced state and give ON magnetic signals when in an oxidized state. This allows for the detection of oxidative stress, or a reduced cell/tissue capacity, using methods such as magnetic resonance imaging (MRI) and electron paramagnetic imaging (EPR), which can detect magnetic signals. The chemical sensor is also bonded to a compound called triphenylphosphonium, which helps the sensor enter living cells and proceed to the mitochondria, which are the cellular components most often responsible for generating ROS, particularly under pathologic conditions.

To test their novel chemical sensor, the scientists first performed experiments with cultures of normal (healthy) and cancerous colon cells in the lab. For this they used their sensor in the oxidized form. In healthy cells, EPR signals were quenched; but in cancer cells, they stayed strong. This indicates that the sensors were reduced in healthy cells by antioxidants but remained in their oxidized state in the cancer cells, which in turn suggests that the cancerous cells had a higher oxidative capacity.

To further test the sensor, the researchers conducted experiments with both healthy mice and those that had been raised on a high-cholesterol diet for 2 months, which caused them to develop early-stage kidney dysfunction due to persistent inflammation. Compared with the healthy mice, the mice with kidney dysfunction exhibited stronger MRI signals in their kidneys, suggesting that their kidneys were under greater oxidative stress.

This work is in its initial stages and much research is required before these sensors can be ready for medical use. But these findings reveal the potential of such technology. Dr. Bakalova notes: "Our sensor is suitable for analyzing even small redox imbalances associated with the overproduction of ROS, via MRI. And while MRI and CT by themselves have been able to diagnose advanced stage kidney damage, they have not yet been able to visualize early stages of dysfunction. The use of our probe could help clinicians identify patients in the early stage of renal damage before they need hemodialysis or kidney transplantation. With further research, our sensor could be the next generation of redox-sensitive contrast probes for early diagnosis of kidney dysfunction, and perhaps, a number of other diseases that are accompanied by inflammation."

Credit: 
The National Institutes for Quantum Science and Technology