Earth

Extra choline may help pregnant women decrease negative effects of COVID-19 on their newborns

Pregnant women who take extra choline supplements may mitigate the negative impact that viral respiratory infections, including COVID-19, can have on their babies, according to a new study from researchers in the Departments of Psychiatry and Obstetrics and Gynecology at the University of Colorado Anschutz Medical Campus. Choline is a vitamin B nutrient found in various foods and dietary supplements, and is critical to fetal brain development.

"The Centers for Disease Control and Prevention (CDC) predicts that COVID-19 will impact fetal brain development like other common corona respiratory viruses," said Robert Freedman, MD, professor of psychiatry at the CU Anschutz Medical Campus and lead researcher.

The new study, published in the Journal of Psychiatric Research, specifically looked at whether higher prenatal choline levels can help protect the fetus's developing brain even if the mother contracts a viral respiratory infection in early pregnancy. The results reveal higher prenatal choline levels mitigate the fetal impact of virus infection.

"It's important for the healthcare community, and soon to be mothers, to be aware that a natural nutrient can be taken during pregnancy, just like folic acid and other prenatal vitamins, to protect fetuses and newborns from brain development issues. Later on in life, these development issues can lead to mental illness," Freedman adds.

In the study, researchers analyzed the effects on infant behavior if the mother had contracted a respiratory virus by measuring the infant's IBQ-R Regulation dimension - which looks at the development of infant attention and other self-regulatory behaviors. Lower IBQ-R Regulation at one year of age is associated with problems in attention and social behavior in later childhood, including decreased reading readiness at age four years and with problems in concentration, and conscientiousness in children through seven years of age.

The results from the study:

Infants of mothers who had viral infections and higher choline levels had significantly increased 3-month IBQ-R scores on the Regulation dimension and specifically the Attention scale in the Regulation dimension, compared to infants of mothers who had viral infections and had lower choline levels.

Choline levels sufficient to protect the fetus often require dietary supplements.

The increased maternal anxiety and depression in the viral-infected mothers were not associated with their infants' IBQ-R Regulation.

The study highlights that in conjunction with the CDC's current advice on COVID-19's effects in pregnancy, phosphatidylcholine or choline supplements along with other prenatal vitamins may help buffer the fetal brain from the possible detrimental impact of the current pandemic and decrease the risk of the children's future mental illness.

"Previous pandemics have resulted in significantly increased levels of mental illnesses including schizophrenia, autism spectrum disorder and attention deficit disorder in the offspring," said Camille Hoffman, MD, associate professor of obstetrics and gynecology and a maternal-fetal medicine specialist at the CU Anschutz Medical Campus. "However, since data from COVID-19 itself will not be available for years, we're hoping our study findings will provide valuable information for soon to be mothers on the importance of taking choline supplements daily during pregnancy."

Credit: 
University of Colorado Anschutz Medical Campus

Kirigami grips could help seniors keep their footing

video: This pop-up shoe grip, inspired by snake skin, increases friction between the shoe and the ground. When the material stretches, the cuts pop out into spikes that dig into the ground. When the foot flattens, the spikes fold back into the material, creating a smooth surface again.

Image: 
(Video courtesy of Ahmad Rafsanjani/Harvard SEAS)

You've never seen a pair of snake skin shoes like this before.

Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and MIT have developed pop-up shoe grips, inspired by snake skin, that can increase friction between the shoe and the ground. The assistive grips could be used, among other things, to reduce the risk of falling among older adults.

The research is published in Nature Biomedical Engineering.

"Falls are the leading cause of the death for older adults and the second leading cause of occupational-related deaths," said Giovanni Traverso, an assistant professor of Mechanical Engineering at MIT and co-corresponding author of the paper. "If we could control and increase the friction between us and the ground, we could reduce the risk of these types of falls, which not only cost lives but billions of dollars in medical bills every year."

The researchers used kirigami -- the Japanese art of paper cutting -- to mimic snake scales. The gripper is made from a thin, flexible steel sheet, with dozens of scale-like cuts. When the material stretches, the cuts pop out into spikes that dig into the ground and create friction. When the foot flattens, the spikes fold back into the material, creating a smooth surface again.

"As you walk, the curvature of your shoe changes," said Sahab Babaee, a research scientist at MIT and co-lead author of the paper. "We designed these assistive grippers to pop-out when weight shifts from the heel to the toe and the shoe bends and stretches along the soles."

The multidisciplinary research team studied gaits to find the exact right curvature of the foot to trigger the spikes and tested the device on multiple surfaces, including ice.

"Through a combination of simulations and experiments, we carefully choose where to make the cuts so that the spikes are stiff and pop out at the best possible angle of attack to maximize the grip of the kirigami with the contacting surface," said Ahmad Rafsanjani, a former postdoctoral fellow at SEAS and co-lead author of the paper. Rafsanjani is currently a senior scientist at ETH Zurich.

In tests, the device not only weighed less and was easier to take on and off than leading commercial crampons, but it also outperformed them in creating friction.

"These lightweight, kirigami metasurfaces could play an important role in public health to mitigate slipping and falling in a range of different environments," said Katia Bertoldi, the William and Ami Kuan Danoff professor of Applied Mechanics at SEAS and co-corresponding author of the paper. "They could also be used to improve the mobility of all-terrain robots that could one day travel across difficult environments for search and rescue missions."

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Extended parenting helps young birds grow smarter

image: A wild Siberian jay parent (left) and its retained offspring (right) foraging together

Image: 
Michael Griesser

Humans are unusual, even among primates, in the length of our "extended childhood." Scientists think that this period of childhood and adolescence, which gives us lots of time to explore, create, and learn, is a key reason why we are smart enough to learn skills that take years to master. But humans are not the only species with an extended childhood. Elephants, some bats, whales, dolphins, and some birds - especially corvids - also have them. But does an extended childhood make other species smart too, and if so, what is the role of parenting?

A team of scientists from the Max Planck Institute for the Science of Human History, the University of Konstanz and the UK tackled these questions by combining the results of their own fieldwork on two corvid species - Siberian jays and New Caledonian crows - with published data from 127 corvid species and several thousand species in the passerine (songbird) order. The study, published in Philosophical Transactions of the Royal Society B, offers a groundbreaking new view on the evolution of intelligence, where parenting takes center stage.

Parenting pays the costs of extended childhoods

Researchers spent years observing two bird species in the wild to understand how young birds' learning is related to parenting received during adolescence and survival in adulthood. At a study site in Sweden, researchers used field experiments to test the ability of young Siberian jays to learn crucial life skills: recognizing a dangerous predator and opening a puzzle box to access food. Across the northern Palearctic, Siberian jays live in family groups which can include not only the young of a breeding pair, but also young that were born in other groups. These young can stay with the family group for up to four years. Young that stayed with their parents longer benefitted from being with their parents. They learned faster by watching their parents and received more food from their parents. As a consequence, they were more likely to live longer and to start their own family.

At a study site in New Caledonia, researchers followed New Caledonian crows to track how juveniles learn a key survival skill: making tools for food retrieval. It takes about a year to learn this skill - a costly time investment for the parents who still have to feed the young. Surprisingly, these crows can stay with their parents for up to three years, allowing for a much longer "childhood" than most other crows. Parents and other adults are extremely tolerant to young crows. While adults are using a tool to get food, they feed the juveniles, let them watch closely, and even tolerate tool theft and physical contact by juveniles. As a result of this tolerant learning environment, New Caledonian crows have the largest brain size for their body size of all corvids.

Extended parenting affects intelligence

The authors argue that the key role of parenting on the evolution of cognition has been overlooked so far. Often thought of as merely an inevitable chore, parental care is the reason children can spend their childhood learning and making mistakes.

"Extended parenting has profound consequences for learning and intelligence," explains Michael Griesser of the University of Konstanz. "Learning opportunities arise from the interplay between extended childhood and extended parenting. The safe haven provided by extended parenting is critical for learning opportunities. It creates extended developmental periods that feed back into the extended childhood."

In addition to benefitting young learners, extended parenting helps pay for the costs of an extended childhood. Having to feed extra mouths is costly, but when there is enough food available in the environment, parents can afford to keep on feeding the young for longer. With a safe haven, young birds have the time to grow a larger brain, learn difficult skills, and access vital food resources. These acquired skills lead to better survival, and possibly also allow the species to expand into new environments.

Corvids are unusual birds, but are similar to humans

The researchers used phylogenetic comparative methods to analyze the differences between corvids and all other passerines. Corvids have much larger brains relative to their body size, like humans. They also have prolonged developmental periods, both in the nest and after they leave - another characteristic of humans.

"Both humans and corvids spend their youth learning vital skills, surrounded by tolerant adults which support their long learning process," explains Natalie Uomini of the Max Planck Institute. "Moreover, corvids and humans have the ability for lifelong learning - a flexible kind of intelligence which allows individuals to adapt to changing environments throughout their lifetime."

In the light of this study, the importance of parenting comes into even greater focus. Parents have a vital role in helping young brains grow smarter. Children, like young birds, cannot learn skills in isolation. Instead they need a nurturing, supportive environment that allows the full potential of their large brains to develop.

Credit: 
Max Planck Institute of Geoanthropology

Impact of children's loneliness today could manifest in depression for years to come

Children and adolescents are likely to experience high rates of depression and anxiety long after current lockdown and social isolation ends and clinical services need to be prepared for a future spike in demand, according to the authors of a new rapid review into the long-term mental health effects of lockdown.

The research, which draws on over 60 pre-existing, peer-reviewed studies into topics spanning isolation, loneliness and mental health for young people aged 4 - 21, is published today (Monday 1 June 2020) in the Journal of the American Academy of Child and Adolescent Psychiatry.

According to the review, young people who are lonely might be as much as three times more likely to develop depression in the future, and that the impact of loneliness on mental health could last for at least 9 years.

The studies highlight an association between loneliness and an increased risk of mental health problems for young people. There is also evidence that duration of loneliness may be more important than the intensity of loneliness in increasing the risk of future depression among young people.

This, say the authors, should act as a warning to policymakers of the expected rise in demand for mental health services from young people and young adults in the years to come - both here in the UK and around the world.

Dr Maria Loades, clinical psychologist from the Department of Psychology at the University of Bath who led the work, explained: "From our analysis, it is clear there are strong associations between loneliness and depression in young people, both in the immediate and the longer-term. We know this effect can sometimes be lagged, meaning it can take up to 10 years to really understand the scale of the mental health impact the covid-19 crisis has created."

For teachers and policymakers currently preparing for a phased re-start of schools in the UK, scheduled from today, Monday 1 June, Dr Loades suggests the research could have important implications for how this process is managed too.

She adds: "There is evidence that it's the duration of loneliness as opposed to the intensity which seems to have the biggest impact on depression rates in young people. This means that returning to some degree of normality as soon as possible is of course important. However, how this process is managed matters when it comes to shaping young people's feelings and experiences about this period.

"For our youngest and their return to school from this week, we need to prioritise the importance of play in helping them to reconnect with friends and adjust following this intense period of isolation."

Members of the review team were also involved in a recent open letter to UK Education Secretary, Gavin Williamson MP, focusing on support for children's social and emotional wellbeing during and after lockdown. In their letter they suggested that:

- The easing of lockdown restrictions should be done in a way that provides all children with the time and opportunity to play with peers, in and outside of school, and even while social distancing measures remain in place;

- Schools should be appropriately resourced and given clear guidance on how to support children's emotional wellbeing during the transition period as schools reopen and that play - rather than academic progress - should be the priority during this time;

- The social and emotional benefits of play and interaction with peers must be clearly communicated, alongside guidance on the objective risks to children.

Acknowledging the trade-offs that need to be struck in terms of restarting the economy and reducing educational disparities, their letter to the Education Secretary concludes: 'Poor emotional health in children leads to long term mental health problems, poorer educational attainment and has a considerable economic burden.'

Credit: 
University of Bath

Heightened interaction between neolithic migrants and hunter-gatherers in Western Europe

image: The burial of Pendimoun F2 (5480-5360 BC), woman carrying about 55% of hunter-gatherer component.

Image: 
Henri Duday

The Neolithic lifestyle, including farming, animal domestication and the development of new technologies, emerged in the Near East around 12,000 years ago and contributed profoundly to the modern way of life. The Neolithic spread rapidly across Europe, mainly along the Danube valley and the Mediterranean coastline, reaching the Atlantic coast around 5000-4500 BCE. The existing archaeogenetic data from prehistoric European farmers indicates that the spread of farming is due to expanding populations of early farmers who mixed little, if at all, with indigenous hunter-gatherer groups. However, until now, no archaeogenetic data were available for France.

"France is where the two streams of the Neolithic expansion overlapped, so understanding how these groups interacted would fill in a big piece of the puzzle," says Wolfgang Haak, senior author of the study. "The data we're collecting suggests a more complex scenario than elsewhere in Europe, with more interaction between early farmers and hunter-gatherers."

These interactions seem to vary greatly from one region to another, attesting to a diverse cultural mosaic in early Neolithic Western Europe. In order to document the biological interactions during this transition period, researchers from the Max Planck Institute for the Science of Human History teamed up with colleagues from the PACEA laboratory (1*) in Bordeaux, the CEPAM laboratory (2*), the RGZM (3*), and other international partners (4*). The study, published in Science Advances, reports new genome-wide data for 101 prehistoric individuals from 12 archaeological sites in today's France and Germany, dating from 7000-3000 BCE

High levels of hunter-gatherer ancestry in early farmers from France

The new results showed evidence for a higher level of admixture, or the combination of genetic information from genetically distant populations, between early migrant farmers and local hunter-gatherers in France. The genetic mixture in this region is unprecedented in the rest of Europe for the early stages of the Neolithic expansion. The genetic contribution of hunter-gatherers is particularly high in the south of France, roughly 31% on average, compared with 3% in Central Europe or 13% in the Iberian Peninsula.

Intriguingly, in an individual from the Pendimoun site in Provence (5480-5360 BCE), the genetic contribution of local hunter-gatherers was as high as 55%. The team could show that the admixture in this individual occurred recently, about four generations before, shortly after the first Neolithic farmers settled on that part of the French coast. "These findings suggest continuous contacts between both groups for at least a century," says Maïté Rivollat, postdoc in the INTERACT project and lead author of the study.

Genetic evidence for the two routes of the Neolithic expansion

Leveraging the genetic substructure observed in European hunter-gatherers, the team was able to retrace the dynamics of admixture in various European regions. Neolithic farmers in central Europe carry a very small hunter-gatherer component, which had already been mixed in and brought in from southeastern Europe. This accounts for the rapid spread of Neolithic groups with a negligible amount of interaction with local hunter-gatherers. On the other hand, Neolithic farmers from west of the Rhine river (in France, Spain, Great Britain) carry a genetic component inherited from local Mesolithic groups, implying a process of late and local admixture.

The new data highlight the complexity and regional variability of biological and cultural interactions between farmer and hunter-gatherer communities during the Neolithic expansion. "This study shows that we can add a lot more detail with focused sampling and unravel the regional dynamics of the farmer-forager interactions," concludes Rivollat. "With the increasing amount of genetic data, we gain the much-needed resolution to investigate biological processes in the past and to understand their relations with observed cultural phenomena."

Credit: 
Max Planck Institute of Geoanthropology

Neuropathogenesis, neurologic manifestations of coronaviruses

What The Study Did: Potential tissue targets and routes of entry of SARS-CoV-2 into the central nervous system and reported neurological complications of COVID-19 are identified in this narrative review.

Authors: Serena Spudich, M.D., of the Yale University School of Medicine in New Haven, Connecticut, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamaneurol.2020.2065)

Editor's Note: The article includes conflict of interest disclosures. Please see the articles for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Growing evidence that minority ethnic groups in England may be at higher risk of COVID-19

Previous pandemics have often disproportionately impacted ethnic minorities and socioeconomically disadvantaged populations. While early evidence suggests that the same may be occurring in the current SARS-CoV-2 pandemic, research into the subject remains limited.

A team of researchers at the University of Glasgow and Public Health Scotland, UK analysed data on 392,116 participants in the UK Biobank study, a large long-term study investigating the contribution of genes and the environment to the development of disease. UK Biobank data, which include information on social and demographic factors, such as ethnicity and socioeconomic position, health and behavioural risk factors, were linked to results of COVID-19 tests conducted in England between 16th March 2020 and 3rd May 2020. Out of the total number of participants whose data were analysed, 348,735 were White British, 7,323 were South Asian and 6,395 were from black ethnic backgrounds. 2,658 participants had been tested for SARS-CoV-2 and 948 had at least one positive test. Out of those, 726 received a positive test in a hospital setting, suggesting more severe illness.

The authors found that, compared to people from white British backgrounds, the risks of testing positive were largest in in black and South Asian minority groups who were 3.4 and 2.4 times more likely to test positive, respectively, with people of Pakistani ethnicity at highest risk in the south Asian group (3.2 times more likely to test positive). Ethnic minorities also were more likely to receive their diagnosis in a hospital setting, which suggests more severe illness. The observed ethnic differences in infection risk did not appear to be fully explained by differences in pre-existing health, behavioural risk factors, country of birth, or socioeconomic differences. The authors also found that living in a disadvantaged area was associated with a higher risk of testing positive, particularly for the most disadvantaged (2.2 times more likely to test positive compared to the least disadvantaged), as was having the lowest level of education (2.0 times more likely to test positive compared to the highest level of education).

The findings suggest that some ethnic minority groups, especially black and South Asian people may be particularly vulnerable to the adverse consequences of COVID-19. An immediate policy response is needed to ensure that the health system is responsive to the needs of ethnic minority groups, according to the authors. This should include ensuring that health and care workers, who often are from minority ethnic populations, have access to the necessary protective personal equipment. Timely communication of guidelines to reduce the risk of being exposed to the virus in a range of languages should also be considered.

The authors caution that test result data was only available for England. Those who were more advantaged were more likely to participate in the UK Biobank study and ethnic minorities may be less well represented. Further research is needed to investigate whether these findings are reflective of the broader UK population, alongside analysis of other datasets examining how SARS-CoV-2 infection affects different ethnic and socioeconomic groups, including in representative samples across different countries.

Credit: 
BMC (BioMed Central)

Glucocorticoids are harmful in treating viral respiratory infections

Glucocorticoids are widely used in treating acute respiratory distress syndrome (ARDS) despite there being no indisputable scientific evidence of their effectiveness. The main reason seems to be that there is no effective treatment for ARDS patients on ventilators. The death rate of these patients differs between 30-40 percent depending on the data.

ARDS is often caused by a serious viral or bacterial infection. Type I interferons alpha and beta (IFNs) are signalling proteins produced by the human body and they are needed in fighting off viral infections. For this reason, IFNs have been used successfully in treating ARDS in early-phase clinical trials. However, in a recently published later-phase clinical trial (INTEREST), this effect was no longer observed. The trial was carried out as an international study covering 300 patients at different medical centres.

A closer analysis revealed that most patients who participated in the study had received glucocorticoids in addition to IFN beta, which proved to be exceedingly harmful. The death rate of patients who were treated only with interferons was 10.6 percent, but the addition of glucocorticoids increased the death rate to 39.7 percent.

Glucocorticoids inhibit interferon signalling and increase mortality

At the MediCity Research Laboratory of the University of Turku, Finland, the research groups of Academician, Professor Sirpa Jalkanen and Academy Research Fellow Maija Hollmén investigated what caused this abrupt increase in death rates.

The researchers demonstrated with cell and tissue cultures that glucocorticoids inhibit IFN signalling and prevent both the body's own and administered interferon from fighting against the disease.

"This is probably the most important observation for saving human lives that I have made in my career. After we overcame the immense disappointment caused by the results of the INTEREST trial, we became sure that there had to be an explanation - and now we have found it", says Professor Sirpa Jalkanen.

The findings are extremely important especially during the current pandemic, as the COVID-19 disease incapacitates the body's IFN production. WHO has already forbidden the use of glucocorticoids in treating COVID-19.

Credit: 
University of Turku

Better outcomes, lower cost in first-ever oncology hospital at home evaluation

image: Kathleen Mooney presents data on Huntsman Cancer Institute's Huntsman at Home evaluation.

Image: 
Huntsman Cancer Institute

SALT LAKE CITY - Researchers at Huntsman Cancer Institute (HCI) at the University of Utah (U of U) presented the first outcomes evaluation of an adult oncology hospital-at-home program today at the 2020 American Society of Clinical Oncology (ASCO) annual meeting. The study evaluated patients participating in HCI's Huntsman at HomeTM. The data demonstrate strong evidence for this care model, showing improved patient outcomes, including reduced hospitalizations and decreased visits to the emergency department.

Huntsman at Home was launched in 2018 as a way to bring HCI-quality care to cancer patients in their homes. The service combines HCI research and clinical expertise for in-person and remote patient and caregiver support and acute-level clinical treatment. A team of oncology professionals deliver care, following best-practice standards. Currently, Huntsman at Home is available to HCI patients living within a 20-mile radius of the flagship hospital in Salt Lake City.

The lead author of the study is Kathi Mooney, PhD, RN, interim senior director of population sciences at HCI and distinguished professor of nursing at the U of U. Mooney and her colleagues evaluated outcomes over 14 months for 367 cancer patients, 169 of whom participated in Huntsman at Home and 198 control patients who qualified for the program, but live outside the service area. Patients with several types of cancer and at various stages of cancer were evaluated. During the first 30 days of enrollment, Huntsman at Home patients were 58% less likely to be admitted for an unplanned hospital stay, and those who were admitted to the hospital had a shorter length of stay. Huntsman at Home patients had 48% less emergency department visits. They also had 48% lower cumulative charges for clinical services when compared to controls. Results over 90 days were similarly robust.

"These findings strongly support our hypothesis that Huntsman at Home's high-quality, acute-level cancer care using a hospital-at-home model improves outcomes while simultaneously improving value," said Mooney.

Huntsman at Home services range from symptom management to acute medical, post-surgical, and end-of-life care. The Huntsman at Home team is led by HCI nurse practitioners working in conjunction with HCI oncologists and is operated in partnership with Community Nursing Services, a home health and hospice agency that provides registered nurses for the team. Other cancer care specialists such as social workers and physical therapists contribute to patient care. Patients must receive a referral from their oncologist and live within a 20-mile radius of HCI.

Mooney and her colleagues plan to continue evaluating outcomes of patients participating in this program. They are also working to implement a geographic expansion of Huntsman at Home in late summer 2020, extending care to several Utah rural counties.

The 2020 ASCO Annual meeting was held virtually from May 29 to May 30. As one of the largest clinical cancer research meetings in the world, ASCO brings together more than 30,000 professionals world-wide for on-demand and scheduled broadcasts of the latest cutting-edge oncology research.

Huntsman at Home is funded by HCI and Huntsman Cancer Foundation. The evaluation of Huntsman at Home is supported by the Cambia Health Foundation. The rural expansion is supported by the Huntsman Foundation and the Rita & Alex Hillman Foundation.

Credit: 
Huntsman Cancer Institute

Study: Integrating satellite and socioeconomic data to improve climate change policy

image: Atul Jain led a study that used a combination of satellite and census data to identify deforestation and expanding saltwater farming as the key physical and socioeconomic drivers of climate change in Bangladesh.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Bangladesh is on track to lose all of its forestland in the next 35-40 years, leading to a rise in CO2 emissions and subsequent climate change, researchers said. However, that is just one of the significant land-use changes that the country is experiencing. A new study uses satellite and census data to quantify and unravel how physical and economic factors drive land-use changes. Understanding this relationship can inform climate policy at the national scale in Bangladesh and beyond.

The study, led by University of Illinois at Urbana-Champaign atmospheric sciences professor Atul Jain and postdoctoral researcher Xiaoming Xu, is published in the journal Regional Environmental Change.

"Land usage changes when biophysical factors like temperature and soil quality change, but also when the economic needs of people change," Xu said. The study identifies two key areas where land use and cover have shifted because of biophysical and socioeconomic activities in Bangladesh - and suggests policies to mitigate their influence on climate change.

First, the team found that approximately 11% of the forests in Bangladesh have shifted to shrub land, cropland and urban land from 2000-10.

"Extreme climate events, such as drought and flood, changes in urban and rural population and economic conditions are driving the changes from forest to shrub land in the southeast region of Bangladesh," Jain said. "Here, the locals earn their livelihood by using land, lumber and fuel resources from forests. However, deforestation may be controlled by implementing simple policies such as road improvement, which can provide the people with a means to obtain alternative fuels and livelihoods that are not as dependent on forests."

The study also found that from 2000-10, the area of standing water bodies such as ponds, lakes and reservoirs increased by approximately 9%. This change occurred in the coastal southwest part of the country, where worsening floods during monsoon months have pushed aquaculture expansion at the cost of cropland in recent decades, the researchers said.

"The rapid conversion of traditional rice-farming land to saltwater shrimp ponds is now a well-established practice in the southwest coastal area of Bangladesh," Xu said. "Shrimp farming is 12 times more profitable than rice cultivation in this country."

The flooding and expansion of saltwater farming have led to increased soil salinity, spoiling the soil for farming purposes, the researchers said. "Policies need to be developed that encourage the development of saltwater aquaculture only in the regions with favorable conditions to prevent further soil degradation," Jain said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Older men worry less than others about COVID-19

ATLANTA--Older men may be at greater risk of contracting COVID-19 because they worry less about catching or dying from it than women their age or than younger people of both sexes, according to a new study by Sarah Barber, a gerontology and psychology researcher at Georgia State University.

This is a concern because older men are already more at risk of severe or fatal COVID-19 infections. Data from the CDC show the fatality rate of COVID-19 steadily rises with age, and that men are more at risk than women.

To test levels of worry and protective behaviors, Barber teamed with Hyunji Kim, a Georgia State doctoral student in psychology, and administered an online questionnaire assessing COVID-19 perceptions and behavior changes. The results were published by the Journals of Gerontology.

It is well established that worry is a key motivator of behavioral health changes, said Barber, including motivating people to engage in preventive health care activities such as healthy eating, exercise and timely screenings. In general, worry begins to ease with age, and is also lower among men than women.

"Not only do older adults exhibit less negative emotions in their daily lives," she said, "they also exhibit less worry and fewer PTSD symptoms following natural disasters and terrorist attacks."

She said that this may be because older adults have better coping strategies, perhaps gained through experience, and thus are able to regulate their emotional responses better.

Knowing that older adults tend to worry less, Barber conducted a study to see how this affected responses to the global pandemic.

"In normal circumstances," said Barber, "not worrying as much is a good thing. Everyday life is probably happier if we worry less. However, where COVID-19 is concerned, we expected that lower amounts of worry would translate into fewer protective COVID-19 behavior changes."

COVID-19 was declared a pandemic on March 11, and the questionnaire took place from March 23-31. Widespread behavioral changes were taking place, including the beginning of sheltering at home and social distancing.

All participants lived in the United States, and were primarily Caucasian with at least some college education. Participants were either aged 18-35 or aged 65-81, with 146 younger adults and 156 older adults studied.

The questionnaire assessed the perceived severity of COVID-19, such as whether respondents thought people were over-reacting to the threat of COVID-19 and whether it was similar in risk to flu. It also assessed worries about COVID-19, including how worried participants were about catching the virus themselves, dying as a result of it, a family member catching it, lifestyle disruptions, hospitals being overwhelmed, an economic recession, personal or family income declining and stores running out of food or medicine.

The questionnaire assessed behavioral changes that can reduce infection risk, from washing hands more often, to wearing a mask, avoiding socializing, avoiding public places, observing a complete quarantine or taking more care with a balanced diet and purchasing extra food or medications.

Not surprisingly, said Barber, most participants were at least moderately concerned about COVID-19, and only one individual, an older male, had "absolutely no worry at all." Also as expected, worry translated to protective behavior: more than 80 percent of participants reported washing their hands more frequently, taking more care about cleanliness, no longer shaking hands and avoiding public places. More than 60 percent of participants also reported no longer socializing with others. The participants who were most worried about COVID-19 were also the most likely to have implemented these behavior changes.

The catch was older men: compared to all other participants, older men were less worried about COVID-19, and had adopted the fewest number of behavior changes. They were relatively less likely to have worn a mask, to report having stopped touching their faces or to have purchased extra food.

Barber does not think the answer is to try to incite worry in older men. She thinks a better answer is to help them understand their risk accurately.

"Our study showed that for older men, accurate perception of risk worked as well as worry to predict preventive behaviors," she said.

If older men can be better educated about the virus, they may adopt protective behaviors even if they don't feel worried. She also notes that the survey took place "right after the pandemic was declared, and we all hope that a more accurate perception of risk has evolved over the last two months."

Either way, said Barber, older men may need a little extra coaching and attention to risk assessment and protective behaviors, both from concerned family members as well as their healthcare practitioners.

Credit: 
Georgia State University

A rising tide of marine disease? How parasites respond to a warming world

image: Sea star wasting disease, pictured here, is likely caused by the sea star associated densovirus.

Image: 
Oregon State Parks

Warming events are increasing in magnitude and severity, threatening many ecosystems worldwide. As the global temperatures continue to climb, it also raises uncertainties as to the relationship, prevalence, and spread of parasites and disease.

A recent study from the University of Washington explores the ways parasitism will respond to climate change, providing researchers new insights into disease transmission. The paper was published May 18 in Trends in Ecology and Evolution.

The review builds upon previous research by adding nearly two decades of new evidence to build a framework showing the parasite-host relationship under climate oscillations. Traditionally, climate-related research is done over long timescales, however this unique approach examines how increasingly frequent "pulse warming" events alter parasite transmission.

"Much of what is known about how organisms and ecosystems can respond to climate change has focused on gradual warming," said lead author Danielle Claar, a postdoctoral researcher at the UW School of Aquatic and Fishery Sciences. "Climate change causes not only gradual warming over time, but also increases the frequency and magnitude of extreme events, like heat waves."

Claar explained that both gradual warming and pulse warming can and have influenced ecosystems, but do so in different ways. Organisms may be able to adapt and keep pace with the gradual warming, but an acute pulse event can have sudden and profound impacts.

The 2013-2015 "blob" is one such extreme heat pulse event which has been linked to a massive die-off of sea stars along the Pacific coast of the U.S. and Canada. Many species of sea stars, including the large sunflower sea star, were decimated by a sudden epidemic of wasting disease. Five years later, populations in the region are still struggling to recover. The abnormally warm waters associated with the blob are thought to have favored the spread of the sea star-associated densovirus, the suggested cause of the disease.

The authors compare the prevalence of these marine diseases to a rising tide, an ebbing tide, or a tsunami. Disease transmission can rise or ebb in concert with gradual warming or a series of pulse warming events. However, a severe pulse warming event could result in a tsunami, "initiating either a deluge or drought of disease," as was observed with sea stars along the Pacific Northwest.

However, not all pulse heat events will cause the same response. What may benefit a particular parasite or host in one system can be detrimental in another. Warming can alter a parasite's life cycle, limit the range of suitable host species, or even impair the host's immune response. Some flatworms which target wildlife and humans cannot survive as long in warmer waters, decreasing their window for infecting a host. Another recent UW study found that parasites commonly found in sushi are on the rise with their numbers increasing 283-fold in the past 40 years, though the relationship between heat pulse events and their abundance is not yet clear.

"The relationships between hosts, parasites, and their corresponding communities are complex and depend on many factors, making outcomes difficult to predict," said Claar, who recommends researchers make predictions on a case-by-case basis for their individual systems.

The authors conclude that rather than a straightforward tidal prediction, they would expect pulse warming to cause "choppy seas with the occasional rogue wave."

"It is important that we are able to understand and predict how parasitism and disease might respond to climate change, so we can prepare for, and mitigate, potential impacts to human and wildlife health," said Claar.

Credit: 
University of Washington

Nilotinib appears safe and affects biomarkers in Alzheimer's disease clinical trial

WASHINGTON - A Georgetown University Medical Center clinical trial investigating the cancer drug nilotinib in people with Alzheimer's disease finds that it is safe and well-tolerated, and researchers say the drug should be tested in a larger study to further determine its safety and efficacy as a potential disease-modifying strategy.

The results of the small, phase II, randomized, double blinded, placebo-controlled study to evaluate the impact of low doses of nilotinib (Tasigna®) were published online May 29 in Annals of Neurology.

Nilotinib is approved by the U.S. Food and Drug Administration (FDA) for the treatment of chronic myeloid leukemia. The rationale for studying nilotinib in Alzheimer's disease is based on laboratory and clinical research conducted by the Georgetown Translational Neurotherapeutics Program (TNP) directed by Charbel Moussa, MBBS, PhD.

Nilotinib appears to aid in the clearance of accumulated beta-amyloid (Abeta) plaques and Tau tangles in neurons in the brain -- hallmarks of Alzheimer's disease. Nilotinib appears to penetrate the blood-brain barrier and turn on the "garbage disposal" machinery inside neurons (a process known as autophagy) to get rid of the Tau, Abeta and other toxic proteins.

R. Scott Turner, PhD, MD, director of Georgetown's Memory Disorders Program, served as principal investigator of the Alzheimer's disease study.

"The primary goal of this study was to determine its safety and tolerability in Alzheimer's patients," says Turner. "The study found that it is safe and well-tolerated, as we anticipated, and that it may have disease modifying benefits."

After careful screening, 37 people with mild dementia due to Alzheimer's were randomized to either the placebo or nilotinib groups for the 12-month study. A 150 mg daily dose of nilotinib or matching placebo was taken orally once daily for 26 weeks followed by a 300 mg daily dose of nilotinib or placebo for another 26 weeks. To prevent bias the study was blinded, meaning neither the study participants nor the investigators knew if the active drug or placebo were being administered until the end of the study.

Nilotinib was safe and well-tolerated, although more adverse events, particularly mood swings (agitation and irritation), were noted at the 300 mg dose. Mood swings were significantly increased between 6 and 12 months after the dose was increased from 150 mg to 300 mg daily. Nilotinib carries an FDA "black-box warning" because of cardiovascular issues that may lead to sudden death in cancer patients (typically treated with 600 mg daily), but no such incidents occurred in this study (maximum dose of 300 mg daily).

The amyloid burden as measured by brain imaging was reduced in the nilotinib group compared to the placebo group. Two forms of amyloid in cerebrospinal fluid were also measured. Aβ40 was reduced at 6 months and Aβ42 was reduced at 12 months in the nilotinib group compared to placebo. Hippocampal volume loss (on MRI scans of the brain) was attenuated at 12 months and phospho-tau-181 in spinal fluid was reduced at 6 and 12 months in the nilotinib treated group.

"The current data are in agreement with previous preclinical and other clinical studies at Georgetown suggesting nilotinib is a potential disease-modifying drug that triggers autophagy of neurotoxic proteins including Aβ40/, Aβ42, and phospho tau-181," explains Moussa, an associate professor of neurology and senior author on the study.

"The increase in mood swings with 300 mg nilotinib is associated with dose-dependent increases of brain dopamine, suggesting that 150 mg nilotinib is the optimal dosage to investigate in a future Alzheimer study," adds Moussa.

Turner emphasizes that "this is the first oral treatment found to lower amyloid burden in the brain." While this has also been found with several anti-amyloid antibodies these treatments cannot be given orally. Future Alzheimer's studies are now in the planning stage, Turner concludes.

"The results of this exploratory study repurposing nilotinib are encouraging," says Howard Fillit, MD, Founding Executive Director and Chief Science Officer of the Alzheimer's Drug Discovery Foundation (ADDF), a study funder. "We supported this research as part of a wider initiative to use the knowledge gained from cancer research to advance effective treatments for Alzheimer's."

Credit: 
Georgetown University Medical Center

New gut-brain link: How gut mucus could help treat brain disorders

Mucus is the first line of defence against bad bacteria in our gut. But could it also be part of our defence against diseases of the brain?

Bacterial imbalance in the gut is linked with Alzheimer's disease, autism and other brain disorders, yet the exact causes are unclear.

Now a new research review of 113 neurological, gut and microbiology studies led by RMIT University suggests a common thread - changes in gut mucus.

Senior author Associate Professor Elisa Hill-Yardin said these changes could be contributing to bacterial imbalance and exacerbating the core symptoms of neurological diseases.

"Mucus is a critical protective layer that helps balance good and bad bacteria in your gut but you need just the right amount - not too little and not too much," Hill-Yardin said.

"Researchers have previously shown that changes to intestinal mucus affect the balance of bacteria in the gut but until now, no-one has made the connection between gut mucus and the brain.

"Our review reveals that people with autism, Parkinson's disease, Alzheimer's and Multiple Sclerosis have different types of bacteria in their gut mucus compared with healthy people, and different amounts of good and bad bacteria.

"It's a new gut-brain connection that opens up fresh avenues for scientists to explore, as we search for ways to better treat disorders of the brain by targeting our 'second brain' - the gut."

Gut mucus is different depending on where it's found in the gastrointestinal tract - in the small intestine it's more porous so nutrients from food can be easily absorbed, while in the colon, the mucus is thick and should be impenetrable to bacteria.

The mucus is full of peptides that kill bacteria, especially in the small intestine, but it can also act as an energy source, feeding some of the bacteria that live inside it.

Gut neurons and brain disorders

Scientists are learning that brain disorders can affect neurons in the gut. For example, RMIT researchers have shown that neurons in both the brain and the gut nervous systems are affected in autism.

The new review suggests that reduced gut mucus protection may make patients with neurological diseases more susceptible to gastrointestinal problems.

Hill-Yardin said severe gut dysfunction could exacerbate the symptoms of brain disorders, significantly affecting quality of life for patients and their families.

"If we can understand the role that gut mucus plays in brain disease, we can try to develop treatments that harness this precise part of the gut-brain axis," she said.

"Our work shows that microbial engineering, and tweaking the gut mucus to boost good bacteria, have potential as therapeutic options for neurological disorders."

Credit: 
RMIT University

Argonne researchers create active material out of microscopic spinning particles

image: Self-assembled dynamic lattice of spinners. The Voronoi diagram is overlaid with the observed lattice. The spinners are blurred because of the long exposure time that enabled precise identification of the rotational axes for all spinners.

Image: 
Argonne National Laboratory

At the atomic level, a glass of water and a spoonful of crystalline salt couldn’t look more different. Water atoms move around freely and randomly, while salt crystals are locked in place in a lattice. But some new materials, recently investigated by researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory, show an intriguing propensity to sometimes behave like water and sometimes like salt, giving them interesting transport properties and holding potential promise for applications like mixing and delivery in the pharmaceutical industry.

These so-called active materials contain small magnetic particles that self-organize into short chains of particles, or spinners, and form a lattice-like structure when a magnetic field is applied. “Active materials need an external energy source to maintain their structure,” said Argonne materials scientist Alexey Snezhko, an author of the study.

“The interesting thing is that you can have these very quickly rotating structures that give the appearance of a yet larger system that is still, but it remains quite active.” — Argonne materials scientist Alexey Snezhko

Unlike in previous experiments involving active materials, which looked at particles that demonstrated linear motion, these new spinners acquire a handedness — like right- or left-handedness — that causes them to rotate in a specific direction.

This twirling rotation of the suspended self-assembled nickel spinners creates a whirlpool-like effect, in which different particles can get sucked in to the vortices created by their neighbors. “The particles don’t move on their own, but they can be dragged around,” Snezhko said. “The interesting thing is that you can have these very quickly rotating structures that give the appearance of a yet larger system that is still, but it remains quite active.”

As the particles start to come together, the whirlpools created by the spinning motion — in conjunction with the magnetic interactions — pull them even closer, creating a fixed crystalline-like material, even as the spinners still rotate.

The Argonne researchers wanted to know how a non-spinner particle would be transported through the active lattice. According to Snezhko, the rapid whirling of the spinners creates the ability for these other cargo particles to move through the lattice much more quickly than they would through a normal material. “In regular diffusion, the process of getting a particle from one side of the material to the other is temperature-dependent and takes a much longer period of time,” he said.

The transport of a non-spinner particle is also dependent upon the spacing between the spinners. If the spinners are located sufficiently far apart, the non-spinner particle will travel chaotically between different spinners, like a raft traveling down a series of whitewater rapids. If the particles in the lattice come closer together, the non-spinner particle can become trapped in an individual cell of the lattice.

“Once the particle comes within a cell through its own chaotic motion, we can modify the field so that the lattice slightly shrinks, making the probability of the particle to leave that location in the lattice very low,” Snezhko said.

The material also showed the ability to undergo self-repair, similar to a biological tissue. When the researchers made a hole in the lattice, the lattice reformed.

By looking at systems with purely rotational motion, Snezhko and his colleagues believe that they can design systems with specific transport characteristics. “There are many different ways for getting an object in a material from point A to point B, and this type of self-assembly could be tailored for different dynamics,” he said.

Credit: 
DOE/Argonne National Laboratory