Earth

Study highlights role of physical, mental health in cognitive impairment

A recent study suggests that preserving physical and mental health helps older adults experiencing cognitive impairment stave off declines in cognitive engagement.

"We found that declines in physical and mental health were associated with more pronounced cognitive disengagement," says Shevaun Neupert, corresponding author of the study and a professor of psychology at North Carolina State University. "The impact of declines in physical health was particularly pronounced for study participants who had more advanced cognitive impairment to begin with."

There's a lot of research showing that cognitive engagement can help older adults maintain cognitive health. However, the vast majority of that work has been done on healthy adults.

"There's very little work on cognitive engagement in people who are already cognitively impaired, such as people who have been diagnosed with dementia," Neupert says. "Are they still capable of sustained cognitive engagement? What factors contribute to that engagement?"

To begin addressing those questions, the researchers enlisted 28 study participants. All of the participants were over 60 and had documented cognitive impairment. Participants came to a testing site two times, six months apart. On each visit, researchers collected data on the physical and mental health of the study participants and performed a battery of tests designed to assess cognitive ability. Participants were also connected to a device that tracked blood pressure continuously and then asked to engage in a series of increasingly difficult cognitive tasks. This allowed researchers to track how cognitive engagement changed as the tasks become progressively harder.

Cognitive engagement means taking part in activities that are mentally challenging. Monitoring blood pressure allows the researchers to track how hard study participants are working to accomplish cognitive tasks. Specifically, blood pressure rises as more blood is pumped to the brain when participants work harder at these tasks.

Broadly speaking, the researchers found that if a participant's cognitive ability, physical health or mental health declined over the course of the six month study period, that participant became less cognitively engaged as the tasks became harder.

"Normally, you'd expect more engagement as the tasks became harder, but we found that some people essentially stopped trying," says Claire Growney, co-author of the study and a postdoctoral researcher at Washington University in St. Louis.

"The findings highlight the fact that well-being is holistic; physical health, mental health and cognitive function can influence each other," says Xianghe Zhu, co-author of the paper and a recent Ph.D. graduate of NC State.

"In practical terms, it suggests that it may be particularly important for people to focus on mental and physical well-being during the early stages of cognitive decline," Growney says. "Or, at the very least, don't become so focused on addressing cognitive challenges that you ignore physical health, or create anxiety or emotional distress for yourself that leads to mental health problems."

"Future research will be needed to determine how beneficial it might be for people to take part in cognitively engaging activities once they've started experiencing cognitive decline," Neupert says. "But we already know that there is an element of 'use it or lose it' to cognitive function in healthy adults. And while it's understandable for people to want to avoid tasks that are difficult or challenging, it's really important to continue challenging ourselves to take part in difficult cognitive activities."

Credit: 
North Carolina State University

Was Cascadia's 1700 earthquake part of a sequence of earthquakes?

The famous 1700 Cascadia earthquake that altered the coastline of western North America and sent a tsunami across the Pacific Ocean to Japan may have been one of a sequence of earthquakes, according to new research presented at the Seismological Society of America (SSA)'s 2021 Annual Meeting.

Evidence from coastlines, tree rings and historical documents confirm that there was a massive earthquake in the U.S. Cascadia Subduction Zone on 26 January 1700. The prevailing hypothesis is that one megathrust earthquake, estimated at magnitude 8.7 to 9.2 and involving the entire tectonic plate boundary in the region, was responsible for the impacts recorded on both sides of the Pacific.

But after simulating more than 30,000 earthquake ruptures within that magnitude range using software that models the 3D tectonic geometry of the region, Diego Melgar, the Ann and Lew Williams Chair of Earth Sciences at the University of Oregon, concluded that those same impacts could have been produced by a series of earthquakes.

Melgar's analysis suggests that a partial rupture of as little as 40% of the megathrust boundary in one magnitude 8.7 or larger earthquake could explain some of the North American coastal subsidence and the 26 January 1700 Japan tsunami. But there could have also been as many as four more earthquakes, each magnitude 8 or smaller, that could have produced the rest of the subsidence without causing a tsunami large enough to be recorded in Japan.

His findings do not rule out the possibility that the 1700 Cascadia earthquake was a stand-alone event, but "the January 26, 1700 event, as part of a longer-lived sequence of earthquakes potentially spanning many decades, needs to be considered as a hypothesis that is at least equally likely," he said.

Knowing whether the 1700 earthquake is one in a sequence has implications for how earthquake hazard maps are created for the region. For instance, calculations for the U.S. Geological Survey hazard maps are based on the Cascadia fault zone fully rupturing about half the time and partially rupturing the other half of the time, Melgar noted.

"But are we really sure that that's real, or maybe it's time to revisit that issue?" said Melgar. "Whether there was a partial or full rupture fundamentally drives everything we put on the hazard maps, so we really need to work on that."

Since the first analyses of the 1700 earthquake, there have been more data from the field, repeated earthquake modeling of the Cascadia Subduction Zone and a better understanding of the physics of megathrust earthquakes--all of which allowed Melgar to revisit the possibilities behind the 1700 earthquake. Researchers also have been writing code for years now to simulate earthquakes and tsunamis in the region, in part to inform earthquake early warning systems like ShakeAlert.

If there was a sequence of earthquakes instead of one earthquake, this might help explain why there is little good geologic evidence of the 1700 event in places such as the Olympic Mountains in Washington State and in southern Oregon, Melgar said.

He noted, however, that these specific areas are difficult to work in, "and may not necessarily be good recorders of the geological signals that paleoseismologists look for."

Melgar's models show that even a smaller Cascadia earthquake could cause a tsunami energetic enough to reach Japan. These smaller earthquakes could still pose a significant tsunami risk to North America as well, he cautioned. "They might be less catastrophic, because they don't affect such a wide area because the rupture is more compact, but we'd still be talking a mega-tsunami."

He suggested that it could be valuable to revisit and re-do old paleoseismological analyses of the 1700 event, to gain an even clearer picture of how it fits into the overall earthquake history of the region.

"Cascadia actually records earthquake geology much better than many other parts of the world," Melgar said, "so I think that just going back with modern methods would probably yield a lot of new results."

Credit: 
Seismological Society of America

'Information theory' recruited to help scientists find cancer genes

image: An illustration of the research concept that DNA methylation code can be analyzed using information theory, represented by strings of 0s and 1s. This analysis helps researchers understand the epigenetic landscape of cancer (pictured in blue) and identify genes that regulate this landscape, (noted as strings and posts underneath the blue landscape).

Image: 
Kate Zvorykina (Ella Maru Studio), Design: Michael Koldobskiy and Andrew Feinberg, Johns Hopkins Medicine

Using a widely known field of mathematics designed mainly to study how digital and other forms of information are measured, stored and shared, scientists at Johns Hopkins Medicine and Johns Hopkins Kimmel Cancer Center say they have uncovered a likely key genetic culprit in the development of acute lymphoblastic leukemia (ALL).

ALL is the most common form of childhood leukemia, striking an estimated 3,000 children and teens each year in the United States alone.

Specifically, the Johns Hopkins team used "information theory," applying an analysis that relies on strings of zeros and ones -- the binary system of symbols common to computer languages and codes -- to identify variables or outcomes of a particular process. In the case of human cancer biology, the scientists focused on a chemical process in cells called DNA methylation, in which certain chemical groups attach to areas of genes that guide genes' on/off switches.

"This study demonstrates how a mathematical language of cancer can help us understand how cells are supposed to behave and how alterations in that behavior affect our health," says Andrew Feinberg, M.D., M.P.H., Bloomberg Distinguished Professor at the Johns Hopkins University School of Medicine, Whiting School of Engineering and Bloomberg School of Public Health. A founder of the field of cancer epigenetics, Feinberg discovered altered DNA methylation in cancer in the 1980s.

Feinberg and his team say that using information theory to find cancer driver genes may be applicable to a wide variety of cancers and other diseases.

Methylation is now recognized as one way DNA can be altered without changing a cell's genetic code. When methylation goes awry in such epigenetic phenomena, certain genes are abnormally turned on or off, triggering uncontrolled cell growth, or cancer.

"Most people are familiar with genetic changes to DNA, namely mutations that change the DNA sequence. Those mutations are like the words that make up a sentence, and methylation is like punctuation in a sentence, providing pauses and stops as we read," says Feinberg. In a search for a new and more efficient way to read and understand the epigenetic code altered by DNA methylation, he worked with John Goutsias, Ph.D., professor in the Department of Electrical and Computer Engineering at The Johns Hopkins University and Michael Koldobskiy, M.D., Ph.D., pediatric oncologist and assistant professor of oncology at the Johns Hopkins Kimmel Cancer Center.

"We wanted to use this information to identify genes that drive the development of cancer even though their genetic code isn't mutated," says Koldobskiy.

Results of the study's findings, led by Feinberg, Koldobskiy and Goutsias, were published April 15 in Nature Biomedical Engineering.

Koldobskiy explains that methylation at a particular gene location is binary -- methylation or no methylation -- and a system of zeros and ones can represent these differences just as they are used to represent computer codes and instructions.

For the study, the Johns Hopkins team analyzed DNA extracted from bone marrow samples of 31 children newly diagnosed with ALL at The Johns Hopkins Hospital and Texas Children's Hospital. They sequenced the DNA to determine which genes, across the entire genome, were methylated and which were not.

Newly diagnosed leukemia patients have billions of leukemia cells in their body, says Koldobskiy.

By assigning zeros and ones to pieces of genetic code that were methylated or unmethylated and using concepts of information theory and computer programs to recognize patterns of methylation, the scientists were able to find regions of the genome that were consistently methylated in patients with leukemia and those without cancer.

They also saw genome regions in the leukemia cells that were more randomly methylated, compared with the normal genome, a signal to scientists that those spots may be specifically linked to leukemia cells compared with normal ones.

One gene, called UHRF1, stood out among other gene regions in leukemia cells that had differences in DNA methylation compared with the normal genome.

"It was a big surprise to find this gene, as its link to prostate and other cancer has been suggested but never identified as a driver of leukemia," says Feinberg.

In normal cells, the protein products of the UHRF1 gene create a biochemical bridge between DNA methylation and DNA packaging, but scientists have not deciphered precisely how alteration of the gene contributes to cancer.

Experiments by the Johns Hopkins team show that laboratory-grown leukemia cells lacking activity of the UHRF1 gene cannot self-renew and perpetuate additional leukemia cells.

"Leukemia cells aim to survive, and the best way to ensure survival is to vary the epigenetics in many genome regions so that no matter what tries to kill the cancer, at least some will survive," says Koldobskiy.

ALL is the most common pediatric cancer, and Koldobskiy says that decades of research on various treatments and the sequence of those treatments have helped clinicians cure most of these leukemias, but relapsed disease remains a leading cause of death from cancer in children.

"This new approach can lead to more rational ways of targeting the alterations that drive this and likely many other forms of cancer," says Koldobskiy.

The Johns Hopkins team plans to use information theory to analyze methylation patterns in other cancers. They also plan to determine whether epigenetic alterations in URFH1 are linked to treatment resistance and disease progression in patients with childhood leukemia.

Credit: 
Johns Hopkins Medicine

Sexual receptivity and rejection may be orchestrated by the same brain region

image: Certain neurons in the brains of female mice change their structure across the reproductive cycle, gaining higher complexity during the receptive phase.

Image: 
Champalimaud Centre for the Unknown - Lima Lab (data); Diogo Matias (design).

In many species, including humans and mice, the fluctuating levels of the hormones progesterone and estrogen determine whether the female is fertile or not. And in the case of mice, whether she's sexually receptive or not.

The change in receptivity is striking. Female mice shift from accepting sexual partners to aggressively rejecting them across a cycle of six short days. How can the female reproductive hormones bring about such a radical behavioural change?

When searching for an explanation, the team of Susana Lima, a principal investigator at the Champalimaud Centre for the Unknown in Portugal, came across an intriguing discovery.

"Our experiments revealed that a brain area important for female receptivity, called the VMH (ventromedial hypothalamus), is actually made up of distinct compartments. And within each compartment, we found neurons whose activity - and even structure - fluctuate with the female reproductive cycle", says Lima.

These findings, published today (April 20th) in the journal eNeuro, uncover heterogeneity within the VMH that may serve to control the two extremes of female sexual behaviour - receptivity and rejection.

When One Becomes Three

The current study stems from previous results from the lab, which demonstrated that neurons in the VMH respond differently to the presence of males depending on the phase of the female's reproductive cycle.

"The fact that the neurons' activity was changing across the cycle led us to suspect that the neurons themselves were changing. So we set out to explore whether that was truly the case", recalls Inês Dias, a PhD student in the lab.

The team probed neurons throughout the VMH, particularly focusing on the front-to-back axis of this elongated structure. Their results revealed that this seemingly homogeneous brain area was actually made up of three separate compartments!

"In all three compartments, neurons expressing progesterone receptors exhibited profound changes across the reproductive cycle. But they were changing in entirely different ways depending on which compartment they belonged to", explains Nicolas Gutierrez Castellanos, a postdoctoral fellow in the lab.

The team noted several distinguishing features across compartments. For example, neurons in the front compartment became more excitable during the non-receptive phase of the cycle.

"We also witnessed extensive structural changes", points out Liliana Ferreira, a research technician in the lab. "Neurons in the back compartment gained higher complexity in the receptive phase."

Receptivity & Rejection Side by Side?

According to the authors, this newly-found heterogeneity within the VMH provides evidence in favour of its involvement in various aspects of female socio-sexual behaviour. In fact, the team is already engaged in investigating this hypothesis further, with the support of a generous Consolidator Grant Lima had received a few years ago from the European Research Council.

Other scientific reports also support this hypothesis. For one, female sexual receptivity is mainly attributed to the back region of the VMH. On the other hand, the front area is important for self-defence in males. This type of behaviour, in the view of the team, shares similar features with rejection, specifically when the male comes too close to the female.

Also, it may seem counterintuitive that opposite behavioural functions would be controlled by the same structure, but there are known examples where this is the case. It happens in feeding, as well as in the flee or freeze response to a threat.

Sex Hormones Matter

According to Lima, this study not only provides insight into the neural basis of female sexual behaviour, but it also highlights the importance of considering hormonal state in research.

"Our results further demonstrate that reproductive hormones have a major effect on brain activity and structure. Progesterone receptors exist in other brain regions. So many aspects of female behaviour could theoretically be affected by the reproductive cycle. The same goes for testosterone in males, which is known to influence male behaviour dramatically. It is therefore essential to take into account the potential effects of these reproductive hormones when conducting scientific research", she concludes.

Credit: 
Champalimaud Centre for the Unknown

'Undruggable' cancer protein becomes druggable, thanks to shrub

image: Mingji Dai, professor of chemistry and a scientist at the Purdue University Center for Cancer Research, studied the compound found naturally in the roots of a shrub and discovered a cost-effective and efficient way to synthesize it in the lab. The compound -- curcusone D -- has the potential to help combat a protein found in many cancers, including some forms of breast, brain, colorectal, prostate, lung and liver cancers, among others.

Image: 
Purdue University photo/Charles Jischke

A chemist from Purdue University has found a way to synthesize a compound to fight a previously "undruggable" cancer protein with benefits across a myriad of cancer types.

Inspired by a rare compound found in a shrub native to North America, Mingji Dai, professor of chemistry and a scientist at the Purdue University Center for Cancer Research, studied the compound and discovered a cost-effective and efficient way to synthesize it in the lab. The compound -- curcusone D -- has the potential to help combat a protein found in many cancers, including some forms of breast, brain, colorectal, prostate, lung and liver cancers, among others. The protein, dubbed BRAT1, had previously been deemed "undruggable" for its chemical properties. In collaboration with Alexander Adibekian's group at the Scripps Research Institute, they linked curcusone D to BRAT1 and validated curcusone D as the first BRAT1 inhibitor.

Curcusones are compounds that come from a shrub named Jatropha curcas, also called the purging nut. Native to the Americas, it has spread to other continents, including Africa and Asia. The plant has long been used for medicinal properties -- including the treatment of cancer -- as well as being a proposed inexpensive source of biodiesel.

Dai was interested in this family of compounds -- curcusone A, B, C and D.

"We were very interested by these compounds' novel structure," Dai said. "We were intrigued by their biological function; they showed quite potent anti-cancer activity and may lead to new mechanisms to combat cancer."

Researchers tested the compounds on breast cancer cells and found curcusone D to be extremely effective at shutting down cancer cells. The protein they were targeting, BRAT1, regulates DNA damage response and DNA repair in cancer cells. Cancer cells grow very fast and make a lot of DNA. If scientists can damage cancer cells' DNA and keep them from repairing it, they can stop cancer cells from growing.

"Our compound can not only kill these cancer cells, it can stop their migration," Dai said. "If we can keep the cancer from metastasizing, the patient can live longer."

Stopping cancer from spreading throughout the body -- metastasizing -- is key to preserving a cancer patient's life. Once cancer starts to migrate from its original organ into different body systems, new symptoms start to develop, often threatening the patient's life.

"For killing cancer cells and stopping migration, there are other compounds that do that," Dai said. "But as far as inhibiting the BRAT1 protein, there are no other compounds that can do that."

Dai and his team believe that as effective as curcusone D is by itself, it may be even more potent as part of a combination therapy. They tested it alongside a DNA damaging agent that has already been approved by the Food and Drug Administration and found that this combination therapy is much more effective.

One difficulty in studying curcusones as potential cancer treatments is that, while the shrub they come from is common and inexpensive, it takes massive amounts of the shrub to yield even a small amount of the compounds. Even then, it is difficult to separate the compounds they were interested in from the rest of the chemicals in the shrub's roots.

"In nature, the plant doesn't produce a lot of this compound," Dai said. "You would need maybe as much as 100 pounds of the plant's dry roots to get just about a quarter teaspoon of the substance -- a 0.002% yield."

That small yield is relevant for production, because if it is effective as a cancer treatment, pharmacists will need a lot more of it. Additionally, having an abundant supply of the compounds makes studying them easier, quicker and less expensive.

"That's why a new synthesis is so important," Dai said. "We can use the synthesis to produce more compounds in a purer form for biological study, allowing us to advance the field. From there, we can make analogs of the compound to improve its potency and decrease the potential for side effects."

The next step will be to test the compound to ensure that it is not toxic to humans, something the researchers are optimistic about since the shrub it came from has been used as a traditional medicine in a number of cultures. Already, researchers from other entities have reached out to test the compound on the cancers they study, bringing hope for renewed therapeutics for treating the disease.

"Many of our most successful cancer drugs have come from nature," Dai said. "A lot of the low-hanging fruit, the compounds that are easy to isolate or synthesize, have already been screened and picked over. We are looking for things no one has thought about before. Once we have the chemistry, we can build the molecules we're interested in and study their biological function."

Credit: 
Purdue University

Research brief: Improving drug efficacy against prostate cancer and related bone growths

Published in the Advanced Functional Materials, University of Minnesota researcher Hongbo Pang led a cross-institutional study on improving the efficacy of nucleotide-based drugs against prostate cancer and bone metastasis.

In this study, Pang and his research team looked at whether liposomes, when integrated with the iRGD peptide, will help concentrate antisense oligonucleotides (ASOs) into primary prostate tumors and its bone metastases. Liposomes are used as a drug carrier system, and ASOs are a type of nucleotide drug.

More importantly, they investigated whether this system helps more drugs across the vessel wall and deeply into the tumor tissue. This is critical because, although nucleotide drugs offer unique advantages in treating tumors and other diseases, they often suffer from a poor efficiency of crossing the blood vessels and entering the tumor tissue, where their targets reside. This problem greatly limits their clinical applicability and efficacy.

"Our system demonstrates a good ability to deliver more ASOs into both primary tumor tissue and bone metastases -- which is the primary site for prostate cancer metastasis," said Pang, an assistant professor in the College of Pharmacy and a member of the Masonic Cancer Center. "This further translates into a significant improvement of ASO efficacy to inhibit the growth of primary tumor and bone metastases. We expect this system to become a universal carrier system, to improve the clinical efficacy of ASOs and other nucleotide drugs."

The study found that:

iRGD-liposomes can increase the tumor accumulation and vascular/tissue penetration of ASOs against the disease-driving gene of prostate cancer;

the ability of ASOs to inhibit the growth of both primary tumors and bone metastases was significantly enhanced by iRGD-liposomes;

and, a long-term tumor inhibition study was also performed, showing that iRGD-liposomes significantly prolongs the AR-ASO suppression of primary tumor growth.

Pang and his team say that iRGD-liposomes are proven as a desirable delivery system for ASOs, and hold the promise to improve the clinical efficacy of nucleotide drugs in cancer therapies.

Credit: 
University of Minnesota

E-cigarette users in rural Appalachia develop more severe lung injuries

image: In a recent study, Sunil Sharma--section chief of pulmonary/critical care and sleep medicine at the WVU School of Medicine--and his colleagues found that rural e-cigarette users are older--and often get sicker--than their urban counterparts.

Image: 
Zane Lacko/WVU

Just as e-cigarette ingredients can vary from one region to another, the health effects of vaping can have regional characteristics as well. A new study out of West Virginia University suggests that rural e-cigarette users are older--and often get sicker--than their urban counterparts.

Researchers with the WVU School of Medicine are investigating severe lung injuries occurring among e-cigarette users in rural Appalachia. In a recent study, Sunil Sharma--section chief of pulmonary/critical care and sleep medicine at the School of Medicine--and his colleagues present a case study of patients with EVALI (electronic cigarettes and vaping-associated lung injury) admitted to WVU hospitals from August 2019 to March 2020.

The study, published in Hospital Practice, suggests that EVALI in rural Appalachia results in severe respiratory failure.

"Ours is the first rural study," Sharma said. "One of the real lessons we learned is we can't take data from urban centers and apply them to rural. We could be different, and we physicians need to treat the way that the science is showing in our areas."

Sharma and his team recorded demographics, baseline characteristics, health conditions and vaping behavior for 17 patients admitted to WVU hospitals with EVALI. They also evaluated lung specimens for signs of inflammation and analyzed patient-volunteered e-liquid materials using mass spectrometry to determine chemical composition.

Compared to other EVALI studies performed in urban centers, patients in the rural study were older, had a higher amount of illicit drug use and were much sicker. The median age of patients in this study was 33, compared to 23 in a large national study.

Thirteen patients had a history of cigarette smoking, while four were never smokers. Urine testing determined that nine patients were also consuming THC and another nine were positive for other illicit drugs. Seven of the patients consuming THC required critical care, and four of the 17 had secondary infection of the lungs. Ten patients required mechanical or noninvasive ventilation while two required treatment with an extracorporeal membrane oxygenation machine, which pumps someone's blood outside of their body, oxygenates it and returns it to the body.

Sharma's analysis of e-liquids identified toxic volatile organic compounds--such as formaldehyde, acetaldehyde, acetone, propylene glycol and cyclohexane--in addition to nicotine. Higher levels of these VOCs were found in the e-liquids provided by the three most severely ill patients.

"We were the first ones to show that there was high correlation of volatile organic compounds, specifically in patients who were really sick," Sharma said. "Inhaling all these volatile organic compounds into your lungs at high temperatures, producing these really toxic gaseous compounds, can cause chemical burns in your lungs."

Sharma suspects the particularly high levels of VOCs may be due to the production of some e-liquids in local "garage labs." Garage labs are unregulated, and the labs likely use chemicals easily available to them.

"Volatile organic compounds are very common," he said. "They are very easily acquired, and they are cheap. And we found that there's a regional flavor to each one of the e-liquids, depending on what their garage labs have access to, what they think is cheaper and how it's made."

Despite the strong correlation between lung injury and high VOC levels, Sharma said many other factors determine how much e-cigarettes harm the lungs. These factors include the type of device or technique used to vape, the ratio of propylene glycol and vegetable glycol used as the e-liquid base, what flavors were added, the age of the patient and if the patient uses other drugs.

"Depending on these, you could have a mild chest pain and feeling of discomfort, which might go away once you stopped smoking, or you could get a lung injury so severe that it requires intubation, mechanical ventilation and sometimes even ECMO," he said.

Sharma believes that the age gap between urban and rural e-cigarette users stems from rural areas' relatively older populations. He also believes that older adults may become engaged in vaping as a way to quit smoking and may not know how dangerous it is, as most warnings from regulatory agencies have targeted teens.

Ad campaigns, such as the Food and Drug Administration's "The Real Cost," have been warning teens and young adults of the "real cost" of e-cigarettes and vaping since 2014. The FDA has also banned advertisements for e-cigarette products targeted at minors and banned enticing fruit and mint e-liquid flavors.

The results of Sharma's study indicate that it may be time to target messages about the dangers of vaping to older populations in rural areas.

"All the effort has been directed toward high schoolers and young people, but maybe in rural areas, we should be having awareness campaigns for older populations," he said.

E-cigarettes are commonly thought to be healthier or less dangerous than traditional cigarettes. Results of the study's follow-up interviews emphasize just how untrue this is, especially for older people. Six to 12 weeks after discharge, it was found that 12 patients completely stopped vaping or smoking, but four patients still experienced persistent cough, trouble breathing and wheezing, and two needed home oxygen therapy.

"The older you are, the more dangerous vaping is for you, and you're going to end up in a very bad situation, probably in an ICU," Sharma said.

Three patients reportedly continued vaping after discharge and had persistent cough and trouble breathing.

"All of these are tied," Sharma said. "You have garage labs that are producing all kinds of chemicals--which are adulterants and not authorized by a regulatory agency--and the people who are buying them are a much older population. And I think these two conspired in order for us to see a very, very sick population in West Virginia."

Credit: 
West Virginia University

New liquid biopsy test to ID lymph node metastasis in early-stage T1 colorectal cancer

image: Ajay Goel, Ph.D., M.S., chair of the Department of Molecular Diagnostics and Experimental Therapeutics at City of Hope, and his colleagues developed a novel, noninvasive liquid biopsy test for detecting lymph node metastasis in individuals with high-risk T1 colorectal carcinoma.

Image: 
City of Hope

DUARTE, Calif. -- Scientists at City of Hope, a world-renowned independent research and treatment center for cancer and diabetes, have developed a novel, noninvasive liquid biopsy test for detecting lymph node metastasis in individuals with high-risk T1 colorectal carcinoma. Research on the development of the blood test was reported in a new study published in Gastroenterology, a journal of the American Gastroenterological Association.

This blood-based test is an example of the theranostic (a term that combines "therapeutic" and "diagnostic") approach at City of Hope, whose goal is to help every patient receive personalized treatment appropriate for their specific disease. Development of blood-based biopsies to detect and monitor tumors is one of the leading-edge technologies under investigation to help patients with cancer.

Most individuals suspected of having T1 colorectal cancer with lymph node metastasis undergo radical surgery to remove affected parts of the colon. Unfortunately, only 5-10% of these individuals actually had lymph node metastasis after final examination of the removed colon, indicating that surgery was not necessary for the majority of these people.

"Since radical surgery dramatically reduces quality of life for patients, improving the success rate of identification of high-risk individuals with lymph node metastasis remains the challenge in T1 colorectal cancer diagnosis," said Ajay Goel, Ph.D., M.S., chair of the Department of Molecular Diagnostics and Experimental Therapeutics at City of Hope, and the study's senior author. "In the future, we hope to improve our confidence in identifying which individuals truly have lymph node metastasis via this novel biomarker-based liquid biopsy test for T1 colorectal cancer, in combination with clinical and pathological criteria."

The publication by Goel and colleagues focused on translating their previous findings of identifying lymph node metastasis with T1 colorectal cancer in a tissue-based assay into a blood-based assay. After refining and validating the panel of RNA biomarkers, the blood test was found to accurately identify lymph node metastasis with high sensitivity (83.3%; a higher value means fewer cases of disease are missed), and specificity (76.2%; a higher value means the test can accurately identify those without disease).

"Obtaining results with more than 80% accuracy (sensitivity) in finding lymph node metastasis in the blood was a completely unexpected finding and is a complete game changer," Goel said.

City of Hope researchers are excited about the potential of this new liquid biopsy test to complement current risk assessment for lymph node metastasis for individuals with early stage T1 colorectal cancer. The technology is patent pending.

Goel indicated that further work will be done to optimize the blood-based assay to improve sensitivity and continue to validate the assay in prospective clinical trials. "There are several steps between where we are now and where we want to go -- detecting lymph node metastasis in colon cancer from a blood sample -- but without doubt this is an encouraging first step."

Credit: 
City of Hope

Can extreme melt destabilize ice sheets?

image: In 2012, an extreme melt season in Greenland created a refrozen ice layer in the compacting snow near the surface of the ice sheet. In some places, this melt layer has continued to grow since then, limiting the ice sheet's future capacity to absorb and store meltwater.

Image: 
Farrin Abbott

Nearly a decade ago, global news outlets reported vast ice melt in the Arctic as sapphire lakes glimmered across the previously frozen Greenland Ice Sheet, one of the most important contributors to sea-level rise. Now researchers have revealed the long-term impact of that extreme melt.

Using a new approach to ice-penetrating radar data, Stanford University scientists show that this melting left behind a contiguous layer of refrozen ice inside the snowpack, including near the middle of the ice sheet where surface melting is usually minimal. Most importantly, the formation of the melt layer changed the ice sheet's behavior by reducing its ability to store future meltwater. The research appears in Nature Communications April 20.

"When you have these extreme, one-off melt years, it's not just adding more to Greenland's contribution to sea-level rise in that year - it's also creating these persistent structural changes in the ice sheet itself," said lead study author Riley Culberg, a PhD student in electrical engineering. "This continental-scale picture helps us understand what kind of melt and snow conditions allowed this layer to form."

The 2012 melt season was caused by unusually warm temperatures exacerbated by high atmospheric pressure over Greenland - an extreme event that may have been caused or intensified by climate change. The Greenland Ice Sheet has experienced five record-breaking melt seasons since 2000, with the most recent occurring in 2019.

"Normally we'd say the ice sheet would just shrug off weather - ice sheets tend to be big, calm, slow things," said senior author Dustin Schroeder, an assistant professor of geophysics at Stanford's School of Earth, Energy & Environmental Sciences (Stanford Earth). "This is really one of the first cases where you can say, shockingly, in some ways, these slow, calm ice sheets care a lot about a single extreme event in a particularly warm year."

Shifting scenarios

Airborne radar data, a major expansion to single-site field observations on the icy poles, is typically used to study the bottom of the ice sheet. But by pushing past technical and computational limitations through advanced modeling, the team was able to reanalyze radar data collected by flights from NASA's Operation IceBridge from 2012 to 2017 to interpret melt near the surface of the ice sheet, at a depth up to about 50 feet.

"Once those challenges were overcome, all of a sudden, we started seeing meltwater ice layers near the surface of the ice sheet," Schroeder said. "It turns out we've been building records that, as a community, we didn't fully realize we were making."

Melting ice sheets and glaciers are the biggest contributors to sea-level rise - and the most complex elements to incorporate into climate model projections. Ice sheet regions that haven't experienced extreme melt can store meltwater in the upper 150 feet, thereby preventing it from flowing into the ocean. A melt layer like the one from 2012 can reduce the storage capacity to about 15 feet in some parts of the Greenland Ice Sheet, according to the research.

The type of melt followed by rapid freeze experienced in 2012 can be compared to wintry conditions in much of the world: snow falls to the ground, a few warm days melt it a little, then when it freezes again, it creates slick ice - the kind that no one would want to drive on.

"The melt event in 2012 is impacting the way the ice sheet responds to surface melt even now," Culberg said. "These structural changes mean the way the ice sheet responds to surface melting is going to be impacted longer term."

In the long run, meltwater that can no longer be stored in the upper part of the ice sheet may drain down to the ice bed, creating slippery conditions that speed up the ice and send chunks into the ocean, raising sea levels more quickly.

Polar patterns

Greenland currently experiences change much more rapidly than its South Pole counterpart. But lessons from Greenland may be applied to Antarctica when the seasons shift, Schroeder said.

"I think now there's no question that when you're trying to project into the future, a warming Antarctic will have all these processes," Schroeder said. "If we don't use Greenland now to better understand this stuff, our capacity to understand how a warmer world will be is not a hopeful proposition."

Credit: 
Stanford University

Clinical trial assesses stem cells' ability to prevent major cause of preemie deaths

image: Human umbilical cord blood- mesenchymal stem cells (intratracheal injection)

Image: 
AlphaMed Press

Durham, NC - A phase 2 clinical trial whose results were released today in STEM CELLS Translational Medicine might point to a way to overcome bronchopulmonary dysplasia (BPD), a major cause of death in preterm infants. The study, conducted by researchers at Samsung Medical Center, Sungkyunkwan University and Asan Medical Center Children's Hospital in Seoul, evaluates the effectiveness of treating these infants by transplanting umbilical cord blood-derived mesenchymal stem cells (UCB-MSCs) directly into their tracheas.

Early results showed signs of improvement for the most immature infants included in the trial.

BPD is a serious breathing disorder in which the lungs do not develop normally. Most infants who develop BPD are born more than 10 weeks before their due date, weigh less than 2 pounds at birth, and have breathing problems. Infections that occur before or shortly after birth also can contribute to the disorder. Despite recent advances in neonatal medicine, BPD remains a major cause of mortality and long-term respiratory and neurologic problems in premature infants.

Won Soon Park, M.D., Ph.D., of Samsung Medical Center and Sungkyunkwan University School of Medicine, and Ellen Ai-Rhan Kim, M.D., Ph.D., of Asan Medical Center Children's Hospital, University of Ulsan, are both pediatricians who have been searching for a way to treat BPD. As such, the two were co-authors of the recent study, which was a double-blind randomized, placebo-controlled phase II clinical trial designed specifically to learn whether human UCB-derived MSC intratracheal transplantation might mitigate BPD in extremely preterm infants.

"While intratracheal transplantation of human UCB-derived MSCs has proven to be safe and feasible in recently conducted clinical trials, its therapeutic efficacy had not been assessed in a clinical setting prior to this study," Dr. Park said.

Choosing the optimal MSCs for transplantation was also a critical point of their work, added Dr. Kim. "We focused on UCB-MSCs, as they exhibit several advantages over adult tissue-derived MSCs, including lower immunogenicity, higher proliferation capacity, paracrine potency and therapeutic efficacy both in vitro and in vivo," she said. "Moreover, allogenic transplantation of MSCs, which come from a donor rather than the patient, might have a logistic advantage as they can be used at an early active stage of disease."

Sixty-six preterm infants ranging in age from 23 to 28 gestational weeks (GW) were included in the study. All were on ventilators and each also had experienced significant respiratory difficulties between five and 14 days after birth. The infants were grouped by age - either 23-24 GW or 25-28 GW - and then further randomly divided into groups that received either UCB-MSCs or a placebo. The infants were then assessed for six months.

In their final analysis, the team verified that intratracheal transplantation of MSCs appears to be safe and feasible.

"At the same time, we learned that MSC transplantation did not significantly improve the primary outcome of death or severe/moderate BPD in infants in the 25-28 GW group. However, our subgroup analysis showed that the secondary outcome of severe BPD was significantly improved - dropping from 53 percent to 19 percent - in the younger 23-24 GW group," Dr. Park said.

Dr. Kim added, "Further study with a larger sample size is required to solidly prove therapeutic benefits for infants in this group who are at the highest risk for BPD or death. Accordingly, we are now conducting an additional larger and controlled phase II clinical trial focusing on 23-24 GW infants. We hope to publish these data in the near future."

"This study suggests that the transplantation of umbilical cord blood-derived mesenchymal stem cells directly into the tracheas of preterm babies might mitigate their breathing challenges," said Anthony Atala, M.D., Editor-in-Chief of STEM CELLS Translational Medicine and Director of the Wake Forest Institute for Regenerative Medicine. "The data are encouraging and useful and should lead to a larger, future clinical trial."

Credit: 
AlphaMed Press

Better marketing for a better world

Newly published research contained in the Special Issue of the Journal of Marketing features fourteen global author teams focused on the topic of Better Marketing for a Better World. Edited by Rajesh Chandy (London Business School), Gita Johar (Columbia University), Christine Moorman (Duke University), and John Roberts (University of New South Wales), this Special Issue brings together wide-ranging research to assess, illuminate, and debate whether, when, and how marketing contributes to a better world.

The Special Issue is built on the thesis that marketing has the power to improve lives, sustain livelihoods, strengthen societies, and benefit the world at large. It calls for a renewed focus by marketing scholars on how marketing can contribute to a better world and argues that scholars should examine the impact of marketing on outcomes beyond just what is good for the financial performance of firms. Better Marketing for a Better World emphasizes marketing's role in enhancing the welfare of the world's multiple stakeholders and institutions and asks marketing to engage with many of the world's most important challenges, including persistent poverty, inequity, illiteracy, insecurity, disease, climate change, pollution, and human trafficking, among many others.

Editor Rajesh Chandy, the Tony and Maureen Wheeler Chair in Entrepreneurship and at London Business School where he is also the Academic Director of the Wheeler Institute for Business and Development, notes, "This Special Issue represents a breakthrough in the academic study of marketing. Articles in the Special Issue bring scholarly scrutiny to the impact of marketing on the world around us. And they point to the wealth of possibilities for further study of how better marketing can help create a better world."

The articles in the Special Issue offer rich insights on how to use the power of marketing for good across four key topics:

On sustainability and climate concerns, articles address the adoption of eco-friendly pesticides in rural China, the use of high-end durable products with longer lifecycles, the design of programs to help consumers adopt alternatives to plastic bags, and the labeling of ugly produce to reduce food waste.

Considering economic and social empowerment, researchers document--through randomized controlled trials--how volunteer marketing consultants can help drive growth among entrepreneurs in Uganda, how popping the illusion of financial responsibility among consumers can improve personal savings, and how marketplace literacy training can improve personal well-being among subsistence consumers in India and Tanzania.

Turning to health and well-being, researchers examine the health costs of commonly used variable compensation systems for salespeople, the effectiveness of anti-tobacco policies and ads, improvements in organ donation registration from low cost, easy-to-scale marketing interventions, and the unintended risks of seeking to promote health by portraying humans as machines.

Finally, insights into prosocial giving include how to increase donations by offering people an opportunity to say something about who they are, using predictive models to identify how to best manage different types of donors, members, and member donors, and using price promotion tools to increase donations.

By seeking solutions that look for win-win outcomes, Editor John Roberts, the Scientia Professor of Marketing at the University of New South Wales, Sydney Australia, notes, "As the discipline tasked with understanding the customer and external stakeholder-facing activities of the organization, marketing has a unique potential to improve the alignment between the economic activities of firms and other providers with the outputs that consumers and other members of the society value. That potential is not always realized and the papers in this Special Issue offer several important ways this achieve this."

The power of these articles lies in the way in which they bring theory and insight to very real and pressing problems. As Editor Gita Johar, the Meyer Feldberg Professor of Business at Columbia Business School, notes, "Some of the proposed interventions are small nudges and some are large-scale programs, but they are all innovative, implementable, and most importantly, scalable. These papers collectively illuminate the theory-practice interface needed to advance societal goals."

Editor in Chief Christine Moorman, the T. Austin Finch Sr. Professor of Business Administration at the Fuqua School of Business, Duke University, heralds this Special Issue for the field and points to the increased focus on using marketing for good. She notes, "The field is posed to serve the world in a way we have not yet witnessed. The 239 submissions we received for this Special Issue, the number of Ph.D. students involved in these projects, and the overall interest across marketing faculty around the world points to momentum for the creative exercise of envisioning how marketing can contribute."

To that end, the editors suggest marketing scholars and practitioners look at pressing social issues and ask themselves two simple questions: 1) Does this topic belong in marketing? 2) How could you frame this topic as a marketing question? From these questions, other questions will emanate: Why is the outcome important to marketing? Does marketing exacerbate the problem? Does marketing have the potential to provide a solution to or an explanation for the problem?

They close the editorial with this statement: We can do more. We can do better. Let's work together to develop better marketing for a better world.

Credit: 
American Marketing Association

Growth in home health care failing to keep up with surging demand, study finds

image: "Only 0.7 percent of physicians in Medicare provided home care regularly," said Nengliang "Aaron" Yao, PhD, a researcher with the University of Virginia School of Medicine's Section of Geriatric Medicine. "Targeted policies are needed to support home-based medical care."

Image: 
Dan Addison | UVA Communications

Recent growth in the number of healthcare workers providing home care for Medicare patients is "small and inadequate" compared with the increasing demand in an aging America, a new study suggests.

To have hope of keeping up, Medicare likely will need to reconsider how it compensates providers for home care, the researchers say.

"Only 0.7 percent of physicians in Medicare provided home care regularly," said Nengliang "Aaron" Yao, PhD, a researcher with the University of Virginia School of Medicine's Section of Geriatric Medicine. "Targeted policies are needed to support home-based medical care."

Trends in Home Care

Growth in the field of home care was "modest but steady" between 2012 and 2016, with most of the growth coming from increasing numbers of nurse practitioners providing home visits, the study found.

The total number of providers offering in-home care for Medicare patients grew from about 14,100 to approximately 16,600 between 2012 and 2016, the researchers report. But there was also strong churn in the field - approximately 4,000 providers began offering home visits each year, while roughly 3,000 stopped.

Demand for home care already exceeds supply in much of the country. Only about 15% of frail older adults receive medical care at home. America's aging population, growing numbers of patients with dementia and increasing preference for aging in place all will continue to drive demand, the researchers say.

"More and more older adults are homebound and have a hard time getting to their medical providers," said researcher Justin B. Mutter, MD, the section head of Geriatric Medicine at UVA Health, who provides home visits through UVA's Virginia at Home (VaH) program. "House calls bring the best of person-centered medical care to where many need it most: their home environment."

UVA Health launched the VaH program last summer, in collaboration with the Department of Neurology's Memory and Aging Care Clinic and the UVA Center for Health Humanities and Ethics. VaH's interprofessional team consists of Mutter, nurse practitioner Karen Duffy, clinical pharmacist Bethany Delk and care coordinator Tuula Ranta. The team helps patients age in place, provides caregiver support and offers house calls as well as telemedicine visits, in partnership with UVA's Center for Telehealth. VaH aims to bridge the gap between high demand for, and low supply of, home-based medical care for older adults in Central Virginia.

Obstacles to Home Care

The researchers note that there are many obstacles that hinder the delivery of home care across the country, including Medicare reimbursement rates, travel time and the complexity of many homebound patients' needs. A family doctor may see 20 patients a day in an office-based setting, while many home-care providers are unable to see half as many patients in that same time, the researchers say.

To overcome those challenges, Medicare likely would need to revisit how it compensates providers for home visits. "Home-based medical care ... has been described as a low-volume, high-value service that is not easily rewarded by fee-for-service payment," the researchers write in a new paper outlining their findings. For this reason, they say, integrating value-based payment options within traditional Medicare for homebound older adults will be essential.

The Virginia at Home program has benefited from the generous support of philanthropic gifts for its launch, but philanthropy must be complemented by sustained payment reform for all home-care providers, Mutter and Yao say.

Without such steps, America will continue to struggle to keep pace with the growing demand for home-based medical care in the years to come, the researchers say.

"Home-based medical care is care built around the patients and caregivers with goals tailored to their needs in their environment," Mutter said. "Now more than ever, we need health-care professionals trained and ready to provide this holistic service to our aging population."

Credit: 
University of Virginia Health System

Human land use wasn't always at nature's expense

image: Indigenous villagers in the heavily forested state of Odisha, India.

Image: 
Ganta Srinivas

Nearly three-quarters of Earth's land had been transformed by humans by 10,000BC, but new research shows it largely wasn't at the expense of the natural world.

A study involving University of Queensland researchers combined global maps of population and land use over the past 12,000 years with current biodiversity data, demonstrating the effective environmental stewardship of Indigenous and traditional peoples.

UQ's Professor James Watson said the findings challenged the modern assumption that human 'development' inevitably led to environmental destruction.

"There's a paradigm among natural scientists, conservationists and policymakers that human transformation of terrestrial nature is mostly recent and inherently destructive," Professor Watson said.

"But lands now characterised as 'natural', 'intact', and 'wild' generally exhibit long histories of human use.

"Even 12,000 years ago, most of Earth's land had been shaped by humans, including more than 95 per cent of temperate lands and 90 per cent of tropical woodlands.

"And, importantly, current global patterns of vertebrate species richness and key biodiversity areas are strongly associated with past patterns of human land use, when compared to current, 'natural', recently-untouched landscapes.

"Humans have been intertwined with nature for most of humanity's existence and this is critical for how we should plan for conservation in the future."

The researchers argue that the modern world's biodiversity crisis has been caused by more complicated factors than simple human expansion.

"Modern environmental destruction has resulted from the appropriation, colonisation and intensifying use of biodiverse cultural landscapes, long shaped and sustained by prior societies," Professor Watson said.

"As such, we need to harness the knowledge of traditional and Indigenous peoples.

"We're in a biodiversity crisis - an enormous extinction event - and lessons learned through millennia of stewardship are, and will be, invaluable.

"Areas under Indigenous management today are now some of the most biodiverse areas remaining on the planet.

"Landscapes under traditional low-intensity use are generally much more biodiverse than those governed by high-intensity agricultural and industrial economies.

"Here in Australia, our Indigenous peoples have lived in sync with incredible biodiversity for the last 50,000 years."

Erle Ellis, Professor of Geography and Environmental Systems at the University of Maryland said the results showed Indigenous collaboration was critical.

"Effective, sustainable and equitable conservation of biodiversity needs to recognise and empower Indigenous, traditional and local peoples and foster their cultural heritage of sustainable ecosystem management," Professor Ellis said.

Credit: 
University of Queensland

Asymmetric synthesis of aziridine with a new catalyst can help develop novel medicines

image: Scientists from Japan recently proposed a possible transition state for the reaction between aziridines and oxazolones in presence of a cinchona alkaloid sulfonamide catalyst, producing desirable aziridine-oxazolone compounds with high yields and enantioselectivity or purity

Image: 
Image courtesy: Shuichi Nakamura from NITech

Unless you've studied chemistry in college, it's unlikely you've come across the name aziridine. An organic compound with the molecular formula, C2H4NH, aziridines are well-known among medicinal chemists, who make use of the compound to prepare pharmaceutical drugs such as Mitomycin C, a chemotherapeutic agent known for its anti-tumor activity. Specifically, aziridines are what chemists call "enantiomers"--molecules that are mirror images of each other and cannot be superposed on one another. A peculiarity with enantiomers is that the biological activity of one is different from its mirror image and only one of them is desirable for making drugs. Chemists, therefore, regularly opt for "asymmetric" or "enantioselective" synthesis techniques that yield the desired enantiomer in greater amounts.

One such technique that has recently attracted attention from the viewpoint of pharmaceutical synthesis involves the use of oxazolones--chemical compounds with the molecular formula C3H3NO2-- to prepare aziridines. "Oxazolones are well-known for their versatility in affording biologically active compounds," explains Professor Shuichi Nakamura from Nagoya Institute of Technology (NITech), Japan, who studies asymmetric reactions, "However, the enantioselective reactions of 2H-azirines with oxazolones have not been very fruitful, despite being touted as one of the most efficient methods to synthesize aziridines."

In a new study recently published in Organic Letters, Prof. Nakamura along with his colleagues from NITech and Osaka University, Japan, explored this issue and, in a significant breakthrough, managed to obtain aziridine-oxazolone compounds in high yields (99%) as well as high enantioselectivity or purity (98%). In addition, the team used an original catalyst they developed to catalyze the reactions they studied.

The team started off by heating α-azideacrylates at 150°C in an organic solvent tetrahydrofuran (THF) to prepare 2H-azirines and then reacted them with oxazolones in presence of various organocatalysts to produce different aziridine-oxazolone compounds. In particular, the team examined the effect of the catalyst cinchonine and various heteroarenecarbonyl and heteroarenesulfonyl groups in organocatalysts derived from cinchona alkaloids and found that reactions using catalysts with either a 2-pyridinesulfonyl group or an 8-quinolinesulfonyl group gave both a high yield (81-99%) as well high enantiopurity (93-98%). In addition, scientists observed that the reaction between a 2H-azirine containing an ethyl ester group and an oxazolone with a 3, 5-dimethoxyphenyl group in presence of the catalyst with 8-quinolinesulfonyl group also gave high yields (98-99%) as well as enantiopurity (97-98%).

The team then moved on to exploring the reaction between 2H-azirine with ethyl ester group and a wider variety of oxazolones in presence of the catalyst with 8-quinolinesulfonyl group. In all of the reactions they observed high yields (77-99%) and enantiopurities (94-99%) except one for the case of an oxazolone bearing a benzyl group and the catalyst with 2-pyridylsulfonyl group that only produced a moderate yield (61%) and purity (86%). Moreover, they were able to convert the obtained aziridines into various other enantiomers without any loss of purity.

Finally, the team proposed a catalytic mechanism and a transition state for the reaction of 2H-azirines with oxazolones in which the catalyst activates both the oxazolone and the 2H-azirine, which then react to give an "addition product" that, in turn, yields the aziridine with regeneration of the catalyst.

While the detailed mechanism is yet to be clarified, scientists are excited by their findings and look forward to the method's application in medicine and pharmacology. "It has the potential to provide people with new medicines and create new drugs as well as drug candidates that are currently difficult to synthesize. Moreover, the catalyst used in this study can be used for many other stereo-selective synthetic reactions," observes an optimistic Prof. Nakamura.

Some fascinating consequences to contemplate for sure!

Credit: 
Nagoya Institute of Technology

2D nanomaterial MXene: The perfect lubricant

image: The atomic layers can move relative to one another, reducing friction.

Image: 
TU Wien

You can lubricate a bicycle chain with oil, but what do you do with a Mars rover or a red-hot conveyor belt in the steel industry? Very special nanomaterials have now been studied by the TU Wien together with research groups from Saarbrücken (Germany), Purdue University in the USA and the Universidad de Chile (Santiago, Chile).

The material class of MXenes (pronounced "maxene") has caused quite a stir in recent years in connection with novel battery technologies. But it now turns out that they are also an excellent solid lubricant that is extremely durable and performs its task even under the most difficult conditions. These remarkable properties of MXenes have now been published in the renowned journal ACS Nano.

Like a stack of sheets of paper

Just like the carbon material graphene, MXenes belong to the class of so-called 2D materials: their properties are essentially determined by the fact that they are ultra-thin layers, single atomic layers, without strong bonds to the layer above or below.

"You first start with so-called MAX phases, which are special layer systems consisting of titanium, aluminium and carbon, for example," says Prof. Carsten Gachot, head of the Tribology Group at the Institute of Engineering Design and Product Development at TU Wien. "The crucial trick is to etch out the aluminium with hydrofluoric acid."

What then remains is a stack of atomically thin layers of titanium and carbon that lie loosely on top of each other, much like sheets of paper. Each layer is relatively stable on its own, but the layers can easily be shifted against each other.

This displaceability of the atomic layers among each other makes the material an excellent dry lubricant: without generating abrasion, extremely low-resistance sliding is made possible. The friction between steel surfaces could thus be reduced to one sixth - and with exceptionally high wear resistance: even after 100,000 movement cycles, the MXene lubricating layer still functions without problems.

This is perfect for use under difficult conditions: While lubricating oil would evaporate immediately in a vacuum during space missions, for example, MXene in the form of fine powder can also be used there.

Independent of atmosphere and temperature

"Similar things have been tried with other thin-film materials, such as graphene or molybdenum disulphide," says Carsten Gachot. "But they react sensitively to moisture in the atmosphere. Water molecules can change the bonding forces between the individual layers. With MXenes, on the other hand, this plays a lesser role."

Another decisive advantage is the heat resistance of MXenes: "Many lubricants oxidise at high heat and lose their lubricity. MXenes, on the other hand, are much more stable, and can even be used in the steel industry, where mechanically moving parts can sometimes reach a temperature of several hundred degrees Celsius," explains Gachot.

The powdery lubricant was investigated in several experiments at TU Wien by Dr. Philipp Grützmacher from Prof. Gachot's research group as well as at Saarland University in Saarbrücken and Purdue University in the USA. At the other end of the world, Prof. Andreas Rosenkranz in Chile played a major role in initiating and designing the work.

"There is also already great interest in these materials on the part of industry. We assume that such MXenes can soon be produced on a larger scale," says Carsten Gachot.

Credit: 
Vienna University of Technology