Earth

Take a bath 90 minutes before bedtime to get better sleep

image: Bathing 1-2 hours before bedtime can significantly increase your chances of getting a good night's rest.

Image: 
Cockrell School of Engineering, The University of Texas at Austin

Biomedical engineers at The University of Texas at Austin may have found a way for people to get better shuteye. Systematic review protocols -- a method used to search for and analyze relevant data -- allowed researchers to analyze thousands of studies linking water-based passive body heating, or bathing and showering with warm/hot water, with improved sleep quality. Researchers in the Cockrell School of Engineering found that bathing 1-2 hours before bedtime in water of about 104-109 degrees Fahrenheit can significantly improve your sleep.

"When we looked through all known studies, we noticed significant disparities in terms of the approaches and findings," said Shahab Haghayegh, a Ph.D. candidate in the Department of Biomedical Engineering and lead author on the paper. "The only way to make an accurate determination of whether sleep can in fact be improved was to combine all the past data and look at it through a new lens."

The paper explaining their method was recently published in the journal Sleep Medicine Reviews.

In collaboration with the UT Health Science Center at Houston and the University of Southern California, the UT researchers reviewed 5,322 studies. They extracted pertinent information from publications meeting predefined inclusion and exclusion criteria to explore the effects of water-based passive body heating on a number of sleep-related conditions: sleep onset latency -- the length of time it takes to accomplish the transition from full wakefulness to sleep; total sleep time; sleep efficiency -- the amount of time spent asleep relative to the total amount of time spent in bed intended for sleep; and subjective sleep quality.

Meta-analytical tools were then used to assess the consistency between relevant studies and showed that an optimum temperature of between 104 and 109 degrees Fahrenheit improved overall sleep quality. When scheduled 1-2 hours before bedtime, it can also hasten the speed of falling asleep by an average of 10 minutes.

Much of the science to support links between water-based body heating and improved sleep is already well-established. For example, it is understood that both sleep and our body's core temperature are regulated by a circadian clock located within the brain's hypothalamus that drives the 24-hour patterns of many biological processes, including sleep and wakefulness.

Body temperature, which is involved in the regulation of the sleep/wake cycle, exhibits a circadian cycle, being 2-3 degrees Fahrenheit higher in the late afternoon/early evening than during sleep, when it is the lowest. The average person's circadian cycle is characterized by a reduction in core body temperature of about 0.5 to 1 F around an hour before usual sleep time, dropping to its lowest level between the middle and later span of nighttime sleep. It then begins to rise, acting as a kind of a biological alarm clock wake-up signal. The temperature cycle leads the sleep cycle and is an essential factor in achieving rapid sleep onset and high efficiency sleep.

The researchers found the optimal timing of bathing for cooling down of core body temperature in order to improve sleep quality is about 90 minutes before going to bed. Warm baths and showers stimulate the body's thermoregulatory system, causing a marked increase in the circulation of blood from the internal core of the body to the peripheral sites of the hands and feet, resulting in efficient removal of body heat and decline in body temperature. Therefore, if baths are taken at the right biological time -- 1-2 hours before bedtime -- they will aid the natural circadian process and increase one's chances of not only falling asleep quickly but also of experiencing better quality sleep.

The research team is now working with UT's Office of Technology Commercialization in the hopes of designing a commercially viable bed system with UT-patented Selective Thermal Stimulation technology. It allows thermoregulatory function to be manipulated on demand and dual temperature zone temperature control that can be tailored to maintain an individual's optimum temperatures throughout the night.

Credit: 
University of Texas at Austin

Discovering how diabetes leads to vascular disease

image: The Navedo lab team is identifying how diabetes increases the risks of serious health conditions such as heart disease and stroke. From left to right are Debapriya Ghosh, Gopireddy Reddy, Arsalan Syed, Manuel Navedo, Madeline Nieves-Cintrón and Thanhmai Lee.

Image: 
UC Regents / UC Davis Health

A team of UC Davis Health scientists and physicians has identified a cellular connection between diabetes and one of its major complications -- blood vessel narrowing that increases risks of several serious health conditions, including heart disease and stroke.

The authors hope their work leads to diabetes treatments -- beyond blood sugar monitoring and insulin therapy -- that target the molecular source of its damaging effects on the vascular system.

The same team previously found that high blood glucose, the hallmark symptom of diabetes, activates an enzyme known as protein kinase A (PKA), which increases calcium channel activity and constricts blood vessels.

"This was a surprise, since PKA is typically associated with blood vessel widening and wasn't really on our radar," said senior author Manuel Navedo, professor of pharmacology at UC Davis Health. "We wanted to understand the molecular processes that created this opposite reaction."

For the new study, published in The Journal of Clinical Investigation, the Navedo lab team conducted a series of experiments on the effects of high glucose on cerebral blood vessels and arterial cells that control blood flow. The tests were conducted on a unique genetically modified mouse and two mouse models of diabetes that were developed at UC Davis for studies of cardiovascular health.

The researchers focused on the relationship between PKA and adenylyl cyclase (AC) -- an enzyme involved in cyclic AMP (cAMP) production, a cellular messenger with a critical role in vascular cell function. Their results showed that one AC in particular -- AC5 -- mediated cAMP and PKA activation, triggering increased calcium channel activity and blood vessel narrowing. They also found that AC5 was essential for blood-vessel constriction during diabetes.

The team now hopes to test the effects of the AC5 chain reaction in high-glucose conditions in human cells. This step could confirm it as a treatment target for reducing the vascular complications of diabetes, which can include eye, kidney, cerebral, gastrointestinal and cardiovascular disease.

"We see every day in our clinics the devastating impact of diabetes on the health and lives of our patients," said co-author Nipavan Chiamvimonvat, the Roger Tatarian Endowed Professor in Cardiovascular Medicine at UC Davis Health. "Our work brings into much clearer focus how high glucose can damage the vascular system and gives us a new target for blocking its effects."

Credit: 
University of California - Davis Health

NIH study links air pollution to increase in newborn intensive care admissions

NIH study links air pollution to increase in newborn intensive care admissions

Infants born to women exposed to high levels of air pollution in the week before delivery are more likely to be admitted to a newborn intensive care unit (NICU), suggests an analysis by researchers at the National Institutes of Health. Depending on the type of pollution, chances for NICU admission increased from about 4% to as much as 147%, compared to infants whose mothers did not encounter high levels of air pollution during the week before delivery. The study was led by Pauline Mendola, Ph.D., of the Epidemiology Branch at NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development. It appears in Annals of Epidemiology.

"Short-term exposure to most types of air pollutants may increase the risk for NICU admission," Dr. Mendola said. "If our findings are confirmed, they suggest that pregnant women may want to consider limiting their time outdoors when air quality advisories indicate unhealthy conditions."

Previous studies have linked elevated levels of certain kinds of air pollutants to higher risks for gestational diabetes and preeclampsia, a blood pressure disorder of pregnancy. Earlier research also has shown that infants born to women exposed to high levels of air pollutants are at risk for preterm birth, of being small for their gestational age at birth and of growing more slowly than normal in the uterus. Given these associations, the study authors sought to determine whether prenatal exposure to air pollution might increase the chance for NICU admission.

Researchers analyzed data from the Consortium on Safe Labor, which compiled information on more than 223,000 births at 12 clinical sites in the United States from 2002 to 2008. They linked records from more than 27,000 NICU admissions to data modified from the Community Multiscale Air Quality Modeling System, which estimates environmental pollution concentrations in the United States. Researchers matched air quality data in the area where each birth occurred to the week before delivery, the day before delivery and the day of delivery. They then compared these time intervals to air quality data two weeks before delivery and two weeks after delivery to identify risk of NICU admission associated with pollution levels.

The researchers also examined the odds of NICU admission associated with high concentrations of particulate matter (pollution particles) less than 2.5 microns in diameter (PM2.5). These types of particles originate from various sources, among them diesel and gasoline engines, power plants, landfills, sewage facilities and industrial processes. Exposure to high concentrations of organic compounds in the air was associated with a 147% increase in risk of NICU admission. Elemental carbon and ammonium ions presented similar increases in risk (38% and 39%, respectively), while exposure to nitrate compounds was associated with a 16% higher risk of NICU admission.

Chances of NICU admission increased significantly with exposures to traffic-related pollutants on the day before and the day of delivery, compared to the week before delivery: 4% and 3%, respectively, for an approximately 300 parts per million (ppm) increase in carbon monoxide; 13% and 9% for an approximately 26 ppm increase in nitrogen dioxide; and 6% and 3% an approximately 3 ppm increase in sulfur dioxide.

Researchers do not know why exposure to air pollution might increase the chances for NICU admission. They theorize, however, that pollutants increase inflammation, leading to impaired blood vessel growth, particularly in the placenta, which supplies oxygen and nutrients to the developing fetus.

The authors note that rising NICU admission rates present financial challenges for families and society, as average daily NICU costs may reach or exceed $3,000. If their results are confirmed by other studies, limiting pregnant women's exposure to high levels of air pollutants may provide a way to reduce NICU admissions.

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

Is 2016 US presidential election associated with preterm births among Latina women?

Bottom Line: A national population-based study suggests the 2016 U.S. presidential election may have been associated with an increase in preterm births among Latina women in the United States. The design of the study is used to evaluate whether policies or other population-level changes interrupt a trend in an outcome. Using data on birth counts from 2009 through July 2017 from the Centers for Disease Control and Prevention, researchers compared preterm births (less than 37 weeks) to Latina women after the 2016 presidential election with the number expected had the election not taken place. Among nearly 32.9 million live births recorded during the study period, 11% of males and 9.6% of female births to Latina women were preterm compared with 10.2% and 9.3%, respectively, to other women. In the nine-month period beginning in November 2016, an additional 1,342 male and 995 female preterm births to Latina women were found above the expected number of preterm births, which is about 3.2% to 3.6% more. This study cannot identify the reasons behind the findings and other limitations of the study include an inability to differentiate between native and nonnative Latina women in the U.S. The authors suggest future research look at the association of anti-immigration policies with population health.

Authors: Alison Gemmill, Ph.D., of the Johns Hopkins Bloomberg School of Public Health, Baltimore, and coauthors

(doi:10.1001/jamanetworkopen.2019.7063)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

X-ray laser sight reveals drug targets

image: Illustration.

Image: 
Elena Khavina/MIPT Press Office

Researchers from the Moscow Institute of Physics and Technology have published a review on serial femtosecond crystallography, one of the most promising methods for analyzing the tertiary structure of proteins. This technique has rapidly evolved over the past decade, opening new prospects for the rational design of drugs targeting proteins previously inaccessible to structural analysis. The article came out in the journal Expert Opinion on Drug Discovery.

X-ray crystallography

X-ray crystallography is one of the main methods for revealing the 3D structure of biological macromolecules, such as proteins. It has helped determine the structure of numerous pharmacologically important enzymes and receptors, enabling the design of drugs targeting these proteins.

The method involves crystallizing a protein and studying it via X-ray diffraction. First the protein is isolated and purified. Then the solvent gradually dries out. As a result, the molecules whose structure is being investigated form crystals, characterized by an internal order. By exposing a crystal to X-rays in a special device, researchers obtain a diffraction pattern. It contains information on the positions of atoms in the crystal. A close analysis of the pattern reveals the 3D structure of the constituent protein molecules.

Before the advent of this method, new drugs were mostly sought empirically: either by changing the structure of the molecules known to affect the target protein, or by sorting through arrays of molecules in chemical libraries. Now that the 3D structures of many target proteins are available, researchers can view them on a computer screen and quickly sort through millions of compounds searching for drug candidates. That way they save much time and money previously spent on chemical synthesis and "wet" experiments.

X-ray crystallography produces good results for crystals that are large, stable, and homogeneous -- that is, with no impurities or structural defects. To better detect a weak diffraction signal, a powerful radiation pulse is needed, but not so powerful as to destroy the crystal. In conventional X-ray crystallography, a protein crystal is rotated in the X-ray beam to produce diffraction patterns for various spatial orientations. This captures maximum information on the structure.

Method for tricky targets

Soon after X-ray crystallography emerged, it became clear that not all biological macromolecules can be crystallized. Some proteins are ordinarily dissolved in the inner cell medium. So it is fairly easy to put them into solution, evaporate it, and obtain a large regular crystal. But membrane proteins, many receptors among them, form crystals that are not large and pure enough for standard X-ray crystallography. That said, many of these proteins are involved in disease development, meaning their structure is of great interest to pharmacologists.

Less than a decade ago, a solution was found for membrane proteins. This new technique, called serial femtosecond X-ray crystallography, or SFX, relies on X-ray free-electron lasers (fig. 1), developed shortly before SFX.

Alexey Mishin, deputy head of the Laboratory for Structural Biology of Receptors at MIPT, who co-authored the study, explained: "What makes it a breakthrough technology is a very high energy density of the laser pulse. The object is exposed to such powerful radiation that it falls apart, inevitably and almost instantly. But before it does, some individual quanta of the laser pulse scatter off the sample and end up at the detector. This is the so-called diffraction-before-destruction principle for studying the structure of the original protein."

X-ray free-electron lasers proved useful outside biology: Over the past years, SFX has been used increasingly often by physicists and chemists, too. The first device became available to experimenters in 2009, and now there are five centers open to researchers in the U.S., Japan, South Korea, Germany, and Switzerland. A new one is being built in China, and the U.S. facility -- historically the first one -- has announced plans for modernization.

While the new technology has offered researchers a glimpse into the structure of proteins previously eluding analysis, it has also promoted novel technical and mathematical solutions. Conventional X-ray crystallography involves exposing one crystal to radiation from various angles and analyzing the resulting diffraction patterns collectively. In SFX, the crystal is instantly destroyed by the first interaction with a powerful X-ray pulse. So researchers need to repeat the process with many small crystals and analyze the "serial" data thus generated, hence the name of the method.

A further challenge is selecting the samples for SFX. In conventional X-ray crystallography, simply choosing the largest and highest-quality crystal was the way to go. This could be done manually, by eyeing the available samples. The new procedure requires working with a suspension of many small crystals of varying size and quality. Centrifuges and filters with known pore dimensions are used to separate the crystals by size.

Ways for placing samples into the chamber had to be elaborated, too. X-ray free-electron lasers have a certain maximum frequency at which they can emit radiation pulses. To reduce the expenses and time consumption, new crystals should be fed into the chamber at the same frequency. So far two approaches have been developed for doing this. Under the first one, the crystals enter the chamber in a liquid suspension, supplied by an injector. The jet leaving the injector is "squeezed" by a stream of gas to ensure correct sample placement. That is, when passing through, a crystal ends up precisely at the center of the laser beam (fig. 2, left). Alternatively, the protein crystals can be spread over a substrate transparent to X-rays and automatically fed into the laser beam before each pulse (fig. 2, right).

Since producing its first results in 2011, SFX has revealed over 200 protein structures. Among them are 51 targets potentially important for pharmacology -- membrane receptors, ferments, viral proteins, etc. -- that used to be inaccessible to conventional analytical techniques.

The systematic review of the technology as applied to biology and pharmacology by the MIPT team will no doubt aid other researchers seeking to obtain the structures of key drug targets to develop new medications.

Credit: 
Moscow Institute of Physics and Technology

Smart irrigation model predicts rainfall to conserve water

image: A predictive model combining information about plant physiology, real-time soil conditions and weather forecasts can save 40% of the water consumed by traditional irrigation strategies, according to new Cornell research.

Image: 
Cornell University

ITHACA, N.Y. - Fresh water isn't unlimited. Rainfall isn't predictable. And plants aren't always thirsty.

Just 3 percent of the world's water is drinkable, and more than 70 percent of that fresh water is used for agriculture. Unnecessary irrigation wastes huge amounts of water - some crops are watered twice as much as they need - and contributes to the pollution of aquifers, lakes and oceans.

A predictive model combining information about plant physiology, real-time soil conditions and weather forecasts can help make more informed decisions about when and how much to irrigate. This could save 40 percent of the water consumed by more traditional methods, according to new Cornell University research.

"If you have a framework to connect all these excellent sources of big data and machine learning, we can make agriculture smart," said Fengqi You, energy systems engineering professor.

You is the senior author of "Robust Model Predictive Control of Irrigation Systems With Active Uncertainty Learning and Data Analytics," which published in IEEE Transactions on Control Systems Technology. The paper was co-authored with Abraham Stroock, professor of chemical and biomolecular engineering, who is working on water conservation strategies with apple farmers in New York state and almond, apple and grape growers in drought-ridden regions of the West Coast.

"These crops, when grown in the semiarid, semidesert environment of California's Central Valley, are huge consumers of water - one gallon of water per almond," Stroock said. "So there's a real opportunity to improve the way we manage water in these contexts."

Controlling plant moisture precisely could also improve the quality of sensitive specialty crops such as wine grapes, he said.

The researchers' method uses historical weather data and machine learning to assess the uncertainty of the real-time weather forecast, as well as the uncertainty of how much water will be lost to the atmosphere from leaves and soil. This is combined with a physical model describing variations in the soil moisture.

Integrating these approaches, they found, makes watering decisions much more precise.

Part of the challenge of the research is identifying the best method for each crop, and determining the costs and benefits of switching to an automated system from a human-operated one. Because apple trees are relatively small and respond quickly to changes in precipitation, they may not require weeks or months of weather data. Almond trees, which tend to be larger and slower to adapt, benefit from longer-term predictions.

"We need to assess the right level of complexity for a control strategy, and the fanciest might not make the most sense," Stroock said. "The experts with their hands on the valves are pretty good. We have to make sure that if we're going to propose that somebody invest in new technology, we've got to be better than those experts."

Credit: 
Cornell University

Predicting long-term risk of death from chest X-rays

What The Study Did: Researchers in this study looked at whether a computing system that analyzed data from thousands of chest x-rays of smokers and nonsmokers and developed a risk score could predict long-term risk of death.

Authors: Michael T. Lu, M.D., M.P.H., of Massachusetts General Hospital and Harvard Medical School in Boston is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.7416)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Routine blood tests could predict diabetes

image: Dr. Mary Rhee, a physician-researcher with the Atlanta VA and Emory University, discusses the use of a glucometer with VA patient Joseph Fields.

Image: 
Lisa Pessin

Random plasma glucose tests could be used to predict which patients will develop diabetes, according to a study of Veterans Affairs treatment data. Researchers from several VA systems showed that levels of glucose found during standard outpatient medical testing revealed patients' likelihood of developing diabetes over the next five years, even when glucose levels did not rise to the level of diabetes diagnosis.

The findings could lead to earlier treatment and better outcomes. The research shows that glucose levels that normally would not be seen as indicating diabetes risk can in fact predict the disease's development. According to the researchers, using plasma glucose levels in blood samples taken during regular doctor visits "could signal the need for further testing, allow preventive intervention in high risk individuals before onset of disease, and lead to earlier identification of diabetes."

The results appear in the July 19, 2019, issue of PLOS ONE.

Diabetes is a major health problem in the United States, yet over 7 million Americans with diabetes go undiagnosed, according to the Centers for Disease Control and Prevention. Early diagnosis allows the use of lifestyle changes or medications that could help prevent or delay the progression from prediabetes to diabetes and help keep diabetes from worsening. When diagnosis is delayed, diabetes-related complications could develop before treatment starts.

"Although screening for prediabetes and diabetes could permit earlier detection and treatment, many in the at-risk population do not receive the necessary screening," notes Dr. Mary Rhee, lead author on the study and a physician-researcher with the Atlanta VA Health Care System and Emory University.

The American Diabetes Association recommends testing for diabetes using a few different methods: a fasting glucose level, an oral glucose tolerance test (which requires fasting and ingestion of a glucose load), a HgbA1c level (a measure of average blood glucose levels over the previous two to three months), or a random plasma glucose when accompanied by symptoms caused by high glucose levels. A random plasma glucose of 200 mg/dL (milligrams per deciliter) or higher is usually the threshold for a diagnosis of diabetes. It can be done at any time, and does not require fasting or withholding meals.

These tests are frequently included in routine labwork patients undergo during or after outpatient medical appointments. However, since many patients are not fasting when they get their blood drawn, routine blood tests with random glucose levels below 200 mg/dL have not previously been deemed useful for diabetes screening.

VA researchers examined data on these routine blood tests to see whether random plasma glucose levels could in fact predict which patients would develop diabetes in the future. They studied data on more than 900,000 VA patients who were not already diagnosed with diabetes. All patients had at least three random plasma glucose tests during a single year. Most of these tests were likely obtained "opportunistically"--that is, during regular doctor visits not specifically related to diabetes screening.

Over a five-year follow-up, about 10% of the total study group developed diabetes. Elevated random plasma glucose levels, though not meeting the diagnostic threshold for diabetes, accurately predicted the development of diabetes within the following five years. Patients with at least two random plasma glucose measurements of 115 mg/dL or higher within a 12-month period were highly likely to be diagnosed with diabetes within a few years. Glucose levels of 130 mg/dL or higher were even more predictive of diabetes.

As expected, demographics and risk factors known to be related to diabetes also predicted development of the disease. Demographic factors included age, sex, and race. Other risk factors that predicted diabetes included a high body mass index (BMI), smoking, and high cholesterol. However, random plasma glucose tests were stronger predictors of diabetes than demographics or other risk factors, alone or combined.

Development of diabetes was infrequent in subjects whose highest random plasma glucose levels were below 110 mg/dL.

In light of these findings, the researchers recommend that patients receive follow-up diagnostic testing for diabetes, such as a fasting glucose or A1c test, if they have two random glucose tests showing levels 115 mg/dL or higher. This approach would very likely be cost-effective, they say, because clinicians can use testing that is already being done during routine outpatient visits. Using random glucose levels for screening could identify patients at higher risk for diabetes, leading to earlier intervention to prevent or control the disease.

"These findings have the potential to impact care in the VA and in the general U.S. population," explains Rhee, "as random plasma glucose levels--which are convenient, low-cost, and 'opportunistic'--could appropriately prompt high-yield, focused diagnostic testing and improve recognition and treatment of prediabetes and early diabetes."

Credit: 
Veterans Affairs Research Communications

Understanding the mode of action of the primaquine: New insights into a 70 year old puzzle

Researchers at LSTM have taken significant steps in understanding the way that the anti-malarial drug primaquine (PQ) works, which they hope will lead to the development of new, safer and more effective treatments for malaria.

The work, which was predominantly funded following an award from the Medical Research Council, has been carried out by the LSTM's Centre for Drugs and Diagnostic (CDD). The results are published in the journal Nature Communications.

LSTM's Professor Giancarlo Biagini explained: "The antimalarial primaquine is a cornerstone of global efforts to eliminate malaria, for some 70 years it has been the only drug registered that has been demonstrated to be able to cure relapsing malaria and block transmission of the disease. However, little has previously been understood about the drug's mode of action, which is seriously undermining efforts to improve the safety and pharmacology profile of this important drug class."

The team at LSTM, working with key collaborators including Professor Paul O'Neill (University of Liverpool), Professor David Baker (London School of Hygiene and Tropical Medicine) and Professor Sangeeta Bhatia (Massachusetts Institute of Technology, USA) were able to replicate the interaction between the drug and the host enzymes which catalyse the generation of cytotoxic amounts of hydrogen peroxide from metabolites of PQ. The experiments were able to demonstrate why the drug displays exquisite selectivity against specific parasite stages and also explains why only very small (nM) catalytic concentrations of metabolites are necessary to kill the parasites.

The search for new antimalarials is vital in the drive towards global malaria elimination, especially given that PQ is potentially lethal to people with the genetic disorder glucose-6-phoshate dehydrogenase (G6PD) deficiency, which affects millions of people in malaria-endemic countries.

"This is why an understanding of how the drug works is central to replicating its most significant elements." Continued Professor Biagini: "This work has been possible with CDD given the multidisciplinary nature of the team. The current study makes a significant advancement in our understanding of PQ mechanism of action. This new knowledge is key to the development of newer and safer, broad-spectrum antimalarial drugs, work currently underway within our group."

Credit: 
Liverpool School of Tropical Medicine

Using mathematics to trace neuro transitions

image: Neuroscience journal cover inspired by Francesca Arese Lucini herself. It shows how the conscious activity in the brain (depicted as the frontal cortex at the top of the image) evanesces into the unconscious part of the brain which is in the lower part.

Image: 
Journal of Neuoscience

Unique in its application of a mathematical model to understand how the brain transitions from consciousness to unconscious behavior, a study at The City College of New York's Benjamin Levich Institute for Physico-Chemical Hydrodynamics may have just advanced neuroscience appreciably. The findings, surprisingly by physicists, suggest that the subliminal state is the most robust part of the conscious network and appear on the cover of the journal "Neuroscience."

"It's a big step forward in terms of analysis and investigation of the brain," said City College PhD student Francesca Arese Lucini of the research, entitled "How the Brain Transitions from Conscious to Subliminal Perception," that she led. "A lot of companies and startups are now focused on understanding more on how (people) work by focusing a lot of energy and money on investigating the brain. This study will probably interest those entrepreneurs or researchers that will focus on artificial intelligence and human computer interaction devices."

Arese Lucini and a team of physicists analyzed functional MRI imaging of different subjects in order to obtain the corresponding brain activation. From the brain activation they obtained functional networks for the conscious state and compared this network to the subliminal state for each fMRI stream.

A reduction of the conscious network, by a k-core decomposition, left the k-core only, the maximal subgraph of nodes of the network with at least k neighbors. The k-core of the conscious state is reduced to three active regions of the brain, the fusiform gyrus (left and right) and the precentral gyrus. These regions are the only active in the subliminal state.

"This implies that the k-core decomposition model could help identify in mathematical term what is the subliminal state of the brain and according to such model, the subliminal state, is identified as the most robust part of the conscious network," observed Arese Lucini. "We can then postulate that the nodes of the network which are involved in the process of conscious perception are built on top of the subliminal state which represents the core of the network."

Highlighting the significance of the research, Hernán A. Makse, professor in CCNY's Division of Science and a Fellow of the American Physical Society, pointed at its publication in a top neuroscience journal, a rarity for a paper by physicists.

Credit: 
City College of New York

Study finds key metabolic changes in patients with chemotherapy-associated cardiotoxicity

BOSTON - More and more patients are being treated successfully for cancer. However, some cancer treatments that are very effective for breast cancer - medications like anthracyclines and trastuzumab - can cause heart dysfunction and lead to heart failure. Heart-related side effects can limit the amount of cancer therapy that patients are eligible to receive. Currently, there is no effective way of predicting which patients will develop heart dysfunction during or after receiving these medications.

To learn more about the processes that lead to heart toxicity, a team of researchers at Beth Israel Deaconess Medical Center (BIDMC) embarked on a study to investigate whether early changes in energy-related metabolites in the blood - measured shortly after chemotherapy - could be used to identify patients who developed heart toxicity at a later time. The study, published in the Journal of Cardiovascular Translational Research, found that metabolites associated with the energy powerhouse of the cell - the mitochondria - changed differently in patients who later developed heart dysfunction compared to those who did not.

Using blood samples obtained from 38 patients treated with anthracyclines and trastuzumab for breast cancer, the researchers measured 71 energy-related metabolites. They then compared metabolite profiles between patients who developed heart toxicity and those who did not, identifying changes in citric acid and aconitic acid that differentiated the two groups of patients.

"In particular, levels of citric acid increased over time in patients who did not develop heart toxicity, but they remained the same or decreased in patients who did develop heart toxicity," said corresponding author Aarti Asnani, MD, Director of the Cardio-Oncology Program at BIDMC. "The ability to augment citric acid and related metabolites may be a protective response that is absent or defective in patients with heart toxicity." The researchers also observed changes in breakdown products of DNA that differentiated the two groups of patients.

"We hope these findings will ultimately lead to the development of biomarkers that could be used to determine which patients are at the highest risk of developing chemotherapy-related heart toxicity," said Asnani. "Identification of high-risk patients could allow us to consider medications that protect the heart before patients begin chemotherapy, or prompt the use of different chemotherapy regimens that are less toxic to the heart in those patients."

In their next phase of research to follow up on this pilot study, Asnani and colleagues will seek to confirm their results in larger patient populations.

Credit: 
Beth Israel Deaconess Medical Center

Adding a polymer stabilizes collapsing metal-organic frameworks

image: Polymer braces, placed inside large-pore MOFs, help to inhibit the collapse of the framework.

Image: 
Li Peng (EPFL)

Metal-organic frameworks (MOFs) are a special class of sponge-like materials with nano-sized pores. The nanopores lead to record-breaking internal surface areas, up to 7800 m2 in a single gram. This feature makes MOFs extremely versatile materials with multiple uses, such as separating petrochemicals and gases, mimicking DNA, hydrogen production and removing heavy metals, fluoride anions, and even gold from water - to name a few.

One of the key features is pore size. MOFs - and other porous materials - are classified based on the diameter of their pores: MOFs with pores up to 2 nm in diameter are called "microporous", and anything above that is called "mesoporous". Most MOFs today are microporous, so they are not useful in applications that require them to capture large molecules or catalyze reactions between them - basically, the molecules don't fit the pores.

So more recently mesoporous MOFs have come into play, because they show a lot of promise in large-molecule applications. Still, they aren't problem-free: when the pore sizes get into the mesoporous regime - they tend to collapse. Understandably, this reduces the internal surface area of mesoporous MOFs and, with that, their overall usefulness. Since a major focus in the field is finding innovative ways to maximize MOF surface areas and pore sizes, addressing the collapsing problem is top priority.

Now, Dr Li Peng a postdoc at EPFL Valais Wallis has solved the problem by adding small amounts of a polymer into the mesoporous MOFs. Because the polymer pins the MOF pores open, adding it dramatically increased accessible surface areas from 5 to 50 times. The study was led by the research group of Wendy Lee Queen, in collaboration with the labs of Berend Smit and Mohammad Khaja Nazeeruddin at EPFL's Institute of Chemical Sciences and Engineering (ISIC).

After adding the polymer to the MOFs, their high surface areas and crystallinity were maintained even after heating the MOFs at 150°C - temperatures that would previously be unreachable due to pore collapse. This new stability provides access to many more open metal coordination sites, which also increases the reactivity of the MOFs.

In the study, published in the Journal of the American Chemical Society, two PhD students, Sudi Jawahery and Mohamad Moosavi, use molecular simulations to investigate why pores collapse in mesoporous MOFs in the first place, and also propose a mechanism to explain how polymers stabilize their structure on a molecular level.

"We envision that this method for polymer-induced stabilization will allow us to make a number of new mesoporous MOFs that were not before accessible due to collapse," says Queen. "Hence, this work can open up new, exciting applications involving the separation, conversion, or delivery of large molecules."

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Tornadoes, windstorms pave way for lasting plant invasions

image: New research from the University of Illinois shows invasive plants thrive and persist more than a decade after windstorms hit.

Image: 
Melissa Daniels

URBANA, Ill. - When tornadoes touch down, we brace for news of property damage, injuries, and loss of life, but the high-speed wind storms wreak environmental havoc, too. They can cut through massive swaths of forest, destroying trees and wildlife habitat, and opening up opportunities for invasive species to gain ground.

A new University of Illinois study, published in the Journal of Ecology, shows that large blowdown areas in southern Illinois forests are more heavily invaded and slower to recover than smaller areas. The research guides management decisions for windstorm-prone forests.

"We used satellite imagery and grueling on-the-ground surveys to look at what was happening with invasive plants after a series of windstorms - a tornado in 2006, a derecho in 2009, and another tornado in 2017 - hit southern Illinois forests," says Eric Larson, assistant professor in the Department of Natural Resources and Environmental Sciences at U of I and co-author on the study. "We assume the forest recovers and those invaders get shaded out, but they may not. They could potentially prevent forest recovery or spread into surrounding areas."

Melissa Daniels, a former graduate student who led the project, adds, "Forest health impacts all of us. Forests provide a lot of important ecosystem services, including biodiversity and carbon sequestration, things that are important for our well-being as a society. We should all care about phenomena that impact our forests."

Larson and Daniels identified blowdown areas after each of the three storms using Landsat satellite imagery. For each affected area, the team identified a matching forest parcel, similar in tree type, size, elevation, slope, and distance to roads and trails, that had not been impacted by storms. Then Daniels visited all 62 sites in the summer of 2018 to survey for invasive plants.

"It was hands-down one of the most difficult things I've ever done, and I do a lot of difficult things, like ski mountaineering, long backpacking trips, and mountain climbing. Illinois summers are brutal, but they're even more difficult when you are walking up steep hills, climbing over logs that are half your height, pulling off two dozen ticks a day, and battling stinging nettle and poison ivy," Daniels says. "But it's so beautiful and we were seeing places that most people will never see."

Once she reached the sites, Daniels identified and measured invasive plant cover and took readings of the tree canopy. It was immediately clear that storm-damaged areas, especially recent ones, were brighter and more open than unaffected sites, offering more light to understory invaders such as multiflora rose, Japanese honeysuckle, Amur honeysuckle, autumn olive, and Oriental bittersweet.

Comparing the storm-affected areas from 2006, 2009, and 2017, invasive species cover decreased and tree cover increased over time.

"Since invasive plants decreased over time, some might interpret that as the forest healing on its own, and that they don't need to worry about it. The issue with that is, of our top five invaders, all of them demonstrate the ability to grow under closed canopy conditions. That means that even if they are being shaded out over time, they have the ability to spread into and persist in adjacent forest, potentially affecting forest regeneration," Daniels says.

Not surprisingly, the storm-damaged areas also were significantly more invaded than unaffected matching parcels, even 12 years after the first tornado hit. And larger damaged parcels were slower to recover, both in terms of decreasing invasions and increasing tree cover. The results suggest a couple of practical management recommendations.

"If you need to preferentially spend money on invasive species management, it makes more sense to focus treatment in larger blowdown areas," Daniels says. "We also found that the most common invaders in the blowdowns were also the most common invaders in unaffected forest, suggesting the disturbance is just releasing what's already present in the system. Therefore, we recommend land managers focus their treatment on the most abundant invasives in their system."

Although the research focused on forests in southern Illinois, specifically Shawnee National Forest, Giant City State Park, and Crab Orchard National Wildlife Refuge, Daniels and Larson suggest the general patterns - more invasive species and greater persistence in larger canopy gaps - are likely relevant anywhere major windstorms affect forested areas.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Scientists discover how and when a subterranean ocean emerged

image: View of the Komati River in Barberton Mountain Land (South Africa).

Image: 
Alexander Sobolev

"The mechanism which caused the crust that had been altered by seawater to sink into the mantle functioned over 3.3 billion years ago. This means that a global cycle of matter, which underpins modern plate tectonics, was established within the first billion of the Earth's existence, and the excess water in the transition zone of the mantle came from the ancient ocean on the planet's surface," said project leader and co-author of the article Alexander Sobolev, a member of the Russian Academy of Sciences (RAS) and Doctor of Geological and Mineralogical Sciences who is a Professor at Vernadsky Institute for Geochemistry and Analytical Chemistry under the Russian Academy of Sciences.

The Earth's crust consists of large continuously moving blocks known as tectonic plates. Mountains are produced when these plates collide and rise up, and the shock of the collisions leads to earthquakes and tsunamis. These plates move very actively under the World Ocean: old oceanic crust, including the minerals that have absorbed seawater, sinks deep into the Earth's mantle. Some of this water is released again due to the effect of high temperatures and plays a role in volcanic eruptions, such as those that occur in Kamchatka, the Kuril islands and Japan. The water that remains in minerals of the oceanic crust at higher temperatures continues to descend into the deep mantle and accumulates at a depth of 410-660 km in the structure of the minerals wadsleyite and ringwoodite and high-pressure modifications of olivine (magnesium iron silicate), the main mineral of the mantle. Experiments have shown that these minerals can contain significant quantities of water and chlorine. This is how the greatest part of the World Ocean could be "pumped" into the planet's interior over the billions of years of its existence.

This process is only a part of the global cycle of the Earth's matter, which is called convection and underpins plate tectonics, a feature that distinguishes our planet from all other bodies in the Solar System. Many scientists study this mechanism, trying to understand at which stage of the Earth's history it appeared.

In order to study the mantle of our planet and investigate its composition, geochemists (scientists who specialise in the chemical composition of the Earth and the processes of rock formation) use samples of volcanic rocks that consist of solidified magma of the mantle. This is a silicate melt enriched in volatile components, such as water, carbon dioxide, chlorine and sulphur. There are different types of magma: scientists commonly use basaltic lava (with a temperature of approximately 1200°C), but komatiitic magma, which was erupted during the early history of the Earth, is hotter (at 1500-1600°C). It can help to describe the evolution of the Earth's inner layers, as it matches the composition of the mantle more completely.

Komatiites are a type of volcanic rock that formed from komatiitic magma billions of years ago and whose composition has changed dramatically in the intervening epochs. It no longer provides information about the content of volatile components, such as water and chlorine. But these rocks still contain remnants of the magmatic mineral olivine, which trapped inclusions of solidified magma during the crystallisation process and protected them from subsequent changes. Such inclusions, just tens of microns across, retained detailed information about the composition of komatiitic melts, including the content of water and chlorine and the isotopic composition of hydrogen. In order to extract this information, inclusions of solidified magma must be heated to the natural melting point of over 1500°C and then immediately tempered to produce clear tempered glass that can later be used for chemical analyses.

In 2016, an international group headed by scientists from the Vernadsky Institute for Geochemistry and Analytical Chemistry studied komatiitic magma of the Abitibi greenstone belt in Canada, which is 2.7 billion years old. Greenstone belts are territories consisting of magmatic rocks that contain greenish minerals. This was the first article that the team published in Nature as part of the project supported by the Russian Science Foundation grant. At that time, the scientists collected initial data on the content of water and a variety of labile elements, such as chlorine, lead and barium, in the transition zone between the upper and lower mantle layers at a depth of 410-660 km, which led them to hypothesise that an ancient subterranean water reservoir once existed that was comparable in mass to the present-day World Ocean. The scientists believe that such a quantity of water was accumulated at the early stages of the Earth's development.

"In the new article, we presented geochemical data indicating that the cycle of global immersion of oceanic crust into the mantle began much earlier than most experts believed, and it could have functioned as early as the first billion years of the Earth's history," noted Alexander Sobolev.

In the course of the work, the scientists once again investigated the composition of komatiite magma, but of a different origin: it was collected from the Barberton greenstone belt in South Africa, which is 3.3 billion years old. The magma was heated using a specialised high-temperature apparatus that can withstand temperatures of up to 1700°C. The geochemists found out that the previously discovered deep water-containing reservoir was already present in the Earth's mantle in the Palaeoarchaean era, 600 million years earlier than established in the previous study.

Credit: 
AKSON Russian Science Communication Association

Simulations fix the cracks in magnetic mirrors

When ring-shaped electromagnets are set up in linear arrangements, they can produce magnetic fields resembling a tube with a cone at each end; a structure which repels charged particles entering one cone back along their path of approach. Referred to as 'magnetic mirrors', these devices have been known to be a relatively easy way to confine plasma since the 1950s, but they have also proven to be inherently leaky. In a study published in EPJ D, physicists led by Wen-Shan Duan at Northwest Normal University, and Lei Yang at the Chinese Academy of Sciences, both in Lanzhou, China, show that these plasma leaks can be minimised if specific conditions are met. Using computer simulations, the physicists analysed the dynamic properties of a high-energy proton plasma beam within a magnetic mirror and fine-tuned the simulation settings to maximise its confinement.

Firstly, Duan, Yang, and their colleagues varied the 'mirror ratio' - defined as the strongest magnetic field in the mirror (at the tip of each cone), divided by the weakest field (on the surface of the tube). They found that higher mirror ratios, which can be achieved using finely-tuned electromagnet configurations, directly corresponded to longer confinement times and lower loss rates. Secondly, the team found that the initial conditions of the plasma beam itself had an important effect, including its density, temperature, velocity, and trajectory. When each of these properties were optimised, the simulated high-energy beam moved in a tight spiral pattern within the mirror, ensuring maximum confinement.

The insights gathered by Duan and Yang's team could solve a decades-old problem of low plasma confinement times and high loss rates in magnetic mirrors. This could make them ideal for intriguing new particle physics experiments, including the production and confinement of antihydrogen atoms and electron-positron plasmas, as well as the deceleration of high-energy antiprotons.

Credit: 
Springer