Tech

Fluorine enables separation-free 'chiral chromatographic analysis'

image: Mechanism and utilities of the 19F NMR-based separation-free enantiodifferentiation.

Image: 
ZHAO Yanchun

Researchers from the Shanghai Institute of Organic Chemistry of the Chinese Academy of Sciences recently developed a new platform for rapid chiral analysis, producing chromatogram-like output without the need for separation. The study was published in Cell Reports Physical Science.

Molecules not superimposable on their mirror images are chiral and play essential roles in the pharmaceutical industry, material sciences, and the origin of life. Distinguishing between mirror-image molecules is in great demand, but existing approaches usually achieve this goal with compromise in accuracy or efficiency.

Specifically, chromatographic methods resolve chiral molecules through easily interpretable chromatogram peaks, but require time-consuming separation. Chiroptical responsive systems allow in-situ chiral analysis, but are often plagued by sample interference issues.

In this study, the researchers developed a series of 19F-labeled chiral aluminum complexes that reversibly bind to various Lewis basic analytes.

The inclusion of chiral analytes to the confined binding pocket produced chromatogram-like 19F NMR signals, allowing rapid and unambiguous chiral analysis.

The method is operationally simple and effectively resolved a wide range of chiral molecules, including alcohols, ethers, amides, carbamates, oxazolidinones, sulfoxides, and sulfoximines.

The study shows that simultaneous analysis of three different classes of chiral compounds can be achieved in the absence of separation.

According to the researchers, the new method may also be used to determine the enantiopurity of crude reaction products and the assignment of absolute configurations. When used in conjunction with an autosampling NMR spectrometer, greater than 1,000 chiral analyses can be performed in a day.

"Such a method has the potential to simultaneously identify multiple analytes and existing interferences, which may eventually lead to real-time chiral analysis in complex biologically relevant systems," said ZHAO Yanchun, corresponding author of the study.

Credit: 
Chinese Academy of Sciences Headquarters

A path to new nanofluidic devices applying spintronics technology

Researchers in the ERATO Saitoh Spin Quantum Rectification Project in the JST Strategic Basic Research Programs have elucidated the mechanism of the hydrodynamic power generation using spin currents(1) in micrometer-scale channels, finding that power generation efficiency improves drastically as the size of the flow is made smaller.

In a microchannel, the flow takes on a state referred to as laminar flow (2), where a micro-vortex-like liquid motion is distributed widely and smoothly throughout the channel. This leads to properties that are more suitable to miniaturization, and an increase in power generation efficiency. Group leader Mamoru Matsuo, et al., predicted the basic theory of fluid power generation using spin currents in 2017, and in this present study, the researchers experimentally demonstrate the fluid power generation phenomenon in the laminar flow region. As a result of experiments, they confirm that in the laminar flow region, energy conversion efficiency was increased by approximately 100,000 times.

The characteristics of the spin fluid power generation phenomenon in laminar flows that they elucidate in this research are that an electromotive force proportional to flow velocity can be obtained, and that conversion efficiency increases as flow size decreases. Also, whereas hydroelectric power generation (also known as fluid power generation) and magnetohydrodynamic power generation(3) require additional equipment such as turbines and coils, the phenomenon in the research requires almost no additional equipment, both inside and outside of the flow channel. Due to these characteristics, application to spintronics-based nanofluidic devices such as liquid metal flow cooling mechanisms in fast breeder reactors or semiconductor devices, as well as application to flowmeters that electrically measure micro-flows, can be hoped for.

(1) Spin current

The flow of spin angular momentum. For example, electrons have a charge (an electrical degree of freedom) and a spin angular momentum (a magnetic degree of freedom), where the flow of the former is called an electric current and the flow of the latter is called a spin current.

(2) Laminar flow

Flow within a channel is characterized primarily by flow-velocity, size and viscosity. In a low-velocity flow in a small-sized channel, viscosity dominates, and the fluid will flow regularly, and in layers, along the channel axis. This is referred to as laminar flow.

(3) Magnetohydrodynamic power generation

When a charged particle moves in a magnetic field, it is subjected to a force (Lorentz force) that is perpendicular to both the particle's direction of motion and the direction of the magnetic field. Particles with charges of the same polarity (positive or negative) are subjected to a force in the same direction, and move in one direction. As a result, electric charge accumulates at the destination of the particles' movement. Magnetohydrodynamic power generation is a power-generation method that uses the potential difference (electromotive force) generated from this accumulation.

This research was conducted under the ERATO Saitoh Spin Quantum Rectification Project of the JST Strategic Basic Research Programs. The members of the project are as follows: Research Director, Eiji Saitoh (Professor, University of Tokyo), Group leader, Sadamichi Maekawa (senior researcher at RIKEN), Group leader, Mamoru Matsuo (former deputy chief researcher at the Japan Atomic Energy Agency, currently associate professor at the University of Chinese Academy of Sciences), Vice Group leader, Hiroyuki Chudo (deputy chief researcher at the Japan Atomic Energy Agency), Research Supporter, Ryo Takahashi (former postdoctoral researcher at the Japan Atomic Energy Agency, currently assistant professor at Ochanomizu University).

Credit: 
Japan Science and Technology Agency

The protein that stands between us and autoimmunity

image: Tet2/3-deficient B cells are activated by self-antigen and express exaggerated amount of CD86. Then those B cells stimulate autoreactive CD4+ T cells, resulting in autoimmune response.

Image: 
Osaka University

Osaka, Japan - Our immune system is supposed to protect us from external microbial invaders, but sometimes it turns its efforts inward, potentially resulting in autoimmune diseases. In a new study, researchers from Osaka University discovered how reversible modifications to our DNA by certain proteins protect us from autoimmune diseases and, conversely, how the absence of these proteins paves the way to autoimmunity.

DNA contains all information that cells in our body need to function by providing specific codes to produce specific proteins. Nonetheless, not all parts of DNA are accessible in all cells at all times. The regulated production of proteins ensures that different cells and organs can be developed from the same DNA code. An important regulatory mechanism is the reversible addition (methylation) or removal (demethylation) of chemical bonds, so-called methyl groups, to segments of DNA. This modifies the readout of said DNA segment. Proteins of the ten-eleven translocation (Tet) family are known DNA demethylases that decrease the production of certain proteins in immune cells. How Tet proteins play into the development of autoimmune diseases has remained unknown—until now.

"Epigenetics deals with how reversible changes in DNA affect gene activity and protein expression," says corresponding author of the study Tomohiro Kurosaki. "Disrupting this machinery can have dramatic effects on cellular function. The goal of our study was to understand how epigenetic control in a specific type of immune cells, called B cells, affects the development of autoimmune diseases."

To achieve their goal, the researchers developed a novel mouse line in which B cells did not produce the epigenetic regulator proteins Tet2 and Tet3. They found that these mice developed a mild form of systemic lupus erythematosus, an autoimmune disease that can affect the joints, skin, kidneys and other organs, and for which there is currently no curative treatment. Similar to human patients, the mice showed increased serum levels of autoantibodies and damage to their kidneys, lungs and liver.

"These findings suggest that Tet2 and Tet3, as well as proteins whose expression is regulated by Tet2 and Tet3, might play a fundamental role in the development of systemic lupus erythematosus," says lead author of the study Shinya Tanaka. "We wanted to gain a deeper molecular understanding of the mechanism behind the effects of Tet2 and Tet3 on the immune system."

The researchers next investigated a different type of immune cell, called T cells, which often interact with B cells, and found that T cells were excessively activated in the Tet2/Tet3 knockout mice. By examining the molecular interaction between B and T cells closer, the researchers found that the protein CD86 was produced at higher levels in B cells of Tet2/Tet3 knockout mice, leading to aberrant T cell activation and autoimmunity.

"These are striking results that show how Tet proteins suppress autoimmune diseases by inactivating B cells and thus ultimately preventing them from attacking our bodies," says Kurosaki. "Our findings provide new insights into the contribution of epigenetics to the development of autoimmune disease. Regulating Tet proteins and their downstream effectors could be a novel treatment for autoimmune diseases."

Credit: 
Osaka University

Putting zinc on bread wheat leaves

image: Reza Keshavarz taking note of wheat growth stage.

Image: 
Photo courtesy of Reza Keshavarz

An estimated 17.3% of people worldwide are at risk of inadequate zinc intake; zinc deficiency is a major human health concern. Increasing Zn concentration in wheat grains is highly important, and management strategies to enhance grain Zn concentration can play an important role in fighting nutrient deficiency.

In a practice known as biofortification, researchers use agronomic practices, plant breeding or biotechnology to increase the macronutrient content of food crops. It is as an effective strategy to increase Zn concentration in wheat grain.

In a recently published Agronomy Journal article, researchers reported the effect of zinc foliar application on yield, protein, and grain zinc concentration of hard red spring wheat cultivars in a dryland system in Montana. Zinc sulfate was sprayed on plant canopy at a rate of 1.12 kg Zn ha-1 once (at heading) or twice (at heading and flowering).

Their results showed that the second application of Zn at flowering was necessary to produce grains with Zn concentration above the target level of 40 mg kg-1 as suggested by nutritionists. Zinc application also marginally increased grain yield; however, yield increment did not offset costs associated with Zn fertilization.

Given the need for producing grain with greater Zn concentration, price incentives or government payments are necessary to motivate farmers to adopt biofortification.

Credit: 
American Society of Agronomy

Cause of abnormal groundwater rise after large earthquake

image: Prior to the earthquake, Kumamoto City area groundwater had broad stable isotopic compositional features that included low elevation mountain springs, recharge area soil waters, and Shirakawa river waters (black frame of b, c).
After the large 2016 earthquakes, the compositional range became more limited (blue frame of b, c), and similar to the composition of the low elevation mountain springs.
a. Comparative samples, b. Groundwater samples in recharge areas, c. Groundwater samples in flow and runoff areas

Image: 
Associate Professor Takahiro Hosono

Increases in groundwater levels and volumes after large earthquakes have been observed around the world, but the details of this process have remained unclear due to a lack of groundwater data directly before and after an earthquake strikes. Fortunately, researchers from Kumamoto and Kwansei Gakuin Universities (Japan) and UC Berkley (US) realized that they had a unique research opportunity to analyze groundwater level changes around Kumamoto City after large earthquakes struck the area in 2016 .

Changes in the hydrological environment after an earthquake, like ponds or wells drying-up, the sudden appearance of running water, or a rise in water levels have been recorded since Roman times. Various theories have been proposed for the cause of such changes, such as fluctuations in pore water pressure (the pressure of groundwater held in the pores or gaps of rocks and soil), increased water permeability, and water movement through new cracks. To identify the actual cause, data must be collected from observation sites in wells, water sources, and rivers. However, especially in the case of inland earthquakes, it is generally rare for these sites to be spatiotemporally arranged in an area where a large earthquake has occurred. Additionally, it is even rarer to have enough data to compare before and after the disaster. These difficulties have been a roadblock to obtaining a clear picture of how hydrological environments change after earthquakes.

Kumamoto City, on the southern Japanese island of Kyushu, is famous for its water. Nearly 100% of the city's drinking water is sourced from groundwater in the area so there are many observation wells in the area that continuously record water level and quality data. In the early morning (Japan time) of April 16, 2016, a magnitude 7.0 earthquake struck the city which resulted in a wealth of groundwater data both before and after the earthquake. Kumamoto University researchers recognized this unique opportunity to assess how earthquakes can change hydrological environments in more detail than ever before, so they established an international collaboration to study the event.

An abnormal rise in groundwater level occurred after the main shock and was particularly noticeable in the recharge area of the groundwater flow system. The water levels peaked within a year after the main shock at around 10 meters and, although it has calmed down thereafter, water levels were still high more than three years later. This was thought to be due to an inflow of water from a place not part of the pre-earthquake hydrological cycle, so researchers attempted to determine the sources by using stable isotope ratios of water.

The stable isotope ratios of water on Earth's surface change slightly with various processes (evaporation, condensation, etc.) so they become unique marker values depending on location. These markers make it possible to determine the processes that affected a water sample as well as its source.

A comparison of the before-and-after sets of stable isotope ratios revealed that, prior to the earthquake, groundwater in the Kumamoto City area came mainly from low-elevation mountain aquifers, soil water in recharge areas, and seepage from the central Shirakawa river area. After the earthquake, the researchers believe that seismic fractures on the west side of Mt. Aso increased the permeability of the mountain aquafer which released groundwater toward the recharge area of the flow system and increased water levels. Furthermore, groundwater levels in the outflow area that had dropped immediately after the main shock were nearly restored within just one year.

"Our research is the first to capture the hydrological environment changes caused by a large earthquake in detail," said study leader Associate Professor Takahiro Hosono. "The phenomenon we discovered can occur anywhere on Earth in areas with climate and geological conditions similar to Kumamoto. We hope our research will be useful both for academics and the establishment of guidelines for regional water use in a disaster."

Credit: 
Kumamoto University

New research examines links between religion and parental support from non-family members

image: Dr John Shaver, University of Otago

Image: 
University of Otago

"Be fruitful and multiply" says the Bible, and worldwide religious people tend to have more children than their secular counterparts. New research suggests that this "multiplying" may be the result of the higher levels of support from non-family members that church-going women receive, and that these greater levels of support are also associated with positive developmental outcomes for children.

The report Church attendance and alloparenting: An analysis of fertility, social support, and child development among English mothers, published this month in Philosophical Transactions of the Royal Society B, the world's oldest English language journal, explored how church attendance is associated with social support and fertility, and how help from outside the family influences child development.

Lead author and University of Otago Religion programme head Dr John Shaver says the research attempts to resolve a paradox.

"That religious people tend to have more children is relatively well known across the social sciences, but from an evolutionary perspective, religious communities' high fertility is puzzling."

Shaver says that previous studies have found that sibling number is negatively related to a child's cognitive and physiological development, as well as their socioeconomic success in adulthood -because parents have less time, and fewer resources to invest in their development.

"The expectation, based on these findings, would be that due to differences in family sizes, children born to religious parents would exhibit poorer developmental outcomes than children born to secular parents. There haven't really been studies that compare the success of religious and secular children, but the available evidence suggests that children born to religious parents fare just as well as those born to secular parents. We've been interested in explaining this paradox of religious fertility."

Using 10 years data collected from the Children of the 90s health study, the report's authors tested the hypothesis that religious cooperation extends to alloparenting (investment in children by people other than the child's parents), that higher levels of social support for religious mothers was associated with their fertility, and their children's development.

The study found that mothers who received help from members of their congregation had higher fertility over time.

The research also confirmed that children with more siblings scored lower on three cognitive tests: when they entered school (aged 4-5), one year later (aged 5-6), and when they were eight.

"Our study reveals known biases in these and similar cognitive tests - such as that the children of wealthier and better educated mothers scored higher on these tests. We found, though, that a mother's social support and aid from co-religionists were both associated with higher child test scores, particularly at later stages of development. This suggests that women's social networks positively affect her child's cognitive development, and our analyses also suggest that religious women have stronger support networks," Dr Shaver says.

Dr Shaver says while the findings only supported some hypotheses, they were mostly consistent with the idea that religions in modern environments support cooperative breeding strategies: women who receive help from members of their congregation have higher fertility, and this aid, as well as more general forms of social support, were both associated with improved child cognitive development.

"By positively influencing social support, religion in the UK may help some women have more children, without sacrificing the success of these children."

Researching the evolutionary dynamics surrounding religion's influence on family size and child success is not just of interest to the scholarly community.

"Due to its relevance for economic and social development, health, and demographic projections, we expect our project will be of significant interest to governments, NGOs, and public policy officials," he says.

Credit: 
University of Otago

Learn from the pandemic to prevent environmental catastrophe, scientists argue

The dynamics of the SARS-CoV-2 pandemic share "striking similarities" with the twin environmental crises of global heating and species extinction, argue a team of scientists and policy experts from the UK and US.

They say that lessons learned the hard way in containing COVID-19 - the need for early intervention to reduce death and economic damage; the curbing of some aspects of people's lifestyles for the good of all of us - should also be at the heart of averting environmental catastrophe.

"We've seen the consequences of delayed action in the fight against COVID-19. The consequences of continued inaction in the face of catastrophic climate change and mass extinction are too grave to contemplate," said Prof Andrew Balmford, from the University of Cambridge's Department of Zoology.

Writing in the journal Current Biology, Balmford and colleagues argue that the spread of coronavirus shares common characteristics with both global heating and the impending "sixth mass extinction".

For example, each new COVID-19 case can spawn others and so lead to escalating infection rates, just as hotter climates alter ecosystems, increasing emissions of the greenhouse gases that cause warming. "Both are dangerous feedback loops," argue the scientists.

The team also draw comparisons of what they term "lagged impacts". For coronavirus, the delay - or lag - before symptoms materialise means infected people spread the disease long before they feel effects and change behaviour.

The researchers equate this with the lag between our destruction of habitat and eventual species extinction, as well as lags between the emissions we pump out and the full effects of global heating, such as sea-level rise. As with viral infection, behaviour change may come too late.

"Like the twin crises of extinction and climate, the SARS-CoV-2 pandemic might have seemed like a distant problem at first, one far removed from most people's everyday lives," said coauthor Ben Balmford from the University of Exeter.

"But left unchecked for too long, the disease has forced major changes to the way we live. The same will be true of the environmental devastation we are causing, except the consequences could be truly irreversible."

The authors find parallels in the indifference that has long greeted warnings from the scientific community about both new zoonotic diseases and human-induced shifts in climate and habitat.

"The lagged impacts, feedback loops and complex dynamics of pandemics and environmental crises mean that identifying and responding to these challenges requires governments to listen to independent scientists," said Dr Brendan Fisher, a coauthor from the University of Vermont. "Such voices have been tragically ignored."

The similarities between the SARS-CoV-2 pandemic and environmental disaster lie not just in their nature but also in their mitigation, say the scientists, who write that "there is no substitute for early action".

The researchers include an analysis of the timing of lockdown across OECD countries, and conclude that if it had come just a week earlier then around 17,000 lives in the UK (up to 21 May 2020) would have been saved, and nearly 45,000 in the US.

They say that, just as delayed lockdown cost thousands of lives, delayed climate action that gives us 2oC of warming rather than 1.5 will expose an estimated extra 62-457 million people - mainly the world's poorest - to "multi-sector climate risks" such as drought, flooding and famine.

Similarly, conservation programmes are less likely to succeed the longer they are delayed. "As wilderness disappears we see an accelerating feedback loop, as a given loss of habitat causes ever-greater species loss," explained Princeton Professor and co-author David Wilcove.

The scientists point out that delayed action resulting in more COVID-19 deaths will also cost those nations more in economic growth, according to IMF estimates, just as hotter and more disruptive climates will curtail economic prosperity.

Intervening to contain both the pandemic and the environmental crises requires decision-makers and citizens to act in the interests of society as a whole, argue the researchers.

"In the COVID-19 crisis we've seen young and working age people sacrificing education, income and social connection primarily for the benefit of older and more vulnerable people," said co-author Prof Dame Georgina Mace from UCL.

"To stem the impacts of climate change and address biodiversity loss, wealthier and older adults will have to forgo short-term material extravagance for the benefit of the present-day poor and future generations. It's time to keep our end of the social bargain," Mace said.

Cambridge's Andrew Balmford added: "Scientists are not inventing these environmental threats, just as they weren't inventing the threat of a pandemic such as COVID-19. They are real, and they are upon us."

Credit: 
University of Cambridge

To listen is to survive: Unravelling how plants process information

Plants must constantly integrate information on the availability of water and nutrients or about the presence of pathogens to produce fruits and seeds for reproduction, heavily used for human consumption. Given the increasing threat of droughts and the requirements of sustainable plant protection it is important to better understand the molecular mechanisms behind the information processing of plants. So far, different plant hormones were known to trigger molecular signaling pathways that result in developmental transitions like fruit ripening or drought response. While the signaling pathways are well studied, it remained enigmatic how the information exchange between them exactly works.

Hundreds of new information exchange points identified

A team at the Institute of Network Biology at Helmholtz Zentrum München with the participation of LMU biologists charted the molecular protein network of plants by experimentally testing more than 17 million protein pairs for physical interactions using a next-gen robotics pipeline combined with latest bioinformatics methods. The network of more than 2,000 observed protein interactions was analyzed using bioinformatic mathematical approaches form statistics and graph theory to find the signaling pathways and potential information exchange points. This way the researchers identified hundreds of these points which were not known before.

Most proteins function in multiple signaling pathways

Next, by using genetic tests, they could show that all tested information exchange points between proteins that were thought to function in single signaling pathways, in fact, organize the communication between different pathways. "This was one of the most striking new insights from this study: Most proteins function in multiple signaling pathways. Moreover, in contrast to single-gene analyses, our results revealed the high degree in which different pathways are physically and functionally intertwined. We believe that this is a fundamental principle and we need to pay more attention to it", says Dr. Melina Altmann, first-author of the study.

Future-proof plants

Prof. Pascal Falter-Braun, Director at the Institute of Network Biology and professor at LMU adds: "This insight might open new strategies for biotechnological development or breeding of plants to address the challenges of climate change in farming. We might be able to redirect the information in crops such that the plants require less fertilizer or pesticides or are more resistant against droughts."

Funding and collaboration

The finding builds on long-term research at the Institute of Network Biology on understanding molecular networks in plants and humans. The project was funded by the DFG via SFB924 "Molecular mechanisms regulating yield and yield stability in plants" and by an ERC consolidator grant awarded to Prof. Pascal Falter-Braun. For this study, the group collaborated with groups from the School of Life Science at the Technical University of Munich (TUM), colleagues from the Department of Environmental Sciences at the Helmholtz Zentrum München, and the University of Warwick, UK.

Credit: 
Helmholtz Munich (Helmholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH))

SUNY Downstate study finds wide variation in trust of health information by Hispanics

BROOKLYN, N.Y. (July 1, 2020) - Hispanic adults vary widely in their reported trust of health information sources, suggesting that information tailored to specific ethnic subgroups and targeted by age group may be beneficial, according to results of a study by SUNY Downstate Assistant Professor Marlene Camacho-Rivera, MS, MPH, ScD. The study is highlighted in the July 2020 issue of Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research.

Hispanics comprise the largest ethnic minority in the United States, with significant disparities in cancer risk and survival, both between Hispanic Americans and whites, and among Hispanic subgroups.

"Lack of knowledge about cancer services, exacerbated by relatively limited access to those services, is considered a major contributor to those disparities," said the study's lead author, Marlene Camacho-Rivera, MS, MPH, ScD, assistant professor in the SUNY Downstate School of Public Health Department of Community Health Sciences. "Our aim was to assess trust in health information across various sources and evaluate how that trust may vary by gender, age, ethnic background, and socioeconomic background."

Dr. Camacho-Rivera and colleagues examined data from the National Cancer Institute's Health Information National Trends Survey (HINTS), a nationally representative data collection program. HINTS oversamples African-American and Hispanic households using U.S. Census data. The study reflects data from 1,521 people, 46 percent of whom were Mexican or Mexican American; 16 percent were Cuban or Puerto Rican; and 37 percent were of other Hispanic backgrounds.

Respondents reported the highest levels of trust in healthcare professionals, with 91 percent saying they had a high level of trust. The next most trusted sources were government health agencies (68 percent), the Internet (63 percent), and charitable organizations (53 percent).

"The findings of this study will help to narrow the racial disparities that minority communities continue to experience," said SUNY Downstate School of Public Health dean Kitaw Demissie, M.D., Ph.D. "Although many health messaging sources are identified by the Latino community as trustworthy, targeting those that are most reliable to disseminate appropriate health messages, and educating the community to focus on those health messaging sources, are likely to have the highest impact on reducing health disparities".

Dr. Camacho-Rivera notes that 84 percent of the Hispanic population is now routinely using the Internet, but they are more likely than whites to lose Internet access due to cost, and more likely to report frustration in their information-seeking.

"As a Latina, I want fellow Hispanics to know that not all health information may be credible and evidence-based," Dr. Camacho-Rivera said. "It is important to ask questions of healthcare providers in order to make informed decisions."

Camacho-Rivera said the results of the study indicate that the Hispanic community would benefit from culturally-tailored health information to narrow health disparities in cancer and other chronic conditions, such as diabetes and cardiovascular diseases.

"While we have seen increases in health information-seeking due to increased access to the Internet, smartphones, and social media, we also recognize the potential for technology to exacerbate health disparities," she continued. "It is not enough for us to simply put out tailored information and expect individuals to act on it; we must also support community spaces, public health programs, and social policies that can help people benefit from the information."

Credit: 
SUNY Downstate Health Science University

Alarming long-term effects of insecticides weaken ant colonies

image: Young colony of the black garden ant (Lasius niger) with queen, workers and brood (eggs, larvae and pupae) in a nesting tube.

Image: 
© Daniel Schlaeppi

"Ants are one of the most important animal groups on our planet. However, they are also affected by the recently observed global declines in abundance and diversity of insects", says Daniel Schläppi of the Institute of Bee Health of the University of Bern, main author of the study. Evidence suggests that pesticides are among the factors responsible for the observed declines. "One problem of these substances is their persistence and the potential to contaminate soils and water, even in areas in which they are not applied", says co-author Gaétan Glauser from the University of Neuchâtel.

But so far, no data existed to show how exposure to low concentrations, which do not induce direct mortality, affect ants in the long run. The data, collected at the University of Bern in cooperation with Agroscope and the University of Neuchâtel, clearly demonstrate previously overlooked long-term effects, which are not detectable during the first year of colony development. The results are published in "Communications Biology", an Open-Access Journal of Nature. According to the authors, this study highlights the importance to develop sustainable agricultural practices that incorporate reduced use of agro-chemicals to prevent irreparable damages to natural ecosystems.

Worrying long-term impacts

Thiamethoxam has a clear negative impact on the health of ants. Thiamethoxam is a neonicotinoid insecticide used to combat pest insects that threaten our harvest. Unfortunately, there is more and more evidence showing that thiamethoxam and similar agro-chemicals have negative consequences for other beneficial insects, including ants and honey bees.

"With our study we show that ants, which play a very important roles in our ecosystems and provide valuable ecosystem services such as natural pest control, are negatively affected by neonicotinoids too", says Schläppi.

In the present work, colonies of black garden ants were chronically exposed to field realistic concentrations of thiamethoxam over 64 weeks. Colonies were raised in the laboratory from queens that were captured in the field. Before the first overwintering of the colonies no effect of neonicotinoid exposure on colony strength was visible. However, until the second overwintering it became apparent that colonies exposed to thiamethoxam were significantly smaller than control colonies. Because the number of workers is a very important factor for the success of an ant colony, the observed effects are most likely to compromise colony survivorship. Considering the important role of ants in natural ecosystems, our results indicate that neonicotinoids impose a threat to ecosystem functioning.

The call for sustainable solutions

"Accumulating long-term impact of neonicotinoids on ants is alarming", says Prof. Peter Neumann of the Institute of Bee Health at the University of Bern. "This is an exemplary study showing how negative effects of an environmental contaminant only become visible after long monitoring, but with potentially far reaching consequences". Therefore, the authors stress the importance to include ants as model organisms and to fully incorporate long-term effects in future risk assessment schemes for more sustainable agriculture.

Credit: 
University of Bern

Study: 35% of excess deaths in pandemic's early months tied to causes other than COVID-19

image: In March and April of 2020, mortality rates from (clockwise from top left) heart disease, diabetes, Alzheimer's disease and stroke spiked in states that also had the most COVID-19 deaths: Massachusetts, Michigan, New Jersey, New York -- particularly New York City -- and Pennsylvania. (Courtesy of JAMA)

Image: 
(Courtesy of JAMA)

RICHMOND, Va. (July 1, 2020) -- Since COVID-19's spread to the United States earlier this year, death rates in the U.S. have risen significantly. But deaths attributed to COVID-19 only account for about two-thirds of the increase in March and April, according to a study published Wednesday in the Journal of the American Medical Association.

Researchers at Virginia Commonwealth University and Yale University found that, from March 1 to April 25, the U.S. saw 87,001 excess deaths -- or deaths above the number that would be expected based on averages from the previous five years. The study, "Excess Deaths from COVID-19 and Other Causes, March-April 2020," showed that only 65% of the excess deaths that occurred in March and April were attributed to COVID-19, meaning more than one-third were linked to other causes.

In 14 states, including two of the most populated -- California and Texas -- more than half of the excess deaths were tied to an underlying cause other than COVID-19, said lead author Steven Woolf, M.D., director emeritus of VCU's Center on Society and Health.

This data, Woolf said, suggests the COVID-19 death counts reported to the public underestimate the true death toll of the pandemic in the U.S.

"There are several potential reasons for this under-count," said Woolf, a professor in the Department of Family Medicine and Population Health at VCU School of Medicine. "Some of it may reflect under-reporting; it takes awhile for some of these data to come in. Some cases might involve patients with COVID-19 who died from related complications, such as heart disease, and those complications may have been listed as the cause of death rather than COVID-19.

"But a third possibility, the one we're quite concerned about, is indirect mortality -- deaths caused by the response to the pandemic," Woolf said. "People who never had the virus may have died from other causes because of the spillover effects of the pandemic, such as delayed medical care, economic hardship or emotional distress."

Woolf and his team found that deaths from causes other than COVID-19 rose sharply in the states that had the most COVID-19 deaths in March and April. Those states were Massachusetts, Michigan, New Jersey, New York -- particularly New York City -- and Pennsylvania. At COVID-19's peak for March and April (the week ending April 11), diabetes deaths in those five states rose 96% above the expected number of deaths when compared to the weekly averages in January and February of 2020. Deaths from heart disease (89%), Alzheimer's disease (64%) and stroke (35%) in those states also spiked.

New York City's death rates alone rose a staggering 398% from heart disease and 356% from diabetes, the study stated.

Woolf said he and his team suspect that some of these were indirect deaths from the pandemic that occurred among people with acute emergencies, such as a heart attack or stroke, who may have been afraid to go to a hospital for fear of getting the virus. Those who did seek emergency care, particularly in the areas hardest hit by the virus, may not have been able to get the treatment they needed, such as ventilator support, if the hospital was overwhelmed by the surge.

Others may have died from a chronic health condition, such as diabetes or cancer, that was exacerbated by the effects of the pandemic, said Woolf, VCU's C. Kenneth and Dianne Wright Distinguished Chair in Population Health and Health Equity. Still others may have struggled to deal with the consequences of job loss or social isolation.

"We can't forget about mental health," Woolf said. "A number of people struggling with depression, addiction and very difficult economic conditions caused by lockdowns may have become increasingly desperate, and some may have died by suicide. People addicted to opioids and other drugs may have overdosed. All told, what we're seeing is a death count well beyond what we would normally expect for this time of year, and it's only partially explained by COVID-19."

Woolf and his co-authors, Derek Chapman, Ph.D., Roy Sabo, Ph.D., and Latoya Hill of VCU, and Daniel M. Weinberger, Ph.D., of Yale University, state that further investigation is needed to determine just how many deaths were from COVID-19 and how many were indirect deaths "caused by disruptions in society that diminished or delayed access to health care and the social determinants of health (e.g., jobs, income, food security)."

Woolf, also a family physician, said this paper's results underscore the need for health systems and public officials to make sure services are available not only for COVID-19 but for other health problems. His study showed what happened in the states that were overwhelmed by cases in March and April. Woolf worries that the same spikes in excess deaths may now be occurring in other states that are being overwhelmed.

"The findings from our VCU researchers' study confirm an alarming trend across the U.S., where community members experiencing a health emergency are staying home -- a decision that can have long-term, and sometimes fatal, consequences," said Peter Buckley, M.D., interim CEO of VCU Health System and interim senior vice president of VCU Health Sciences. "Health systems nationwide need to let patients know it is safe and important to seek care in a health emergency, whether it's through telehealth or in person."

Woolf, who serves in a community engagement role with the C. Kenneth and Dianne Wright Center for Clinical and Translational Research, said resources should be available for those facing unemployment, loss of income and food and housing insecurity, including help with the mental health challenges, such as depression, anxiety or addiction, that these hardships could present.

"Public officials need to be thinking about behavioral health care and ramping up their services for those patients in need," Woolf said. "The absence of systems to deal with these kinds of other health issues will only increase this number of excess deaths."

Credit: 
Virginia Commonwealth University

Energy-saving servers: Data storage 2.0

image: Diagram of a device architecture which employs the piezoelectric effect.

Image: 
ill./©: Kläui Lab

Whether it's sending the grandparents a few pictures of the kids, streaming a movie or music, or surfing the Internet for hours, the volume of data our society generates is increasing all the time. But this comes at a price, since storing data consumes huge amounts of energy. Assuming that data volumes continue to grow in future, the related energy consumption will also increase by several orders of magnitude. For example, it is predicted that energy consumption in the IT sector will rise to ten petawatt-hours, or ten trillion kilowatt-hours, by 2030. This would be equivalent to around half of the electricity produced worldwide.

But what can be done to reduce the amount of power needed by servers to function? Data is usually stored in a storage layer with the help of magnetization. To write or delete the data, electric currents are passed through ferromagnetic multilayer structures, where the flowing electrons generate an effective magnetic field. The magnetization in the storage layer "senses" this magnetic field and changes its direction accordingly. However, each electron can only be used once. An important step forward in energy-efficient data storage involves the construction of a ferromagnetic storage layer that includes a heavy metal such as platinum. As the current flows through the heavy metal, the electrons switch back and forth between the heavy metal and the ferromagnetic layer. The great advantage of this technique is that the electrons can be re-used multiple times, and the current required to write the data decreases by a factor of up to a thousand.

Doubling the efficiency of the storage process

A team of researchers at Johannes Gutenberg University Mainz (JGU) working in collaboration with researchers from Forschungszentrum Jülich has now found a way to double the efficiency of this storage process once again. "Instead of using simple silicon as a substrate as is usual practice, we employ a piezoelectric crystal," explained JGU scientist Mariia Filianina. "We attach the heavy metal layer and the ferromagnetic layer to this." If an electric field is then applied to the piezoelectric crystal, it generates mechanical strain in the crystal. This in turn increases the efficiency of the magnetic switching of the storage layer, which is the element that provides for data storage. The extent of enhancement of efficiency is determined by the system and the strength of the electrical field. "We can directly measure the change in efficiency and consequently adjust the appropriate field strength - actually on the fly," said Filianina. In other words, it is possible to directly control the efficacy of the magnetic switching process by means of adjusting the strength of the electric field to which the piezoelectric crystal is exposed.

This not only comes with a significant reduction of energy consumption but also makes possible the use of complex architectures for information storage. The researchers propose that if the electric field is only applied to a small area of the piezoelectric crystal, the switching efficiency will only be increased at that location. If they now adjust the system so that the spin torques of the electrons can only be switched when the strain is amplified in the piezoelectric crystal, they can change the magnetization locally. "Using this method, we can easily realize multilevel memories and complex server architectures," stated Filianina, a doctoral candidate at the Materials Science in Mainz Graduate School of Excellence and the Max Planck Graduate Center.

"I am pleased that the collaboration with our colleagues at Jülich is working so well. Without the help of their theoretical analysis we would not be able to explain our observations. I am looking forward to continue to work with them in connection with the recent jointly-obtained ERC Synergy Grant," emphasized Professor Mathias Kläui, who coordinated the experimental work.

Credit: 
Johannes Gutenberg Universitaet Mainz

Laser takes pictures of electrons in crystals

image: Electrons in the crystal of calcium fluoride; Christian HackenbergerElectrons in the Crystal of Calcium Fluoride

Image: 
University of Rostock

The researchers used powerful laser flashes to irradiate thin, films of crystalline materials. These laser pulses drove crystal electrons into a fast wiggling motion. As the electrons bounced off with the surrounding electrons, they emitted radiation in the extreme ultraviolet part of the spectrum. By analyzing the properties of this radiation, the researchers composed pictures that illustrate how the electron cloud is distributed among atoms in the crystal lattice of solids with a resolution of a few tens of picometers which is a billionth of a millimeter. The experiments pave the way towards developing a new class of laser-based microscopes that could allow physicists, chemists, and material scientists to peer into the details of the microcosm with unprecedented resolution and to deeply understand and eventually control the chemical and the electronic properties of materials. (Nature, July 1 2020)

For decades scientists have used flashes of laser light to understand the inner workings of the microcosm. Such lasers flashes can now track ultrafast microscopic processes inside solids. Still they cannot spatially resolve electrons, that is, to see how electrons occupy the minute space among atoms in crystals, and how they form the chemical bonds that hold atoms together. The reason is long known. It was discovered by Abbe more than a century back. Visible light can only discern objects commensurable in size to its wavelength which is approximately few hundreds of nanometers. But to see electrons, the microscopes have to increase their magnification power by a few thousand times.

To overcome this limitation, Goulielmakis and coworkers took a different path. They developed a microscope that works with powerful laser pulses. They dubbed their device as the Light Picoscope. "A powerful laser pulse can force electrons inside crystalline materials to become the photographers of the space around them." When the laser pulse penetrates inside the crystal, it can grab an electron and drive it into a fast- wiggling motion. "As the electron moves, it feels the space around it, just like your car feels the uneven surface of a bumpy road," said Harshit Lakhotia, a researcher of the group. When the laser-driven electrons cross a bump made by other electrons or atoms, it decelerates and emits radiation at a frequency much higher than that of the lasers. "By recording and analyzing the properties of this radiation, we can deduce the shape of these minute bumps, and we can draw pictures that show where the electron density in the crystal is high or low," said Hee-Yong Kim, a doctorate researcher in Extreme Photonics Labs. "Laser Picoscopy combines the capability of peering into the bulk of materials, like x-rays, and that of probing valence electrons. The latter is possible by scanning tunneling microscopes but only on surfaces."

"With a microscope capable of probing, the valence electron density we may soon be able to benchmark the performance of computational solid-state physics tools," said Sheng Meng, from the Institute of Physics, Beijing, and a theoretical solid-state physicist in the research team. "We can optimize modern, state-of-the-art models to predict the properties of materials with ever finer detail. This is an exciting aspect that laser picoscopy brings in," he continues.

Now the researchers are working on developing the technique further. They plan to probe electrons in three dimensions and further benchmark the method with a broad range of materials including 2-D and topological materials. "Because laser picoscopy can be readily combined with time-resolved laser techniques, it may soon become possible to record real movies of electrons in materials. This is a long-sought goal in ultrafast sciences and microscopies of matter" Goulielmakis concludes.

Credit: 
University of Rostock

Long-term culture of human pancreatic slices reveals regeneration of beta cells

image: A non-profit organization supporting the Diabetes Research Institute at the University of Miami Miller School of Medicine.

Image: 
Diabetes Research Institute Foundation

MIAMI, FL - June 29, 2020 - Scientists from the Diabetes Research Institute (DRI) at the University of Miami Miller School of Medicine have developed a method allowing for the long-term culture of "pancreatic slices" to study the regeneration of the human pancreas in real time. The results, published this week in the journal of Nature Communications, demonstrate for the first time that extended cultures of near-intact human pancreatic tissue retain the ability of the live organ to replenish insulin-producing beta cells. The use of this system as a model to study pancreatic regeneration could have important therapeutic implications for the treatment of diabetes.

Pancreatic slices are very thin live sections of the organ that preserve the cellular architecture and cell-to-cell interactions of the native organ. The generation of slices from human donors was first accomplished at the DRI in the context of an initiative sponsored by the Network for Pancreatic Donors with Diabetes (nPOD). However, slices are fragile and usually disintegrate in culture very quickly due to self-digestion and lack of sufficient oxygenation. Such rapid deterioration was incompatible with the study of the regeneration of insulin-producing beta cells, which are destroyed by autoimmunity in type 1 diabetes. However, DRI scientists circumvented this problem by placing the slices on a culture device that enhances the oxygenation of the tissue. This led to the extended survival and biological function of the slices for many days.

"The ability to keep human pancreatic slices alive for nearly two weeks is a technical breakthrough that allows us to witness the regeneration of beta cells in a human model that strongly resembles the real pancreas," said Dr. Juan Domínguez-Bendala, Director of Stem Cell Development for Translational Research and Research Associate Professor of Surgery at the Diabetes Research Institute, University of Miami Miller School of Medicine, and principal investigator of this work. "We now have a window into the native human organ that simply wasn't possible prior to this study."

The DRI's research on regeneration focused on a population of stem-like "progenitor" cells that had been previously characterized by this team in the human pancreas - a pool of cells that remains intact even in patients with long-standing type 1 diabetes. These cells were shown to respond to a natural growth factor, BMP-7, by proliferating and subsequently giving rise to insulin-producing beta cells.

"Extending the life of these slices in culture was key to observe beta cell regeneration in real time following the addition of BMP-7, even in slices obtained from diabetic donors. This gives us hope that we may be able to apply this approach to living patients one day," said Ricardo Pastori, Ph.D., Research Professor of Medicine, Immunology, and Microbiology and the Director of the Molecular Biology Laboratory at the Diabetes Research Institute, University of Miami Miller School of Medicine and co-principal investigator of this study.

The findings and system utilized here will continue to impact future studies of the human pancreas. Considering that preclinical work in animals often translates poorly to humans, this in vitro model of the human pancreas could be used to screen for additional regenerative agents with an unparalleled degree of accuracy when it comes to predicting responses in patients. This could lead to a faster and more certain path to clinical trials for type 1 and type 2 diabetes.

Credit: 
Diabetes Research Institute Foundation

UM Bio Station researchers unlock mystery of subterranean stoneflies

image: FLBS summer intern Grant Marshall holds benthic Pteronarcys californica stoneflies found in surface waters of the Nyack floodplain.

Image: 
UM photo

FLATHEAD LAKE - In a new study published in the scientific journal Ecology, researchers from the University of Montana's Flathead Lake Biological Station may have unlocked a mystery surrounding unique aquatic insects in the Flathead watershed.

"There's a surprising adaptation of stoneflies in alluvial aquifers that allows them to use low-oxygen or oxygen-free environments," said FLBS researcher Rachel Malison, lead author on the study. "These aquifers are hotspots of biodiversity, and this study highlights the vital role gravel-bed river floodplains play on the landscape."

River floodplains are among the most biodiverse landscapes on earth. They provide an important habitat for aquatic and terrestrial organisms, and their aquifers (i.e., shallow groundwater beneath and adjacent to the river) are key components of complex ecosystems worldwide. The Nyack floodplain of the Middle Fork Flathead River outside Glacier National Park, for instance, sustains everything from microbes to grizzly bears and is home to over half of the 100-plus species of stoneflies known in the state of Montana.

But there's a unique mystery at work within these river floodplains. Out of sight and under the surface, alluvial aquifers are composed of unconsolidated materials and offer limited sources of carbon for sustaining organisms and food webs. Alluvial aquifers also can contain extreme environmental conditions and an abundance of methane gas, which is typically produced in freshwater ecosystems within anoxic (zero-oxygen) or hypoxic (significantly low-oxygen) environments.

To this point, most stoneflies are thought to require highly oxygenated water environments to survive. But in the alluvial aquifer of the Nyack floodplain, large populations of subterranean stoneflies exist that can be found in low-oxygen environments, and significant portions of their biomass carbon derive from methane.

The question of how these stoneflies could survive and possibly access food in such an inhospitable, low-oxygen environment, is a question that Malison and her team of researchers set out to address.

"It was in the early-1990s that [FLBS researcher] Bonnie Ellis first discovered that a species of stonefly in the Nyack floodplain had the ability to survive anoxia exposure, and it's been a mystery ever since," Malison said. "No other stoneflies have this adaptation, so we wanted to investigate to better understand how large populations of stoneflies might be supported in aquifer food webs."

Through the course of their study, Malison and her fellow researchers tested the anoxic and hypoxic responses of nearly 2,500 stonefly individuals in three alluvial aquifer species and nine river species. Compared to their surface-dwelling relatives, the aquifer stoneflies performed better in low-oxygen and oxygen-free conditions, surviving an average of three times longer than their above-ground counterparts.

Additionally, the aquifer stoneflies were still able to keep moving and crawling when exposed to 76 hours without oxygen, which has important implications for how these species may be able to access different food resources in the aquifer.

Delving into the DNA of the stoneflies, the researchers showed that the aquifer stoneflies have gene sequences for hemocyanin, an oxygen-transport respiratory protein, which could represent a possible mechanism for the stoneflies' ability to survive at low-oxygen levels.

The results of the study show that subterranean stoneflies likely are able to exploit rich carbon resources in anoxic zones, which may explain their extraordinarily high abundance in gravel-bed floodplain aquifers. Additionally, their remarkable ability to perform well in low-oxygen and oxygen-free conditions is unique within the entire order of stoneflies.

It's a discovery that suggests unconventional and surprising methane sources likely support a crucial component of biodiversity and productivity in floodplains all over the world.

"These findings begin to help us understand how vulnerable different stoneflies might be to climate change," Malison said. "As waters warm they contain less oxygen, potentially causing stress and negatively influencing populations of the more sensitive species."

Credit: 
The University of Montana