Earth

RUDN University doctors found out the role of macrophages in liver regeneration

image: RUDN University doctors found out what role macrophages play in the recovery of the liver after the removal of its significant part.

Image: 
RUDN University

RUDN University doctors found out what role macrophages play in the recovery of the liver after the removal of its significant part. The results are published in the journal Biomedicine & Pharmacotherapy.

The liver in mammals is the most regenerative internal organ. It can restore the original size from as little as 25% of the preserved tissue. An important role in this process is played by macrophages. These are the cells that can engulf and digest particles. It is known, for example, that if the liver is affected by foreign substances, including drugs, macrophages migrate to the liver, absorb harmful microorganisms and dead cells, cause inflammation and thus contribute to the restoration of the organ. However, it is still unknown unambiguously how macrophages affect the growth of the liver after its resection, meaning the removal of a large part of the organ. RUDN University doctors investigated this issue in an experiment with laboratory mice.

"The role of macrophages in the liver growth after massive resections is uncertain. Some studies reveal the lack of immigration of macrophages to the liver during its recovery from partial resection, whereas other studies demonstrate such possibility. So, we focused our study on the macrophage population dynamics after 70% liver resection in mouse mode", Andrey Elchaninov, MD, PhD, researcher at th Department of Histology, Cytology and Embryology of RUDN University.

Doctors used 184 laboratory mice of the BALB/c line. In 132 they removed 70% of the liver. Immediately after that, then a day later, three days later, and a week later, the scientists took liver samples for analysis. The resulting cells were studied using an immunohistochemical method. The sections were labelled with specific antibodies to the glycoproteins CD68, CD206 and other compounds that are found on the surface of macrophages. The antibodies are labelled with fluorescent dyes and glow when attached to macrophages -- so one can count their number. RUDN University doctors also measured the rate of reproduction and cell death of macrophages.

It turned out that after resection, a large number of macrophages migrate to the liver. For example, a day after surgery, the number of macrophages with CD68 in the liver increases by 2 times, which persists after a week. It also turned out that the resection led to significant changes in the ratio of different types of macrophages. For example, the proportion of Ly6C cells in the week after surgery increased 4-fold -- from 5% to 22%, and the proportion of CD86 fell from 50% to 15%. The role of macrophages is ambiguous. On the one hand, they release chemicals (chemoattractors) that attract white blood cells responsible for the body's inflammatory response. On the other hand, they regulate the reproduction of liver cells and the metabolism in the organ.

"Corresponding profiles of macrophages in regeneration liver cannot be unambiguously defined as pro- or anti-inflammatory. Their typical features include elevated expression of leukocyte chemoattractant factors, and many of the differentially expressed sequences are related to the control of cell growth and metabolic processes in the liver. Our findings revealed essential roles of macrophages and macrophages proliferation in the mouse liver during its recovery from a massive resection", from RUDN University.

Credit: 
RUDN University

Third of Americans use gray market caregivers to aid the elderly and those with dementia

Nearly a third of Americans who arranged for paid care for an older person or someone with dementia employed workers who were not hired through a regulated agency, according to a new RAND Corporation study.

Individuals who hired gray market caregivers were less likely to be employed and more likely to also use unpaid care for their family members. In addition, people who lived in rural areas had an almost five-times higher odds of arranging dementia care through gray markets as compared to those who lived in urban areas.

The study is the first national survey to probe the use of gray market care for older adults and people with dementia. The findings are published by the Journal of Applied Gerontology.

"Gray market care represents a substantial proportion of paid, long-term care for older adults and may fill gaps in access to care," said Regina A. Shih, the study's lead author and a senior policy researcher at RAND, a nonprofit research organization. "Better understanding of the use of gray market caregivers for older Americans is important to meet the needs of the nation's aging population."

The study defined gray market caregivers as paid providers who are unrelated to the recipient, not working for a regulated agency, and potentially unscreened and untrained.

The rapid aging of the U.S. is expected to increase the demand for long-term care and supports to help with the activities of daily living. Demographic and social trends are reducing the number of family caregivers available to help older adults. As a result, the need for home health aides and personal care aides is expected to grow by 36% from 2019 to 2029, much faster than the average for all occupations.

Many older adults who need help do not qualify for Medicaid-sponsored long-term services and supports, and may be unable or unwilling to pay out-of-pocket to hire nurses or aides through a home health agency.

To explore the use of gray market caregivers, RAND researchers in August 2017 surveyed a random sample of 1,037 members of the RAND American Life Panel, a nationally representative internet panel of adults. Those surveyed were asked about whether they had sought care for an older adult and where their formal caregiver was employed.

Among survey participants, 28% had arranged aging-related long-term care for themselves or someone they love. Of respondents who arranged any paid care (including those who combined paid and unpaid care), 31% hired a gray market provider.

Similarly, 31% of respondents who arranged paid care for someone with dementia also sought gray market care. Among those who were gray market consumers, 65% also arranged for or provided unpaid care themselves.

Researchers say that people with dementia who need long-term care and live in rural areas may have more difficulty accessing or paying for regulated home- and community-based provider than those who live in urban locales.

Regulations for home health care agencies vary by state, but they usually are required to perform criminal background checks, verify education or training, and maintain clinical records.

When workers are employed by an agency, they are generally covered by disability and liability insurance to protect consumers and providers in the event of on-the-job accidents. Agency-based employees also may be eligible for or contribute to social insurance and employee benefit programs.

"Without agency oversight, the quality of care provided by gray market caregivers is unknown, and the potential for exploitation or abuse -- of both the care recipient or the care provider -- has not been systematically studied," Shih said.

Researchers say that more research about the use of gray market care is needed to identify factors contributing to its use, improve the quality of gray market care, and provide training for dementia care skills among providers.

Credit: 
RAND Corporation

New research unlocks the mystery of New England's beaches

image: Woodruff's team making a transect of a rocky New England Beach.

Image: 
Image courtesy Jon Woodruff

AMHERST, Mass. - Millions of Americans will visit New England's beaches this summer to cool off, play in the waves and soak up the sun. Until now, the factors governing which beaches slope gradually to the sea and which ones end abruptly in a steep drop-off have been largely unknown. However, new research from the University of Massachusetts Amherst reveals, with unprecedented detail, how the grain size of beach sand relates to the slope of the beach itself. These new findings are critical to understanding how New England's beaches will respond to both rising sea levels and increased storm activity.

Many of New England's beaches are made up of a mixture of sand and small stones. Or, to be more precise, the grain sizes on these beaches are "bi-modal" - composed of very large pieces of gravel, from 10 to 64 millimeters, and medium-to-coarse sand, from .25 to 1 millimeter, but with very little in between.

"I challenge you to find a handful of grains from a New England beach that are about 5 millimeters (or just under one-quarter of an inch) in diameter," says Jon Woodruff, a professor in UMass Amherst's department of geosciences and lead author of a recently published paper in Marine Geology that details his team's research. "There just aren't many."

It turns out the grain size is one of the crucial determinants of a beach's slope, and researchers have long known that the finer the sand, the more gradually pitched the beach - up to a point. "The relationship between grain size and slope falls apart for coarser-grained beaches," says Woodruff. Though many New England beaches are typically made up of coarse-grained particles, they still slope gradually to the water's edge. Until now, no one knew why.

"Past researchers have always focused on either the mean or median grain size," says Woodruff. It's a method that works well for finer-grained beaches. But in a bi-modal, New England beach, the median grain size falls right in that gap between 1 and 10 millimeters. Woodruff and his team took over 1,000 samples from 18 beaches in Massachusetts from which they assembled the largest, publicly available dataset on New England beaches.

The UMass research group also included Steve Mabee and Nick Venti from the Massachusetts Geological Survey, as well as an army of students led by UMass co-authors Doug Beach and Alycia DiTroia. What Woodruff's team discovered is that in bi-modal beaches, it's only the finer-grained sand that determines a beach's slope. "That smaller handful of sand grains," says Woodruff, "is why beachgoers have a place to sunbathe in New England."

This new research, which was conducted in partnership with the Massachusetts Office of Coastal Zone Management and supported by the federal Bureau of Ocean Energy Management and the Northeast Climate Adaptation Science Center, has implications far beyond your next summer vacation. "Understanding how beach sand grain size influences the makeup of our beaches is critical for making projections as to how beaches will respond to storms and sea-level rise," says Woodruff. "Especially given the attempts to preserve beaches from erosion, which cost many millions of dollars every year, we need to know what determines the shape and defining grain size characteristics of these beaches."

Credit: 
University of Massachusetts Amherst

Japanese, Italian, US physicists reveal new measurements of high-energy cosmic rays

image: Iron spectrum (multiplied by E2.6) measured by CALET and other experiments.

Image: 
CALET

New findings published this week in Physical Review Letters, Measurement of the Iron Spectrum in Cosmic Rays from 10??GeV/n to 2.0??TeV/n with the Calorimetric Electron Telescope on the International Space Station, suggest that cosmic ray nuclei of hydrogen, carbon and oxygen travel through the galaxy toward Earth in a similar way, but, surprisingly, that iron arrives at Earth differently.

A series of recent publications based on results from the CALorimetric Electron Telescope, or CALET, instrument on the International Space Station, or ISS, have cast new light on the abundance of high-energy cosmic ray nuclei -- atoms stripped of their electrons moving through space at nearly the speed of light - that arrive at Earth from outside the Solar System.

"For many years, the standard model to explain the origin of high-energy cosmic rays, or HECR, has been based on the assumption that these particles are produced by nuclear reactions inside stars, accelerated to high energies by the turbulent shock waves in sources like supernovae, and then ejected into the interstellar medium and scattered by the magnetic fields they encounter as they propagate through the Galaxy to Earth," said Michael Cherry, LSU Department of Physics & Astronomy professor emeritus and co-author on the new publication.

"The study of cosmic rays is the study of how the universe generates and distributes matter, and how that affects the evolution of the galaxy," said John Krizmanic, senior scientist at the University of Maryland's Center for Space Science and Technology, or CSST, and co-author.

Measurements of the HECR nuclear composition and precise energy spectra - the intensity of particles at each energy - provide a means to understand the sources of high-energy matter and how those cosmic ray nuclei propagate from their distant sources through the galaxy to Earth.

The new CALET results in the December 18, 2020 and June 18, 2021 issues of the journal Physical Review Letters confirm the previously observed flattening of the carbon and oxygen spectra in detail at energies around 200 billion electron volts or giga-electron volts, or GeV, per nucleon, but not for iron, and extend the earlier results to higher energies with improved statistics.

CALET has been collecting data about cosmic rays since 2015. The data include details such as how many and what kinds of particles are arriving, and how much energy they're arriving with. The Japanese, Italian and American teams that manage CALET collaborated on the new research.

The current HECR measurements suggest that the particles travel well outside the disk of the Milky Way Galaxy and are confined in a diffuse halo extending well beyond the disk. Independent evidence of the halo can be obtained from observations of radio synchrotron emission of cosmic ray electrons and gamma rays produced by HECR interactions in the halo material. As a result, the diffusion of the HECR through the galaxy provides a measurement of the structure of the galaxy and how the particles propagate through the galaxy, in particular through observations of the abundances of individual HECR elements and their intensity vs. the particle energy, i.e. their energy spectra.

Iron on the move

CALET detects cosmic ray nuclei at energies from a few billion electron volts per nucleon to over 2 trillion electron volts per nucleon. The CALET instrument is one of extremely few in space that is able to deliver fine detail about the cosmic rays it detects. Graphs of the spectra for hydrogen, carbon and oxygen cosmic rays are very similar, showing a characteristic change in slope near 200 billion electron volts per nucleon. The key finding from the new paper is that the spectrum for iron is different, with no change in slope at that energy. Iron behaves differently from the lighter elements.

Interestingly, the CALET results agree well with the results from the earlier PAMELA and CREAM experiments as well as agree with the flattening of the curves reported by the AMS experiment and also on the space station -- but differ from AMS in the absolute magnitudes of the fluxes measured. These results are also consistent with the results CALET published previously in Physical Review Letters in May 2019 on the measurement of the cosmic ray proton spectrum from 50 GeV to 10 trillion electron volts, or TeV. The similarity of the nature of the spectral breaks in the proton, carbon and oxygen spectra indicate a potential common source, but the cause of these changes in the nature of the spectra is not understood.

"There are several possibilities to explain the differences between iron and the three lighter elements. The cosmic rays could accelerate and travel through the Galaxy differently, although scientists generally believe they understand how cosmic rays propagate," explains Krizmanic.

Scientists generally believe that exploding stars, or supernovae, are a primary source of high-energy cosmic rays, but neutron stars or very massive stars could be other potential sources.

"The new measurements may indicate that the sources of the hydrogen, carbon and oxygen may be different from the sources of the iron," Cherry said.

Next-level precision

The largest cosmic ray detectors have been located on the ground or flown on balloons high in the atmosphere. But by the time cosmic rays reach those instruments, they have already interacted with Earth's atmosphere and broken down into secondary particles. Identifying precisely how many primary cosmic rays are arriving, which elements and their energies requires observations from space. CALET, being on the ISS above the atmosphere, can measure the particles directly and distinguish individual elements precisely.

Iron is a particularly useful element to analyze, explains Nick Cannady, a postdoctoral researcher with CSST and NASA Goddard Space Flight Center and former PhD student at LSU. On their way to Earth, cosmic rays can break down into secondary particles, and it can be hard to distinguish between original particles ejected from a source, like a supernova, and secondary particles. That complicates deductions about where the particles originally came from.

"As things interact on their way to us, then you'll get essentially conversions from one element to another," Cannady said. "Iron is unique, in that being one of the heaviest things that can be synthesized in regular stellar evolution, we're pretty certain that it is pretty much all primary cosmic rays. It's the only pure primary cosmic ray, where with others you'll have some secondary components feeding into that as well."

CALET was optimized to detect cosmic ray electrons and positrons, but also detects the atomic nuclei of cosmic rays very precisely. The electrons and positrons are important because their spectrum potentially contains information about their specific sources. Now the cosmic ray nuclei are providing additional information about the sources and/or propagation through the galaxy.

"We didn't expect that the nuclei - the carbon, oxygen, protons, iron - would really start showing some of these detailed differences that are clearly pointing at things we don't know," Cherry said.

A global effort

The Japanese space agency launched CALET and today leads the mission in collaboration with the U.S. and Italian teams. In the U.S., the CALET team consists of researchers from LSU, NASA Goddard, University of Maryland - Baltimore County, University of Maryland - College Park, University of Denver and Washington University. The LSU scientists include Professor T. Gregory Guzik, Professor Emeritus John Wefel, PhD student Anthony Ficklin, Research Associates Doug Granger and Aaron Ryan and Cherry. The new paper is the fifth from this highly successful international collaboration published in Physical Review Letters.

The current data set based on five years of exposure has allowed CALET to directly measure the rare flux of cosmic rays up to energies of 2 trillion electron volts per nucleon. CALET has been approved to continue operating at least through 2024.

The latest finding creates more questions than it answers, emphasizing that there is still more to learn about how matter is generated and moves around the galaxy.

"The study of cosmic rays is the study of how the universe generates and distributes matter, and how that affects the evolution of the galaxy," Krizmanic adds. "So really it's studying the astrophysics of this engine we call the Milky Way that's throwing all these elements around."

Credit: 
Louisiana State University

Vegetation of planet Earth: Researchers publish unique database as Open Access

It's a treasure trove of data: the global geodatabase of vegetation plots "sPlotOpen" is now freely accessible. It contains data on vegetation from 114 countries and from all climate zones on Earth. The database was compiled by an international team of researchers led by Martin Luther University Halle-Wittenberg (MLU), the German Centre for Integrative Biodiversity Research (iDiv) and the French National Centre for Scientific Research (CNRS). Researchers around the world finally have a balanced, representative dataset of the Earth's vegetation at their disposal, as the team reports in the journal Global Ecology & Biogeography.

Global issues and questions require global answers. "If we want to understand or predict how climate change will affect biodiversity in all regions on Earth, we have to consider them in their entirety. We need more than just data from a few well-studied regions," explains Professor Helge Bruelheide, a geobotanist at MLU and a member of iDiv. Instead, a global geodatabase is required with information on vegetation from all continents and climate zones.

Now, with the launch of "sPlotOpen", this database not only exists but it can be accessed by anyone who is interested. Around 100,000 vegetation plots from 114 countries have been entered into the database which has been release as open access. Each data point contains information on all of the plant species co-occurring at that location, alongside geographical, temporal and methodological metadata. This enables researchers to see exactly when, where and by whom the data point was collected.

Most importantly, sPlotOpen provides information about the complete plant community that can be found at the respective location. This is one of the great advantages of the new database, explains Dr Francesco Maria Sabatini from MLU and iDiv, who is coordinating the project together with Dr Jonathan Lenoir from the CNRS. "There are already several databases that show the distribution of individual plant species worldwide. In reality, however, plant species rarely occur alone and in isolation," says the researcher. With the aid of "sPlotOpen", scientists could figure out whether the data was collected from a forest - with various tree and grass species - or a meadow. According to the biologist, this is important when selecting individual surveys for specific research projects that are meant to investigate, for instance, only forests or grasslands.

Another advantage is that the team has attempted to balance the data. "There is already an incredible wealth of data on vegetation in Europe, North America and Australia. But, for many different reasons, there is much less data on other regions," says Sabatini. If the data from all countries were compared without weighting it first, false conclusions could be drawn. Therefore, the team tried to establish a certain comparability between the datasets from different regions. For example, not all data from Western Europe was included in the project - only a representative selection. The team already balanced the data for the benefit of future users. This should prevent a striking imbalance of data over regions that are less well documented, for example, warm or cold, dry or humid climates. "Our database has the potential to be a gamechanger for research projects in the field of macroecology and beyond," explains Sabatini. It could also be used in the field of geoscience, for example, to train algorithms for analysing satellite images. If you know where a certain community of plants exists, you can associate this plant community to a specific pattern in satellite images, namely a specific spectral community, and search other images for the same patterns.

In order to create this unique set of data, the team from Germany and France have relied on the support of 161 researchers from 57 countries who participated in the project by providing data. "Behind these individual datasets are countless colleagues and students, who went to the field collecting data in the first place. The sPlotOpen database could not exist without them" says Sabatini. The data also comes from national projects, such as the German Vegetation Reference Database, which is managed at MLU by Helge Bruelheide's research group.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

Nerve tumor in children: better tolerable chemotherapy without loss of efficacy

image: Prof. Ruth Ladenstein, MD, comments, "We are happy to announce that we have established a new standard of care for children suffering from high-risk neuroblastoma."

Image: 
St. Anna Children's Cancer Research Institute (René van Bakel)

The initial chemotherapy of aggressive childhood nerve tumors, so-called high-risk neuroblastomas, is crucial for ultimate survival. It has now been shown that the chemotherapy regimen used by the European Neuroblastoma Study Group is equally efficacious but better tolerated than a highly effective regimen from the US. This was the conclusion of an international trial coordinated by St. Anna Children's Cancer Research Institute. The study was published in the prestigious Journal of Clinical Oncology.

For particularly aggressive nerve tumors in children, so-called high-risk neuroblastomas, various combination chemotherapies are used with the intention to shrink the tumor before surgery (i.e., induction chemotherapy). The efficacy of such an induction therapy significantly impacts the chances of survival. The European Neuroblastoma Study Group of the International Society of Pediatric Oncology (SIOPEN), has now compared two of the most effective combination therapies in an international study, coordinated by St. Anna Children's Cancer Research Institute (St. Anna CCRI). The result: equal efficacy, but significantly lower rates of side effects with the therapy regimen considered the standard of the European Neuroblastoma Group SIOPEN.

"With the European rCOJEC regimen, high-grade vomiting, nausea, diarrhea, infections and stomatitis were significantly lower than in the US reference group. We will therefore implement it as a standard of care for high-risk neuroblastoma," says Prof. Ruth Ladenstein, MD, MBA, cPM, senior author of the study and head of the Studies & Statistics for Integrated Research and Projects (S2IRP) group at St. Anna CCRI.

Equal efficacy, less side effects

The study team compared the treatment regimen of the renowned Memorial Sloan Kettering Cancer Center, i.e., MSKCC-N5 regimen, which has the best efficacy results to date, with the so-called rCOJEC regimen (see also "About the Study"), the standard SIOPEN treatment in Europe. In the present international phase III study, a total of 630 patients were randomly assigned to one of the two regimens. Prof. Ladenstein and her international colleagues intended to evaluate whether the MSKCC-N5 regimen improves the response of tumor metastases (metastatic complete response) compared with rCOJEC. Furthermore, the clinicians investigated whether MSKCC-N5 therapy reduces the likelihood of relapse within three years (3-year event-free survival), compared with the rCOJEC regimen. Neither was the case. This study showed that the efficacy did not significantly differ between the two treatment regimens. However, the rate of high-grade (grade 3-4), acute adverse events was significantly higher with the MSKCC-N5 regimen.

According to Prof. Ladenstein, further studies should now examine the efficacy of immunotherapies in addition to the rCOJEC regimen, given that only about 60 and 65 percent of children receiving the rCOJEC and -MSKCC-N5 regimens, respectively, are still alive after three years. Prof. Ladenstein comments, "We urgently need to further improve the survival outcome. For example, the combination of chemotherapy with anti-GD2 antibodies is very promising - a combination that will be tested in randomized trials by SIOPEN and others."

High need for randomized trials

Thanks to continuously improving therapies, an increasingly larger number of children with high-risk neuroblastoma are already surviving their disease nowadays. Furthermore, those who survive have a long life expectancy. "For this reason it is even more important today to also investigate sequelae resulting from intensive chemotherapies," explains Prof. Ladenstein.
"Our study is only the fourth randomized trial investigating induction therapies in high-risk neuroblastoma. That's because randomized clinical trials require a lot of organizational effort at the international level due to small case numbers in each participating country. But they are essential for providing the best treatment for childhood cancer patients," adds Prof. Ladenstein.

About the study

The SIOPEN international randomized phase III HR-NBL1.5 trial included a total of 630 subjects. The median age was 3.2 years. The study included children or adolescents between one and 20 years of age with stage 4 neuroblastoma older than 12 months or infants younger than one year of age with stage 4S and MYCN amplification. Study participants were randomly assigned in a 1:1 ratio to the rCOJEC (cisplatin, vincristine, etoposide, cyclophosphamide, and carboplatin) or MSKCC-N5 regimen (cyclophosphamide, doxorubicin, vincristine, and cisplatin). Induction therapy with one of these regimens was followed by tumor surgery, high-dose chemotherapy (busulphan, melphalan), radiotherapy, and isotretionin combined with immunotherapy with dinutuximab beta. Co-primary study endpoints were metastatic complete response and 3-year event-free survival.

The metastatic complete response rate was not significantly different between the rCOJEC (32%) and MSKCC-N5 (35%, p=0.368) groups. There were also no significant differences in 3-year event-free survival between the groups (44±3% vs. 47%±3%, p=0.527). High-grade adverse events (grade 3-4) were significantly more frequent in the MSKCC-N5 group. These included non-hematologic side effects (68 vs. 48%, p

Credit: 
St. Anna Children's Cancer Research Institute

New method for molecular functionalization of surfaces

image: A high-resolution scanning tunneling microscopy image of the ordered NHC single layer on silicon; NHC stands for "N-heterocyclic carbenes"

Image: 
Dr. Martin Franz

One vision that is currently driving material scientists is to combine organic molecules (and their diverse functionalities) with the technological possibilities offered by extremely sophisticated semiconductor electronics. Thanks to modern methods of micro- and nanotechnology, the latter designs ever more efficient electronic components for a wide variety of applications. However, it is also increasingly reaching its physical limits: Ever smaller structures for functionalizing semiconductor materials such as silicon cannot be produced using the approaches of classical technology. Scientists have now presented a new approach in the journal Nature Chemistry: They show that stable and yet very well-ordered molecular single layers can be produced on silicon surfaces - by self-assembly. To do this, they use N-heterocyclic carbenes. These are small reactive organic ring molecules whose structure and properties vary in many ways and can be tailored by different "functional" groups.

Researchers led by Prof. Dr. Mario Dähne (TU Berlin, Germany), Prof. Dr. Norbert Esser (TU Berlin and Leibniz Institute for Analytical Sciences, Germany), Prof. Dr. Frank Glorius (University of Münster, Germany), Dr. Conor Hogan (Institute of Structure of Matter, National Research Council of Italy, Rome, Italy) and Prof. Dr. Wolf Gero Schmidt (University of Paderborn, Germany) were involved in the study.

Technological miniaturization reaches its limits

"Instead of trying to artificially produce smaller and smaller structures with increasing effort, it is obvious to learn from molecular structures and processes in nature and to merge their functionality with semiconductor technology," says chemist Frank Glorius. "This would make an interface, so to speak, between molecular function and the electronic user interface for technical applications." The prerequisite is that the ultra-small molecules with variable structure and functionality would have to be physically incorporated with the semiconductor devices, and they would have to be reproducible, stable and as simple as possible.

Harnessing the self-organization of molecules

The self-organization of molecules on a surface, as an interface to the device, can perform this task very well. Molecules with a defined structure can be adsorbed on surfaces in large numbers and arrange themselves into a desired structure that is predetermined by the molecular properties. "This works quite well on surfaces of metals, for example, but unfortunately not at all satisfactorily for semiconductor materials so far," explains physicist Norbert Esser. This is because in order to be able to arrange themselves, the molecules must be mobile (diffuse) on the surface. But molecules on semiconductor surfaces do not do that. Rather, they are so strongly bound to the surface that they stick wherever they hit the surface.

N-Heterocyclic carbenes as a solution

Being simultaneously mobile and yet stably bonded to the surface is the crucial problem and at the same time the key to potential applications. And it is precisely here that the researchers now have a possible solution at hand: N-heterocyclic carbenes. Their use for surface functionalization has attracted a lot of interest over the past decade. On surfaces of metals such as gold, silver and copper, for example, they have proven to be very effective surface ligands, often outperforming other molecules. However, their interaction with semiconductor surfaces has remained virtually unexplored.

Formation of a regular molecular structure

Certain properties of the carbenes are decisive for the fact that it has now been possible for the first time to produce molecular single layers on silicon surfaces: N-heterocyclic carbenes, like other molecules, form very strong covalent bonds with silicon and are thus stably bound. However, side groups of the molecule simultaneously keep them "at a distance" from the surface. Thus, they can still move about on the surface. Although they do not travel very far - only a few atomic distances - this is sufficient to form an almost equally regular molecular structure on the surface of the regularly structured silicon crystal.

Interdisciplinary collaboration

Using a complementary multi-method approach of organic chemical synthesis, scanning probe microscopy, photoelectron spectroscopy and comprehensive material simulations, the researchers clarified the principle of this novel chemical interaction in their interdisciplinary collaboration. They also demonstrated the formation of regular molecular structures in several examples. "This opens a new chapter for the functionalization of semiconductor materials, such as silicon in this case," emphasizes physicist Dr. Martin Franz, first author of the study.

Credit: 
University of Münster

How do developing spinal cords choose 'heads' or 'tails'?

image: Neural organoids developed by Gladstone scientists--like the ones in the vile shown here--mimic one of the key steps in the early development of the nervous system in embryos.

Image: 
Photo: Michael Short/Gladstone Institutes

SAN FRANCISCO, CA--June 21, 2021--The progression from a round ball of cells to an embryo with a head and a tail is one of the most critical steps in an organism's development. But just how cells first start organizing themselves with directionality along this head-to-tail axis is hard to study because it happens in the earliest days of embryonic development, in the confines of a mammal's uterus.

Now, scientists at Gladstone Institutes have created an organoid--a three-dimensional cluster of cells grown in the lab--that mimics the earliest developmental steps of the nervous system in embryos. The organoid is the first to show how human spinal cord cells become oriented in an embryo, and could shed light on how environmental exposures or toxins can make this process go awry, causing early miscarriages and birth defects.

"This is such a critical point in the early development of any organism, so having a new model to observe it and study it in the lab is very exciting," says Gladstone Senior Investigator Todd McDevitt, PhD, senior author of the new paper published in the journal Development.

In a weeks-old human embryo, cells destined to become the backbone and spinal cord start assembling--beginning at the embryo's head end and growing to form the tail (or tailbone) end of the embryo. The process is known as axial elongation.

Although scientists have studied this process in chick and mouse embryos, they have not been able to study molecules that help signal "heads" or "tails" to cells. What's more, differences in the body plans of humans compared to other animals--such as the lack of a tail--might mean that observations in these model organisms don't hold true in humans.

Members of McDevitt's team were working on a new organoid, made from a population of brain cells, when they noticed that certain conditions allowed cell clusters to begin to spontaneously elongate, forming tadpole-shaped structures reminiscent of developing spinal cords. The extended organoids had made the same transition that an embryo undergoes when the spinal cord develops--switching from a ball of cells to something with a distinct head and a tail, marking the top and bottom of the spine.

"Organoids don't typically have head-tail directionality, and we didn't originally set out to create an elongating organoid, so the fact that we saw this at all was very surprising," says Gladstone Graduate Student Nick Elder, a co-author of the new paper along with fellow Graduate Student Emily Bulger and former Graduate Students Ashley Libby, PhD, and David Joy, PhD.

The researchers worked to narrow down exactly what made the organoid elongate, and homed in on a handful of required signaling molecules. Then, they analyzed which genes were turned on or off in cells throughout the elongating organoid over the course of about 2 weeks. They found that the organoid had cellular and molecular patterns similar to those previously found in early developing mouse embryos.

"That means we can now use this model to better dissect the cellular and molecular details of human spinal cord elongation in the developing embryonic environment," says Bulger.

For instance, the researchers used CRISPR-Cas9 genome-editing technology to silence a gene from stem cells thought to be important in spinal cord development, and then created organoids from the edited cells. Without the gene, the team showed, the organoids didn't elongate normally, verifying the importance of the gene in axial elongation.

The organoid model could also be used to screen drugs or environmental exposures for their effect on a developing embryo, the researchers say.

"We can use this organoid to get at unresolved human developmental questions in a way that doesn't involve human embryos," says Libby. "For instance, you could add chemicals or toxins that a pregnant woman might be exposed to, and see how they affect the development of the spinal cord."

For now, McDevitt's team is planning to continue refining their approach to create elongating organoids under different conditions and using different cell types in an attempt to further understand the complex interactions that are required to build the spinal cord.

Credit: 
Gladstone Institutes

Researchers explore microbial ecosystem in search of drugs to fight SARS-CoV-2

Washington, DC - June 20, 2021 - Researchers from Yonsei University in South Korea have found that certain commensal bacteria that reside in the human intestine produce compounds that inhibit SARS-CoV-2. The research will be presented on June 20 at World Microbe Forum, an online meeting of the American Society for Microbiology (ASM), the Federation of European Microbiological Societies (FEMS), and several other societies that will take place online June 20-24.

Previous clinical findings have shown that some patients with moderate to severe COVID-19 have gastro-intestinal symptoms, while others showed signs of infection solely in the lungs.

"We wondered whether gut resident bacteria could protect the intestine from invasion of the virus," said Mohammed Ali, a Ph.D. student in Medicine at Yonsei University, Seoul, South Korea.

To investigate this hypothesis, the researchers screened dominant bacteria inhabiting the gut for activity against SARS-CoV-2. Their search revealed that Bifidobacteria, which have previously been shown to suppress other bacteria such as H. pylori and have proven active against irritable bowel syndrome, had such activity, said Ali.

The investigators also used machine learning to search for potential illness-fighting compounds in databases containing microbially produced molecules, discovering some that might also prove useful against SARS-CoV-2. "To train our model we leveraged previous coronavirus datasets in which several compounds were tested against targets from coronaviruses," said Mr. Ali. "This approach seems to be significant as those targets share features in common with SARS-CoV-2."

Ali emphasized the ecological nature of his approach to this work, observing that many existing antibiotics and cancer therapies are compounds that bacteria use to compete with each other within the gastrointestinal tract, and that these were initially purified from microbial secretions.

"Finding microbes that secrete anti-coronavirus molecules will be a promising method to develop natural or engineered probiotics to expand our therapeutics prevention techniques, to provide a more sustainable way to combat the viral infection," said Ali.

Credit: 
American Society for Microbiology

An acceleration of coastal overtopping around the world

image: Schematic diagram of coastal overtopping

Image: 
© Rafaël Almar et al., Nature Communications.

By combining satellite data and digital models, the researchers have shown that coastal overtopping, and consequently the risk of flooding, is set to further accelerate over the 21st century, by up to 50-fold under a high emission global warming scenario, especially in the tropics. This increase is principally caused by a combination of sea level rise and ocean waves.

Low-lying coastal regions host nearly 10% of the world's population. In addition to ongoing erosion and rising sea levels, these areas and their unique ecosystems are facing destructive hazards, including episodic flooding due to overtopping of natural/artificial protection, as in the case of Hurricane Katrina, which hit the United States in 2005, Cyclone Xynthia in Europe in 2010, and Typhoon Haiyan in Asia in 2013 (the largest tropical cyclone ever measured). These episodic events are expected to become more severe and more frequent due to global warming, while the consequences will also increase due to increased anthropogenic pressure, such as coastal and infrastructure development, rapid urbanisation. Although the magnitude and frequency of these events remain uncertain, scientists believe that countries in the tropics will be particularly affected.

Despite the significant role ocean waves play in determining coastal sea levels, their contribution to coastal flooding had previously been largely overlooked, mainly due to a lack of accurate coastal topographic information.

Measuring past events to estimate future risks

In this study, French researchers -from IRD, CNES, Mercator Océan- together with Dutch, Brazilian, Portuguese, Italian and Nigerian colleagues, combined an unprecedented global digital model for surface elevation with new estimates of the extreme sea levels. These extreme water levels contain tides, analysis of wind-driven waves and existing measurements of natural and artificial coastal defences.

The study started by quantifying the increase in global submersion events that occurred between 1993 and 2015. To accomplish this, satellite data was used to define two key parameters for coastal topography: the local beach-slope and maximum subaerial elevation of the coasts. The extreme level of coastal waters was calculated in hourly timesteps in order to identify the potential annual number of hours during which coastal defences could be overtopped in each area.

"The combination of tides and episodes of large waves is the main contributor to episodes of coastal overflow," says Rafaël Almar, a researcher in coastal dynamics at IRD, and the coordinator of the study. "We identified hot-spots, where the increase in risks of overtopping is higher, such as in the Gulf of Mexico, the Southern Mediterranean, West Africa, Madagascar and the Baltic Sea."

Acceleration during the 21st century

The scientists also performed an initial global assessment of the potential coastal overtopping over the 21st century, by taking into account different sea-level rise scenarios. Results show that the number of overtopping hours could increase with a faster pace than the average rate of sea-level rise. "The frequency of overtopping is accelerating exponentially and will be clearly perceptible as early as 2050, regardless of the climate scenario. By the end of the century, the intensity of the acceleration will depend on the future trajectories of greenhouse gas emissions and therefore the rise in sea-level. In the case of a high emissions scenario, the number of overtopping hours globally could increase fifty-fold compared with current levels," Rafaël Almar warns. "As we go along the 21st century, more and more regions will be exposed to overtopping and consequent coastal flooding, especially in the tropics, north-western United States, Scandinavia, and the Far East of Russia."

Further studies will be needed on the local and regional levels to flesh out these global projections, which provide a solid basis for proposing effective adaptation measures in the hotspots identified.

Credit: 
Institut de recherche pour le développement

mRNA vaccine yields full protection against malaria in mice

Scientists from the Walter Reed Army Institute of Research and Naval Medical Research Center partnered with researchers at the University of Pennsylvania and Acuitas Therapeutics to develop a novel vaccine based on mRNA technology that protects against malaria in animal models, publishing their findings in npj Vaccines.

In 2019, there were an estimated 229 million cases of malaria and 409,000 deaths globally, creating an extraordinary cost in terms of human morbidity, mortality, economic burden, and regional social stability. Worldwide, Plasmodium falciparum is the parasite species which causes the vast majority of deaths. Those at highest risk of severe disease include pregnant women, children and malaria naïve travelers. Malaria countermeasures development has historically been a priority research area for the Department of Defense as the disease remains a top threat to U.S. military forces deployed to endemic regions.

A safe, effective malaria vaccine has long been an elusive target for scientists. The most advanced malaria vaccine is RTS,S, a first-generation product developed in partnership with WRAIR. RTS,S is based on the circumsporozoite protein of P. falciparum, the most dangerous and widespread species of malaria parasite. While RTS,S is an impactful countermeasure in the fight against malaria, field studies have revealed limited sterile efficacy and duration of protection. The limitations associated with RTS,S and other first-generation malaria vaccines have led scientists to evaluate new platforms and second-generation approaches for malaria vaccines.

"Recent successes with vaccines against COVID-19 highlight the advantages of mRNA-based platforms--notably highly targeted design, flexible and rapid manufacturing and ability to promote strong immune responses in a manner not yet explored," said Dr. Evelina Angov, a researcher at WRAIR's Malaria Biologics Branch and senior author on the paper. "Our goal is to translate those advances to a safe, effective vaccine against malaria."

Like RTS,S, the vaccine relies on P. falciparum's circumsporozoite protein to elicit an immune response. However, rather than administering a version of the protein directly, this approach uses mRNA--accompanied by a lipid nanoparticle which protects from premature degradation and helps stimulate the immune system-- to prompt cells to code for circumsporozoite protein themselves. Those proteins then trigger a protective response against malaria but cannot actually cause infection.

"Our vaccine achieved high levels of protection against malaria infection in mice," said Katherine Mallory, a WRAIR researcher at the time of the article's submission and lead author on the paper. "While more work remains before clinical testing, these results are an encouraging sign that an effective, mRNA-based malaria vaccine is achievable."

Credit: 
Walter Reed Army Institute of Research

Surprising spider hair discovery may inspire stronger adhesives

image: Scanning Electron Microscopy (SEM) image of the bases of pretarsal (ie, on lowest part of leg) adhesive hairs. (A) On the left are the hair shafts of the adhesive hairs closest to the exoskeleton. At their insertion, the hair shaft becomes thinner and a stopper-like structure on the exoskeleton meets and attaches to it. (b) Further magnification of the same region: the asterisk marks the pivot point where hairs can bend upwards. Distal vs. proximal here means away from vs. towards the claw on the tip of the leg.

Image: 
B Poerschke, SN Gorb and F Schaber

Just how do spiders walk straight up -- and even upside-down across -- so many different types of surfaces? Answering this question could open up new opportunities for creating powerful, yet reversible, bioinspired adhesives. Scientists have been working to better understand spider feet for the past several decades. Now, a new study in Frontiers in Mechanical Engineering is the first to show that the characteristics of the hair-like structures that form the adhesive feet of one species -- the wandering spider Cupiennius salei -- are more variable than previously thought.

"When we started the experiments, we expected to find a specific angle of best adhesion and similar adhesive properties for all of the individual attachment hairs," says the group leader of the study, Dr Clemens Schaber of the University of Kiel in Germany. "But surprisingly, the adhesion forces largely differed between the individual hairs, e.g. one hair adhered best at a low angle with the substrate while the other one performed best close to perpendicular."

The feet of this species of spider are made up of close to 2,400 tiny hairs or 'setae' (one hundredth of one millimeter thick). Schaber, and his colleagues Bastian Poerschke and Stanislav Gorb, collected a sample of these hairs and then measured how well they stuck to a range of rough and smooth surfaces, including glass. They also looked at how well the hairs performed at various contact angles.

Different types of hair work together

Unexpectedly, each hair showed unique adhesive properties. When the team looked at the hairs under a powerful microscope, they also found that each one showed clearly different -- and previously unrecognized -- structural arrangements. The team believes that this variety may be key to how spiders can climb so many surface types.

This current work studied only a small number of the thousands of hairs on each foot, and it's beyond the scope of existing resources to consider studying them all. But the team expects that not all of the hairs are unique, and that it might be possible to find clusters or repeating patterns instead.

Bioinspired applications possible

"Although it is still very difficult to fabricate nanostructures like those of the spider--and especially to achieve the stability and reliability of the natural materials -- our findings can further optimize existing models for reversible and residue-free artificial adhesives," says Schaber. "The principle of different shapes and alignments of adhesive contacts as found in the spider attachment system can improve the attachment ability of bioinspired materials to a broad range of substrates with different properties."

Credit: 
Frontiers

Organic farming could feed Europe by 2050

image: Diagram of a possible agro-ecological scenario for 2050.

Image: 
© Gilles Billen

Food has become one of the major challenges of the 21st century. According to a study carried out by CNRS scientists1, an organic, sustainable, biodiversity-friendly agro-food system, could be implemented in Europe and would allow a balanced coexistence between agriculture and the environment. The scenario proposed is based on three levers. The first would involve a change in diet, with less consumption of animal products, making it possible to limit intensive livestock farming and eliminate feed imports. The second lever would require the application of the principles of agroecology, with the generalization of long, diversified crop rotation systems2 incorporating nitrogen-fixing legumes, making it possible to do without synthetic nitrogen fertilizers and pesticides. The final lever would consist in bringing together crops and livestock, which are often disconnected and concentrated in ultra-specialized regions. This would allow optimal recycling of manure. According to this scenario, it would in this way be possible to reinforce Europe's autonomy, feed the predicted population in 2050, continue to export cereals to countries which need them for human consumption, and above all substantially reduce water pollution and greenhouse gas emissions from agriculture. This study was published in One Earth on June 18, 2021.

Credit: 
CNRS

Moderate and vigorous physical activity attenuate arterial stiffening already in children

image: Physical activity can curb arterial stiffening already in childhood.

Image: 
University of Jyväskylä

According to a recent Finnish study, higher levels of moderate and vigorous physical activity can curb arterial stiffening already in childhood. However, sedentary time or aerobic fitness were not linked to arterial health. The results, based on the ongoing Physical Activity and Nutrition in Children (PANIC) Study conducted at the University of Eastern Finland, were published in the Journal of Sports Sciences. The study was made in collaboration among researchers from the University of Jyväskylä, University of Eastern Finland, the Norwegian School of Sport sciences, and the University of Cambridge.

Arterial stiffening predisposes to heart diseases, but physical activity reduces the risk

Stiffened arteries are one of the first signs of increased risk of cardiovascular diseases, and stiffening of the arteries has been observed even in children. High levels of physical activity, reduced sedentary time and good physical fitness form the basis for prevention of cardiovascular diseases in adulthood, but little is known about their role in promoting arterial health in primary school children.

"Our study showed that increased levels of moderate and vigorous physical activity were linked to more elastic arteries and better dilatation capacity," says Dr. Eero Haapala from the Faculty of Sport and Health Sciences at the University of Jyväskylä;. "However, our results also suggest that the positive effects of moderate and vigorous physical activity on arterial health are partly explained by their positive effects on body composition."

Moderate and vigorous physical activity are important for cardiovascular health

The researchers found the healthiest arteries in children with the highest levels of moderate and vigorous physical activity, but similar associations were not observed with sedentary time or light intensity activity.

"The key message of our study is that, starting from childhood, increasing moderate and vigorous physical activity is central in the prevention of cardiovascular diseases," says Haapala. "However, it is worth remembering that every step is important, because reducing sedentary time and increasing light physical activity have various health effects, even though they may not have direct effects on the arteries."

The study investigated the association of physical activity, sedentary time, and aerobic fitness and changes in them over 2-year follow-up with arterial stiffness and dilatation capacity in 245 children aged 6 to 8 years at the beginning of the study. Physical activity was measured using a combined heart rate and movement monitor and arterial stiffness and dilatation capacity using pulse contour analysis. Body composition was measured using a DXA device.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto

Atomic-scale tailoring of graphene approaches macroscopic world

Graphene consists of carbon atoms arranged in a chicken-wire like pattern. This one-atom-thick material is famous for its many extraordinary properties, such as extreme strength and remarkable capability to conduct electricity. Since its discovery, researchers have looked for ways to further tailor graphene through controlled manipulation of its atomic structure. However, until now, such modifications have been only confirmed locally, because of challenges in atomic-resolution imaging of large samples and analysis of large datasets.

Now a team around Jani Kotakoski at the University of Vienna together with Nion Co. has combined an experimental setup built around an atomic-resolution Nion UltraSTEM 100 microscope and new approaches to imaging and data analysis through machine learning to bring atomic-scale control of graphene towards macroscopic sample sizes. The experimental procedure is shown in Figure 1.

The experiment begins by cleaning graphene via laser irradiation, after which it is controllably modified using low energy argon ion irradiation. After transferring the sample to the microscope under vacuum, it is imaged at atomic resolution with an automatic algorithm. The recorded images are passed to a neural network which recognizes the atomic structure providing a comprehensive overview of the atomic-scale alteration of the sample.

"The key to the successful experiment was the combination of our unique experimental setup with the new automated imaging and machine learning algorithms", says Alberto Trentino, the lead author of the study. "Developing all necessary pieces was a real team effort, and now they can be easily used for follow-up experiments", he continues. Indeed, after this confirmed atomic-scale modification of graphene over a large area, the researchers are already expanding the method to employ the created structural imperfections to anchor impurity atoms to the structure. "We are excited of the prospect of creating new materials that are designed starting at the atomic level, based on this method", Jani Kotakoski, the leader of the research team concludes.

Credit: 
University of Vienna