Tech

Unprecedented single-cell studies in virtual embryo

video: This video is a 4D visualization of single-cell expression patterns.

Image: 
Hanna Sladitschek/EMBL

"How are the many different cell types in the body generated during embryonic development from an egg, which is only a single cell? This is one of the most fundamental questions in biology," explains Dr. Pierre Neveu, group leader at EMBL Heidelberg, setting out the rationale behind the research he and his group have performed in collaboration with the group of Dr. Lars Hufnagel.

While answering this question is essential to understand how multicellular organisms form, studying the developmental mechanisms driving this cellular diversification at the single-cell, genome-wide, and whole-embryo level is a challenging task. "So far we have lacked a comprehensive understanding of the gene expression programmes. These instruct individual cells to form the different cell types necessary to build an embryo," explains Dr. Hanna Sladitschek, first author of the study - a former postdoc at EMBL Heidelberg and now at the University of Padua School of Medicine. Despite recent advances in the field, a complete representation of embryonic development, accounting for every single cell in space and time, has not been achieved until now.

The EMBL researchers were able to solve this problem by constructing a 'virtual embryo' of Phallusia mammillata - a type of marine organism known as a sea squirt, which is found in the Mediterranean Sea and the Atlantic Ocean. This species was picked as a model system because it is related to vertebrates and each individual has the same number of cells, making it easier to combine observations from many specimens.

This virtual embryo describes the gene expression and morphology of every single cell of an embryo at every cell division in the early stages of development - showing the evolution from a single cell to the 64-cell stage. After these first seven cell divisions, the fates of the future nerve cord, brain, germ cells, blood cell precursors, and muscles are already specified. This makes it the first full description of early development accounting for every single cell in an embryo. It describes both gene expression - how a cell's genetic information is expressed and appears - and spatial position. To generate this comprehensive atlas, the researchers combined high-resolution single-cell transcriptomics and light-sheet imaging.

"Our model shows that it is possible to know the location and history of an individual cell by analysing its gene expression," says Neveu. "In addition, we find that while the regulation of gene expression is very precise within an embryo, differences in developmental timing explain the observed variation between individual embryos."

Gene expression is generally thought to be a noisy process - in other words, one that shows an element of randomness - yet the new results show that it is remarkably reproducible and coordinated across cells in a given embryo. "How is such coordination achieved? How does the embryo coordinate between the two mirror-symmetric embryo halves?" says Neveu, highlighting some of the new questions that the scientists would like to answer.

"Our studies represent a leap forward in the emerging field of developmental genomics," says Hufnagel. "Now that we have worked with an organism with a small number of cells, it will of course be very interesting to extend our work to mammals, which have many more cells!"

Credit: 
European Molecular Biology Laboratory

Supercomputers and Archimedes' law enable calculating nanobubble diffusion in nuclear fuel

image: Fuel aging

Image: 
Daria Sokol/MIPT Press Office

Researchers from the Moscow Institute of Physics and Technology have proposed a method that speeds up the calculation of nanobubble diffusion in solid materials. This method makes it possible to create significantly more accurate fuel models for nuclear power plants. The paper was published in the Journal of Nuclear Materials.

Why does nuclear fuel 'age'?

During the reactor operation, fission fragments, flying at high speeds through the crystal lattice of the nuclear fuel material, create various defects -- vacancies, interstitial atoms, and their clusters. Combining, such vacancies form bubbles that fill up with fission gas products during fuel burnout. The diffusion of such nanobubbles significantly affects the properties of the fuel and the release of gaseous fission products from it.

Modeling to the rescue

Fuel aging processes are hard to research in an experimental fashion. On the one hand, such processes are very slow, and on the other hand, gathering experimental data during the reactor operation is almost impossible. Therefore, integrated models are currently being developed to allow calculating the evolution of fuel material properties during the burnout process. The nanobubble diffusion coefficient is one of the key parameters in such models. This study is a joint project of MIPT and the Joint Institute for High Temperatures of the Russian Academy of Sciences.

From Schrödinger equation to dynamics of hundreds of thousands of atoms

The researchers from the Laboratory of Supercomputer Methods in Condensed Matter Physics at MIPT examined atomistic models of the material comprising hundreds of thousands of atoms. Using supercomputers, the team calculated their trajectories over hundreds of millions or even billions of integration steps. The gamma uranium interatomic interaction model used was obtained by the physicists in the course of their previous work, based on resolving the quantum mechanical problem for a multielectron system.

MIPT doctoral student Alexander Antropov, a co-author of the paper, explained: "For the nanobubble to move, it is necessary for the lattice atoms to cross over to the other side of the bubble. This is similar to an air bubble moving in water. However, in solid materials, this process is much slower. When working on the project, we demonstrated that there is another difference: The pores in the lattice take the form of polyhedra and the stable faces inhibit the diffusion process. In the 1970s, the possibility of such an effect was predicted theoretically based on general considerations. Our method makes it possible to obtain quantitative results for a specific material."

"Due to the fact that the diffusion of nanobubbles is very slow, the only real way to model their movement is to somehow give them a push. The problem, however, is how do you push a void? While working on the project, we proposed and established a method, in which an external force acts on the material surrounding the nanopore. The bubble begins to float upwards, similarly to a bubble in water under the buoyant force of Archimedes' principle. The proposed method is based on the Einstein-Smoluchowski relation and makes diffusion coefficient calculations dozens of times faster. In the future, we plan to use it for other materials that are exposed to severe radiation damage in nuclear reactors," commented Vladimir Stegailov, MIPT professor, the head of the MIPT Laboratory of Supercomputer Methods in Condensed Matter Physics.

Credit: 
Moscow Institute of Physics and Technology

Survey: Food insecurity in Vermont rose 33% during pandemic

image: Empty shelves at the grocery store have been a common sight during the coronavirus pandemic. A new survey found that food insecurity in Vermont has increased 33% since this start of the outbreak.

Image: 
Ingrid Cold

Food insecurity in Vermont has increased by one-third during the coronavirus pandemic, from 18.3% to 24.3%, according to a statewide survey conducted by the University of Vermont at the end of March and announced in a series of briefs today.

The increase in food insecurity was strongly correlated with employment status. Among survey respondents overall, 45% had lost their jobs, been furloughed or had their hours reduced during the pandemic. Among food insecure Vermonters, two-thirds (66%) had experienced job losses or work disruptions since the outbreak of the pandemic.

"Our data suggests that the growth of food insecurity is related to job layoffs and other employment disruptions," said Meredith Niles, assistant professor in UVM's Department of Nutrition and Food Sciences, a fellow in the Gund Institute for Environment and the principal investigator on the study. "People who had lost their jobs or had their work disrupted were far more likely to be food insecure compared with those who remained employed."

While job losses during the pandemic created many newly food insecure people, a sizable number of respondents -- 84% -- who had been food insecure before the pandemic remained so, a telling statistic for Niles.

"These are already vulnerable people and households who may be even more vulnerable now," she said. "They were experiencing challenges with food access before the pandemic, and this event has not helped them."

Surprisingly, less than 30% of respondents experiencing food insecurity participated in food assistance programs, Niles said.

In general, respondents with food insecurity expressed greater worry about food access than survey respondents overall. And they were more likely to adopt coping strategies to address food access challenges, like buying foods that would last longer (77%), buying different and/or cheaper foods (66%) or eating less (66%).

The last category is worrisome, said Farryl Bertmann, a lecturer in the Department of Nutrition and Food Sciences and a member of the research team.

"When people start eating less or disrupting their current eating patterns, we become concerned," she said. "When forced to skip or stretch meals, people increase their risk for nutrition-related diseases, decrease their immune function and may negatively impact their mental and emotional health."

For respondents experiencing food insecurity, the most helpful assistance strategies included receiving additional money for food and bills, achieving greater trust in the safety of stores and seeing benefits offered by government programs increased.

The average amount of additional money they said would be helpful for food and bills, if they had trouble affording food, was $110 per week.

"That's not a huge number, but it is significantly more than they get from public assistance programs like 3SquaresVT," said Emily Morgan, assistant professor in the Department of Nutrition and Food Sciences, another researcher involved in the survey project. "It shows there is a greater need than the current programs allow for."

The coronavirus changed food habits and practices for respondents overall, the survey found. Eighty-seven percent said they usually or always reduced the number of trips they made to the grocery store to avoid exposure to the virus, and 58% said they usually or always spent more time cooking.

While respondents experiencing food insecurity expressed greater concern and challenges accessing food, most of the respondents in the survey were unable to find all the food their households were accustomed to.

"We are all feeling the impacts of the coronavirus on the food system," Niles said

In other survey findings:

--Vermonters are helping each other. The percent of people reporting that "someone brings me food" doubled from 10% to 20% since the start of the outbreak.

--Respondents said that increased trust in the safety of going to stores and more food in stores would be the most helpful actions --Respondents worried most about food becoming unaffordable and running out of food if they were unable to go out.

--Respondents perceive their actions differently than average U.S. households. For example, only 49% reported buying many more items in a single trip to the grocery store, but 88% of respondents felt that the average U.S. household did so.

--Compared to food secure respondents, those who were food insecure reported more frequent challenges related to food access after the 'Stay home, stay safe' order was put in place. Challenges including buying as much of or the types of food needed, food affordability and food pantry access.

--Compared to food secure respondents, food insecure households were less likely to use a farm CSA, local farmstand or specialty store (a coop, health food store or ethnic market, for instance) but were not less likely to have used a farmers' market in the past year. With the closure of farmers' markets in Vermont, this could indicate that food insecure households may have limited ability to access fresh, local Vermont products.

--Respondents who reported food insecurity in the year prior to the coronavirus outbreak were more likely to be people of color, female, live in households with children and live in larger households.

The survey has a margin of error of 2%. A total of 3,251 Vermonters responded. The survey launched on March 29th and was concluded on April 12th. The survey was developed in collaboration with researchers at Johns Hopkins University and fielded by the University of Vermont team. The research team intends to conduct the survey in other states and nationally, and to conduct additional future surveys in Vermont to assess changes in the situation.

Credit: 
University of Vermont

Optimizing a new spraying method for ceramic coatings

image: A ceramic film produced by powder aerosol deposition on a porous gas-permeable electrode, such as those required in fuel cells.

Image: 
Image: Jörg Exner.

For a long time, the production of ceramic coatings has only been possible by means of sintering techniques conducted at more than 1,000 degrees Celsius. However, a novel spraying method, Powder Aerosol Deposition (PAD), enables their production at normal room temperatures. It is therefore highly attractive for industrial applications. Engineering scientists from the University of Bayreuth under the direction of Prof. Dr.-Ing. Ralf Moos are working in the frontline of ongoing development of this technology. In the journal of "Advanced Materials", they present its advantages and show how the functional properties of ceramic films can be optimized with regard to high-tech applications.

With PAD, dense ceramic films can be applied to very different types of materials, such as steel, glass, silicon, or even plastic. To achieve this, a dry ceramic powder is first converted into an aerosol, i.e. a mixture of gas and solid particles, with the aid of a carrier gas. The aerosol is then transported into a vacuum chamber, and accelerated to several 100 meters per second through a nozzle and directed onto the material to be coated. On impact, the tiny ceramic particles fracture. The resulting fragments, only a few nanometers in size, feature fresh, active surfaces. They form tightly adhering, dense coatings with a thickness of between 1 and 100 micrometers.

"Thanks to their dense microstructure, the coatings already exhibit excellent mechanical properties even directly after the deposition. They are extraordinarily hard and have good chemical resistance," explains Dr.-Ing. Jörg Exner, first author of the study, who was a driving force in the research work on PAD at the University. However, as it turned out, some functional properties of the coatings, especially the electrical conductivity, proved inadequate without carrying out further steps. In their new study, nevertheless, the Bayreuth engineering scientists are now able to report on effective methods of optimization.

Crystalline structures are of crucial importance in this context. The strong impact of the ceramic particles on the materials causes structural defects in the resulting fragments. This not only affects electrical conductivity, but also other functional properties. "By a thermal post-treatment, or so-called tempering, these defects can be almost completely eliminated. We have been able to show that the required temperatures are generally much lower than for conventional sintering. The avoidance of these extremely high temperatures is what makes PAD so attractive. It therefore remains true: This technology offers very high industrial potential, especially where high-quality ceramic coatings are required," Exner concludes.

What type of ceramic materials are processed depends on the intended technological applications: Dielectric ceramics are suitable for capacitors, electrically conductive functional ceramics are preferred for sensors, and yttrium-stabilized zirconium oxide is used in high-temperature fuel cells. Even lithium-ion batteries can be produced in this way.

The scientific understanding of the ceramic film structures and of their functional properties, gained at the University of Bayreuth, will contribute significantly to the goal of integrating high-quality coated components into complex systems in a sustainable way. New technologies, for example, in the fields of energy storage and conversion, or for the purpose of environmental monitoring, therefore stand to benefit considerably from powder aerosol deposition applications.

Credit: 
Universität Bayreuth

Scientists discover new features of molecular elevator

image: This is a molecular elevator.

Image: 
Daria Sokol/MIPT Press Office

Biophysicists from the Moscow Institute of Physics and Technology and the University of Groningen in the Netherlands have visualized a nearly complete transport cycle of the mammalian glutamate transporter homologue from archaea. They confirmed that the transport mechanism resembles that of an elevator: A "door" opens, ions and substrate molecules come in, the door closes, and they travel through the membrane. Presumably the mammalian transporters operate the same way, so this discovery is potentially important for developing new treatments for schizophrenia and other mental illnesses caused by malfunctioning of these transporters. The research was published in the journal Nature Communications.

Nerve impulses travel through the human body in the form of chemical signals or electric charges, as ion currents. Neurons, the cells of the nervous system, can generate and propagate electrical signals. A neuron consists of a cell body with projections of two types: multiple dendrites and a single axon. The cell body and the dendrites serve as an antenna picking up signals from other neurons. By summing and processing all of the input signals, the neuron generates its own impulses that are then passed on to the neighboring neuron. The electric impulse in an axon is similar to the electric current in wires, but it is carried by sodium and calcium ions, rather than electrons. That said, electrical signal transmission is only possible within a neuron. The signals transmitted between neurons are of a chemical nature and involve special structures, called synapses.

The signal in a synapse is usually carried by chemicals called neurotransmitters. A neuron releases neurotransmitters into the synaptic cleft, and the membrane of the receiving neuron recognizes the neurotransmitter via a dedicated receptor.

Another hidden yet vital stage in this process is that the neurotransmitter molecules must be removed from the synaptic cleft to enable the next pulse transmission. Otherwise, the receiving neuron will be overstimulated. Neurotransmitters are cleared out by dedicated transporters that pump these molecules from the synaptic cleft back into the cell body. These transporters are located either in the synapses of neurons or in the so-called glial cells, which provide support and protection for neurons (fig. 1).

Glutamate is the main excitatory neurotransmitter in the human brain. When glutamate is released into the synaptic cleft, this excites the next neuron in the sequence. The human nervous system also has inhibitory neurotransmitters, for example GABA (gamma-Aminobutyric acid), which snuff out any potential in the neuron when released.

The glutamate transporter clears out glutamate from the synaptic cleft. This process is crucial to the functioning of the human brain. The inhibition of glutamate removal from the cleft is linked to many neurodegenerative diseases and mental disorders, including schizophrenia.

Quite often we can learn a lot about someone by just looking at their relatives. The same holds true for evolutionary similar proteins, called homologues. The group of Russian and Dutch scientists has resolved a conformational ensemble of the aspartate transporter from archaea, which is homologous to the glutamate transporters in humans.

Until recently, X-ray crystallography was the main technique for studying the 3D structures of proteins. The main challenge faced by that method is crystallizing proteins to obtain diffraction images from crystals. Membrane proteins tend not to form well-diffracting crystals easily.

To overcome this bottleneck, another technique called cryo-electron microscopy can be used. In cryo-EM a vitrified sample is irradiated by an electron beam and the collected images are combined, yielding a three-dimensional reconstruction of the protein. The obtained model is analyzed and can be used to design new drugs.

The structure of the mammalian glutamate transporter homologue was determined using a cryo-electron microscope at the University of Groningen in the Netherlands. These proteins consist of three individual molecules, hence they form trimers. Each individual protomer consists of two parts: the immobile part fixed in the membrane and the mobile transport domain resembling an elevator. The study has revealed 15 protomer structures (in five trimers), including intermediate conformations. The team also confirmed independent movements of transport domains.

"These structures help us explain how these proteins prevent sodium leakage," the head of the MIPT Laboratory of Structural Electron Microscopy of Biological Systems, Albert Guskov explained. "Just like in an elevator, the transport domain has a door, and as long as it stays open, the elevator will not move. But once the sodium ions and the substrate -- in this case, the aspartate molecules -- enter the elevator, the door closes, and off it goes. So, if there are only sodium ions present, this is not enough to close the door."

"This makes the transport very efficient, which is particularly important in the case of human proteins, since it's not merely about eating up the aspartate -- like in archaea -- but about information transfer between neurons," the scientist added.

The Laboratory of Structural Electron Microscopy of Biological Systems, led by Professor Guskov, is establishing a modern scientific infrastructure at MIPT, enabling the full-cycle research on single-particle cryo-EM in Russia. In 2019, the team launched a research platform based on the cryo-electron microscope FEI Polara G2 with further plans to upgrade it to the state-of-the-art microscope.

"The competences of the laboratory are in high demand in the Russian scientific community, and the expanding international academic network enables the access to modern scientific infrastructure. Such infrastructure opens new opportunities for studying the fundamental questions of biology, such as the mechanisms of functions of ion channels and transporters, interactions within protein complexes, etc. It also helps us find industrial partners that would conduct research toward applying our findings in drug design and elsewhere in medicine," Professor Guskov commented.

Credit: 
Moscow Institute of Physics and Technology

Relying on 'local food' is a distant dream for most of the world

image: Optimized distance between food production and consumption graphic.

Image: 
Aalto University

Globalisation has revolutionised food production and consumption in recent decades and cultivation has become more efficient As a result, diets have diversified and food availability has increased in various parts of the globe. However, it has also led to a situation where the majority of the world population live in countries that are dependent on, at least partially, imported food. This can intensify vulnerabilities during any kind of global crisis, such as the current COVID-19 pandemic, as global food supply chains are disrupted.

Aalto University dissertation researcher, Pekka Kinnunen, says 'There are big differences between different areas and the local foliage. For example, in Europe and North America, temperate crops, such as wheat, can be obtained mostly within a radius of 500 kilometres. In comparison, the global average is about 3,800 kilometres'.

The recent study, published in Nature Food and led by Kinnunen, modelled the minimum distance between crop production and consumption that humans around the world would need to be able to meet their food demand. The study was conducted in collaboration with the University of Columbia, the University of California, the Australian National University and the University of Göttningen. The study factored in six key crop groups for humans: temperate cereals (wheat, barley, rye), rice, corn, tropical grains (millet, sorghum), tropical roots (cassava) and pulses. The researchers modelled globally the distances between production and the consumer for both normal production conditions and scenarios where production chains become more efficient due to reduced food waste and improved farming methods.

It was shown that 27% of the world's population could get their temperate cereal grains within a radius of fewer than 100 kilometres. The share was 22% for tropical cereals, 28% for rice and 27% for pulses. In the case of maize and tropical roots, the proportion was only 11-16%, which Kinnunen says displays the difficulty of relying solely on local resources.

Foodsheds as areas of self-sufficiency

'We defined foodsheds as areas within which food production could be self-sufficient. In addition to food production and demand, food fences describe the impact of transport infrastructure on where food could be obtained', Kinnunen explains.

The study also showed that foodsheds are mostly relatively compact areas for individual crops. When crops are looked at as a whole, foodsheds formed larger areas, spanning the globe. This indicates that the diversity of our current diets creates global, complex dependencies.

According to Associate professor Matti Kummu, who was also involved in the study, the results clearly show that local production alone cannot meet the demand for food; at least not with current production methods and consumption habits. Increasing the share of effectively managed domestic production would probably reduce both food waste and greenhouse gas emissions. However, at the same time, it could lead to new problems such as water pollution and water scarcity in very densely populated areas, as well as vulnerabilities during such occurrences as poor harvests or large-scale migration.

'The ongoing COVID-19 epidemic emphasises the importance of self-sufficiency and local food production. It would be important also to assess the risks that dependence on imported agricultural inputs such as animal feed proteins, fertilisers and energy, might cause', says Kummu.

Credit: 
Aalto University

Study of sewage finds link between different rates of sepsis in UK and presence of E. coli in the community

A study to be presented at European Congress on Clinical Microbiology and Infectious Diseases (ECCMID)* shows that rates of Escherichia coli related sepsis in different regions of the UK could be directly linked to the levels of pathogenic (disease causing) E. coli in the community, as determined by its presence in sewage in that area. The study is by Dr Mark Toleman, Cardiff University, UK and colleagues.

UK E. coli sepsis rates have been rising for the last 20 years. Good information on the rates of increase are available from the agencies Public Health England, Wales and Scotland. E. coli bacteraemia (blood infection) rates have been closely monitored since mandatory surveillance was initiated for acute NHS trusts in 2011. For example, rates have risen by 49% in Wales (60.3-89.8 per 100,000 population from 2010-2017), 71% in England (45-77.7 per 100,000, 2009-2018) and 31% in Scotland (66.6-87.3 per 100,000 2009-2018). However, the reason behind this consistent year on year increase is to date unknown.

The sepsis rate also varies greatly between NHS geographic regions** and considerably between London (64 cases 100,000 population) and South Wales (85 per 100,000). In this study, the authors tested the theory that the different rates could be due to differing prevalence of pathogenic E. coli types (type B2) in the different UK NHS regions.

Sewage was collected from multiple sites: Longreach (about 20km East of London on the River Thames near Dartford), Marlow (Buckinghamshire), Reading (Berkshire), Bristol (Avon), Ponthir (South Wales) and Cardiff (South Wales) sewage works from the period 19 to 26 September 2019. The authors chose these particular locations to focus the study along the M4 motorway corridor knowing that the sepsis rates were lowest in London and highest in South Wales.

The authors explain: "We were constrained a little by availability to access certain sewage works but essentially we managed to get sewage from the majority of plants targeted. We then randomly chose 100 E. coli isolates from each location and performed genetic analysis on them."

The average prevalence of pathogenic B2 phylotype E. coli was considerably higher in South Wales than across the English locations, 32.5% versus 17.8%. E. coli B2 phylogenetic prevalence at each location was: Ponthir (33%), Cardiff (32%), Bristol (24%), Reading (16%), Marlow (13%), Longreach (18%) with prevalence lowest in the London region (15.6% overall - an average of the sites that were all within 40 miles of central London: Reading, Marlow and Longreach).

A method called multiplex PCR was then used to detect known sepsis-causing pathogenic E. coli ST95, ST131, ST73 and ST69 (all part of the B2 group routinely found to cause sepsis in hospital). Most sepsis-causing E. coli sequence types belong to the E. coli B2 group, and the prevalence of these specific ST was also considerably higher in South Wales than in England, 11% versus 7.8%.

The highest rate of specific sepsis E. coli ST was found in Bristol mostly due to a very high prevalence of ST95 (9%) in the Bristol community. However overall, Bristol had less of a diversity of sepsis causing ST compared to the Welsh sites, with several additional sepsis-causing types commonly found in Wales.

The authors say "Our study showed that firstly, the carriage rate of B2 types is very high in the UK especially in Wales in this study and secondly, that the specific sequence types within the B2 group known to cause sepsis in our hospitals are commonly carried in the community."

They further explain: "Most sepsis events start from common community acquired infections such as urinary tract infections (UTIs) that cause bladder and then kidney infections before entering the blood stream (sepsis). The origin of the E. coli organisms causing these infections is typically the patients' own gut. Therefore, if more people in the community are carrying pathogenic B2 types of E. coli such as ST73, ST131 etc we would expect the UTI rate to increase and also subsequently the sepsis rate. This is exactly what appears to be happening."

The authors have also done other work in this area, including a study of the UK, Saudi Arabia and Kazakhstan that indicates that the UK sepsis rates are directly related to carriage of pathogenic E. coli types. They have also published data on rates of B2 carriage in Pakistan and Northern India (which are very low in comparison to the UK), and have an on-going study comparing the situation between the UK and South America (Brazil). In addition, they are currently seeking funding to do a UK wide study.

On the root causes of the different rates of E. coli carriage in different places, they say: "Our previous research has shown very low rates of human carriage of pathogenic E. coli strains in other geographic locations such as Saudi Arabia, Bangladesh Pakistan and India (E. coli predators such as bacteriophages in different human communities and locations. Similarly, diverse mixes of cultures and ethnicities in global hubs such as London would allow mixing of bacterial and bacteriophage populations altering prevalence of individual E. coli strains."

They add: "Rates are also very high in the North of England, Scotland and Northern Ireland. We are currently planning the logistics of doing an in-depth UK wide study incorporating the majority of sewage works in the UK and additionally measuring resistance rates at different places."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

KIST and UNIST joint research team develop a high-capacity battery material using salmon DNA

image: A Korean research team has succeeded in developing next-generation high-capacity cathode material for lithium-ion batteries.
It's stabilizing the surface of over-lithiated layered oxides using DNA from salmon and carbon nanotubes, and Improved catalyst performance and lifespan found through the use of integrated advanced analytical techniques.

Image: 
Korea Institute of Science and Technology (KIST)

A Korean research team has succeeded in developing next-generation high-capacity cathode material for lithium-ion batteries. The Korea Institute of Science and Technology (KIST, Acting President Seok-jin Yoon) announced that the joint research team of Dr. Kyung Yoon Chung (head of the Center for Energy Storage Research at KIST), Prof. Sang-Young Lee (Professor at the Ulsan National Institute of Science and Technology (UNIST)), and Dr. Wonyoung Chang (Principal Researcher at the Center for Energy Storage Research at KIST) have developed high-performance cathode material by stabilizing the surface of *over-lithiated layered oxides(OLO), using the DNA of salmon.

*Over-lithiated layered oxides(OLO) : Materials containing a large amount of lithium by replacing from the transition metal element to lithium element in layered structure of material.

In the lithium-ion secondary battery, the amount of lithium ions moving back and forth between the cathode and anode during charging and discharging process determines the energy density of the battery system. In other words, the development of high-capacity cathode material is essential to increasing the capacity of a lithium-ion battery.

Over-lithiated layered oxides(OLO) have a high reversible capacity of 250 mAh/g (compared to the reversible capacity of existing commercialized materials, which is only 160 mAh/g) and have long received attention as a next-generation cathode material, which can improve the energy storage capacity of batteries by more than 50%. However, OLO have a major weakness in that, during charge/discharge cycling, the layered structure of OLO can collapse, resulting in swelling and rendering the battery unusable.

The KIST research team used **transmission electron microscopy to analyze changes in the crystallographic structure by dividing into specific areas from surface to interior of OLO. The results of the analysis confirmed that the metal layers of the OLO started to collapse at the surface by repeated charge/discharge cycling.

**Transmission electron microscopy: Provides morphology, crystalline structure, and elemental information of various materials down to the atomic scale, using the diffraction phenomenon of electrons accelerated by a high voltage.

The joint research team used the DNA of a salmon, which has a strong affinity with lithium ions, to control the OLO's surface structure, which was the cause of the material degradation. However, the salmon DNA showed a tendency to aggregate in aqueous solutions. To solve this problem, the research team synthesized the composite coating material that combined ***carbon nanotubes(CNT) and the salmon DNA. The DNA/CNT mixture was uniformly arranged and attached to the surface of the OLO, resulting in the development of a new cathode material.

***Carbon nanotube: A cylindrical nanostructure that consists only of carbon atoms.

The research team at KIST performed integrated advanced analytical techniques (investigating a range of factors, from individual particles to electrodes) and found that the electrochemical characteristics of OLO and the mechanisms of its structural stability improved. The results of the in situ X-ray based analysis for the developed OLO was confirmed that the structural degradation was suppressed during charge/discharge cycling and the thermal stability was improved.

Professor Sang-Young Lee from UNIST said of the significance of the development, "Unlike preexisting attempts, this study uses DNA, the basic unit of life, suggesting a new direction for the development of high-performance battery materials." Kyung Yoon Chung, head of the Center for Energy Storage Research, KIST said, "This research is very meaningful as it presents design factors for stabilized high capacity cathode material using the integrated advanced analytical techniques. Based on this research, we will devote more efforts to develop a new material that can replace existing commercialized materials."

The work done at UNIST was supported by U.S. Army Research Office (ARO), Basic Science Research Program, and Wearable Platform Materials Technology Center through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT, and future Planning. This work was also supported by the Korea Forest Research Institute and Batteries R&D of LG Chem. The work done at KIST was supported by the Technology Development Program to Solve Climate Changes of the National Research Foundation (NRF) funded by the Ministry of Science & ICT and KIST Institutional Program. The results of the research was published as a front cover article in the latest issue of Advanced Energy Materials (March 3, Volume 10, Issue 9), a prestigious international journal on energy (IF: 24.884, top 1.69% of JCR).

Credit: 
National Research Council of Science & Technology

Lighting the way to safer heart procedures

In the first study of its kind, Johns Hopkins researchers provide evidence that an alternative imaging technique could someday replace current methods that require potentially harmful radiation.

The findings, published in the April issue of IEEE Transactions in Medical Imaging, detail success in a heart procedure but can potentially be applied to any procedure that uses a catheter, such as in vitro fertilization, or surgeries using the da Vinci robot, where clinicians need a clearer view of large vessels.

"This is the first time anyone has shown that photoacoustic imaging can be performed in a live animal heart with anatomy and size similar to that of humans. The results are highly promising for future iterations of this technology," says Muyinatu Bell, assistant professor of electrical and computer engineering at The Johns Hopkins University, director of the Photoacoustic & Ultrasonic Systems Engineering (PULSE) Lab, and the study's senior author.

Bell's team of PULSE Lab members and cardiologist collaborators tested the technology during a cardiac intervention, a procedure in which a long, thin tube called a catheter is inserted into a vein or artery, then threaded up to the heart to diagnose and treat various heart diseases such as abnormal heartbeats. Doctors currently most commonly use a technique called fluoroscopy, a sort of x-ray movie, that can only show the shadow of where the catheter tip is and doesn't provide detailed information about depth. Additionally, Bell adds, this current visualization technology requires ionizing radiation, which can be harmful to both the patient and the doctor.

Photoacoustic imaging, simply explained, is the use of light and sound to produce images. When energy from a pulsed laser lights up an area in the body, that light is absorbed by photoabsorbers within the tissue, such as the protein that carries oxygen in blood (hemoglobin), which results in a small temperature rise. This increase in temperature creates rapid heat expansion, which generates a sound wave. The sound wave can then be received by an ultrasound probe and reconstructed into an image.

Past studies of photoacoustic imaging mostly looked at its use outside of the body, such as for dermatology procedures, and few have tried using such imaging with a laser light placed internally. Bell's team wanted to explore how photoacoustic imaging could be used to reduce radiation exposure by testing a new robotic system to automatically track the photoacoustic signal.

For this study, Bell's team first placed an optical fiber inside a catheter's hollow core, with one end of the fiber connected to a laser to transmit light; this way, the optical fiber's visualization coincided with the visualization of the catheter tip.

Bell's team then performed cardiac catherization on two pigs under anesthesia and used fluoroscopy to initially map the catheter's path on its way to the heart.

Bell's team also successfully used robotic technology to hold the ultrasound probe and maintain constant visualization the photoacoustic signal, receiving image feedback every few millimeters.

Finally, the team looked at the pig's cardiac tissue after the procedures and found no laser-related damage. While the team needs to perform more experiments to determine whether the robotic photoacoustic imaging system can be miniaturized and used to navigate more complicated pathways, as well as perform clinical trials to definitively prove safety, they say these findings are a promising step forward.

"We envision that ultimately, this technology will be a complete system that serves the four-fold purpose of guiding cardiologists towards the heart, determining their precise locations within the body, confirming contact of catheter tips with heart tissue and concluding whether damaged hearts have been repaired during cardiac radiofrequency ablation procedures," says Bell.

Credit: 
Johns Hopkins University

Trade friction: Adaptiveness of swarms of complex networks

image: Changes in B2B networks over time.

Image: 
Copyright© 2020 Springer Nature

Trade friction between industries involved in information communication technology (ICT) have become apparent in recent years. Those trade frictions have striking impact on various industries. Adaptations to these economic fluctuations are necessary for industry and companies in respective regions to survive. However, such phenomena are difficult to analyze because the required datasets cannot be obtained synchronously and spatiotemporally. However, social media and other forms of data collection are making it possible to do more analysis in this field.

Research conducted by PhD candidate Yusaku Ogai and Professor Yoshiyuki Matsumura of Shinshu University et al. aimed to demonstrate how business-to-business (B2B) networks of industries change due to the exchange rate, an indicator of economic fluctuation. The research used datasets from the Japanese textile and apparel industry to show the statistical properties of B2B networks and the changing relationships with the USD/JPY exchange rate.

Some previous studies that focused on complex networks have shown that networks are made of core and peripheral networks. Movie industry networks of creators comprise of core and peripheral networks (Cattani and Ferriani 2008). Studies of Twitter account networks have also shown that the network structures are scale-free (Ikegami et al. 2017) and comprise core and peripheral networks. These behaviors within core and peripheral have also been discussed regarding the adaptiveness of swarms and the process of generating new ideas (Craig 1987). The research by Matsumura Lab builds on these previous studies by adding adaptive behavior in complex networks that comprise of core and peripheral networks in the presence of economic fluctuation. There have been studies into how the exchange rate effects the performance of Japanese companies, but studies based on numerical methods focused on B2B networks itself is new.

The study found that cores of networks are more adaptive to environmental changes in the context of complex systems. However, adaptive behavior alone cannot create new ideas or methods for adjusting to environmental changes. When the most adaptive communities cannot adjust to environmental changes, peripheries can introduce new ideas and methods to an entire system. This is why interactions and networking between core and peripheral are important. In the context of multi-agent systems, reinforcement learning is implemented with algorithms that are epsilon-greedy methods (Sutton 1990). The algorithm for an agent control also comprises the most adaptive attitude and exploratory attitudes.

The first approach examined the statistical properties of power-law in the entire network. These results showed that the entire network are made of a few companies with high degrees of connectivity and many companies with low degrees. The few companies with high degrees are the core networks of international trading companies, and the peripheral networks are domestic companies with fewer connections.

The second approach found correlations by regression analyses using the network indexes and the USD/JPY exchange rate. This method showed that the peripheral networks correlated negatively with the USD/JPY exchange rate and that the core networks correlated positively with the USD/JPY exchange rate. These results not only show economic characteristics of importing and exporting but demonstrate the changing of the B2B networks.

This research elucidated strategies for adaptations of complex systems to trade friction by focusing on the Japanese textile and apparel industry to show how currencies of the global economy affect B2B networks as complex structures. Networking with companies affects the performances of the companies. Peripheral networks such as domestic industries can survive from the most adaptive strategies in the fluctuating economy. The core networks do not have to take the most adaptive strategies under the fluctuating economy. Various ideas and interactions in the networks make industries sustainable under the fluctuating economy.

For future possibilities, the authors will try to demonstrate changing B2B networks using computational models. Research can be conducted using agent based modeling which is a simulation tool for social science. The ultimate goal of the Matsumura Lab is to evaluate the adaptation power of industrial investigating strategies, for example, the adaptation power of swarm strategies of artificial intelligence in the ICT industry.

Credit: 
Shinshu University

Papua New Guinea highland research redates Neolithic period

image: University of Otago Professor of Archaeology Professor Glenn Summerhayes with field crew in Papua New Guinea.

Image: 
University of Otago

A new report published in Science Advances on the emergence of agriculture in highland Papua New Guinea shows advancements often associated with a later Neolithic period occurred about 1000 years' earlier than previously thought.

University of Otago Archaeology Programme Professor and report co-author Glenn Summerhayes says findings in Emergence of a Neolithic in highland New Guinea by 5000 to 4000 years ago, provide insights into when and how the highlands were first occupied; the role of economic plants in this process; the development of trade routes which led to the translocation of plants and technologies; and an associated record of landscape, environment and climate change through time.

The report details the earliest figurative stone carving and formally manufactured pestles in Oceania, dating to 5050 to 4200 years ago, which were found at a dig site in Waim. Also found were the earliest planilateral axe-adzes uncovered in New Guinea to date, and the first evidence for fibrecraft and interisland obsidian transfer from neighbouring islands over distances of at least 800km.

"The new evidence from Waim fills a critical gap in our understanding of the social changes and technological innovations that have contributed to the developing cultural diversity in New Guinea," Professor Summerhayes says.

The combination of symbolic social systems, complex technologies, and highland agricultural intensification supports an independent emergence of a Neolithic around 1000 years before the arrival of Neolithic migrants, the Lapita, from Southeast Asia. When considered together with a growing corpus of studies indicating expansion and intensification of agricultural practices, these combined cultural elements represent the development of a regionally distinct Neolithic.

The research establishes dating for other finds at the site, including a fire lighting tool, postholes, and a fibrecraft tool with ochre, possibly used for colouring string fibre.

The report suggests increased population pressure on the uneven distribution of natural resources likely drove this process, which is further inferred by language and genetic divergence.

The project arose out of an Australian Research Council Grant awarded to Dr Judith Field (University of New South Wales) and Professor Summerhayes.

"Former Otago postgraduate student Dr Ben Shaw was employed as postdoctoral fellow to do the "leg work in the field" and Dr Anne Ford (Otago Archaeology Programme) contributed to understandings of the stone tool technologies. As it worked out many of these rich discoveries were made by Dr Shaw. It was one of the best appointments Dr Field and I have ever made. I am proud of our Otago graduates who are some of the best in the world."

Professor Summerhayes and his team had previously completed a Marsden funded project in the Ivane Valley of Papua, establishing the beginning of human occupation at 50,000 years ago. The results of this work were published in Science in 2010.

"This project is a follow-on where we wanted to construct a chronology of human presence in the Simbai/Kaironk Valley of Papua New Guinea by systematic archaeological survey with subsequent excavation and analysis of a select number of sites.

"This work tracks long-term patterns of settlement history, resource use and trade, and establishes an environmental context for these developments by compiling vegetation histories, with particular attention paid to fire histories, indicators of landscape disturbance and markers of climate variability. This will add to understandings of peoples' impact on the environment."

Professor Summerhayes received a Marsden grant in late 2019 for his project "Crossing the divide from Asia to the Pacific: Understanding Austronesian colonisation gateways into the Pacific". This will involve work in the Ramu Valley, which was once part of an inland sea, and will tie in the developments of Highland New Guinea, with the movements of Austronesian speakers into the Pacific.

Credit: 
University of Otago

The lipid code

image: Molecular probes (in blue) for the analysis of lipid messengers.

Image: 
Schuhmacher et al., MPI-CBG

Lipids, or fats, have many functions in our body: They form membrane barriers, store energy or act as messengers, which regulate cell growth and hormone release. Many of them are also biomarkers for severe diseases. So far, it has been very difficult to analyze the functions of these molecules in living cells. Researchers at the Max Planck Institute of Molecular Cell Biology and Genetics (MPI-CBG) in Dresden and the Leibniz Research Institute for Molecular Pharmacology (FMP) in Berlin have now developed chemical tools that can be activated by light and used to influence lipid concentration in living cells. This approach could enable medical doctors to work with biochemists to identify what molecules within a cell actually do. The study was published in the journal PNAS.

Every cell can create thousands of different lipids (fats). However, little is known how this chemical lipid diversity contributes to the transport of messages within the cell, in other words, the lipid code of the cell is still unknown. This is mainly due to the lack of methods to quantitatively study lipid function in living cells. An understanding of how lipids work is very important because they control the function of proteins throughout the cell and are involved in bringing important substances into the cell through the cell membrane. In this process it is fascinating that only a limited number of lipid classes on the inside of the cell membrane act as messenger molecules, but they receive messages from thousands of different receptor proteins. It is still not clear, how this abundance of messages can still be easily recognized and transmitted.

The research groups led by André Nadler at the MPI-CBG and Alexander Walter at the FMP, in collaboration with the TU Dresden, have developed chemical tools to control the concentration of lipids in living cells. These tools can be activated by light. Milena Schuhmacher, the lead author of the study, explains: "Lipids are actually not individual molecular structures, but differ in tiny chemical details. For example, some have longer fatty acid chains and some have slightly shorter ones. Using sophisticated microscopy in living cells and mathematical modelling approaches, we were able to show that the cells are actually able to recognize these tiny changes through special effector proteins and thus possibly use them to transmit information. It was important that we were able to control exactly how much of each individual lipid was involved." André Nadler, who supervised the study, adds: "These results indicate the existence of a lipid code that cells use to re-encode information, detected on the outside of the cell, on the inner side of the cell."

The results of the study could enable membrane biophysicists and lipid biochemists to verify their results with quantitative data from living cells. André Nadler adds: "Clinicians could also benefit from our newly developed method. In diseases such as diabetes and high blood pressure, more lipids that act as biomarkers are found in the blood. This can be visualized with a lipid profile. With the help of our method, doctors could now see exactly what the lipids are doing in the body. That wasn't possible before."

Credit: 
Forschungsverbund Berlin

Virginia Tech's fog harp harvests water even in the lightest fog

image: Jonathan Boreyko and Brook Kennedy inspect a fog harp at Kentland Farm. Photo by Peter Means for Virginia Tech.

Image: 
Virginia Tech

What do you get when you cross a novel approach to water harvesting with a light fog? The answer: a lot more water than you expected.

The development of the fog harp, a Virginia Tech interdisciplinary pairing of engineering with biomimetic design, was first reported in 2018. The hope behind the fog harp's development was simple: in areas of the world where water is scarce but fog is present, pulling usable water from fog could become a sustainable option. While fog nets are already in use, the superior efficiency of the fog harp could dramatically increase the number of regions worldwide where fog harvesting is viable. The difference comes in the fog harp's uncanny ability to derive water from less dense fog than its predecessors.

The partnered approach has been a combination of new design with existing science. The science initiated with Assistant Professor Jonathan Boreyko from the Department of Mechanical Engineering within the College of Engineering. His group hypothesized the harp approach and characterized the performance of the harp prototypes. Design development has been led by Associate Professor Brook Kennedy from the Department of Industrial Design in the College of Architecture and Urban Studies. Kennedy's product development and materials knowledge brought the project to the point where it could be prototyped and tested in real-world environments. Early funding came from the Institute for Creativity, Arts, and Technology.

"Billions of people face water scarcity worldwide," Kennedy said. "We feel that the fog harp is a great example of a relatively simple, low-tech invention that leverages insight from nature to help communities meet their most basic needs."

The "harp" design uses parallel wires to collect ambient water from fog, whereas current technology in use around the globe relies primarily on a screen mesh. The lab-proven theory for the new device was that parallel wires are more efficient at gathering water, avoiding clogs and enhancing drainage into the collector. The researchers' small-scale early tests showed that in high-fog conditions, their harps outpaced those with meshes by a factor of two to one.

Testing then literally moved to the field. In the open fields of Virginia Tech's Kentland Farm, then-undergraduate Brandon Hart built roofed structures to prevent rainfall from impacting findings. Under these coverings, fog harps were placed side-by-side with three different mesh harvesters: one with wire diameters equivalent to the harp, one with a wire size more optimal to harvesting, and one using Raschel mesh -- a mesh made of flat-panel ribbons in v-shaped arrays between horizontal supports. This v-shaped mesh is currently the most popular among fog harvesting sites around the world.

Whereas heavy fog conditions were used in the lab, the actual fog conditions surrounding Virginia Tech are generally much lighter. As field tests began, Boreyko and Kennedy were skeptical that the available fog would provide the feedback they needed to do adequate testing. They were pleasantly surprised.

As fog began rolling over the hills of the New River Valley, the fog harps always showed results. In thin fog, the collection pipes of the mesh collectors were completely devoid of drips. Even as fog density increased, the harps continued outperforming their companions. Depending on the density of the fog, this ranged from twice as much output to almost 20 times.

Bringing together lab studies and field data, researchers determined that collection potential is the result of multiple factors. Greatest among these is the size of collectable water droplets between mesh and harp. To be harvested in both cases, water must be caught on the mesh or harp as air passes through, traveling downward into collection points by gravity. Fog harps use only vertical wires, creating an unimpeded path for mobile drops. Mesh collectors, by contrast, have both horizontal and vertical construction, and water droplets must be significantly larger to cross the horizontal pieces. In field tests, mesh collectors routinely required droplets reaching a size roughly 100 times larger than those on harps before descending. Water that never drops will simply evaporate and cannot be collected.

"We already knew that in heavy fog, we can get at least two times as much water," said Boreyko. "But realizing in our field tests that we can get up to 20 times more water on average in a moderate fog gives us hope we can dramatically enhance the breadth of regions where fog harvesting is a viable tool for getting decentralized, fresh water."

Credit: 
Virginia Tech

Cancer drug resistance study raises immune red flags

image: Lead author Mark Sundrud, PhD, associate professor of Immunology and Microbiology at Scripps Research, Florida, examines cells in his lab.

Image: 
Scott Wiseman for Scripps Research

JUPITER, Fla.--April 17,2020--Sooner or later, most cancer patients develop resistance to the very chemotherapy drugs designed to kill their cancer, forcing oncologists to seek alternatives. Even more problematic, once a patient's tumor is resistant to one type of chemotherapy, it is much more likely to be resistant to other chemotherapies as well, a conundrum long known as multidrug resistance. Once patients reach this point, the prognosis is often poor, and for the last 35 years scientists have attempted to understand and block multidrug resistance in cancer by using experimental medicines.

A new study from scientists at Scripps Research in Florida raises red flags about this strategy. Inhibiting the key gene involved in cancer drug resistance has unintended side effects on specialized immune system cells called CD8+ cytotoxic T lymphocytes (CTLs), the team found. This could dull anti-cancer immune responses, and potentially increase vulnerability to infection, since CTLs are "killer" T cells, essential in the fight against both viral and bacterial infections and tumors, says lead author Mark Sundrud, PhD, associate professor of Immunology and Microbiology at Scripps Research.

Several genes are now recognized for contributing to multidrug resistance in cancer, but the first and most prominent of these is called multidrug resistance-1 (MDR1). Its discovery more than three decades ago set off a race to develop drugs that would inhibit expression of MDR1. But those MDR1 inhibitor drugs have consistently disappointed in clinical trials. The reasons behind these failures have remained enigmatic.

In a new study published Thursday in the Journal of Experimental Medicine, Sundrud and colleagues including Scripps Research immunologist Matthew Pipkin, PhD, suggest that the repeated failure of MDR1 inhibitors in human cancer trials may be due to a previously unrecognized--and essential--function of the MDR1 gene in CD8+ cytotoxic T lymphocytes.

Using new genetic approaches to visualize and functionally assess MDR1 expression in mouse cells, the team found that CTLs were unique in their constant and high-level expression of MDR1. In addition, preventing MDR1 expression in CTLs, or blocking its function using inhibitors previously tested in human cancer trials, sets off a chain reaction of CTL dysfunction, ultimately disabling these cells from fighting off viral or bacterial infections.

Considering that these cells are also necessary for warding off most cancerous tumors, blocking MDR1 with existing inhibitors could also cripple natural immune responses to cancers, Sundrud says.

"With the help of our collaborators at New York University Medical Center, we looked at mouse immune cells from five major lymphoid and nonlymphoid tissues: bone marrow, thymus, spleen, lung, and small intestine," Sundrud says. "It became clear that the types of cells that are key to fighting infections and cancers, are among those most sensitive to blocking MDR1 function."

It has been known for decades that CTLs, as well as "natural killer" cells, a type of white blood cell, express high levels of the MDR1 gene. But because MDR1 has historically been viewed only through the lens of creating multidrug resistance in cancer cells, few researchers thought to ask what MDR1 does during normal immune responses; those that did found confusing and often contradictory results, Sundrud says, likely due to the use of non-specific animal model systems.

Convinced that MDR1 might impact natural immune responses, Sundrud and colleagues sought to devise more specific mouse models to directly visualize and functionally characterize MDR1 expression in vivo. Additional experiments revealed that blocking MDR1 function hampered the earliest stages of the CTL response to infections, when these cells multiply rapidly to reach the numbers needed to kill all viral and bacterial invaders. In line with this result, MDR1 inhibition also affected long-lived immunity to infections that have been previously seen and eradicated. It also affected the cells' energy organelles, called mitochondria.

"We think that MDR1 plays a special role in helping mitochondria provide energy to growing cells" Sundrud says. "So, if you take this away, it makes sense that these cells can't support the metabolic demand of cell division, and that they ultimately die."

On one hand, Sundrud says, the research raises questions about the safety and utility of using systemic MDR1 inhibitors as cancer therapies. At the same time, the work reveals important new mechanisms that determine how the immune system fights off infections and develops long-lived memory.

"These insights become all the more pertinent today, given all the questions and concerns related to immunity against the pandemic coronavirus that causes COVID-19," Sundrud says.

The team is now looking to use this new knowledge to finally nail down a unifying function of MDR1 in all cells, whether it is in CTLs responding to infections, or cancer cells trying to deal with chemotherapeutic agents.

In the shorter term, Sundrud and colleagues plan to explore new approaches to re-design existing MDR1 inhibitors to specifically target only cancer cells.

"This way you might be able to prevent multidrug resistance in cancer cells, without affecting the immune cells that are trying to fight off the tumor", Sundrud says.

Credit: 
Scripps Research Institute

Under pressure: New bioinspired material can 'shapeshift' to external forces

image: For the JHU team's experiment, increased force (arrow pointing down) applied on the material led to more electrical charges, and thus, more mineralization.

Image: 
Pam Li/Johns Hopkins University

Inspired by how human bone and colorful coral reefs adjust mineral deposits in response to their surrounding environments, Johns Hopkins researchers have created a self-adapting material that can change its stiffness in response to the applied force. This advancement can someday open the doors for materials that can self-reinforce to prepare for increased force or stop further damage.

A report of the findings was published today in Advanced Materials.

"Imagine a bone implant or a bridge that can self-reinforce where a high force is applied without inspection and maintenance. It will allow safer implants and bridges with minimal complication, cost and downtime," says Sung Hoon Kang, an assistant professor in the Department of Mechanical Engineering, Hopkins Extreme Materials Institute, and Institute for NanoBioTechnology at The Johns Hopkins University and the study's senior author.

While other researchers have attempted to create similar synthetic materials before, doing so has been challenging because such materials are difficult and expensive to create, or require active maintenance when they are created and are limited in how much stress they can bear. Having materials with adaptable properties, like those of wood and bone, can provide safer structures, save money and resources, and reduce harmful environmental impact.

Natural materials can self-regulate by using resources in the surrounding environment; for example, bones use cell signals to control the addition or removal of minerals taken from blood around them. Inspired by these natural materials, Kang and colleagues sought to create a materials system that could add minerals in response to applied stress.

The team started off by using materials that can convert mechanical forces into electrical charges as scaffolds, or support structures, that can create charges proportional to external force placed on it. The team's hope was that these charges could serve as signals for the materials to start mineralization from mineral ions in the environment.

Kang and colleagues immersed polymer films of these materials in a simulated body fluid mimicking ionic concentrations of human blood plasma. After the materials incubated in the simulated body fluid, minerals started forming on the surfaces. The team also discovered that they could control the types of minerals formed by controlling the fluid's ion composition.

The team then set up a beam anchored on one end to gradually increase stress from one end of the materials to the other and found that regions with more stress had more mineral buildup; the mineral height was proportional to the square root of stress applied.

Their methods, the researchers say, are simple, low-cost and don't require extra energy.

"Our findings can pave the way for a new class of self-regenerating materials that can self-reinforce damaged areas," says Kang. Kang hopes that these materials can someday be used as scaffolds to accelerate treatment of bone-related disease or fracture, smart resins for dental treatments or other similar applications.

Additionally, these findings contribute to scientists' understanding of dynamic materials and how mineralization works, which could shed light on ideal environments needed for bone regeneration.

Credit: 
Johns Hopkins University