Tech

Abandoned cropland helps make Europe cooler

If you've ever sat in the cool shade of a tree on a hot summer day, you already know that shaded areas are cooler than open fields. But is that kind of cooling enough to make a difference in the hotter world of the future?

When a team of researchers looked at more than 20 years of recent land use changes for Europe and combined that with a climate model to provide information on temperatures during the same period, they found the answer to this question is a clear yes.

"When we put all the land cover changes together and looked at how these affected climate, we found a widespread seasonal cooling -- up to one degree C in the summer -- in western Europe," said Francesco Cherubini, the senior author of a newly published paper on the findings in Nature Communications and head of the Industrial Ecology Programme at the Norwegian University of Science and Technology.

Cherubini and his colleagues say this kind of information is vital to helping Europe plan policies that will encourage the right kind of land use for a warmer future.

"We can couple the global challenge of mitigation with the local need for climate adaption if we choose the right combination of land uses," he said.

The article is entitled "Predominant regional biophysical cooling from recent land cover changes in Europe."

More than just CO2

Cherubini was one of the lead authors of an ICC special report on Climate Change and Land published last autumn, so he's well aware of the role that land use plays in determining local and regional climate.

The 2019 IPCC land report demonstrated that land use can help stabilize temperature rises to a relatively low level, he said.

For example, the IPCC study showed that decreasing the amount of land used for grazing animals can free up land for growing forests, which soak up CO2 as they grow.

The new study goes beyond looking at how land can help store CO2, however, by looking at other ways in which land cover affects the climate.

"Usually we look at carbon in or carbon out," Cherubini said. "But here we assess the other effects through which the land interacts with climate systems, not just carbon."

These other effects include how different kinds of land cover reflect or absorb sunlight -- which clearly affects surface temperature -- along with humidity levels and evapotranspiration. Evapotranspiration is a term that describes both water losses due to evaporation from water bodies, and water losses when trees lose water through their leaves, which is called transpiration.

All these factors are important, he says, because policy makers need to look at all the different pieces of the climate puzzle, not just carbon dioxide.

"By having policies that only focus on carbon, you completely overlook these other effects, which are important from a regional climate perspective," he said.

"The ambition here is to have land management planning, where you can tackle the global challenges of carbon storage through land management, combined with strategies that have local cooling benefits," he said.

Climate model and satellite data

The researchers relied on the European Space Agency's satellite information on land cover, which has data on changes in vegetation cover from 1992 to 2015.

This incredibly detailed dataset allowed the researchers to map land use cover for the 24-year period under eight broad categories: evergreen needleleaf forest, deciduous broadleaf forest, open shrubland, cropland, urban and built-up, cropland/natural vegetation mosaic, wetland, and grassland.

They then combined these maps of vegetation changes with a regional climate model that simulated the climate for the same 24-year period.

"The model used actual observed atmospheric conditions," said Bo Huang, a postdoc at the Industrial Ecology Programme and first author of the paper. "This gave us realistic information about how the changes we saw in land cover also affected changes in climate over the period."

They were also able to compare their results with other empirical studies from different parts of Europe, which confirmed their findings.

Area of cropland loss the size of Switzerland

The researchers found that approximately 25 million hectares (Mha) of agricultural land was abandoned in Europe during the 24 years for which they had data, although cropland expansion elsewhere in Europe of about 20 Mha meant that the net loss of cropland was 5 Mha. That's a loss that is a little larger than the area of Switzerland.

When cropland was abandoned, it was mostly taken over by forests, and to a lesser extent, urban settlements. Cherubini said the main reason that cropland was abandoned was because of socioeconomic factors.

"People might have gotten tired of living in the countryside, or they don't want to work on their farm anymore," he said. "We saw this especially in the former Soviet Union after the fall of the (Berlin) Wall, because farmers were exposed to agricultural trade and international markets."

As a consequence of agriculture abandonment, forested areas in Europe increased
by about 23 Mha, with about 7 Mha of net gain. Some of these gains in forest area resulted when trees colonized wetlands and peatlands that had dried out over the period due to warmer summers and less precipitation. This last change -- the drying out of wetlands in eastern Europe -- also had significance for temperatures in eastern Europe, especially in the summer.

Cooler in western Europe, warmer in eastern Europe

When the researchers put all their data together, they saw that cropland abandonment in western Europe was associated with a regional cooling of roughly 1 degree C. in the spring and summer, and lesser amounts of cooling in the autumn and winter.

But eastern Europe, especially in the northeast, showed the opposite trend with warming of up to 1 degree C in some areas during the spring and summer.

The reason for this warming is partly because wetlands in this region are drying out, said Xiangping Hu, a researcher at NTNU's Industrial Ecology Programme and one of the paper's contributing authors.

"When the sun shines on a 'wet' wetland, much of the energy from the sun goes to evaporating the water in the wetland rather than heating the surface of the wetland," he said. "In a 'dry' wetland, most of the sun's energy goes to heating the surface of the wetland, so the air above it also warms."

The researchers saw this clear trend in their temperature modelling of the area.

One of the main outcomes of the study was the different climate response to vegetation growth in eastern versus western Europe, Cherubini said, because of different local conditions.

For example, eastern Europe is drier than western Europe, so when trees revegetate cropland, they don't have access to as much soil water for transpiration as their counterparts in western Europe. That difference is enough to overcome the benefits of cropland abandonment in eastern Europe, which is another reason why the researchers' analysis showed warming in eastern Europe but cooling in western Europe with cropland abandonment.

In contrast to both eastern and western Europe, however, Scandinavia showed relatively little change in temperatures linked to land use cover changes over the period. That's because there was little change in land use, the researchers found.

Creating win-win situations

An awareness of these local and regional effects can allow European policymakers to create incentives that will help mitigate temperature increases to come.

For example, in northern Europe, policymakers could find ways to prevent wetlands from drying out as a way to limit temperature increases, Cherubini said. In western Europe, he said, policymakers could have "specific planning and incentives for revegetation of open land, considering the local cooling benefits as a synergy of global climate change mitigation," he said.

By lowering food waste in general and promoting more efficient agriculture on land that is being farmed, less land will be needed for primary agricultural production.

Cherubini pointed out that warming is occurring much faster over land than compared to the global average level.

"We are already at a mean warming of about 1.8 degrees C on the land, and we will be about 3 degrees on the land even if we are successful at stabilizing the average global temperature at 1.5 degrees C," he said. "That means we need to take action to adapt to a warming climate, and land use planning is one action that can bring local cooling benefits."

"The message is quite clear," Cherubini said. "Abandoned cropland -- or land cover change more generally -- and its role in regional climate can help to us adapt and mitigate the effects of climate change. And by improving agricultural systems, we can free up land for multiple uses."

Credit: 
Norwegian University of Science and Technology

Researchers make asthma breakthrough

Researchers from Trinity College Dublin have made a breakthrough that may eventually lead to improved therapeutic options for people living with asthma. The researchers have uncovered a critical role for a protein (Caspase-11), which had previously never been implicated in the disease.

They report their findings today [Wednesday 26th February 2020] in leading journal Nature Communications.

Lead author Zbigniew Zaslona, working with a team led by Luke O'Neill, Professor of Biochemistry in the School of Biochemistry and Immunology in the Trinity Biomedical Sciences Institute, has been exploring the role that inflammation plays in asthma - a very common and often serious disease of childhood.

Ireland has one of the highest incidences of asthma in Europe, which in its most severe form remains difficult to treat and can be fatal. Caspase-11 is a protein with an important role in defending against bacteria, but the team in Trinity has found that when it is over-active it can provoke a damaging inflammatory reaction. When this happens, it is likely to be a key driver of allergic inflammation in the lungs of asthmatics.

Dr Zaslona said:

"Caspase-11 can cause cells to die, which is a very inflammatory event as the cells then release their contents, which can irritate tissues in our body. We have found that Caspase-11 is a key driver of inflammation in the airways in asthma. This causes the signs and symptoms of asthma which most notably involves difficulty breathing."

Although symptoms of mild asthma can be managed with current therapies, severe asthma remains very difficult to treat and asthma rates are constantly on the rise.

Dr Zaslona added:

"A variety of irritants such as airborne pollutants, certain types of pollen and house dust mites can induce cell death in the lungs. Our work suggests that Caspase-11 is sensing these noxious things and causing disease."

Professor O'Neill said:

"Caspase-11 - or it's human equivalent, which is Caspase-4 - has never been implicated in asthma before so we think it holds great promise as a possible target for new drugs to treat this common, debilitating disease."

Credit: 
Trinity College Dublin

MOF co-catalyst allows selectivity of branched aldehydes of up to 90%

image: Micropores of MOFs with certain topologies increase the density of the olefins while partially preventing the adsorption of the synthesis gas.

Image: 
@PSI

Efforts to develop heterogeneous catalysts that appeal to the fine chemical industry have been limited by underwhelming results. Although some approaches have shown promising catalytic activity, "heterogenization" itself is not enough. To be adopted by industry, heterogeneous catalysts must promote selectivity that is difficult or even impossible to obtain with existing catalytic systems--the chemical properties of any proposed heterogeneous catalysts must go beyond easier separation and recycling.

The chemical-flexibility, tuneable pore size and chemical and structural stability of metal-organic frameworks (MOFs) makes them ideal for designing active sites at the molecular level. Able to selectively adsorb different molecules depending on their structure, they can direct selectivity and reaction performance. Many promising catalytic applications using MOFs as precursors for novel materials as well as model systems for understanding heterogeneous catalysis processes have been described. The field of catalysis by MOFs is still in its infancy though since most examples are proof-of-concepts and do not offer attractive advantages to existing catalysts.

In the paper Metal-organic frameworks as kinetic modulators for branched selectivity in hydroformylation, researchers from the Paul Scherrer Institute's Syncat Group, led by Marco Ranocchiari, and EPFL's Laboratory of Molecular Simulation, a computational group led by Berend Smit, used the example of hydroformylation to show how adsorption properties of MOFs can be exploited in catalysis to get results that have otherwise been inaccessible. The methods presented can be used to predict the effect of such microporous co-catalysts in increasing selectivity in any homogeneous or heterogeneous catalytic reaction.

Hydroformylation, or oxo synthesis, is an industrial process for obtaining aldehydes from olefins. Current catalytic processes yield both linear aldehydes, which are key intermediates for the detergent and polymer industry, and branched ones, which are considered a powerful tool for the fine chemical industry because of their possible use in producing enantioenriched products, that is, products featuring a greater proportion of a given enantiomer of a chiral substance.

The linear isomers are often formed with rhodium catalysts. Branched aldehydes are formed from rhodium catalysts with bidentate ligands with directing groups to enhance selectivity. Producing the sought-after branched isomers without these directing groups is still a challenge though and can only be achieved through complex Rh catalysts. They have been shown to result, for instance, in a selectivity for 2-methylhexanale from 1-hexene up to 75% and up to 86% for 2-methylbutanale from 1-butene.

The researchers first screened several catalytic conditions to maximize the yield of the branched product that could be obtained with homogeneous catalysis. They then showed how they could go beyond this limit and achieve much higher branched selectivity by adding MOFs to the reaction mixture. They also tested different MOF topologies to understand the role of the MOF environment in such a change in selectivity.

The group was able to show that the micropores of MOFs push the cobalt-catalyzed hydroformylation of olefins to kinetic regimes that favor high branched selectivity, without the use of any directing groups. The addition of MOFs allowed branched selectivity of up to 90% in these cases, a feat that cannot be achieved with existing catalysts. Monte Carlo and density functional theory simulations combined with kinetic models show that the micropores of MOFs with certain topologies increase the density of the olefins while partially preventing the adsorption of the synthesis gas--this is what leads to the high branched selectivity.

Though the research focused on aldehydes, the methods presented can be used to predict the effect of microporous co-catalysts in increasing selectivity in any homogeneous or heterogeneous catalytic reaction. Researchers can determine the microporous material that has the best chances of increasing selectivity by first choosing those that can adsorb the catalyst while being inert under reaction conditions, and by then using simulations to determine how the microporous materials might change the local concentration of the selectivity determining reactant(s) within the micropores.

Credit: 
National Centre of Competence in Research (NCCR) MARVEL

A tactile robot finger with no blind spots

image: Tactile fingers progressing through its manufacturing stages: 3D-printed skeleton, flexible circuit board, transparent silicone layer, and reflective skin.

Image: 
Pedro Piacenza / Columbia Engineering

Columbia Engineers first to demonstrate a robotic finger with a highly precise sense of touch over a complex, multicurved surface.

New York, NY--February 26, 2020--Researchers at Columbia Engineering announced today that they have introduced a new type of robotic finger with a sense of touch. Their finger can localize touch with very high precision--

"There has long been a gap between stand-alone tactile sensors and fully integrated tactile fingers--tactile sensing is still far from ubiquitous in robotic manipulation," says Matei Ciocarlie, associate professor in the departments of mechanical engineering and computer science, who led this work in collaboration with Electrical Engineering Professor Ioannis (John) Kymissis. "In this paper, we have demonstrated a multicurved robotic finger with accurate touch localization and normal force detection over complex 3D surfaces."

Current methods for building touch sensors have proven difficult to integrate into robot fingers due to multiple challenges, including difficulty in covering multicurved surfaces, high wire count, or difficulty fitting into small fingertips, thus preventing use in dexterous hands. The Columbia Engineering team took a new approach: the novel use of overlapping signals from light emitters and receivers embedded in a transparent waveguide layer that covers the functional areas of the finger.

By measuring light transport between every emitter and receiver, they showed that they can obtain a very rich signal data set that changes in response to deformation of the finger due to touch. They then demonstrated that purely data-driven deep learning methods can extract useful information from the data, including contact location and applied normal force, without the need for analytical models. Their final result is a fully integrated, sensorized robot finger, with a low wire count, built using accessible manufacturing methods and designed for easy integration into dexterous hands.

The study, published online in IEEE/ASME Transactions on Mechatronics, demonstrates the two aspects of the underlying technology that combine to enable the new results. Firstly, in this project, the researchers use light to sense touch. Under the "skin," their finger has a layer made of transparent silicone, into which they shined light from more than 30 LEDs. The finger also has more than 30 photodiodes that measure how the light bounces around. Whenever the finger touches something, its skin deforms, so light shifts around in the transparent layer underneath. Measuring how much light goes from every LED to every diode, the researchers end up with close to 1,000 signals that each contain some information about the contact that was made. Since light can also bounce around in a curved space, these signals can cover a complex 3D shape such as a fingertip.

"The human finger provides incredibly rich contact information--more than 400 tiny touch sensors in every square centimeter of skin!" says Ciocarlie. "That was the model that pushed us to try and get as much data as possible from our finger. It was critical to be sure all contacts on all sides of the finger were covered--we essentially built a tactile robot finger with no blind spots."

Secondly, the team designed this data to be processed by machine learning algorithms. Because there are so many signals, all of them partially overlapping with each other, the data is too complex to be interpreted by humans. Fortunately, current machine learning techniques can learn to extract the information that researchers care about: where the finger is being touched, what it is touching the finger, how much force is being applied, etc.

"Our results show that a deep neural network can extract this information with very high accuracy," says Kymissis. "Our device is truly a tactile finger designed from the very beginning to be used in conjunction with AI algorithms."

In addition, the team built the finger so it, and others, can be put onto robotic hands. Integrating the system onto a hand is easy: thanks to this new technology, the finger collects almost 1,000 signals, but only needs a 14-wire cable connecting it to the hand, and it needs no complex off-board electronics. The researchers already have two dexterous hands (capable of grasping and manipulating objects) in their lab being outfitted with these fingers--one hand has three fingers, and the other one four. In the next months, the team will be using these hands to try and demonstrate dexterous manipulation abilities, based on tactile and proprioceptive data.

"Dexterous robotic manipulation is needed now in fields such as manufacturing and logistics, and is one of the technologies that, in the longer term, are needed to enable personal robotic assistance in other areas, such as healthcare or service domains," Ciocarlie adds.

Credit: 
Columbia University School of Engineering and Applied Science

England off track to meet government's 2030 smoke-free target

England will fail to be smoke-free by 2030 if current smoking trends continue, according to a report* released today (Tuesday) from Cancer Research UK.

New figures reveal that England is not expected to reach smoke-free until 2037 - seven years behind the ambitious target set last year. And unless smoking in the poorest communities is tackled, health inequalities will remain rife.**

The new projections show around a 20-year gap in smoking rates between the least and most deprived people in England, with the richest expected to achieve smoke-free in 2025, and the poorest not reaching it until the mid-2040s.

The Government hopes that 2030 will be the year when England can call itself smoke-free - getting the overall proportion of adults who smoke down to 5%. 14% of adults in England smoke cigarettes.*** But plans on how to achieve this are yet to be set out. Reinvesting in stop smoking services and national education campaigns that encourage smokers to quit will be essential, as both have had significant cuts in recent years.

In order to reach its ambition, smoking rates need to drop 40% faster than projected. Cancer Research UK believes there are actions Government must urgently take to achieve this, including a fixed annual charge on the tobacco industry which would provide funding to reduce the £11 billion burden smoking related illnesses cost society in England every year.

This money could help provide more funding for stop smoking services which give smokers the best support to give up for good. But Government need to ensure the tobacco industry are not involved in how this is spent.

If these actions are taken and the Government reaches its target, there could be around 3.4 million fewer smokers in England by 2030 compared with today.

Dr Katrina Brown, Cancer Research UK statistics manager and report co-author, said: "Our modelling suggests that if the 2030 target is achieved, there could be around 3.4 million fewer smokers in England compared with today. But unless Government acts to make smoking rates fall faster, we're unlikely to reach the target.

"Smoking is the biggest cause of cancer, leading to around 120 cases of cancer in England every day, so it's vital that the government tackles tobacco to prevent illness and suffering."

Smoking takes the lives of around 115,000 people in the UK every year.

Alison Cox, Cancer Research UK's director of cancer prevention, said: "Smoking - and its catastrophic impact on health - remains more common within poorer communities. So more funding is needed to help these disadvantaged groups to quit as they are increasingly being left behind.

"The tobacco industry makes more money every year than Coca Cola, Disney, Google, McDonalds and FedEx combined, while its products continue to kill people. It should be made to pay for the damage it causes, which is why we're calling on the Government to introduce an annual charge on the industry to fund these vital services that will help get England smoke-free by 2030.

"The government must act now if they are to see this smoke-free ambition become a reality."

Credit: 
Cancer Research UK

Electrolyte supplements don't prevent illness in athletes, study finds

Electrolyte supplements popular with endurance runners can't be relied on to keep essential sodium levels in balance, according to researchers at the Stanford University School of Medicine and their collaborators.

Rather, longer training distances, lower body mass and avoidance of overhydration were shown to be more important factors in preventing illness caused by electrolyte imbalances, the researchers found. Their study also showed that hot weather increased the rates of these types of illnesses.

"Electrolyte supplements are promoted as preventing nausea and cramping caused by low salt levels, but this is a false paradigm," said Grant Lipman, MD, professor of emergency medicine at Stanford and director of Stanford Wilderness Medicine. "They've never been shown to prevent illness or even improve performance -- and if diluted with too much water can be dangerous."

Lipman is the lead author of the study, which will be published online Feb. 25 in the Clinical Journal of Sports Medicine. Brian Krabak, MD, a sports and rehabilitation medicine specialist at the University of Washington-Seattle, is the senior author.

Providing care for ultramarathoners

Lipman and several of his co-authors are experienced at providing medical care for ultramarathoners and compete in ultramarathons themselves, so they are familiar with the challenges the athletes face. Ultramarathons are any foot race longer than a marathon. The idea for this study grew out of seeing firsthand how often endurance athletes use electrolyte supplements -- whether taken in pill, powder or liquid form -- and wanting to know if they prevented illness.

"In the past, athletes were told to make sure they're taking electrolyte supplements and drinking as much water as they can," Lipman said. "It was generally thought that that would prevent things like muscle cramping, electrolyte imbalances and dizziness. But there is currently no evidence to show this is true."

Sodium levels that are too high or too low during exercise can harm athletes. This study focused on two conditions: hypernatremia, which occurs when sodium levels are too high and is associated with dehydration, and exercise-associated hyponatremia, or EAH, which is caused by a drop in sodium levels. EAH can lead to altered mental status, seizures, pulmonary edema and even death. There have been 14 such documented fatalities since 1985, according to previous studies.

"Most athletes worry about dehydration, but that won't kill you," Lipman said. "There have been multiple deaths from EAH, not only among endurance athletes but among military, football players and half-marathoners, as well."

'The perfect outdoor lab'

To conduct the study, researchers recruited 266 ultramarathoners from RacingThePlanet's weeklong athletic events, which involve running 155 miles over seven days across rough terrain in extreme weather conditions at different deserts around the world.

"It's the perfect outdoor lab for easily generalizable results," said Lipman, who is on the medical advisory board and is research director for the events. Each of the study participants ran in one of five different races held in 2017 and 2018 in South America, Namibia and Mongolia. Data was collected on the fifth day of the event, when the athletes ran 50 miles. Ninety-eight of the runners competed in temperatures that averaged over 93 F.

"It's a bit crazy," said co-author Patrick Burns, MD, assistant professor of emergency medicine at Stanford and a participant in the trial, who completed one of the colder races -- held in the Patagonia region of Chile. "The race started at 8 a.m. and I finished at 8 p.m. We ran trails through the woods with thousand-foot climbs and multiple river crossings up to your waist. It was cold and overcast and raining. I was soaked. My Achilles tendons were on fire."

Data was collected from athletes at the beginning and end of the 50-mile race, when the exhausted, thirsty participants finally crossed the finish line. Prior to the race, the participants had been asked what electrolyte supplements they planned to use, how often they planned to take them and what their drinking strategy was -- whether they planned to drink at regular intervals or just when they got thirsty. They reported their previous training programs and were weighed in. At the finish line, before hydrating or resting, researchers once again weighed them and asked how closely they followed their plans for drinking and taking supplements. A blood sample was also taken to measure sodium levels.

"People have different strategies in these races," Lipman said. "Some people take a salt tablet every hour. Some prefer to put the supplements in one water bottle, then alternate with a bottle with just water. Some like a diluted mixture with powder or tablets. There are multiple different methods. However, most electrolyte strategies end up with a drink that has a lower sodium concentration than what is found in the body. This is why drinking too much electrolyte solutions can result in EAH. "

Sodium's essential roles

Sodium plays several essential roles in the body, such as maintaining blood pressure and regulating the function of muscles and nerves. Keeping sodium levels in balance while exercising is particularly important to prevent a variety of problems, including nausea, muscle cramping, dizziness and fatigue. Both high and low levels can cause these symptoms.

Past evidence has shown that electrolyte supplements don't protect against EAH. Usually, the disorder is caused by drinking too much while exercising, which dilutes salt levels. "The reality is dehydration is not as dangerous as overhydrating," Lipman said. "Dehydration and hypernatremia can cause similar symptoms to EAH, which can be easily confused, especially in the heat, but it's rarely fatal."

Analysis of the data showed that 41 of the athletes had sodium imbalances by the end of the race: 11 were found to have EAH due to too little sodium, and 30 were dehydrated, with too much sodium in their blood. Also, 88% of the sodium imbalances recorded occurred during the hot races, indicating that heat and hydration levels were far more predictive of sodium imbalances than either the manner or type of electrolyte supplements taken. Each of the participants took supplements, although the type, amount and manner of ingestion showed little to no effect on sodium levels.

"Overhydrating can reduce electrolyte levels, and electrolyte supplements aren't going to protect you," said Lipman. "You have to be smart while exercising, especially in the heat when you are sweating more and have greater hydration requirements."

Further analysis of the data also showed that participants with EAH had, on average, shorter training programs, weighed more and took five to six hours longer to complete the race. Researchers concluded that running in hot temperatures was an independent risk factor for illnesses from sodium imbalances, avoidance of overhydration was the most important factor in preventing EAH, and avoidance of dehydration prevented hypernatremia.

Burns said that this study raises questions about what exactly are the benefits of electrolyte supplements, but he still plans to keep using them during athletic competitions until more research is done.

And, as Lipman said, "Listen to your body. Stop drinking if you feel bloated or nauseous." Drink to thirst, not at regularly scheduled intervals, he said.

Credit: 
Stanford Medicine

Seeds in Tibet face impacts from climate change

image: Climate resilience of plants like these wildflowers of the Tibetan Plateau could depend on soil seed bank health.

Image: 
Photo courtesy of Scott Collins.

Seeds offer a level of resilience to the harmful effects of climate change in ecosystems across the globe. When seeds are dropped into the soil, often becoming dormant for many years until they are ready to grow into plants, they become part of the natural storage of seeds in "soil seed banks." These banks have been thought to better withstand extreme conditions than can the sprouted vegetation that exists above-ground.

A new study published in the Ecological Society of America's journal Ecological Applications examines how warming and increased precipitation (rain and snow) harms the seeds in the ground of the Tibetan Plateau and elsewhere.

"Soil seed banks are essentially the last resort of natural resilience in ecosystems," says Scott Collins, professor at New Mexico University and an author on the paper. "Too often we focus on what we see above ground and base management decisions just on the appearance of the plant community."

The Tibetan Plateau, a place that has been grazed for thousands of years, is an ideal place to study direct and indirect climate effects on vegetation in a fragile environment. The study states that as the highest plateau in the world, averaging over 12,000 feet (4000 meters) in elevation, it is regarded as the third pole of the Earth. The warming rate here is nearly 1.5 times that of global warming due to climate change and annual rainfall has increased in most areas of the plateau.

Because the growing season is relatively short on the plateau, the soil samples and the plant surveys were all collected in one year. Researchers from Lanzhou University in China visited 57 sample collection sites at different elevations and ecosystem types in the northeastern part of the plateau. They gathered 1026 soil samples and surveyed the aboveground plant community, which are composed of the grown plants that reflect the types of seeds dropped into the ground over time. Next, the researchers germinated the samples and grew them in experimental plots to study the growth and what different conditions affect the seed soil banks of Tibet.

While some plants appear to grow well under increasing precipitation and warming, these changes have different, harmful effects on the little seeds that lay dormant and resilient in the soil.

"Climate change effects the ability of seeds to germinate, grow and survive," says Collins. "Although climate change affects adult plants, seedlings are delicate and stress from climate - drought, freezing, etc. - can cause high mortality of seedlings."

The study states that temperature is a primary factor in controlling seed dormancy. With warmer temperatures, seeds may be triggered to sprout too early when conditions are not ideal for healthy growth. An abnormally warm spell of a few days - which is becoming more common - during an otherwise harsh winter can trigger those seeds to grow but ultimately make them fail. Many seeds might also be triggered to sprout too soon by higher moisture levels in the soil.

Increasing temperature and precipitation can also effect seeds indirectly, by changing the environment around them. Pathogens (microscopic disease-causing organisms) that are harmful to seeds can grow more prolific under warmer and wetter soil conditions. The acidity of the soil can also change, which strongly affects microbial communities and the abundance of those pathogens. Extra nitrogen in the soil, also brought on by changing conditions, allows some plant species to dominate others and leads to a decline in the overall species diversity, which translates to lower diversity of seeds in the ground.

Collins believes the study should compel ecosystem managers and scientists to pay attention to both the direct and indirect effects of global environmental change on belowground systems. "Even when the aboveground community seems badly degraded," he says, "the soil seed bank may still provide an important but underappreciated source of ecosystem resilience following prolonged disturbance."

With continued changing climate conditions, however, that resilience continues to be tested.

Credit: 
Ecological Society of America

Scientists develop enzyme produced from agricultural waste for use as laundry detergent

image: Dr Pattanathu Rahman

Image: 
University of Portsmouth

An international team of researchers has developed an enzyme produced from agricultural waste that could be used as an important additive in laundry detergents.

By using an enzyme produced from a by-product of mustard seeds, they hope to develop a low-cost naturally derived version of lipase, the second largest commercially produced enzyme, which is used in various industries for the production of fine chemicals, cosmetics, pharmaceuticals and biodiesel including detergents.

Thousands of tons of lipase are used annually for the production of laundry detergents as an additive or to replace the chemical detergents because of its advantage of being eco-friendly and better ability to remove oil stains without harming the texture of the cloth.

Lipase is one of the most rapidly growing industrial enzymes in the market and is worth $590.5million. However, the cost of biotechnologically produced lipases has always been a challenge, mainly due to the high cost of feedstocks.

In this collaborative project, Dr Pattanathu Rahman, a microbial biotechnologist from the Centre for Enzyme Innovation at the University of Portsmouth worked with Professor Subudhi and scientists from the Centre for Biotechnology at Siksha O Anusandhan University in Odisha, India, where Dr Rahman is also a visiting Professor.

They examined a lipase produced from mustard oil cakes, which are the by-products of oil extraction from the mustard seeds. Oil cakes are a very good resource for growth of microbes to produce enzymes. They fermented the oil cakes with the bacteria Anoxybacillus sp. ARS-1, living in a tropical hot spring Taptapani, Odisha, India to produce the lipase enzyme.

Mustard are the third most produced oilseed crops in the world after soybean and palm oil seed. These seeds are produced in tropical countries such as Bangladesh, Pakistan and Northern India. The mustard oil extracted from the seeds are used as cooking oils. Oil cakes that are the by-products of oil extraction contain relatively high amounts of protein with small amounts of anti-nutritional compounds like glucosinolates and their breakdown products, phenolics and phytates.

Dr Rahman said: "We further investigated suitability of the lipase enzyme in detergent formulations. Anoxybacillus sp. ARS-1 produced lipase was found to be stable and resist almost all chemical detergents as well as common laundry detergent such as Ezee, Surf, Ariel and Ghadhi, proving it to be a prospective additive for incorporation in the new detergent formulations."

Credit: 
University of Portsmouth

Instrument may enable mail-in testing to detect heavy metals in water

image: MIT graduate student Emily Hanhauser demonstrates a new device that may simplify the logistics of water monitoring for trace metal contaminants, particularly in resource-constrained regions.

Image: 
Image: Melanie Gonick/MIT

Lead, arsenic, and other heavy metals are increasingly present in water systems around the world due to human activities, such as pesticide use and, more recently, the inadequate disposal of electronic waste. Chronic exposure to even trace levels of these contaminants, at concentrations of parts per billion, can cause debilitating health conditions in pregnant women, children, and other vulnerable populations.

Monitoring water for heavy metals is a formidable task, however, particularly for resource-constrained regions where workers must collect many liters of water and chemically preserve samples before transporting them to distant laboratories for analysis.

To simplify the monitoring process, MIT researchers have developed an approach called SEPSTAT, for solid-phase extraction, preservation, storage, transportation, and analysis of trace contaminants. The method is based on a small, user-friendly device the team developed, which absorbs trace contaminants in water and preserves them in a dry state so the samples can be easily dropped in the mail and shipped to a laboratory for further analysis.

The device resembles a small, flexible propeller, or whisk, which fits inside a typical sampling bottle. When twirled inside the bottle for several minutes, the instrument can absorb most of the trace contaminants in the water sample. A user can either air-dry the device or blot it with a piece of paper, then flatten it and mail it in an envelope to a laboratory, where scientists can dip it in a solution of acid to remove the contaminants and collect them for further analysis in the lab.

"We initially designed this for use in India, but it's taught me a lot about our own water issues and trace contaminants in the United States," says device designer Emily Hanhauser, a graduate student in MIT's Department of Mechanical Engineering. "For instance, someone who has heard about the water crisis in Flint, Michigan, who now wants to know what's in their water, might one day order something like this online, do the test themselves, and send it to a lab."

Hanhauser and her colleagues recently published their results in the journal Environmental Science and Technology. Her MIT co-authors are Chintan Vaishnav of the Tata Center for Technology and Design and the MIT Sloan School of Management; John Hart, associate professor of mechanical engineering; and Rohit Karnik, professor of mechanical engineering and associate department head for education, along with Michael Bono of Boston University.

From teabags to whisks

The team originally set out to understand the water monitoring infrastructure in India. Millions of water samples are collected by workers at local laboratories all around the country, which are equipped to perform basic water quality analysis. However, to analyze trace contaminants, workers at these local labs need to chemically preserve large numbers of water samples and transport the vessels, often over hundreds of kilometers, to state capitals, where centralized labs have facilities to properly analyze trace contaminants.

"If you're collecting a lot of these samples and trying to bring them to a lab, it's pretty onerous work, and there is a significant transportation barrier," Hanhauser says.

In looking to streamline the logistics of water monitoring, she and her colleagues wondered whether they could bypass the need to transport the water, and instead transport the contaminants by themselves, in a dry state.

They eventually found inspiration in dry blood spotting, a simple technique that involves pricking a person's finger and collecting a drop of blood on a card of cellulose. When dried, the chemicals in the blood are stable and preserved, and the cards can be mailed off for further analysis, avoiding the need to preserve and ship large volumes of blood.

The team started thinking of a similar collection system for heavy metals, and looked through the literature for materials that could both absorb trace contaminants from water and keep them stable when dry.

They eventually settled on ion-exchange resins, a class of material that comes in the form of small polymer beads, several hundreds of microns wide. These beads contain groups of molecules bound to a hydrogen ion. When dipped in water, the hydrogen comes off and can be exchanged with another ion, such as a heavy metal cation, that takes hydrogen's place on the bead. In this way, the beads can absorb heavy metals and other trace contaminants from water.

The researchers then looked for ways to immerse the beads in water, and first considered a teabag-like design. They filled a mesh-like pocket with beads and dunked it in water they spiked with heavy metals. They found, though, that it took days for the beads to adequately absorb the contaminants if they simply left the teabag in the water. When they stirred the teabag around, turbulence sped the process somewhat, but it still took far too long for the beads, packed into one large teabag, to absorb the contaminants.

Ultimately, Hanhauser found that a handheld stirring design worked best to take up metal contaminants in water within a reasonable amount of time. The device is made from a polymer mesh cut into several propeller-like panels. Within each panel, Hanhauser hand-stitched small pockets, which she filled with polymer beads. She then stitched each panel around a polymer stick to resemble a sort of egg beater or whisk.

Testing the waters

The researchers fabricated several of the devices, then tested them on samples of natural water collected around Boston, including the Charles and Mystic rivers. They spiked the samples with various heavy metal contaminants, such as lead, copper, nickel, and cadmium, then stuck a device in the bottle of each sample, and twirled it around by hand to catch and absorb the contaminants. They then placed the devices on a counter to dry overnight.

To recover the contaminants from the device, they dipped the device in hydrochloric acid. The hydrogen in the solution effectively knocks away any ions attached to the polymer beads, including heavy metals, which can then be collected and analyzed with instruments such as mass spectrometers.

The researchers found that by stirring the device in the water sample, the device was able to absorb and preserve about 94 percent of the metal contaminants in each sample. In their recent trials, they found they could still detect the contaminants and predict their concentrations in the original water samples, with an accuracy range of 10 to 20 percent, even after storing the device in a dry state for up to two years.

With a cost of less than $2, the researchers believe that the device could facilitate transport of samples to centralized laboratories, collection and preservation of samples for future analysis, and acquisition of water quality data in a centralized manner, which, in turn, could help to identify sources of contamination, guide policies, and enable improved water quality management.

Credit: 
Massachusetts Institute of Technology

The do's and don'ts of monitoring many wildlife species at once

image: UMass Amherst ecologist Kadambari Devarajan setting up the trail camera for her field work in Zambia's grassland, where she used the traps to study the community of carnivores including lions, leopards cheetah and hyena.

Image: 
UMass Amherst/K. Devarajan

AMHERST, Mass. - A new analysis of 92 studies from 27 countries conducted by ecologists at the University of Massachusetts Amherst suggests that many recent multi-species studies of wildlife communities often incorrectly use the analytical tools and methods available.

Technology such as trail cameras and drones have "revolutionized wildlife monitoring studies" in recent years, says organismic and evolutionary biology doctoral student Kadambari Devarajan, who led the study, "but if not properly used in well-designed research, they will compromise the reliability of the results obtained."

Devarajan and co-authors report that the number of studies reporting community-level data has dramatically increased from fewer than five in 2009 to more than 50 in 2019. They believe that given the growth of ecological studies at the community level, it is important to identify the pitfalls that could arise from incorrectly applying certain methods and inconsistencies in reporting results, as well as what should be improved going forward.

Reporting in a special issue of the journal Ecography, Devarajan and her collaborators Toni Lyn Morelli of UMass Amherst's Northeast Climate Adaptation Science Center and Simone Tenan of the Museo delle Scienze, Trento, Italy, considered how published studies approached a variable known as 'species occupancy,' which among other things takes into account how easily detected each species is in the study area.

Devarajan explains, "If it's rare you can account for that, and if it's not rare, you can account for that as well, but a very important assumption made in occupancy is that you must identify the species correctly." However, that didn't always happen in the sample of studies they analyzed. "We lay out the consequences of violating assumptions associated with this method, and how to handle them correctly," she adds.

Devarajan, who has done field studies on carnivore communities in Zambia and India, points out that as methods and tools used to study single species in the past are modified to entire wildlife communities, there can be "a disconnect," reducing accuracy and precision in the inferences made. Her new paper not only points out difficulties but offers recommendations and guidelines for future work with a focus on multi-species occupancy models.

As she explains, "A trail camera set up to study black bears often captures the presence of other wildlife in the area, such as deer, bobcats, coyotes and rabbits. With the use of appropriate methods, scientists can use the trail camera data to simultaneously monitor populations of the entire wildlife community. At a time when human-driven climate and global change are affecting wildlife in myriad ways, multispecies methods are increasingly important as they are often more cost- and resource-efficient than conservation efforts for individual species."

In their analysis, Devarajan and colleagues show that under-reporting of necessary details about community-level patterns of wildlife occurrence were common. Researchers also failed to report crucial aspects of the study design such as justifying the choice of study sites and duration of the project, as well as placement of detectors such as camera traps and how long they were deployed.

With a background in computer science now applied to ecology and conservation, Devarajan also observed reporting bias and an increased potential for violation of assumptions. These factors could result in over- or under-estimating wildlife numbers, leading to spurious results, she notes. "Scientists are not always reporting assumptions and justifying the methods they used to support their conclusions, due to which the inferences drawn from such studies of wildlife communities have to be viewed with caution," she points out.

"To do these community-level studies well, you have to make certain assumptions and if you have a lot of reporting bias, there is increased potential for violating those assumptions," she adds. For example, one important assumption for mapping the distribution of multiple species in a defined area is correctly identifying the different species.

"In our paper we lay out the consequences for violating assumptions by looking at 10 years of studies of wildlife communities. We use this review of past studies to come up with a roadmap for how such studies should ideally be conducted and offer recommendations to make these studies more robust," she says.

Other issues explored by Devarajan and colleagues include whether research groups included a local collaborator for field studies, the kinds of organisms studied and how many community-level occupancy models looked at marine and aquatic environments compared to terrestrial. "It was good news to that for many of the studies, there is diversity in the research and inclusivity in taxa," she notes.

The researchers report that the studies they examined ranged from plants and insects to birds, mammals and reptiles, "a whole variety of taxa," but one bias they identified was little representation from marine environment or aquatic communities and an over-reliance on terrestrial systems, Devarajan notes.

Credit: 
University of Massachusetts Amherst

Stabilizing freeze-dried cellular machinery unlocks cell-free biotechnology

image: Preservative formulations have been discovered to improve storage of cell-free components at room temperature. Through the use of machine learning algorithm, researchers can now identify preservatives that will enable their cell-free biotechnology applications outside of the lab for on-demand protein synthesis, point-of -care biosensing or therapeutic production, and biochemical education.

Image: 
Nicole Gregorio

Researchers at California Polytechnic State University have developed a low-cost approach that improves cell-free biotechnology's utility for bio-manufacturing and portability for field applications.

Cell-free protein synthesis (CFPS) is a biotechnology that harnesses active cellular machinery in a test tube without the presence of living cells, allowing researchers to directly access and manipulate biochemical processes. Scientists and engineers are looking to utilize cell-free biotechnology for numerous applications including on-demand biomanufacturing of biomaterials and therapeutics, point-of-care diagnostics of disease biomarkers and environmental pollutants, and transformative biochemical education platforms.

Cell-free biotechnology researchers have already made many of these applications a reality in the lab, but getting them to work in the field, clinic and classroom is more difficult. The cellular machinery extracted for use in cell-free biotechnology contains biomolecules such as proteins and RNAs, which break down at warmer temperatures, greatly limiting the shelf life of the cellular machinery. Transporting it from one laboratory to another or taking it out of the lab for field applications requires refrigeration to maintain its activity. Being tethered to the "cold chain" is a fundamental limit to meeting cell-free biotechnology's potential.

Inspired by storage optimizations of biological materials like cow's milk, researchers have previously extended the shelf life of extracts by freeze-drying them, resulting in a product similar to powdered milk that can be stored at room temperature for extended time periods. However, unlike powdered milk, freeze-dried cellular machinery cannot be stored for more than a few days without continual loss of activity. Researchers at California Polytechnic State University have discovered low-cost preservatives that allow freeze-dried cellular machinery to retain full activity when stored at room temperature for up to two weeks.

To accomplish this, a team of undergraduate student researchers pursued an interdisciplinary approach led by professors Javin Oza, Katharine Watts and Pratish Patel. As published in the journal ACS Synthetic Biology, researchers selected 10 preservatives with four distinct mechanisms of action and systematically identified the best performers, which were then tested in combinations of two or three. This approach allowed the researchers to identify combinations of preservatives that could maintain the full productivity of the cellular machinery for two weeks at room temperature. Researchers also discovered that certain combinations of preservatives could enhance the protein-producing capacity of the cellular machinery nearly two-fold.

Researchers demonstrated that the utility of any given preservative for stabilizing biological materials is highly context dependent. To help overcome this limitation, their data was used to develop a machine learning algorithm to allow other users to identify preservative formulations that are ideal for their specific application of the cell-free biotechnology. Access to the machine learning algorithm through a user-friendly interface will soon be available to the public on http://www.oza-lab.com.

These advances represent a step toward unlocking the potential for cell-free biotechnology applications. More information about this work can be found in the publication entitled "Unlocking applications of cell-free biotechnology through enhanced shelf-life and productivity of E. coli extracts."

Credit: 
California Polytechnic State University

Breaking the temperature barrier in small-scale materials testing

image: Materials science and engineering professor Shen Dillion uses electron microscopy and targeted laser heating for ultra-high temperature testing of aeronautical materials.

Image: 
Photo by Steph Adams

CHAMPAIGN, Ill. -- Researchers have demonstrated a new method for testing microscopic aeronautical materials at ultra-high temperatures. By combining electron microscopy and laser heating, scientists can evaluate these materials much more quickly and inexpensively than with traditional testing.

The findings of the new study, conducted by Shen Dillon, a professor of materials science and engineering at the University of Illinois at Urbana-Champaign, and collaborators from Sandia Laboratories, are published in the journal Nano Letters.

A decade ago, advancements in aeronautical materials involved testing large, expensive models and years of development. Scientists and engineers now use micro-scale experimentation to help create new materials and understand the chemical and physical properties that lead to material failure.

"Micro-scale mechanical testing provides opportunities to break the materials down into their components and see defects at the atomic level," Dillon said.

Until now, researchers have been unable to conduct successful micro-scale materials tests at the extreme temperatures experienced by critical components during flight.

"Unfortunately, it's really difficult to perform experiments with new materials or combinations of existing materials at ultra-high temperatures above 1,000 C because you run into the problem of destroying the testing mechanisms themselves," Dillon said.

This temperature barrier has slowed the development of new materials for commercial applications such as rockets and vehicles, which require testing at temperatures well above the current research's limit of "a few hundred degrees Celsius," he said. "The method we demonstrate in the paper will significantly reduce the time and expense involved in making these tests possible."

Their ultra-high temperature test combined two commonly used tools in a unique way. Using a transmission electron microscope and targeted laser heating, they were able to see and control where and how the material deformed at the highest temperature possible before the sample evaporated.

"We were able to bring the laser together with the mechanical tester so precisely with the TEM that we could heat the sample without overheating the mechanical tester," Dillon said. "Our test allows you to grow a thin film of the material without any special processing and then put it in the microscope to test a number of different mechanical properties."

As proof of concept, the study tested zirconium dioxide - used in fuel cells and thermal barrier coatings - at temperatures up to 2,050 C, "a temperature well above anything that you could do previously," Dillon said.

Dillon says the paper will result in "more people using this technique for high-temperature tests in the future because they are much easier to do and the engineering interest is definitely there."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Researchers improve safety of lead-based perovskite solar cells

Researchers at the National Renewable Energy Laboratory (NREL) and Northern Illinois University (NIU) have developed a technique to sequester the lead used to make perovskite solar cells, a highly efficient emerging photovoltaic technology.

The light-absorbing layer in a perovskite solar cell contains a minute amount of lead. The presence of this toxic material in the developing technology could turn some consumers away when perovskite solar cells become commercially available, said Kai Zhu, a senior scientist in the Chemistry and Nanoscience Center at NREL.

Zhu and other researchers at NREL and NIU outlined their solution in a paper newly published in Nature, titled "On-Device Lead Sequestration for Perovskite Solar Cells."

"This is a big step in the correct direction," Zhu said. His co-authors are Fei Zhang and Joseph Berry of NREL, Haiying He of Valparaiso University, and Xun Li and Tao Xu from NIU. Xu served as the lead researcher from NIU.

"Lead toxicity has been one of the most vexing, last-mile challenges facing perovskite solar cells," Xu said. "Our on-device lead-sequestration method renders a 'safety belt' for this fascinating photovoltaic technology."

A lead-based perovskite solar cell's highest efficiency--its ability to turn sunlight into electricity--runs close to 25%. Without the lead, that efficiency is cut in half.

Silicon solar panels, the industry's dominant technology, contain lead solder, but that lead is not water soluble. The lead used in perovskites can be dissolved in water. While existing analyses show this is not a major concern, the researchers developed a method to ensure the lead is sequestered should a cell become damaged. They coated the front and back of a perovskite solar cell with two different lead-absorbing films. Then, they damaged the two sides of the cell--slashing one with a knife and smashing the other with a hammer.

The researchers then immersed the damaged cells in water of various types, including pure water, acid water, and even flowing water to simulate heavy rain. They found that these lead-absorbing films can prevent more than 96% of lead from leaking into the water from the damaged cells.

The addition of the lead-absorbing layers did not affect the performance of the solar cell, the researchers found.

Credit: 
DOE/National Renewable Energy Laboratory

Shrinking sea ice is creating an ecological trap for polar bears

San Diego Zoo Global researchers studying the effects of climate change on polar bears are using innovative technologies to understand why polar bears in the Southern Beaufort Sea are showing divergent movement patterns in the summer. In recent decades, about a quarter of this population of bears have chosen to come on land instead of staying on the shrinking summer sea ice platform. Historically, the polar bears in this region remained on the ice year-round. The decision of each individual bear to stay on the ice or to move to land appears to be linked to the energetic cost or benefit of either option, and the potential of having to swim to reach land.

"We found that bears who moved to land expended more energy on average during the summer than bears that remained on the receding sea ice," said Anthony Pagano, Ph.D., a postdoctoral research fellow co-mentored between San Diego Zoo Global, the U.S. Geological Survey and Polar Bears International. "And in the late summer, as the ice became even more restricted, a greater percentage of energy was expended by bears swimming to land. This means the immediate cost of moving to land exceeded the cost of remaining on the receding summer pack ice--even though bears are having to move greater distances to follow the retreating sea ice than they would have historically."

However, prior research has shown that bears on land in this region have access to whale carcasses in the summer, while bears on the sea ice appear to be fasting. Researchers are concerned that the decision by each individual bear to stay on the ice is creating an ecological trap that may be contributing to population decreases that have already been documented in this population.

The Southern Beaufort Sea subpopulation of polar bears has experienced increased sea ice retreat in recent decades. A basic understanding of polar bear energetics that can be applied to this research has come from studies that include polar bears at the San Diego Zoo and at the Oregon Zoo.

"The polar bear conservation program at the San Diego Zoo has supported research such as this by engaging in studies to measure the energetic costs of polar bear metabolism," said Megan Owen, Ph.D., director of Population Sustainability, San Diego Zoo Global. "These studies have enhanced the capacity of field researchers to interpret data collected on free-ranging bears, providing a better understanding of what it costs a polar bear to move about their rapidly changing habitat."

"The research underscores the importance of taking action to reduce the greenhouse gas emissions that are causing sea ice to melt," said Steven Amstrup, Ph.D., chief scientist at Polar Bears International. "It's yet another piece in the climate puzzle, showing the impacts of global climate warming on polar bears and how the bears are responding to sea ice retreat."

Credit: 
San Diego Zoo Wildlife Alliance

Regioselective magnetization enabled chiral semiconducting heteronanorods

image: a, Schematic illustration of magnetically induced chiroptical activity. b, Model of materialized magnetite nanodomain at one apex of ZnxCd1-xS semiconductor nanorod.

Image: 
ZHUANG et al.

The USTC team led by Prof. Shu-Hong Yu (USTC), collaborating with Prof. Zhiyong Tang (National Center for Nanoscience and Technology, China) and Prof. Edward H. Sargent (University of Toronto), has shed new lights on the topic of chiral inorganic nanomaterials. Researchers demonstrated a regioselective magnetization strategy, achieving a library of semiconducting heteronanorods with chiroptical activities.

The research article entitled "Regioselective magnetization in semiconducting nanorods" was published in Nature Nanotechnology on Jan 20th.

Chirality - the property of an object non-superimposable with its mirror image - is of widespread interest in physics, chemistry, and biology. The chiroptical activity in materials can be tuned by electric and magnetic transition dipoles. To date, the chemical construction of chiral nanomaterials has been achieved through the introduction of chiral molecules and geometrically helical structures to provide modulation; but these methods limit their environmental instability - chirality disappears under illumination, heating, or in a harsh chemical environment - and poor conductivity, since charge transfer processes towards surface reactants and electrodes are impeded. These limitations hamper further practical applications of chiral materials in various areas.

Designing magneto-optical nanomaterials offers an opportunity to modulate the interactions between electric and magnetic dipoles via the local magnetic field, underlining another promising approach to enable chirality. To materialize such chiroptically active media, the growth of magnetic units has to be achieved at targeted locations of parent nanomaterials. One-dimensional chalcogenide semiconductor nanorods stand out as compelling candidates to serve as the parent materials due to their high geometric anisotropy, large electric dipole moment along nanorods, ease of composition and size modulations, as well as promising applications in catalysis, photonics, and electronics. However, technical challenges lie in the epitaxial growth between host and motif materials of large lattice and chemical mismatches, let alone the regioselective growth.

Taking up the challenge, researchers reported a double-buffer-layer engineering strategy to achieve the selective growth of magnetic materials at specific locations on a wide variety of semiconducting nanorods (i.e. ZnxC1-xS, where 0?x?1). The authors sequentially integrated Ag2S and Au intermediate layers at one apex of each nanorod to catalyze the site-specific growth of Fe3O4 nanodomains. Due to the location-specific magnetic field, the resulting magnetized heteronanorods exhibit deflected electric dipole moment. In this way, the non-zero interaction between electric and magnetic transition dipoles induces chiroptical activity in the absence of chiral ligands, helical structures, and chiral lattices - a phenomenon not observed outside of modulation. The regioselective magnetization strategy opens a new avenue to designing optically active nanomaterials for chirality and spintronics.

Credit: 
University of Science and Technology of China