Tech

Illinois, Nebraska scientists propose improvements to precision crop irrigation

URBANA, Ill. - With threats of water scarcity complicating the need to feed a growing global population, it is more important than ever to get crop irrigation right. Overwatering can deplete local water supplies and lead to polluted runoff, while underwatering can lead to sub-optimal crop performance. Yet few farmers use science-based tools to help them decide when and how much to water their crops.

A new University of Illinois led study identifies obstacles and solutions to improve performance and adoption of irrigation decision support tools at the field scale.

"We wanted to offer our perspective on how to achieve field-scale precision irrigation with the most recent and advanced technologies on data collection, plant water stress, modeling, and decision-making," says Jingwen Zhang, postdoctoral researcher in the Department of Natural Resources and Environmental Sciences (NRES) at Illinois and lead author on the article in Environmental Research Letters.

Zhang says many farmers rely on traditional rules of thumb, including visual observation, crop calendars, and what the neighbors are doing, to decide when and how much to water. Better data and more advanced technologies exist to help make those decisions, but they aren't being leveraged currently to their full potential.

For example, some fields are equipped with soil moisture sensors or cameras that detect changes in crop appearance, but there aren't enough of them to provide accurate information across fields. Satellites can monitor vegetation from space, but the spatial and temporal resolution of satellite images is often too large to help make decisions at the field scale.

Kaiyu Guan, assistant professor in NRES, Blue Waters professor with the National Center for Supercomputing Applications, and project leader on the study, pioneered a way to fuse high-resolution and high-frequency satellite data into one integrated high spatial-temporal resolution product to help track soil and plant conditions.

"Based on remote sensing fusion technology and advanced modeling, we can help farmers get a fully scalable solution remotely," he says. "That's powerful. It can potentially be a revolutionary technology for farmers, not only in the U.S., but also smallholder farmers in developing countries."

With modern satellite technology and Guan's fusion model, data acquisition won't be a limiting factor in future precision irrigation products. But it's still important to define plant water stress appropriately.

Historically, irrigation decisions were based solely on measures of soil moisture. Guan's group recently called for the agricultural industry to redefine drought, not based on soil moisture alone, but on its interaction with atmospheric dryness.

"If we consider the soil-plant-atmosphere-continuum as a system, which reflects both soil water supply and atmospheric water demand, we can use those plant-centric metrics to define plant water stress to trigger irrigation," Zhang says. "Again, if we use our data fusion methods and process-based modelling, we can achieve precision irrigation with very high accuracy and also high resolution."

The researchers also looked at challenges regarding farmer adoption of existing decision support tools. Because current products are based on less-than-ideal data sources, Guan says producers are reluctant to switch from traditional rule-of-thumb methods to tools that may not be much more reliable. Non-intuitive user interfaces, data privacy, and inflexible timing compound the problem.

Trenton Franz, associate professor at the University of Nebraska-Lincoln (UNL) and a coauthor, says farmers will be more likely to adopt precision irrigation decision tools if they are accurate down to the field scale, flexible, and easy to use. His and Guan's teams are working on technologies to fill this need and are actively testing the technology in irrigated fields in Nebraska. This includes participating with Daran Rudnick, assistant professor at UNL and co-author of the study, in the UNL Testing Ag Performance (TAPS) program, which focuses on technology adoption and education for producers across the region.

"We're pretty close. We have real-time evapotranspiration data, and we're adding the soil moisture component and the irrigation component. Probably in less than a year this will be launched as a prototype and can be tested among the farmer community," Guan says.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Fasting lowers blood pressure by reshaping the gut microbiota

Nearly half of adults in the United States have hypertension, a condition that raises the risk for heart disease and stroke, which are leading causes of death in the U. S.

At Baylor College of Medicine, Dr. David J. Durgan and his colleagues are dedicated to better understand hypertension, in particular the emerging evidence suggesting that disruption of the gut microbiota, known as gut dysbiosis, can have adverse effects on blood pressure.

"Previous studies from our lab have shown that the composition of the gut microbiota in animal models of hypertension, such as the SHRSP (spontaneously hypertensive stroke-prone rat) model, is different from that in animals with normal blood pressure," said Durgan, assistant professor of anesthesiology at Baylor.

The researchers also have shown that transplanting dysbiotic gut microbiota from a hypertensive animal into a normotensive (having a healthy blood pressure) one results in the recipient developing high blood pressure.

"This result told us that gut dysbiosis is not just a consequence of hypertension, but is actually involved in causing it," Durgan said. "This ground work led to the current study in which we proposed to answer two questions. First, can we manipulate the dysbiotic microbiota to either prevent or relieve hypertension? Second, how are the gut microbes influencing the animal's blood pressure?"

Can manipulating the gut microbiota regulate blood pressure?

To answer the first question, Durgan and his colleagues drew on previous research showing that fasting was both one of the major drivers of the composition of the gut microbiota and a promoter of beneficial cardiovascular effects. These studies, however, had not provided evidence connecting the microbiota and blood pressure.

Working with the SHRSP model of spontaneous hypertension and normal rats, the researchers set up two groups. One group had SHRSP and normal rats that were fed every other day, while the other group, called control, had SHRSP and normal rats with unrestricted food availability.

Nine weeks after the experiment began, the researchers observed that, as expected, the rats in the SHRSP control had higher blood pressure when compared to the normal control rats. Interestingly, in the group that fasted every other day, the SHRSP rats had significantly reduced blood pressure when compared with the SHRSP rats that had not fasted.

"Next, we investigated whether the microbiota was involved in the reduction of blood pressure we observed in the SHRSP rats that had fasted," Durgan said.

The researchers transplanted the microbiota of the rats that had either fasted or fed without restrictions into germ-free rats, which have no microbiota of their own.

Durgan and his colleagues were excited to see that the germ-free rats that received the microbiota of normally fed SHRSP rats had higher blood pressure than the germ-free rats receiving microbiota from normal control rats, just like their corresponding microbiota donors.

"It was particularly interesting to see that the germ-free rats that received microbiota from the fasting SHRSP rats had significantly lower the blood pressure than the rats that had received microbiota from SHRSP control rats," Durgan said. "These results demonstrated that the alterations to the microbiota induced by fasting were sufficient to mediate the blood pressure-lowering effect of intermitting fasting."

How the microbiota regulates blood pressure

The team proceeded to investigate the second question of their project. How does the gut microbiota regulate blood pressure?

"We applied whole genome shotgun sequence analysis of the microbiota as well as untargeted metabolomics analysis of plasma and gastrointestinal luminal content. Among the changes we observed, alterations in products of bile acid metabolism stood out as potential mediators of blood pressure regulation," Durgan said.

The team discovered that the SHRSP hypertensive animals that were fed normally had lower bile acids in circulation than normotensive animals. On the other hand, SHRSP animals that followed an intermittent feeding schedule had more bile acids in the circulation.

"Supporting this finding, we found that supplementing animals with cholic acid, a primary bile acid, also significantly reduced blood pressure in the SHRSP model of hypertension," Durgan said.

Taken together, the study shows for the first time that intermittent fasting can be beneficial in terms of reducing hypertension by reshaping the composition of gut microbiota in an animal model. The work also provides evidence that gut dysbiosis contributes to hypertension by altering bile acid signaling.

"This study is important to understand that fasting can have its effects on the host through microbiota manipulation," Durgan said. "This is an attractive idea because it can potentially have clinical applications. Many of the bacteria in the gut microbiota are involved in the production of compounds that have been shown to have beneficial effects as they make it into the circulation and contribute to the regulation of the host's physiology. Fasting schedules could one day help regulate the activity of gut microbial populations to naturally provide health benefits."

Credit: 
Baylor College of Medicine

Exploring extremes -- When is it too hot to handle

image: SEM of anode

Image: 
Heriot-Watt University

Exploring extreme environments can put significant operational challenges on the engineering systems we depend upon to safely explore and at times operate within.

Within high-value and safety-critical applications, such as space exploration or sub-surface drilling, the extreme and at times dynamic operating conditions within the environment, can make it challenging to understand the life expectancy of critical components and sub-systems. Hence, it's a highly complex and at times impossible situation to accurately understand therefore predict.

To have safe, resilient and economically viable operations within these challenge environments, it is vital to understand the effect of high temperatures on critical devices, such as Electrochemical Capacitors (EC's). In comparison to a battery, ECs, also known as supercapacitor, ultracapacitor, or electrochemical double-layer capacitor, can withstand high discharge-charge currents and thus are suitable for withstanding peak power demands. An EC's long cycle life when operated in a high-temperature environment makes it ideal for challenging and extreme environments.

Within extreme environment systems, it is common to be operating components beyond the limits of the manufacturers recommendations. This makes the ability to understand and forecast the End of Life of such components a significant challenge. To address this challenge, in our research we focus on ECs operated at temperatures of up to 200°C, specifically the operation of ECs onboard downhole drilling equipment for geothermal or oil and gas exploration. Downhole tools are complex electromechanical systems that perform critical functions in drilling operations and are designed to withstand extreme temperatures, shocks, and vibrations.

In our research we deploy a machine learning algorithm to predict degradation trends for electrochemical double-layer capacitors beyond the knee-point onset when cycled at high temperature in an oil and gas drilling environment. Operation at high temperature accelerates EC degradation, we therefore investigate the worst-case scenario for the mentioned application.

Recently published in IEEE Access, led by Professor Flynn's Smart Systems Group at Heriot-Watt University, and in partnership with Baker Hughes, University of Maryland and the Lloyds Register Foundation.

Professor David Flynn stated; "This research demonstrates a significant advancement in the ability to understand and predict the life expectancy of critical components. Our experimental results show that end of life, defined as a 30% decrease in capacitance, occurs at 1,000 cycles when the environmental temperature exceeds the maximum operating temperature by 30%. Using lifecycle test data, something that is not readily available and very challenging to obtain, we create a machine learning model that has an average root mean squared percent error of less than 2% and a mean calibration score of 93% when referenced to a 95% confidence interval. Our model can be utilized to determine the EC degradation rate at a range of operating temperature values".

Credit: 
Heriot-Watt University

Wearable glucose monitors shed light on progression of Type 2 diabetes in Hispanic adults

image: David Kerr is the director of research and innovation at Sansum Diabetes Research Institute.

Image: 
Image courtesy of Sansum Diabetes Research Institute

HOUSTON - (April 29, 2021) - In one of the first studies of its kind, medical and engineering researchers have shown wearable devices that continuously monitor blood sugar provide new insights into the progression of Type 2 diabetes among at-risk Hispanic/Latino adults.

The findings by researchers from Sansum Diabetes Research Institute (SDRI) and Rice University are available online this week in EClinicalMedicine, an open-access clinical journal published by The Lancet.

"The fresh look at the glucose data sheds new light on disease progression, which could have a direct impact on better management," said Rice study co-author Ashutosh Sabharwal, professor and department chair in electrical and computer engineering and founder of Rice's Scalable Health Labs. "An important aspect of our analysis is that the results are clinically interpretable and point to new directions for improved Type 2 diabetes care."

The study builds on SDRI's groundbreaking research to address Type 2 diabetes in underserved Hispanic/Latino communities. SDRI's Farming for Life initiative assesses the physical and mental health benefits of providing medical prescriptions for locally sourced fresh vegetables to people with or at risk of Type 2 diabetes, with a focus on the Hispanic/Latino community. SDRI recently added a digital health technology called continuous glucose monitoring to this research.

Continuous glucose monitors track blood sugar levels around-the-clock and allow trends in blood glucose to be displayed and analyzed over time. The devices typically consist of two parts, a small electrode sensor affixed to the skin with an adhesive patch and a receiver that gathers data from the sensor.

"We found that the use of this technology is both feasible and acceptable for this population, predominantly Mexican American adults," said study co-author David Kerr, SDRI's director of research and innovation. "The results also provided new insights into measurable differences in the glucose profiles for individuals at risk of as well as with noninsulin-treated Type 2 diabetes. These findings could facilitate novel therapeutic approaches to reduce the risk of progression of Type 2 diabetes for this underserved population."

Sabharwal, who is also a co-investigator of the Precise Advanced Technologies and Health Systems for Underserved Populations (PATHS-UP) engineering research center, said, "The collaboration with SDRI aligns with our mission to use technology as an important building block to reduce health care disparities."

"We are excited about the application of digital health technologies for underserved populations as a way to eliminate health disparities and improve health equity," Kerr said. "This opens up potential for a larger number of collaborations to support SDRI's evolving focus on precision nutrition and also the expanded use of digital health technologies for both the prevention and management of all forms of diabetes."

Credit: 
Rice University

Lightning and subvisible discharges produce molecules that clean the atmosphere

image: Nitrogen, oxygen and water vapor molecules are broken apart by lightning and associated weaker electrical discharges, generating the reactive gases NO, O3, HO2, and the atmosphere's cleanser, OH.

Image: 
Jena Jenkins, Penn State

Lightning bolts break apart nitrogen and oxygen molecules in the atmosphere and create reactive chemicals that affect greenhouse gases. Now, a team of atmospheric chemists and lightning scientists have found that lightning bolts and, surprisingly, subvisible discharges that cannot be seen by cameras or the naked eye produce extreme amounts of the hydroxyl radical -- OH -- and hydroperoxyl radical -- HO2.

The hydroxyl radical is important in the atmosphere because it initiates chemical reactions and breaks down molecules like the greenhouse gas methane. OH is the main driver of many compositional changes in the atmosphere.

"Initially, we looked at these huge OH and HO2 signals found in the clouds and asked, what is wrong with our instrument?" said William H. Brune, distinguished professor of meteorology at Penn State. "We assumed there was noise in the instrument, so we removed the huge signals from the dataset and shelved them for later study."

The data was from an instrument on a plane flown above Colorado and Oklahoma in 2012 looking at the chemical changes that thunderstorms and lightning make to the atmosphere.

But a few years ago, Brune took the data off the shelf, saw that the signals were really hydroxyl and hydroperoxyl, and then worked with a graduate student and research associate to see if these signals could be produced by sparks and subvisible discharges in the laboratory. Then they did a reanalysis of the thunderstrom and lightning dataset.

"With the help of a great undergraduate intern," said Brune, "we were able to link the huge signals seen by our instrument flying through the thunderstorm clouds to the lightning measurements made from the ground."

The researchers report their results online today (April 29) in Science First Release and the Journal of Geophysical Research -- Atmospheres.

Brune notes that airplanes avoid flying through the rapidly rising cores of thunderstorms because it is dangerous, but can sample the anvil, the top portion of the cloud that spreads outward in the direction of the wind. Visible lightning happens in the part of the anvil near the thunderstorm core.

"Through history, people were only interested in lightning bolts because of what they could do on the ground," said Brune. "Now there is increasing interest in the weaker electrical discharges in thunderstorms that lead to lightning bolts."

Most lightning never strikes the ground, and the lightning that stays in the clouds is particularly important for affecting ozone, and important greenhouse gas, in the upper atmosphere. It was known that lightning can split water to form hydroxyl and hydroperoxyl, but this process had never been observed before in thunderstorms.

What confused Brune's team initially was that their instrument recorded high levels of hydroxyl and hydroperoxyl in areas of the cloud where there was no lightning visible from the aircraft or the ground. Experiments in the lab showed that weak electrical current, much less energetic than that of visible lightning, could produce these same components.

While the researchers found hydroxyl and hydroperoxyl in areas with subvisible lightning, they found little evidence of ozone and no evidence of nitric oxide, which requires visible lightning to form. If subvisible lightning occurs routinely, then the hydroxyl and hydroperoxyl these electrical events create need to be included in atmospheric models. Currently, they are not.

According to the researchers, "Lightning-generated OH (hydroxyl) in all storms happening globally can be responsible for a highly uncertain but substantial 2% to 16% of global atmospheric OH oxidation."

"These results are highly uncertain, partly because we do not know how these measurements apply to the rest of the globe," said Brune. "We only flew over Colorado and Oklahoma. Most thunderstorms are in the tropics. The whole structure of high plains storms is different than those in the tropics. Clearly we need more aircraft measurements to reduce this uncertainty."

Credit: 
Penn State

Combining solar panels and lamb grazing increases land productivity, study finds

image: Sheep grazing underneath solar panels at Oregon State University.

Image: 
Sean Nealon, Oregon State University

CORVALLIS, Ore. - Land productivity could be greatly increased by combining sheep grazing and solar energy production on the same land, according to new research by Oregon State University scientists.

This is believed to be the first study to investigate livestock production under agrivoltaic systems, where solar energy production is combined with agricultural production, such as planting agricultural crops or grazing animals.

The researchers compared lamb growth and pasture production in pastures with solar panels and traditional open pastures. They found less overall but higher quality forage in the solar pastures and that lambs raised in each pasture type gained similar amounts of weight. The solar panels, of course, provide value in terms of energy production, which increases the overall productivity of the land.

Solar panels also benefit the welfare of the lambs by providing shade, which allows the animals to preserve energy. Also lamb grazing alleviates the need to manage plant growth under the solar panels through herbicides or regular mowing, which require extra labor and costs.

"The results from the study support the benefits of agrivoltaics as a sustainable agricultural system," said Alyssa Andrew, a master's student at Oregon State who is the lead author of the paper published in Frontier in Sustainable Food Systems.

Solar photovoltaic installation in the U.S. has increased by an average of 48% per year over the past decade, and current capacity is expected to double again over the next five years, the researchers say.

Past research has found that grasslands and croplands in temperate regions are the best places to install solar panels for maximum energy production. However, energy production in photovoltaic systems requires large areas of land, potentially causing a competition between agricultural uses.

Agrivoltaics looks to diffuse that competition by measuring the economic value of energy production and agricultural use of the same land. Past research has focused on crops and solar panels and found that some crops, particularly types that like shade, can be more productive in combination with solar panels.

Another recent Oregon State study found that shade provided by solar panels increased the abundance of flowers under the panels and delayed the timing of their bloom, both findings that could aid the agricultural community.

The just-published study with lambs and solar panels was carried out in 2019 and 2020 at Oregon State's campus in Corvallis. Findings included:

The lambs gained almost the same amount of weight in the two pasture types in both years.

The daily water consumption of the lambs in the two pasture types in spring 2019 were similar during early spring, but lambs in open pastures consumed more water than those grazed under solar panels in the late spring period. There was no difference observed in water intake of the lambs in spring 2020.

Over the two years, solar pastures produced 38% less forage than open pastures.

Overall, the return from grazing was $1,046 per hectare (one hectare equals 2.47 acres) per year in open pastures and $1,029 per hectare per year in pastures with solar panels.

"The overall return is about the same, and that doesn't take into account the energy the solar panels are producing," said Serkan Ates, an assistant professor in the Oregon State's Department of Animal and Rangeland Sciences and a co-author of the paper. "And if we designed the system to maximize production we would likely get even better numbers."

Andrew is now working on a follow up to this study where she is quantifying the forage and lamb production from three different pasture types under solar panels.

Credit: 
Oregon State University

Hubble watches how a giant planet grows

image: This illustration of the newly forming exoplanet PDS 70b shows how material may be falling onto the giant world as it builds up mass. By employing Hubble's ultraviolet light (UV) sensitivity, researchers got a unique look at radiation from extremely hot gas falling onto the planet, allowing them to directly measure the planet's mass growth rate for the first time. The planet PDS 70b is encircled by its own gas-and-dust disk that's siphoning material from the vastly larger circumstellar disk in this solar system. The researchers hypothesize that magnetic field lines extend from its circumplanetary disk down to the exoplanet's atmosphere and are funneling material onto the planet's surface. The illustration shows one possible magnetospheric accretion configuration, but the magnetic field's detailed geometry requires future work to probe. The remote world has already bulked up to five times the mass of Jupiter over a period of about five million years, but is anticipated to be in the tail end of its formation process. PDS 70b orbits the orange dwarf star PDS 70 approximately 370 light-years from Earth in the constellation Centaurus.

Image: 
Credits: NASA, ESA, STScI, Joseph Olmsted (STScI)

NASA's Hubble Space Telescope is giving astronomers a rare look at a Jupiter-sized, still-forming planet that is feeding off material surrounding a young star.

"We just don't know very much about how giant planets grow," said Brendan Bowler of the University of Texas at Austin. "This planetary system gives us the first opportunity to witness material falling onto a planet. Our results open up a new area for this research."

Though over 4,000 exoplanets have been cataloged so far, only about 15 have been directly imaged to date by telescopes. And the planets are so far away and small, they are simply dots in the best photos. The team's fresh technique for using Hubble to directly image this planet paves a new route for further exoplanet research, especially during a planet's formative years.

This huge exoplanet, designated PDS 70b, orbits the orange dwarf star PDS 70, which is already known to have two actively forming planets inside a huge disk of dust and gas encircling the star. The system is located 370 light-years from Earth in the constellation Centaurus.

"This system is so exciting because we can witness the formation of a planet," said Yifan Zhou, also of the University of Texas at Austin. "This is the youngest bona fide planet Hubble has ever directly imaged." At a youthful five million years, the planet is still gathering material and building up mass.

Hubble's ultraviolet light (UV) sensitivity offers a unique look at radiation from extremely hot gas falling onto the planet. "Hubble's observations allowed us to estimate how fast the planet is gaining mass," added Zhou.

The UV observations, which add to the body of research about this planet, allowed the team to directly measure the planet's mass growth rate for the first time. The remote world has already bulked up to five times the mass of Jupiter over a period of about five million years. The present measured accretion rate has dwindled to the point where, if the rate remained steady for another million years, the planet would only increase by approximately an additional 1/100th of a Jupiter-mass.

Zhou and Bowler emphasize that these observations are a single snapshot in time - more data are required to determine if the rate at which the planet is adding mass is increasing or decreasing. "Our measurements suggest that the planet is in the tail end of its formation process."

The youthful PDS 70 system is filled with a primordial gas-and-dust disk that provides fuel to feed the growth of planets throughout the entire system. The planet PDS 70b is encircled by its own gas-and-dust disk that's siphoning material from the vastly larger circumstellar disk. The researchers hypothesize that magnetic field lines extend from its circumplanetary disk down to the exoplanet's atmosphere and are funneling material onto the planet's surface.

"If this material follows columns from the disk onto the planet, it would cause local hot spots," Zhou explained. "These hot spots could be at least 10 times hotter than the temperature of the planet." These hot patches were found to glow fiercely in UV light.

These observations offer insights into how gas giant planets formed around our Sun 4.6 billion years ago. Jupiter may have bulked up on a surrounding disk of infalling material. Its major moons would have also formed from leftovers in that disk.

A challenge to the team was overcoming the glare of the parent star. PDS 70b orbits at approximately the same distance as Uranus does from the Sun, but its star is more than 3,000 times brighter than the planet at UV wavelengths. As Zhou processed the images, he very carefully removed the star's glare to leave behind only light emitted by the planet. In doing so, he improved the limit of how close a planet can be to its star in Hubble observations by a factor of five.

"Thirty-one years after launch, we're still finding new ways to use Hubble," Bowler added. "Yifan's observing strategy and post-processing technique will open new windows into studying similar systems, or even the same system, repeatedly with Hubble. With future observations, we could potentially discover when the majority of the gas and dust falls onto their planets and if it does so at a constant rate."

The researchers' results were published in April 2021 in The Astronomical Journal.

Credit: 
NASA/Goddard Space Flight Center

Less innocent than it looks

image: A hydrogen vacancy (the black spot left of center) created by removing hydrogen from a methylammonium molecule, traps carriers in the prototypical hybrid perovskite, mehtylammonium lead iodide CH3NH3Pbl3

Image: 
Xie Zhang

Researchers in the materials department in UC Santa Barbara's College of Engineering have uncovered a major cause of limitations to efficiency in a new generation of solar cells.

Various possible defects in the lattice of what are known as hybrid perovskites had previously been considered as the potential cause of such limitations, but it was assumed that the organic molecules (the components responsible for the "hybrid" moniker) would remain intact. Cutting-edge computations have now revealed that missing hydrogen atoms in these molecules can cause massive efficiency losses. The findings are published in a paper titled "Minimizing hydrogen vacancies to enable highly efficient hybrid perovskites," in the April 29 issue of the journal Nature Materials.

The remarkable photovoltaic performance of hybrid perovskites has created a great deal of excitement, given their potential to advance solar-cell technology. "Hybrid" refers to the embedding of organic molecules in an inorganic perovskite lattice, which has a crystal structure similar to that of the perovskite mineral (calcium titanium oxide). The materials exhibit power-conversion efficiencies rivaling that of silicon, but are much cheaper to produce. Defects in the perovskite crystalline lattice, however, are known to create unwanted energy dissipation in the form of heat, which limits efficiency.

A number of research teams have been studying such defects, among them the group of UCSB materials professor Chris Van de Walle, which recently achieved a breakthrough by discovering a detrimental defect in a place no one had looked before: on the organic molecule.

"Methylammonium lead iodide is the prototypical hybrid perovskite," explained Xie Zhang, lead researcher on the project. "We found that it is surprisingly easy to break one of the bonds and remove a hydrogen atom on the methylammonium molecule. The resulting 'hydrogen vacancy' then acts as a sink for the electric charges that move through the crystal after being generated by light falling on the solar cell. When these charges get caught at the vacancy, they can no longer do useful work, such as charging a battery or powering a motor, hence the loss in efficiency."

The research was enabled by advanced computational techniques developed by the Van de Walle group. Such state-of-the-art calculations provide detailed information about the quantum-mechanical behavior of electrons in the material. Mark Turiansky, a senior graduate student in Van de Walle's group who was involved in the research, helped build sophisticated approaches for turning this information into quantitative values for rates of charge carrier trapping.

"Our group has created powerful methods for determining which processes cause efficiency loss," Turiansky said, "and it is gratifying to see the approach provide such valuable insights for an important class of materials."

"The computations act as a theoretical microscope that allows us to peer into the material with much higher resolution than can be achieved experimentally," Van de Walle explained. "They also form a basis for rational materials design. Through trial and error, it has been found that perovskites in which the methylammonium molecule is replaced by formamidinium exhibit better performance. We are now able to attribute this improvement to the fact that hydrogen defects form less readily in the formamidinium compound.

"This insight provides a clear rationale for the empirically established wisdom that formamidinium is essential for realizing high-efficiency solar cells," he added. "Based on these fundamental insights, the scientists who fabricate the materials can develop strategies to suppress the harmful defects, boosting additional efficiency enhancements in solar cells."

Credit: 
University of California - Santa Barbara

An OU-led study sheds new insight on forest loss and degradation in Brazilian Amazon

image: Interannual changes of forest area, aboveground biomass (AGB), active fire area, burned area, and atmospheric CO2 concentration in the Brazilian Amazon

Image: 
Xiangming Xiao

An international team led by Xiangming Xiao, George Lynn Cross Research Professor in the Department of Microbiology and Plant Biology, University of Oklahoma College of Arts and Sciences, published a paper in the April issue of the journal Nature Climate Change that has major implications on forest policies, conservation and management practices in the Brazilian Amazon. Xiao also is director of OU's Center for Earth Observation and Modeling. Yuanwei Qin, a research scientist at the Center for Earth Observation and Modeling, is the lead author of the study.

For the study described in the paper, "Carbon loss from forest degradation exceeds that from deforestation in the Brazilian Amazon," Xiao, Qin and a team of research scientists and faculty from institutes and universities in the United States, France, the United Kingdom, Denmark and China used satellite data to track spatial-temporal changes of forest area and aboveground biomass in the Brazilian Amazon from 2010 to 2019. They discovered that carbon loss from forest degradation was greater than that resulting from deforestation in the region, which indicates forest degradation should become a high priority in policies, conservation and management.

Tropical forests in the Amazon account, Qin notes, for approximately 50% of the rainforests in the world and are important for global biodiversity, hydrology, climate and the carbon cycle. Accurate and timely data on vegetation aboveground biomass and forest area in the region at various spatial and temporal scales are also essential for data-based policies and decision making. This international team harnessed diverse data for monitoring, reporting and verification of tropical forests. The paper published in Nature Climate Change is a follow-up of a previous study published in Nature Sustainability in 2019, which reported improved estimates of forest areas in the Brazilian Amazon.

Credit: 
University of Oklahoma

How to level up soft robotics

The field of soft robotics has exploded in the past decade, as ever more researchers seek to make real the potential of these pliant, flexible automata in a variety of realms, including search and rescue, exploration and medicine.

For all the excitement surrounding these new machines, however, UC Santa Barbara mechanical engineering professor Elliot Hawkes wants to ensure that soft robotics research is more than just a flash in the pan. "Some new, rapidly growing fields never take root, while others become thriving disciplines," Hawkes said.

To help guarantee the longevity of soft robotics research, Hawkes, whose own robots have garnered interest for their bioinspired and novel locomotion and for the new possibilities they present, offers an approach that moves the field forward. His viewpoint, written with colleagues Carmel Majidi from Carnegie Mellon University and Michael T. Tolley of UC San Diego, is published in the journal Science Robotics.

"We were looking at publication data for soft robotics and noticed a phase of explosive growth over the last decade," Hawkes said. "We became curious about trends like this in new fields, and how new fields take root."

The first decade of widespread soft robotics research, according to the group, "was characterized by defining, inspiring and exploring," as roboticists took to heart what it meant to create a soft robot, from materials systems to novel ways of navigating through and interacting with the environment.

However, the researchers argue, "for soft robotics to become a thriving, impactful field in the next decade, every study must make a meaningful contribution." According to Hawkes, the long-term duration of a rapidly growing field is often a matter of whether the initial exploratory research matures.

With that in mind, the group presents a three-tiered categorization system to apply to future soft robotics work.

"The three-tier system categorizes studies within the field, not the field as a whole," Hawkes explained. "For example, there will be articles coming out this year that will be Level 0, Level 1 and Level 2. The goal is to push as many Level 0 studies toward Level 1 and Level 2."

From Baseline to Broad Contribution

"Soft for soft's sake" could be used to characterize Level 0 in the categorization system, as researchers have, for the past decade, rapidly and broadly explored new materials and mechanisms that could fall under the notion of "soft robot." While these studies were necessary to define the field, according to the authors, maintaining research at this level puts soft robotics at the risk of stagnation.

With the benefits of a solid foundation, present and future roboticists are now encouraged to identify areas for performance improvement and solutions to gaps in the knowledge of soft robotics -- the hallmark of Level 1. These studies will push the field forward, the researchers said, as novel results could elevate technological performance of soft systems.

However, they say, "whenever possible, we should strive to push beyond work that only contributes to our field." Studies in the Level 2 category go beyond soft robotics to become applications in the broader field of engineering. Here, softness is more than an artificial constraint, according to the paper; rather, it "advances state-of-the art technology and understanding across disciplines" and may even displace long-used conventional technologies.

One way to move beyond Level 0 lies in the training of the next generation of roboticists, the researchers said. Consolidating the best available knowledge contributed by previous work will prime those just entering the field to "ask the right questions" as they pursue their research.

"We hope that the categorization we offer will serve the field as a tool to help improve contribution, ideally increasing the impact of soft robotics in the coming decade," Hawkes said.

Credit: 
University of California - Santa Barbara

New test detects residual cancer DNA in the blood without relying on tumor data

BOSTON - After patients with cancer undergo surgery to remove a tumor and sometimes additional chemotherapy, tools are used to identify patients at highest risk of recurrence. Non-invasive tools to detect microscopic disease are of especially high value. In a new study published in Clinical Cancer Research, a team led by investigators at Massachusetts General Hospital (MGH) has evaluated the first "tumor-uninformed" test that detects cancer DNA circulating in the blood of patients following treatment.

The test, called Guardant Reveal, developed by precision oncology company Guardant Health, is "tumor-uninformed" because, unlike previous tests for circulating tumor DNA (ctDNA) in the blood, this test does not require knowing the particular mutations that were present in the patient's tumor.

"The use of ctDNA, which is a type of 'liquid biopsy', is a powerful prognostic tool to detect residual disease, and many prospective trials are under way in the United States, Europe, Asia and Australia to use ctDNA to guide treatment decision-making," says lead author Aparna R. Parikh, MD, an investigator in the Division of Hematology and Oncology at MGH and an assistant professor of Medicine at Harvard Medical School. "Most studies have used a tumor-informed ctDNA approach that requires testing of the tumor and knowledge of tumor-specific alterations, which can't be used when a patient has insufficient tumor tissue for analysis."

In this study, Parikh and her colleagues at MGH and Guardant Health evaluated the first tumor-uninformed ctDNA assay to detect residual cancer cells in patients who underwent surgery for colorectal cancer. Instead of relying on DNA sequencing of individual patients' tumors, the approach looked for known cancer-specific alterations.

When the researchers analyzed ctDNA results from 84 patients and examined how accurately the results correlated with cancer recurrence, they found that this "plasma only" approach was similar in sensitivity and specificity to tumor-informed approaches.

"This is one of the first studies to report on a plasma-only approach. There are advantages and disadvantages to each of the approaches," says Parikh. She notes that ongoing prospective studies will provide additional information on the performance of this assay for detecting residual cancer cells and for guiding treatment decisions.

Credit: 
Massachusetts General Hospital

Open-source GPU technology for supercomputers

image: Vladimir Stegailov, HSE University professor

Image: 
Vladimir Stegailov

Researchers from the HSE International Laboratory for Supercomputer Atomistic Modelling and Multi-scale Analysis, JIHT RAS and MIPT have compared the performance of popular molecular modelling programs on GPU accelerators produced by AMD and Nvidia. In a paper published by the International Journal of High Performance Computing Applications, the scholars ported LAMMPS on the new open-source GPU technology, AMD HIP, for the first time.

The scholars thoroughly analysed the performance of three molecular modelling programs - LAMMPS, Gromacs and OpenMM - on GPU accelerators Nvidia and AMD with comparable peak parameters. For the tests, they used the model of ApoA1 (Apolipoprotein A1) -- apolipoprotein in blood plasma, the main carrier protein of 'good cholesterol'. They found that the performance of research calculations is influenced not only by hardware parameters, but also by software environment. It turned out that ineffective performance of AMD drivers in complicated scenarios of parallel launch of computing kernels can lead to considerable delays. Open-source solutions still have their disadvantages.

In the recently published paper, the researchers were the first to port LAMMPS on a new open-source GPU technology, AMD HIP. This developing technology looks very promising since it helps effectively use one code both on Nvidia accelerators and on new GPUs by AMD. The developed LAMMPS modification has been published as an open source and is available in the official repository: users from all over the world can use it to accelerate their calculations.

'We thoroughly analysed and compared the GPU accelerator memory sub-systems of Nvidia Volta and AMD Vega20 architectures. I found a difference in the logics of parallel launch of GPU kernels and demonstrated it by visualizing the program profiles. Both the memory bandwidth and the latencies of different levels of GPU memory hierarchy as well as the effective parallel execution of GPU kernels -- all these aspects have a major impact on the real performance of GPU programs,' said Vsevolod Nikolskiy, HSE University doctoral student and one of the paper's authors.

The paper's authors argue that participation in the technological race of the contemporary microelectronics giants demonstrates an obvious trend toward greater variety of GPU acceleration technologies.

'On the one hand, this fact is positive for end users, since it stimulates competition, growing effectiveness and the decreasing cost of supercomputers. On the other hand, it will be even more difficult to develop effective programs due to the need to consider the availability of several different types of GPU architectures and programming technologies,' commented Vladimir Stegailov, HSE University professor. 'Even supporting program portability for ordinary processors on different architectures (x86, Arm, POWER) is often complicated. Portability of programs between different GPU platforms is a much more complicated issue. The open-source paradigm eliminates many barriers and helps the developers of big and complicated supercomputer software.'

In 2020, the market for graphic accelerators experienced a growing deficit. The popular areas of their use are well-known: cryptocurrency mining and machine learning tasks. Meanwhile, scientific research also requires GPU accelerators for mathematical modelling of new materials and biological molecules.

'Creating powerful supercomputers and developing fast and effective programs is how tools are prepared for solving the most complex global challenges, such as the COVID-19 pandemic. Computation tools for molecular modelling are used globally today to search for ways to fight the virus,' said Nikolay Kondratyuk, researcher at HSE University and one of the paper's authors.

The most important programs for mathematical modelling are developed by international teams and scholars from dozens of institutions. Development is carried out within the open-source paradigm and under free licenses. The competition of two contemporary microelectronics giants, Nvidia and AMD, has led to the emergence of a new open-source infrastructure for GPU accelerators' programming, AMD ROCm. The open-source character of this platform gives hope for maximum portability of codes developed with its use, to supercomputers of various types. Such AMD strategy is different from Nvidia's approach, whose CUDA technology is a closed standard.

It did not take long to see the response from the academic community. Projects of the largest new supercomputers based on AMD GPU accelerators are close to completion. The Lumi in Finland with 0.5 exaFLOPS of performance (which is similar to performance of 1,500,000 laptops!) is quickly being built. This year, a more powerful supercomputer, Frontier, is expected in the USA (1.5 exaFLOPS), and in 2023 - an even more powerful El Capitan (2 exaFLOPS) is expected.

Credit: 
National Research University Higher School of Economics

Social media and science show how ship's plastic cargo dispersed from Florida to Norway

image: An ink cartridge washed up on a beach in Cornwall and recovered by the Lost at Sea Project

Image: 
Tracey Williams, Lost at Sea Project

A ship's container lost overboard in the North Atlantic has resulted in printer cartridges washing up everywhere from the coast of Florida to northern Norway, a new study has shown.

It has also resulted in the items weathering to form microplastics that are contaminated with a range of metals such as titanium, iron and copper.

The spillage is thought to have happened around 1,500 km east of New York, in January 2014, with the first beached cartridges reported along the coastline of the Azores in September the same year.

Since then, around 1,500 more have been reported on social media, with the greatest quantities along the coastlines of the UK and Ireland but also as far south as Cape Verde and north to the edge of the Arctic Circle.

The study was conducted by the University of Plymouth and the Lost at Sea Project, who have previously worked together on research suggesting LEGO bricks could survive in the ocean for up to 1,300 years.

For this new research, they combined sightings data reported by members of the public and oceanographic modelling tools to show how the cartridges reached their resting place.

Some were carried by the Azores and Canary currents around the North Atlantic Gyre, while others were transported northwards with the North Atlantic and Norwegian currents.

Writing in the journal Environmental Pollution, the researchers say the dates of first sightings suggested the cartridges travelled on average between 6 cm and 13 cm per second, demonstrating how quickly buoyant items can be dispersed across the oceans.

Through microscopic and X-ray fluorescence analyses, they also revealed a high degree of exterior weathering that resulted in the cartridge surfaces becoming chalky and brittle.

This has resulted in the formation of microplastics rich in titanium, the chemical fouling of interior ink foams by iron oxides, and, in some cases, the presence of an electronic chip containing copper, gold and brominated compounds.

Significantly, the study's authors say, the latter characteristic renders cartridges as electrical and electronic waste and means the finds are not governed by current, conventional regulations on plastic cargo lost at sea.

Lead author Dr Andrew Turner, Associate Professor (Reader) in Environmental Sciences at the University of Plymouth, said: "Cargo spillages are not common, but estimates suggest there could be several thousand containers lost at sea every year. They can cause harm to the seabed but, once ruptured, their contents can have an impact both where they are lost and - as shown in this study - much more widely. This research has also shown once again how plastics not designed to be exposed to nature can break down and become a source of microplastics in the environment. It also calls into question the relevance and robustness of current instruments and conventions that deal with plastic waste and its accidental loss at sea."

Tracey Williams, founder of the Cornwall-based Lost at Sea Project, added: "This study also highlights the potential usefulness of social media-led citizen science to marine research. Over many years, members of the public have helped us to show the amount of plastic in our seas and on our beaches. It is something people care passionately about and are committed to trying to solve."

Credit: 
University of Plymouth

Study: New York City nurses experienced anxiety, depression during first wave of COVID-19

New York nurses caring for COVID-19 patients during the first wave of the pandemic experienced anxiety, depression, and illness--but steps their hospitals took to protect them and support from their coworkers helped buffer against the stressful conditions, according to a study led by researchers at NYU Rory Meyers College of Nursing.

"A critical part of the public health response to the COVID-19 pandemic should be supporting the mental health of our frontline workers. Our study demonstrates that institutional resources--such as supportive staff relationships, professional development, providing temporary housing, and access to personal protective equipment--were associated with lower levels of anxiety and depression among nurses," said Christine T. Kovner, RN, PhD, the Mathey Mezey Professor of Geriatric Nursing at NYU Meyers and the study's lead author.

The COVID-19 pandemic has strained health systems around the world. The public health crisis has subjected nurses--the largest group of healthcare professionals responding to the pandemic--and other frontline workers to situations of unparalleled stress, as routine roles and responsibilities were disrupted. Not only have nurses worked tirelessly to care for very ill patients, many of whom died, but they themselves have been at risk of exposure to a life-threatening disease and worry about bringing it home to their loved ones.

Research shows that nurses responding to disasters can experience anxiety and depression, but a variety of factors--both personal and in the workplace--can help nurses cope with, adapt to, and recover from stressful conditions. This study, published in Nursing Outlook, examined what factors helped nurses responding to COVID-19 thrive and what factors may have challenged their mental health.

Kovner and her colleagues surveyed 2,495 nurses across four hospitals in the New York City area that are part of NYU Langone Health. This study was conducted from May through July 2020, during the first wave of the pandemic.

Key findings include:

Anxiety and depression were common among nurses during COVID-19's first wave.

Roughly 27 percent of nurses surveyed reported anxiety and 17 percent reported depression.

The more that nurses cared for patients with COVID-19, the higher their depression and anxiety.

Younger nurses were more likely to be anxious and depressed than were older nurses.

Nurses working in intensive care units were more likely to be depressed than were those in other settings.

Workplace support protected nurses' wellbeing.

When asked what helped the nurses to carry out their care of patients, the most common responses were coworker support (75 percent), followed by support from family and friends (58 percent).

More than half of the respondents were assigned to a new unit as part of their hospital's response to the pandemic. Of those, 77 percent felt that they had received sufficient support from staff at the new unit.

Less anxiety and depression were associated with more support in the workplace, better physician-nurse work relations, and access to hospital resources (e.g., adequate personal protective equipment, or PPE). Anxiety and depression were higher among those with more organizational constraints.

Nurses experienced COVID-19's impact not only at work, but in their personal and home lives as well.

Thirteen percent of nurses reported having contracted COVID-19 and 24 percent had family or a close friend with the illness.

Almost half of the nurses surveyed had to self-isolate and nearly one in five lived in temporary housing provided by the hospital.

Conflict between work and home responsibilities was linked to higher levels of depression and anxiety; residing in temporary housing was linked to lower depression and anxiety.

Nurses valued professional development and training tailored to the pandemic.

Training in the proper donning, doffing, and disposal of PPE was one of the top factors the majority of nurses identified as helping them care for patients with COVID-19.

Having a sense of mastery at work was the most protective factor against depression and anxiety.

Prior education and experience did not necessarily translate to a pandemic with a novel virus: 24 percent of nurses had prior experience with epidemics, and only 23 percent reported that their nursing education was helpful in caring for COVID-19 patients.

"Hospitals can play a role in building and sustaining resiliency in their workforces by understanding the triggers that contribute to stress, depression, and anxiety, and by developing resources to minimize these factors, particularly during crises," said Kovner.

Credit: 
New York University

Simple device improves care after kidney transplantation

image: The geko™ device, manufactured by Sky Medical Technology Ltd and distributed in Canada by Trudell Healthcare Solutions Inc., is a muscle pump activator which significantly improves blood flow by stimulating the body's 'muscle pumps.' Patients using the device following kidney transplantation experienced shorter hospital stays and reduced surgical site infections by nearly 60 per cent.

Image: 
Lawson Health Research Institute

LONDON, ON - In a published study, a team from Lawson Health Research Institute has found that a simple device can reduce swelling after kidney transplantation. The geko™ device, manufactured by Sky Medical Technology Ltd and distributed in Canada by Trudell Healthcare Solutions Inc., is a muscle pump activator which significantly improves blood flow by stimulating the body's 'muscle pumps.' Patients using the device following kidney transplantation experienced shorter hospital stays and reduced surgical site infections by nearly 60 per cent.

Kidney and simultaneous pancreas-kidney transplantations can significantly reduce mortality and improve the quality of life for patients with end stage renal disease. "After surgery, many of these organ recipients require a longer hospital stay due to delayed kidney function, infection, lack of mobility or edema," says Dr. Alp Sener, Lawson Scientist and Transplant Surgeon in the Multi-Organ Transplant Program at London Health Sciences Centre (LHSC).

Edema is swelling caused by excess fluid trapped in the body's tissues which can impact wound healing. The current standard of care for managing lower-limb edema and improving blood flow is thrombo-embolic-deterrent ("TED") stockings used with compression devices. Sleeves pumped with air squeeze the lower legs to boost circulation. They can be uncomfortable to wear, and the large pump can inhibit early mobility and disrupt sleep after surgery.

In a randomized controlled clinical trial spanning two years, 221 transplant recipients at LHSC either wore the standard TED stocking and pump or the geko™ device for six days after surgery. Dr. Sener's research team found that wearing the device increased urine output by 27 per cent and lowered weight gain by over a kilogram. With more urine produced and less fluid retention, patients experienced 31 per cent less swelling. The duration of costly hospitalization was shortened by over one day after kidney transplantation compared to the standard of care.

A 60 per cent reduction in wound infection rates was a striking observation. "Transplant patients are at a higher risk of infection due to the immunosuppressant medications needed after surgery," explains Dr. Sener, who is also the President of the Urologic Society for Transplantation and Renal Surgery, a global organization affiliated with the American Urological Association. "Reducing infection means a much better outcome for the patient and considering that recent data shows wound infections can cost the health care system thousands of dollars per person, it's a win-win situation."

Some of the study participants wore pedometers to track their steps, and those using the geko™ device had improved mobility after surgery. The team suspects this may be due to reduced swelling which could improve ease and comfort when moving.

"The study results have been both surprising and exciting. Not only have we cut down wound infection rates but we have also seen a considerable improvement in the new organ's function following transplantation. Patients report feeling more satisfied with the transplant process and are more mobile," says Dr. Sener. The geko™ device is now being offered to patients at LHSC in recovery after receiving a new kidney.

Ruben Garcia, 68 years old, recently received a new kidney from his daughter, Ruby, who was a match as a living kidney donor. Following his surgery, Garcia found it difficult to get out of bed due to the pain and swelling, and the function of his new kidney was very low. "My surgeon explained in very simple terms that it was as if my new kidney wasn't awake yet," describes Garcia.

Dr. Sener recommended that Garcia use the geko™ device to help stimulate blood flow in a way that is similar to walking. Garcia was soon able to sit up on a chair and by the next day he was walking. "My kidney woke up and starting working again! I could feel the device working and it was comfortable to wear, almost like a massage for my legs. I'm very grateful for the care that I received."

Dr. Sener adds that "using a muscle pump activator could be a game changer for other procedures like orthopedic implants where wound infection can have disastrous consequences or in surgeries where wound infections are more common such as in cancer and intestinal surgery."

The geko™ device is non-invasive, self-adhering, battery-powered and recyclable. It generates neuromuscular electro-stimulation and unparalleled systemic blood flow that equates to 60 per cent of that achieved by walking. Pain-free muscle contraction compresses deep veins in the lower legs to create better blood flow in these vessels and return blood to the heart. It is particularly well suited to hospital settings as it portable and requires minimal training. For the indications for the use of the geko™ device, go to http://www.gekodevices.com

"The results of the study provide further evidence that the geko™ device is an effective treatment option that can improve outcomes for patients and help them return home sooner, while reducing costs for the health-care system," says George Baran, Executive Chairman of the Trudell Medical Group and a Director of Sky Medical.

The study "Daily use of a muscle pump activator device reduces duration of hospitalization and improves early graft outcomes post-kidney transplantation: A randomized controlled trial" is published in CUAJ.

Credit: 
Lawson Health Research Institute