Tech

Stirling research evaluates effectiveness of conservation efforts

image: Research lead

Image: 
University of Stirling

New research from the University of Stirling into the effectiveness of international conservation projects could help to save endangered species from extinction.

The research, led by Laura Thomas-Walters of the Faculty of Natural Sciences, has published in respected journal, People and Nature. It focussed on evaluating a conservation campaign in São Tomé and Principe, that aimed to persuade local people to stop eating sea turtle meat and eggs.

Animal conservation faces a number of challenges, including climate change and the wildlife trade, and both time and money pressures. Knowing what measures work best and are most appropriate is essential to delivering effective conservation activity, but previous efforts to evaluate projects have encountered difficulties and delivered varying results.

The Stirling research team sought to address those difficulties by exploring better ways to evaluate conservation projects that work to change human behaviour. The team conducted interviews and completed surveys with local people to determine if the campaign, which used tactics including television advertisements and cooking competitions, had been successful in reducing their consumption of sea turtles.

Laura Thomas-Walters, research project lead, said: "In conservation we are faced with a whole host of pressing issues that human actions cause, but we are short on time and money. Evaluating our projects is really important if we want to learn from them and improve future interventions.

"Unusually, for this research we measured both behavioural outcomes like attitudes or consumption, and conservation impacts, like sea turtle poaching. We found a decrease in self-reported sea turtle egg consumption and a decrease in the poaching of adult sea turtles.

"However, a simultaneous increase in law enforcement targeted at poaching may also have impacted the figures. Our recommendations for future projects include combining different outcome measures to triangulate hard-to-measure behaviours, and using theory-based evaluation methods."

The campaign, 'Tataluga - Mém Di Omali', which means "Sea Turtle - Mother of the Sea" in the local Forrô dialect featured community events, such as cooking contests to promote alternative food products, and theatre performances, as well as a football competition. It also utilised mass media, including billboards, and television and radio adverts.

The Stirling team interviewed local people about their habits and sea turtle consumption, both before and after the campaign took place. The number of people who indicated they ate sea turtle eggs decreased from 40 per cent to 11 per cent during the campaign but this may also have been due to increased anti-poaching law enforcement.

Credit: 
University of Stirling

Climate change presents new challenges for the drinking water supply

image: The Rappbode Reservoir in the Harz region is Germany's largest drinking water reservoir.

Image: 
André Künzelmann

The Rappbode Reservoir in the Harz region is Germany's largest drinking water reservoir, supplying around one million people with drinking water in areas including the Halle region and the southern part of the state of Saxony-Anhalt. Water temperatures in the reservoir now have the potential to increase significantly due to climate change. If average global warming reaches between 4 and 6 degrees by the year 2100, as the current trend suggests, temperature conditions in the Rappbode Reservoir will become comparable to those in Lake Garda and other lakes south of the Alps. In an article in Science of the Total Environment magazine, a team of researchers led by the Helmholtz Centre for Environmental Research (UFZ) writes that the reservoir's operators could partially offset the impacts this will have on the drinking water supply - to do so, they would have to change the way the reservoir is managed.

The impacts of climate change can already be seen in the Rappbode Reservoir: Over the past 40 years, the water surface temperature in the reservoir has increased by around 4 degrees in the summer months. This trend could continue, as has now been demonstrated by a team of researchers led by Dr Karsten Rinke, who researches lakes at UFZ. Working on the basis of a lake model developed by US researchers, the team took into account potential reservoir management strategies to forecast the impacts climate change could have on water temperatures and on the lake's physical structure, which control the stratification and seasonal mixing of the body of water. Their research looked at three scenarios for future greenhouse gas emissions. The so-called "representative concentration pathways" (RCPs) describe whether greenhouse gas emissions will be halted (RCP 2.6), will continue to rise (RCP 6.0) or even continue to increase unabated (RCP 8.5) by 2100. According to the Intergovernmental Panel on Climate Change IPCC, the latter case would result in average global warming of more than 4 degrees by the end of this century.

For the RCP 2.6 and RCP 6.0 scenarios, the study's authors projected that the average temperature on the water surface of the Rappbode Reservoir is set to increase by 0.09 degrees or 0.32 degrees respectively every decade by the year 2100. This would correspond to a total increase of around 0.7 degrees (RCP 2.6) and around 2.6 degrees (RCP 6.0) by the end of this century. As expected, the increase in temperatures would be the highest under the RCP 8.5 scenario, which would see the water temperature increasing by 0.5 degrees every decade or approx. 4 degrees by 2100.

However, in terms of using drinking water, what happens in the deeper strata of the reservoir - i.e., at depths of 50 metres and below - is more serious, as this is where raw water is taken out before being treated to prepare it as drinking water. It is true that impacts by 2100 would be relatively minor under the RCP 2.6 and RCP 6.0 scenarios, as the water temperature would continue to be around 5 degrees year-round. However, water temperatures will increase significantly under the RCP 8.5 scenario - by nearly 3 degrees by the end of the century. As a result, the water in the depths of the reservoir would warm to around 8 degrees. "This would turn a reservoir in Germany's northernmost highlands into a body of water comparable to Lake Maggiore or Lake Garda nowadays," says UFZ scientist Rinke. An increase of this magnitude would have consequences because it would significantly accelerate the speed of biological metabolic processes. "A temperature increase to 8 degrees nearly doubles oxygen demand, that is the amount of oxygen organisms consume during their respiration and degradation processes," says lead author Chenxi Mi, who is focusing on climate impacts on the Rappbode Reservoir in his doctorate at UFZ. Increased oxygen consumption will place an additional pressure on the water's oxygen budget, because the duration of summer stagnation - the phase of stable temperature stratification in lakes in which the deep water is closed off to oxygen supply from the atmosphere - is already extending due to climate change. Plus, warmer water is also unable to absorb as much oxygen. Potential consequences include intensified dissolution of nutrients and dissolved metals from the sediment, algae growth and an increase in blue-green algae.

In other words, the 8.5 scenario would have impacts on the drinking water supply if it were to occur. The reservoir's operators draw the raw water from the lowermost strata for good reason, as the water there is cold and contains only low levels of suspended substances, dissolved metals, algae, bacteria and potentially pathogenic microorganisms. If the oxygen content there decreases more rapidly due to the rising water temperature, the risk of contamination increases, for example due to substances released from the sediment and greater bacteria growth. Treating the water would therefore require a greater effort on the part of the operators, and they would have to deal with higher demands in terms of the treatment capacity they would need to reserve. "This means preventing the deep water from warming is also worthwhile from the perspective of the drinking water supply, and the ideal way to do this is ambitious climate policies that limit warming," says Rinke.

But the operators are not completely powerless against the warming of the deep water in the reservoir. The model simulations set up by Rinke's team show that a share of the heat can be exported by using a clever system to withdraw the water. This has to do with the water that is released to the downstream waters that is, the water that is withdrawn and drains into the water course below the reservoir in order to keep the discharge conditions there stable. This so-called downstream discharge would need to be withdrawn not from the lower strata as it has been thus far but rather from near the surface. "This approach would allow the additional heat caused by climate change to be released again," Rinke explains. However, he adds, it would be impossible to prevent the deep water from heating up if the air temperature increases beyond 6 degrees. "Even though operators have had to cope more with a shortage of water due to the very dry years we've had recently, it's just as important to think about the quality of the water. In terms of reservoir management, we definitely have options and can respond to new conditions caused by climate change. In this way, we can alleviate certain negative impacts through climate adaptation measures."

The operators of the Rappbode Reservoir at the Talsperrenbetrieb Sachsen-Anhalt company are aware of this. They have been working closely together with Karsten Rinke and his team of researchers at UFZ for many years to assess the impacts of climate change and discussed about potential options for adapting the Rappbode Reservoir. The Talsperrenbetrieb is already planning new infrastructures that will make it possible to implement the new management strategies.

Credit: 
Helmholtz Centre for Environmental Research - UFZ

COVID19 A research of Politecnico di Milano discovering the secrets of viral sequences

image: From the SARS-CoV-2 genome (a) sequences of nucleotides and amino acids are extracted (b); sequences are then deposited to world-wide open repositories: GENBANK, GISAID, COG-UK (c), and imported to the centralized database at Politecnico, where the search engine ViruSurf is accessed (d).

Image: 
Politecnico di Milano

Since the beginning of 2020, labs from all around the world are sequencing the material from positive tests of people affected by COVID-19 and then depositing sequences mostly to three points of collection: GenBank, COG-UK, and GISAID. A fast exploration of this huge amount of data is important for understanding how the genome of the virus is changing. For enabling fast "surfing" over this data, the research group of Politecnico di Milano led by Prof. Stefano Ceri has developed ViruSurf, a search engine operating on top of a centralized database stored at Politecnico. The database is periodically reloaded from the three sources and as of today contains 200,516 sequences of SARS-CoV-2, the virus causing COVID-19, and 33,256 sequences of other viral species also associated to epidemics affecting humans, such as SARS, MERS, Ebola, and Dengue.

Every sequence is described from four perspectives: the biological features of the virus and the host, the sequencing technology, the project that has produced the original data, the mutations of the whole sequence of nucleotides and of gene-specific amino acids. The advantage provided by ViruSurf is the use of an algorithm for computing viral mutations homogeneously across sources, using cloud computing. The database is optimized for giving quick responses to the search engine surfers.

Among the future developments of ViruSurf, the most important, funded by a six-month-long project by EIT Digital, is a bio-informatic service for ingesting new viral sequences, which highlights the presence of viral mutations associated with enhanced or reduced severity and virulence as they are discovered. Used in clinics, particularly with a less acute pandemics spreading, it will support the addition of critical information to the patient health record; other uses will be possible in the context of animal farming or of the food chain. The system will soon allow the tracing of epitopes - amino acid sequences that are used in vaccine design - for instance to associate epitopes with mutations of the virus that could be present in given countries of the world and that could affect vaccine.

"In the GeCo project, financed by the European Research Council, we had already developed a search engine for datasets describing the human genome, called GenoSurf; at the beginning of the pandemic, there was no such system for viral sequences. To better understand its requirements, we interviewed about twenty expert virologists from all over the world. The result is a user-friendly system: any researcher can connect to it and perform queries, for instance, about when a viral mutation started and how it has spread in the world"--says Stefano Ceri, the project leader. The article is published on a high relevance journal, Nucleic Acids Research, in the database issue that every year collects the descriptions of the most significant biological databases. The article is authored also by Pietro Pinoli, algorithm designer, Arif Canakoglu, software architect, Anna Bernasconi, data designer, Tommaso Alfonsi, designer of the data loading pipeline, and Damianos P. Melidis from L3S (Hannover), author of some algorithms.

Credit: 
Politecnico di Milano

Minimizing the impact of restaurant shutdowns, restrictions in china amid COVID-19 crisis

image: Assistant Professor, University of Houston Conrad N. Hilton College of Hotel and Restaurant Management

Image: 
University of Houston

Shortly after the COVID-19 pandemic first broke out in Wuhan, China, in January, government-issued lockdowns and business restrictions were implemented across the country, affecting more than 1.2 billion people and all types of businesses. With social distancing mandates in full effect, the restaurant industry was particularly hard hit - forced to close dining rooms while pivoting to curbside or delivery services only.

While the financial devastation caused by the pandemic and subsequent lockdowns has been well-documented, a new study led by the University of Houston Conrad N. Hilton College of Hotel and Restaurant Management, identifies aspects of restaurant operations that benefitted the bottom line despite the turmoil.

Revenue from delivery service dramatically increased, discounts offered to customers weren't effective and there was a significant variation in financial performance across service types.

The researchers analyzed sales data collected by Meituan.com, the largest delivery service provider in China, from more than 86,000 small- and medium-sized restaurants across nine Chinese cities. The study is published in the International Journal of Hospitality Management.

The average annual growth for delivery sales was more than 26%. Meanwhile, the difference in delivery sales between casual dining and fine dining establishments substantially increased from 10% in 2019 to more than 64% in the first quarter of 2020. These findings suggest the need for fine dining restaurants to diversify their source of sales by adding delivery and curbside pickup, just as their less fancy casual dining counterparts were able to do.

The data analysis also revealed that discounts offered during the public health emergency failed to increase sales, indicating consumers are much more concerned about their health than price or value. Discounts are typically a surefire way to increase revenue during normal operations, said Kim.

"In a normal situation, people are very price sensitive, but when there is a health concern like COVID-19, then they are willing to spend more money to consume food if they believe it's safe and convenient," said Jaewook Kim, lead study author and UH assistant professor who conducted the analysis with Jewoo Kim and Yiqi Wang from Iowa State University.

Average delivery time also increased, from 38 minutes to 47 minutes, indicating that substantially increased demand and enlarged delivery areas slowed the pace of delivery services.

The research team is currently performing a second analysis of sales data collected from restaurants in the United States, which currently leads the world in coronavirus cases and deaths, to compare with their findings from China. They hope the data provides partial solutions for restaurateurs trying to minimize the impacts of business shutdowns, and to help them prepare for future crises.

"The restaurant industry is so vulnerable, given such thin margins, and it's been made much worse by the pandemic, so understanding the data is very important to improving the service model and adapting to expanded delivery services," said Kim. "It's a new world now and important for restaurateurs to consider every resource available, so they need to challenge themselves to change their operational characteristics and invest capital to employ different types of delivery services."

Credit: 
University of Houston

CCNY researchers overcome barriers for bio-inspired solar energy harvesting materials

image: Top image adapted from Journal of Physical Chemical Letters. Bottom image, Kara Ng, Nature Chemistry 2020

Image: 
Top image adapted from Journal of Physical Chemical Letters. Bottom image, Kara Ng, Nature Chemistry 2020

Inspired by nature, researchers at The City College of New York (CCNY) can demonstrate a synthetic strategy to stabilize bio-inspired solar energy harvesting materials. Their findings, published in the latest issue of Nature Chemistry, could be a significant breakthrough in functionalizing molecular assemblies for future solar energy conversion technologies.

In almost every corner of the world, despite extreme heat or cold temperature conditions, you will find photosynthetic organisms striving to capture solar energy. Uncovering nature's secrets on how to harvest light so efficiently and robustly could transform the landscape of sustainable solar energy technologies, especially in the wake of rising global temperatures.

In photosynthesis, the first step (that is, light-harvesting) involves the interaction between light and the light-harvesting antenna, which is composed of fragile materials known as supra-molecular assemblies. From leafy green plants to tiny bacteria, nature designed a two-component system: the supra-molecular assemblies are embedded within protein or lipid scaffolds. It is not yet clear what role this scaffold plays, but recent research suggests that nature may have evolved these sophisticated protein environments to stabilize their fragile supra-molecular assemblies.

"Although we can't replicate the complexity of the protein scaffolds found in photosynthetic organisms, we were able to adapt the basic concept of a protective scaffold to stabilize our artificial light-harvesting antenna," said Dr. Kara Ng. Her co-authors include Dorthe M. Eisele and Ilona Kretzschmar, both professors at CCNY, and Seogjoo Jang, professor at Queens College.

Thus far, translating nature's design principles to large-scale photovoltaic applications has been unsuccessful.

"The failure may lie in the design paradigm of current solar cell architectures," said Eisele. However, she and her research team, "do not aim to improve the solar cell designs that already exist. But we want to learn from nature's masterpieces to inspire entirely new solar energy harvesting architectures," she added.

Inspired by nature, the researchers demonstrate how small, cross-linking molecules can overcome barriers towards functionalization of supra-molecular assemblies. They found that silane molecules can self-assemble to form an interlocking, stabilizing scaffold around an artificial supra-molecular light-harvesting antenna.

"We have shown that these intrinsically unstable materials, can now survive in a device, even through multiple cycles of heating and cooling," said Ng. Their work provides proof-of-concept that a cage-like scaffold design stabilizes supra-molecular assemblies against environmental stressors, such as extreme temperature fluctuations, without disrupting their favorable light-harvesting properties.

Credit: 
City College of New York

Scientists organize to tackle crisis of coral bleaching

COLUMBUS, Ohio - An international consortium of scientists has created the first-ever common framework for increasing comparability of research findings on coral bleaching.

"Coral bleaching is a major crisis and we have to find a way to move the science forward faster," said Andréa Grottoli, a professor of earth sciences at The Ohio State University and lead author of a paper on guidelines published Saturday, Nov. 21 in the journal Ecological Applications.

The common framework covers a broad range of variables that scientists generally monitor in their experiments, including temperature, water flow, light and others. It does not dictate what levels of each should be present during an experiment into the causes of coral bleaching; rather, it offers a common framework for increasing comparability of reported variables.

"Our goal was to create a structure that would allow researchers to anchor their studies, so we would have a common language and common reference points for comparing among studies," said Grottoli, who also is director of the consortium that developed the common framework.

Coral bleaching is a significant problem for the world's ocean ecosystems: When coral becomes bleached, it loses the algae that live inside it, turning it white. Coral can survive a bleaching but being bleached puts coral at higher risk for disease and death. And that can be catastrophic: Coral protects coastlines from erosion, offers a boost to tourism in coastal regions, and is an essential habitat to more than 25% of the world's marine species.

Bleaching events have been happening with greater frequency and in greater numbers as the world's atmosphere -- and oceans -- have warmed because of climate change.

"Reefs are in crisis," Grottoli said. "And as scientists, we have a responsibility to do our jobs as quickly, cost-effectively, professionally and as well as we can. The proposed common framework is one mechanism for enhancing that."

The consortium leading this effort is the Coral Bleaching Research Coordination Network, an international group of coral researchers. Twenty-seven scientists from the network, representing 21 institutions around the world, worked together as part of a workshop at Ohio State in May 2019 to develop the common framework.

The goal, Grottoli said, is to allow scientists to compare their work, make the most of the coral samples they collect, and find ways to create a common framework for coral experimentation.

Their recommendations include guidelines for experiments that help scientists understand what happens when coral is exposed to changes in light or temperature over a short period of time, a moderate period, and long periods. The guidelines include a compendium of the most common methods used for recording and reporting physical and biological parameters in a coral bleaching experiment.

That such a framework hasn't already been established is not surprising: The scientific field that seeks to understand the causes of and solutions for coral bleaching is relatively young. The first reported bleaching occurred in 1971 in Hawaii; the first wide-spread bleaching event was reported in Panama and was connected with the 1982-83 El Niño.

But experiments to understand coral bleaching didn't really start in earnest until the 1990s -- and a companion paper by many of the same authors found that two-thirds of the scientific papers about coral bleaching have been published in the last 10 years.

Researchers are still trying to understand why some coral species seem to be more vulnerable to bleaching than others, Grottoli said, and setting up experiments with consistency will help the science move forward more quickly and economically.

"Adopting a common framework for experiments around coral bleaching would make us more efficient as a discipline," Grottoli said.

"We'd be able to better collaborate, and to build on one another's work more easily. It would help us progress in our understanding of coral bleaching -- and because of climate change and the vulnerability of the coral, we need to progress more quickly."

Credit: 
Ohio State University

A rich source of nutrients under the Earth's ice sheets

image: A glacial meltwater river that has drained from the Greenland Ice Sheet. These rivers contain high amounts of suspended glacial flour as the ice sheet acts like a natural bulldozer, giving the rivers a grey milky color. The meltwater is enriched in surprisingly high concentrations of biologically essential micronutrients like iron, manganese or zinc.

Image: 
Jon Hawkings, GFZ

Trace elements such as iron, manganese and zinc are an integral part of the biogeochemical processes on the Earth's surface. As micronutrients, they play an essential role for the growth of all kinds of organisms and thus the Earth's carbon cycle. Below ice sheets, which cover around ten percent of the Earth's land surface, larger quantities of these substances are mobilised than previously assumed. This is shown by new data from Greenland and Antarctica, which were collected and analysed by an international research team led by Jon Hawkings from the GFZ German Research Centre for Geosciences in Potsdam and Florida State University (USA). They provide important insights into previously unknown processes at the boundary of ice, meltwater and rock. Because the ice masses are significantly influenced by global warming, new perspectives are emerging on the consequences climate change has for critical biogeochemical processes, including those in surrounding ecosystems such as oceans, lakes and wetlands. The study is published today in the journal PNAS.

Under the Earth's ice sheets melt water forms an extensive hidden wetland of rivulets, rivers and lakes. During the last forty years, over 400 subglacial lakes have been discovered in Antarctica alone; some as large as the Great Lakes of North America. At the boundary between ice, water and rock, a complex ensemble of chemical, physical and microbiological forces is at work, breaking up and grinding rock and releasing trace elements into the meltwater which is carried downstream. These chemical elements occur only in very low concentrations, hence the name. Nevertheless they are - like vitamins - essential as nutrients for all living things.

How and in which quantities trace elements are released under the Greenland and Antarctic ice sheets and eventually flow into the adjacent ecosystems, and what role they play in these ecosystems and the global carbon cycle at large, has not yet been studied in detail. This is because measurement campaigns in these remote regions of the world are an enormous logistical and technical challenge.

Elaborate sampling

In order to collect samples from the waters under the Greenland and Antarctic ice sheets and analyse them in the laboratory, Jon Hawkings from GFZ** collaborated with an international and interdisciplinary research team. Colleagues Mark Skidmore and John Priscu from Montana State University (USA) led a project to drill more than 1000 metres into the Antarctic ice sheet as part of their SALSA project. This enabled them to tap into the nine-kilometre long and 15-metre deep Mercer Subglacial Lake. "There's a science reason for looking at that specific lake, but then there is the context of these lakes being part of this greater hydrological system," Mark Skidmore said. "So, we want to see what's being generated beneath the ice sheet and how that connects to the coastal environments."

Jon Hawkings himself and colleagues under the lead of Jemma Wadham of the University of Bristol (UK) took samples from sub-ice waters emerging from Leverett Glacier in Greenland over a three-month period in the summer melt season.

The samples were analysed in ultra-clean laboratories to avoid contamination. The researchers filtered the meltwater samples to multiple levels to sort the sample concentrations by size, as many of these trace elements can exist as extremely small nanoparticulate minerals. They determined their chemical composition using particularly sensitive mass-spectrometry methods.

Surprisingly high concentration of iron & Co.

Hawkings and his colleagues discovered that significant amounts of trace elements are released in the melt waters below the ice masses. They found these melt water concentrations can exceed those in rivers and the open ocean by many times. For example, the value for dissolved iron in the Antarctic subglacial lake was more than 1000 micrograms per litre and not around five, as would be expected in dilute ice melt.

"For a long time it was assumed that in the icy regions of the earth trace elements are present in such miniscule quantities that they are of little importance for global elemental cycles. On the contrary, our results show that ice sheets may play a key role in regional mobilization of these elements. The impacts of this need to be further monitored and analysed in the context of climate change. We have now laid a baseline for this," says Jon Hawkings.

Insights into weathering processes under the ice

Furthermore, the concentrations of the individual elements as well as their ratios and the size distribution between dissolved and nanoparticulate mineral forms tell the researchers something about the source material, the sub-ice sheet weathering processes and the paths taken by the water before sampling. For example, it is known that the element vanadium occurs primarily in silicate rock minerals rather than carbonate rock minerals. Elevated concentrations found in this study indicate that higher rates of silicate mineral weathering are occurring under ice sheets than previously thought. Importantly, silicate mineral weathering is a sink for carbon dioxide. Iron, on the other hand, is known to oxidise in an oxygen-rich environment, resulting in precipitated „rust". Large quantities of dissolved iron therefore indicate that some of the water may originate from a region with little oxygen. The researchers also found trace elements like aluminium, iron and titanium occurred in higher concentrations in Antarctica than in Greenland. They therefore hypothesise that the meltwater in the southern polar region has much longer residence times under the ice sheet and greater hydrological isolation than in the northern polar region.

Consequences for ideas on iron fertilization

The new findings are particularly relevant for our understanding of nutrient cycling in the Southern Ocean. There the water is considered to be rich in nutrients like nitrogen and phosphorus but depleted in iron. For this reason, phytoplankton, the plants of the ocean, the base of the global food pyramid and an important CO2 sink, do not grow to their maximum potential. This "iron limitation" has been the subject of previous geoengineering projects to sequester carbon dioxide from the atmosphere by seeding the ocean with iron. The results of Hawkings and his colleagues are consistent with observations of higher quantities of iron and phytoplankton in the immediate vicinity of the Antarctic Ice Sheet. Their results suggest that the ice sheet may naturally fertilize the coastal regions of the Southern Ocean by providing a supply of iron for phytoplankton. To what degree and how this might change in the future with climatic warming remain open questions for further research.

On the trail of life's limits

Hawkings and his collaborators investigated 17 different trace elements. "Each of these tells us its own story and we work like detectives, trying to make a coherent overall narrative out of all the data," says the geoscientist. "We are interested in exploring the limits of life on Earth in terms of the availability of energy and nutrients, and this helps tell us part of that story. We are only just beginning to understand the importance of these large ice masses in this context. Hopefully our research also helps in starting to answer many important outstanding scientific questions, which include the influence of climate change: How will these biogeochemical cycles change if more ice melts? Will this release more and more trace elements or will these processes be slowed down? In addition, it is still open what happens to the substances on their way into the oceans and how much ultimately reaches marine organisms."

Collaborator and SALSA project lead John Priscu points out the importance of interdisciplinary work for scientific discoveries: "This paper intersects many disciplines and shows the power of international collaboration. Results in this manuscript have transformed our view of how polar ice sheets influence the Earth System."

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

UCF researcher zeroes in on critical point for improving superconductors

ORLANDO, Nov. 23, 2020 - The search for a superconductor that can work under less extreme conditions than hundreds of degrees below zero or at pressures like those near the center of the Earth is a quest for a revolutionary new power -- one that's needed for magnetically levitating cars and ultra-efficient power grids of the future.

But developing this kind of "room temperature" superconductor is a feat science has yet to achieve.

A University of Central Florida researcher, however, is working to move this goal closer to realization, with some of his latest research published recently in the journal Communications Physics - Nature.

In the study, Yasuyuki Nakajima, an assistant professor in UCF's Department of Physics, and co-authors showed they could get a closer look at what is happening in "strange" metals.

These "strange" metals are special materials that show unusual temperature behavior in electrical resistance. The "strange" metallic behavior is found in many high-temperature superconductors when they are not in a superconducting state, which makes them useful to scientists studying how certain metals become high-temperature superconductors.

This work is important because insight into the quantum behavior of electrons in the "strange" metallic phase could allow researchers to understand a mechanism for superconductivity at higher temperatures.

"If we know the theory to describe these behaviors, we may be able to design high-temperature superconductors," Nakajima says.

Superconductors get their name because they are the ultimate conductors of electricity. Unlike a conductor, they have zero resistance, which, like an electronic "friction," causes electricity to lose power as it flows through a conductor like copper or gold wire.

This makes superconductors a dream material for supplying power to cities as the energy saved by using resistance-free wire would be huge.

Powerful superconductors also can levitate heavy magnets, paving the way for practical and affordable magnetically levitating cars, trains and more.

To turn a conductor into a superconductor, the metal material must be cooled to an extremely low temperature to lose all electrical resistance, an abrupt process that physics has yet to develop a fully comprehensive theory to explain.

These critical temperatures at which the switch is made are often in the range of -220 to -480 degrees Fahrenheit and typically involve an expensive and cumbersome cooling system using liquid nitrogen or helium.

Some researchers have achieved superconductors that work at about 59 degrees Fahrenheit, but it was also at a pressure of more than 2 million times of that at the Earth's surface.

In the study, Nakajima and the researchers were able to measure and characterize electron behavior in a "strange" metallic state of non-superconducting material, an iron pnictide alloy, near a quantum critical point at which electrons switch from having predictable, individual behavior to moving collectively in quantum-mechanical fluctuations that are challenging for scientists to describe theoretically.

The researchers were able to measure and describe the electron behavior by using a unique metal mix in which nickel and cobalt were substituted for iron in a process called doping, thus creating an iron pnictide alloy that didn't superconduct down to -459.63 degrees Fahrenheit, far below the point at which a conductor would typically become a superconductor.

"We used an alloy, a relative compound of high temperature iron-based superconductor, in which the ratio of the constituents, iron, cobalt and nickel in this case, is fine-tuned so that there's no superconductivity even near absolute zero," Nakajima says. "This allows us to access the critical point at which quantum fluctuations govern the behavior of the electrons and study how they behave in the compound."

They found the behavior of the electrons was not described by any known theoretical predictions, but that the scattering rate at which the electrons were transported across the material can be associated with what's known as the Planckian dissipation, the quantum speed limit on how fast matter can transport energy.

"The quantum critical behavior we observed is quite unusual and completely differs from the theories and experiments for known quantum critical materials," Nakajima says. "The next step is to map the doping-phase diagram in this iron pnictide alloy system."

"The ultimate goal is to design higher temperature superconductors," he says. "If we can do that, we can use them for magnetic resonance imaging scans, magnetic levitation, power grids, and more, with low costs."

Unlocking ways to predict the resistance behavior of "strange" metals would not only improve superconductor development but also inform theories behind other quantum-level phenomena, Nakajima says.

"Recent theoretical developments show surprising connections between black holes, gravity and quantum information theory through the Planckian dissipation," he says. "Hence, the research of 'strange' metallic behavior has also become a hot topic in this context."

Credit: 
University of Central Florida

Proving viability of injection-free microneedle for single-administration of vaccines

A single-use, self-administered microneedle technology developed by UConn faculty to provide immunization against infectious diseases has recently been validated by preclinical research trials.

Recently published in Nature Biomedical Engineering, the development and preclinical testing of the microneedle patches was reported by UConn researchers in the lab of Thanh Nguyen, assistant professor in the Departments of Mechanical Engineering and Biomedical Engineering.

The concept of a single-injection vaccine, which is recognized as a preferable vaccination approach by the World Health Organization (WHO), has been investigated for many years. Previous efforts to create such a single-injection vaccine include a technology called SEAL (StampEd Assembly of Polymer Layer), developed in 2017 by Nguyen, to create single-injection vaccine microparticles which can deliver vaccines after several defined periods, simulating multiple bolus injections.

However, these microparticles require a large needle for the injection. Additionally, there is also a limited number of the particles that can be loaded into the needle, which means only a limited vaccine dose can be delivered. Ultimately, the microparticles still require traditional injections, which are painful and produce unfavorable biohazard wastes from disposed sharp syringes.
A tiny microneedle patch being held between the gloved fingers of a UConn researcher.
A microneedle patch. (Courtesy of Thanh Nguyen)

"It has been recognized for a long time that there is a need to eliminate many injections in conventional vaccination process," Thanh says. "While booster and repeated shots of vaccines are important to sustain immune-protection, these injections are associated with pain, high costs, and complicated injection schedules, causing a very low patient compliance. The issue becomes more problematic for patients in developing countries due to their limited access to health care providers. In such places, parents struggle to remember the schedule and cannot afford to repeatedly travel long distances with their children to medical centers to receive multiple booster doses of vaccines."

As detailed in Nature Biomedical Engineering, to overcome these problems, Nguyen's lab at UConn developed a microneedle skin patch, which only requires a single administration to perform exactly the same programmable delayed release of vaccine, as that obtained from the SEAL microparticles.

The microneedle patch avoids any painful injections, offering a significant enhancement from the perspective of patients. Extensive research has shown microneedle skin patches are almost painless, and could even be self-administered by patients at home. The patch is small, portable, and similar to a nicotine patch, which could be easily distributed to all people over the world for self-administration in the case of a pandemic such as the COVID-19 crisis to quickly create a pan-immunity at the global scale.

The microneedles have a core-shell microstructure, in which the microneedle shells are made with biodegradable medical polymer that is FDA-approved for implants, and offers unique drug-release kinetics--which allows a preprogrammed burst release of vaccine loads over a period of a few days to more than a month from a single administration. The microneedles can be easily inserted and fully embedded inside the dermal layer, thanks to the miniscule tips and smooth geometry of the needles.

To create this vaccine microneedle patch, Khanh Tran, a PhD student in Nguyen's lab and the lead author of the published work, adapted the SEAL technology to assemble different microneedle components, including a cap, shell, and vaccine core. These components are manufactured in an additive manner, similar to the approach of 3D printing, to create arrays of core-shell microneedles over a large area.

Nguyen's team devised several new approaches to overcome many issues of the existing SEAL technology. The key novelty of their new manufacturing process is to micro-mold vaccines into the shape of the microneedle core, and insert all of the molded vaccine cores into arrays of the microneedle shells at the same time, offering a fabrication method similar to the manufacturing process of computer chips, as shown in this video:
Video Player
00:00
00:50

"This is a tremendous advantage, compared to the previously-reported SEAL and other traditional methods to fabricate vaccine carriers, in which vaccine is often filled slowly one by one into each polymeric shell/carrier," Tran says.

In the preclinical trials, the researchers inserted microneedles loaded with a clinically available vaccine (Prevnar-13) into the skin of rats in a minimally invasive manner. The patch application caused no skin irritation during long-term implantation, and triggered a high immune protection response against a lethal dose of infectious pneumococcal bacteria. The results from the one-time administration were similar to that obtained from multiple injections of the same vaccine over a period of approximately two months.

"We are very excited for this achievement, as for the first time, a onetime-use and injection-free skin patch can be pre-programmed to release vaccines at different times to provide a long-term and effective immune protection," Nguyen says. "The microneedle patch could facilitate the global effort for a complete vaccination process to eradicate dangerous infectious diseases and enable a quick distribution of vaccines. This could create a pan-community immune-protection at a global scale in the case of a pandemic such as the COVID-19," Nguyen says.

In this regard, Nguyen and and his collaborator, Associate Professor Steve Szczepanek in the Department of Pathobiology and Veterinary Science in the College of Agriculture, Health, and Natural Resources have also received a $432,990 contract from the U.S. Department of Health and Human Services (HHS) BARDA to develop this technology.

Looking into the future, more research is needed in order to bring the microneedle patch into clinical use. While the researchers have shown the ability to use the patch for the pneumococcal vaccines, different vaccines would require different strategies for stabilization so they can be functional over a long period of implantation inside the skin.

The researchers are also working on the optimization and automation of the fabrication process, which can reduce the cost of the microneedle skin patch for clinical use. Future works on larger animal models closely mimicking human immune systems are also needed to verify the safety and efficacy of the microneedle platforms.

Credit: 
University of Connecticut

Flow physics could help forecasters predict extreme events

image: Brian Elbing (left) holds a microphone with storm chaser Val Castor (right) in front of his storm chasing truck, in which the researchers mounted an infrasound sensor for monitoring tornadoes.

Image: 
Brian Elbing

VIRTUAL MEETING (CST), November 22, 2020 -- About 1,000 tornadoes strike the United States each year, causing billions of dollars in damage and killing about 60 people on average. Tracking data show that they're becoming increasingly common in the southeast, and less frequent in "Tornado Alley," which stretches across the Great Plains. Scientists lack a clear understanding of how tornadoes form, but a more urgent challenge is to develop more accurate prediction and warning systems. It requires a fine balance: Without warnings, people can't shelter, but if they experience too many false alarms, they'll become inured.

One way to improve tornado prediction tools might be to listen better, according to mechanical engineer Brian Elbing at Oklahoma State University in Stillwater, in the heart of Tornado Alley. He doesn't mean any sounds audible to human ears, though. As long ago as the 1960s, researchers reported evidence that tornadoes emit signature sounds at frequencies that fall outside the range of human hearing. People can hear down to about 20 Hertz--which sounds like a low rumble--but a tornado's song likely falls somewhere between 1 and 10 Hertz.

Brandon White, a graduate student in Elbing's lab, discussed their recent analyses of the infrasound signature of tornadoes at the 73rd Annual Meeting of the American Physical Society's Division of Fluid Dynamics.

Elbing said these infrasound signatures had seemed like a promising avenue of research, at least until radar emerged as a frontrunner technology for warning systems. Acoustic-based approaches took a back seat for decades. "Now we've made a lot of advances with radar systems and monitoring, but there are still limitations. Radar requires line of sight measurements." But line of sight can be tricky in hilly places like the Southeast, where the majority of tornado deaths occur.

Maybe it's time to revisit those acoustic approaches, said Elbing. In 2017, his research group recorded infrasound bursts from a supercell that produced a small tornado near Perkins, Oklahoma. When they analyzed the data, they found that the vibrations began before the tornado formed.

Researchers still know little about the fluid dynamics of tornadoes. "To date there have been eight trusted measurements of pressure inside a tornado, and no classical theory predicts them," said Elbing. He doesn't know how the sound is produced, either, but knowing the cause isn't required for an alarm system. The idea of an acoustics-based system is straightforward.

"If I dropped a glass behind you and it shattered, you don't need to turn around to know what happened," said Elbing. "That sound gives you a good sense of your immediate environment." Infrasound vibrations can travel over long distances quickly, and through different media. "We could detect tornadoes from 100 miles away."

Members of Elbing's research group also described a sensor array for detecting tornadoes via acoustics and presented findings from studies on how infrasound vibrations travel through the atmosphere. The work on infrasound tornado signatures was supported by a grant from NOAA.

Other sessions during the Division of Fluid Dynamics meeting similarly addressed ways to study and predict extreme events. During a session on nonlinear dynamics, MIT engineer Qiqi Wang revisited the butterfly effect, a well-known phenomena in fluid dynamics that asks whether a butterfly flapping its wings in Brazil could trigger a tornado in Texas.

What's unclear is whether the butterfly wings can lead to changes in the longtime statistics of the climate. By investigating the question computationally in small chaotic systems, he found that small perturbations can, indeed, effect long-term changes, a finding that suggests even small efforts can lead to lasting changes in the climate of a system.

During the same session, mechanical engineer Antoine Blanchard, a postdoctoral researcher at MIT, introduced a smart sampling algorithm designed to help quantify and predict extreme events--like extreme storms or cyclones, for example. Extreme events occur with low probability, he said, and therefore require large amounts of data, which can be expensive to generate, computationally or experimentally. Blanchard, whose background is in fluid dynamics, wanted to find a way to identify outliers more economically. "We're trying to identify those dangerous states using as few simulations as possible."

The algorithm he designed is a kind of black box: Any dynamical state can be fed as an input, and the algorithm will return a measure of the dangerousness of that state.

"We're trying to find the doors to danger. If you open that particular door, will the system remain quiescent, or will it go crazy?" asked Blanchard. "What are the states and conditions--like weather conditions, for example--that if you were to evolve them over time could cause a cyclone or storm?"

Blanchard said he's still refining the algorithm but hopes to start applying it to real data and large-scale experiments soon. He also said it may have implications beyond the weather, in any system that produces extreme events. "It's a very general algorithm."

Credit: 
American Physical Society

Snorkeling gear, animal noses inspire better personal protective equipment

VIRTUAL MEETING (CST), November 22, 2020 -- In March, Stanford University bioengineer Manu Prakash flew from France to his home in California and spent two weeks in personal quarantine. After being holed up in the room where he stores his snorkeling and scuba equipment, Prakash emerged with an idea for addressing two of the pandemic's most pressing challenges.

First, he saw that the global supply chain for disposable N95 masks had broken down, and many hospitals lacked adequate personal protective equipment, or PPE. Second, "the masks that are out there, that we put in the hands of our frontline workers, are not that good," said Paraksh. They're often ill-fitting and uncomfortable, and if they don't fit, they don't protect.

Prakash's idea addressed both problems. He wanted to repurpose full-face snorkel masks, outfit them with 3D-printed filter-holders, and use them as a way to meet the growing demand for PPE. The idea became a tweet, which spawned an international collaboration, which led to design and testing in Prakash's lab and elsewhere, which resulted in tens of thousands of snorkel masks being shipped and used around the world. The device functions as a combination of mask and face shield.

Mechanical engineer Laurel Kroo, who works in Prakash's lab, described the design, testing, and distribution of "pneumask" at the 73rd Annual Meeting of the American Physical Society's Division of Fluid Dynamics. The Pneumask Consortium includes universities and companies from all over the world. Researchers in the coalition have published protocols for how to decontaminate the device, making it suitable for reuse. Clinical tests suggest that it can be worn comfortably for the entire duration of an eight-hour shift.

"From a fluid dynamics perspective, a mask is a hydrodynamic device," said Prakash. "A lot is happening when you breathe in and breathe out. You have to have the right kind of filters. You have to think about rebreathing, and comfort."

Prakash's lab has pivoted to focus on many COVID-19 related projects. They helped launch the 1000x1000 project, which repurposes cotton candy machines to produce protective, N95-grade mask material. To make cotton candy, the machine melts and spins out liquid sugar in fine threads; to make filter material, the repurposed machines spin out nanofibers that can trap minuscule particles. And together with partners at other universities and companies, the group helped develop the "Pufferfish," an open-source, low-cost ICU ventilator. Hongquan Li, from Prakash's lab, described that device during the same session on November 24.

Sunghwan Jung at Cornell University, who studies animals through the lens of fluid dynamics, has been working with researchers including Saikat Basu at South Dakota State University, in Brookings, and Leonardo Chamorro from the University of Illinois Urbana-Champaign on masks that take their shape from the nasal cavities of animals. The work was funded by a grant from the National Science Foundation.

Animals like dogs, opossums, and pigs are renowned for their super-sensitive sniffers, said Jung. "They have a very complicated nasal structure, and we tried to mimic that structure in our filters."

The human nose is fairly straightforward and vacuous, said Jung. But dogs and pigs are different. They have twisty, tortuous nasal cavities, and that's partly why they have such strong senses of smell. "Fluid mechanics tells us that if you have such a tortuous air pathway, you have more changes to capture more particles," said Jung.

The researchers capitalized on that idea to design a mask filter that can be 3D-printed to have a similarly tortuous structure. Lab tests showed that it can block micron-sized particles and has a low pressure drop--which means people wouldn't have to breathe hard while wearing it. The mask hasn't been approved or used in hospitals, said Jung.

Credit: 
American Physical Society

Making sense of a universe of corn genetics

image: Seed banks across the globe store and preserve the genetic diversity of millions of varieties of crops, including corn. Iowa State University researchers are developing ways to predict the traits of corn varieties based on their genomes

Image: 
Jianming Yu

AMES, Iowa - Seed banks across the globe store and preserve the genetic diversity of millions of varieties of crops. This massive collection of genetic material ensures crop breeders access to a wealth of genetics with which to breed crops that yield better or resist stress and disease.

But, with a world of corn genetics at their disposal, how do plant breeders know which varieties are worth studying and which ones aren't? For most of history, that required growing the varieties and studying their performance in the real world. But innovative data analytics and genomics could help plant breeders predict the performance of new varieties without having to go to the effort of growing them.

Jianming Yu, a professor of agronomy at Iowa State University and the Pioneer Distinguished Chair in Maize Breeding, has devoted much of his research to "turbo charging" the seemingly endless amount of genetic stocks contained in the world's seed banks. Yu and his colleagues have published an article in the Plant Biotechnology Journal, a scientific publication, that details their latest efforts to predict traits in corn based on genomics and data analytics.

Plant breeders searching for varieties to test might feel lost in a sea of genomic material. Yu said applying advanced data analytics to all those genomes can help breeders narrow down the number of varieties they're interested in much faster and more efficiently.

"We're always searching for the best genetic combinations, and we search the various combinations to see what varieties we want to test," said Xiaoqing Yu (no relation), a former postdoctoral research associate in Yu's lab and the first author of the study. "Having these predictions can guide our searching process."

The study focused on predicting eight corn traits based on the shoot apical meristem (SAM), a microscopic stem cell niche that generates all the above-ground organs of the plant. The researchers used their analytical approach to predict traits in 2,687 diverse maize inbred varieties based on a model they developed from studying 369 inbred varieties that had been grown and had their shoot apical meristems pictured and measured under the microscope.

The researchers then validated their predictions with data obtained from 488 inbreds to determine their prediction accuracy ranged from 37% to 57% across the eight traits they studied.

"We wanted to connect the research in foundational biological mechanisms of cell growth and differentiation with agronomic improvement of corn," said Mike Scanlon, a professor of developmental biology at Cornell University and the lead investigator of the multi-institutional team behind the study. "SAM morphometric measurements in corn seedlings allow a quick completion of the study cycle. It not only enables that connection, but also extends the practice of genomic prediction into the microphenotypic space."

Jianming Yu said plant breeders can bump up the accuracy of those genomic predictions by increasing the number of plants per inbred for measurement and findings-improved prediction algorithms. More importantly, plant breeders can finetune their selection process for which inbreds to study closely by leveraging the "U values," a statistical concept that accounts for the reliability of estimates. Yu said the study shows that implementing a selection process that accounts for prediction and statistical reliability can help plant breeders zero in on desirable crop genetics faster.

For instance, analytical models might predict a particular inbred to have modest potential for a given trait, but the U value, or the upper bound for reliability, might indicate a high degree of unreliability in those predictions. So plant breeders might elect to test inbreds that don't do as well in the predictive model simply because of their genetic uniqueness, being less related to those used in building the prediction models.

"We found that there can be a balance between selecting for optimizing short-term gain and mining diversity," Yu said. "It's a tricky balance for plant breeders. Those considerations sometimes go in different directions. Genetic improvement can be viewed as space exploration, either of the vast amount of existing genetic materials in seed banks or of the innumerable breeding progenies constantly being generated. We want to develop better tools to guide those decisions in the process."

Credit: 
Iowa State University

Big cats and small dogs: solving the mystery of canine distemper in wild tigers

image: Amur tigers share their taiga forest habitat with wild carnivores that act as a reservoir of canine distemper virus.

Image: 
Wildlife Conservation Society

ITHACA, N.Y. - If you think getting your cat to the veterinarian is tricky, new Cornell Wildlife Health Center research has revealed that vaccination of endangered Amur (Siberian) tigers is the only practical strategy to protect them from a dangerous disease in their natural habitat in the Russian Far East.

Canine distemper virus (CDV) causes a serious disease in domestic dogs, and also infects other carnivores, including threatened species like the Amur tiger, which numbers fewer than 550 individuals in the Russian Far East and neighbouring China. It is often assumed that domestic dogs are the primary source of CDV, but in a new study published in the Proceedings of the National Academy of Sciences, the Cornell Wildlife Health Center's Dr. Martin Gilbert and colleagues found that other local wildlife was the primary source of CDV transmission to tigers instead.

"Understanding how tigers are catching distemper is absolutely crucial to helping us design effective measures to minimize the conservation impact of the virus," said Gilbert. "Vaccinating tigers is hard to do, but our research shows that immunizing just two tigers within a small population each year can reduce the risk that CDV will cause extinction by almost seventy-five percent. At least in the Russian Far East, vaccinating local domestic dogs would not be an effective strategy to protect tigers."

The research, led by Cornell University, the Wildlife Conservation Society and the University of Glasgow, relied on several lines of evidence to build a picture of CDV epidemiology in the tigers' habitat. Using samples from domestic dogs, tigers, and other wild carnivores, they compared viral genetic sequence data and used antibodies to assess patterns of exposure in each population.

"The taiga forest where the tigers live supports a rich diversity of 17 wild carnivore species" said study co-author Dr. Nadezhda Sulikhan with the Federal Scientific Center of the East Asia Terrestrial Biodiversity of Russian Academy of Sciences. "Our findings suggest that more abundant small-bodied species like martens, badgers and raccoon dogs are the most important contributors to the CDV reservoir."

Controlling CDV in these abundant wild carnivore populations is not possible, as there are no CDV oral vaccines that could be distributed to these populations through baited food.

That left only one viable possibility -- using an injectable vaccine on the tigers themselves. To determine whether currently available CDV vaccines could protect wild tigers, the researchers showed in the laboratory that serum from tigers vaccinated in captivity was able to neutralize the strain of CDV that they had detected in Russia. They then developed a computer model to show that even a low rate of vaccination (two tigers per year) could reduce the tigers' risk of extinction significantly, at a cost of only US $30,000 per year or less if vaccines are given opportunistically when tigers are captured for routine radio collaring studies.
Gilbert and his colleagues contend that vaccination can be a valuable conservation strategy. As wildlife populations become more fragmented through the effects of habitat destruction, poaching and climate change, they become increasingly vulnerable to local extinctions caused by infectious diseases like distemper.

According to study contributor Dr. Sarah Cleaveland of the University of Glasgow, "This work shows that CDV in the Amur tiger is a solvable problem -- a rare piece of good news for the tiger conservation community."

Credit: 
Cornell University

Supersized wind turbines generate clean energy--and surprising physics

image: A team from Oklahoma State University attached sensors to robotic aircraft to take more cohesive measurements of building wakes, or the disturbed airflow around buildings.

Image: 
Jamey Jacob

VIRTUAL MEETING (CST), November 22, 2020 -- Twenty years ago, wind energy was mostly a niche industry that contributed less than 1% to the total electricity demand in the United States. Wind has since emerged as a serious contender in the race to develop clean, renewable energy sources that can sustain the grid and meet the ever-rising global energy demand. Last year, wind energy supplied 7% of domestic electricity demand, and across the country--both on and offshore--energy companies have been installing giant turbines that reach higher and wider than ever before.

"Wind energy is going to be a really important component of power production," said engineer Jonathan Naughton at the University of Wyoming, in Laramie. He acknowledged that skeptics doubt the viability of renewable energy sources like wind and solar because they're weather dependent and variable in nature, and therefore hard to control and predict. "That's true," he said, "but there are ways to overcome that."

Naughton and Charles Meneveau at Johns Hopkins University in Baltimore, Maryland, organized a mini-symposium at the 73rd Annual Meeting of the American Physical Society's Division of Fluid Dynamics, where researchers described the promise and fluid dynamics challenges of wind energy.

In order for wind energy to be useful--and accepted--researchers need to design systems that are both efficient and inexpensive, Naughton said. That means gaining a better understanding of the physical phenomena that govern wind turbines, at all scales. Three years ago, the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) brought together 70 experts from around the world to discuss the state of the science. In 2019, the group published grand scientific challenges that need to be addressed for wind energy to contribute up to half of the demand for power.

One of those challenges was to better understand the physics of the part of the atmosphere where the turbines operate. "Wind is really an atmospheric fluid mechanics problem," said Naughton. "But how the wind behaves at the levels where the turbines operate is still an area where we need more information."

Today's turbines have blades that can stretch 50 to 70 meters, said Paul Veers, Chief Engineer at NREL's National Wind Technology Center, who provided an overview of the challenges during the symposium. These towers tower 100 meters or more over their environs. "Offshore, they're getting even bigger," said Veers.

The advantage to building bigger turbines is that a wind power plant would need fewer machines to build and maintain and to access the powerful winds high above the ground. But giant power plants function at a scale that hasn't been well-studied, said Veers.

"We have a really good ability to understand and work with the atmosphere at really large scales," said Veers. "And scientists like Jonathan and Charles have done amazing jobs with fluid dynamics to understand small scales. But between these two, there's an area that has not been studied all that much."

Another challenge will be to study the structural and system dynamics of these giant rotating machines. The winds interact with the blades, which bend and twist. The spinning blades give rise to high Reynolds numbers, "and those are areas where we don't have a lot of information," said Naughton.

Powerful computational approaches can help reveal the physics, said Veers. "We're really pushing the computational methods as far as possible," he said. "It's taking us to the fastest and biggest computers that exist right now."

A third challenge, Naughton noted, is to study the behavior of groups of turbines. Every turbine produces a wake in the atmosphere, and as that wake propagates downstream it interacts with the wakes from other turbines. Wakes may combine; they may also interfere with other turbines. Or anything else in the area. "If there's farmland downwind, we don't know how the change in the atmospheric flow will affect it," said Naughton.

He called wind energy the "ultimate scale problem." Because it connects small-scale problems like the interactions of turbines with the air to giant-scale problems like atmospheric modeling, wind energy will require expertise and input from a variety of fields to address the challenges. "Wind is among the cheapest forms of energy," said Naughton. "But as the technology matures, the questions get harder."

Credit: 
American Physical Society

Eye exam could lead to early Parkinson's disease diagnosis

image: An example of a fundus eye images taken from the UK Biobank.

Image: 
Radiological Society of North America

OAK BROOK, Ill. - A simple eye exam combined with powerful artificial intelligence (AI) machine learning technology could provide early detection of Parkinson's disease, according to research being presented at the annual meeting of the Radiological Society of North America (RSNA).

Parkinson's disease is a progressive disorder of the central nervous system that affects millions of people worldwide. Diagnosis is typically based on symptoms like tremors, muscle stiffness and impaired balance--an approach that has significant limitations.

"The issue with that method is that patients usually develop symptoms only after prolonged progression with significant injury to dopamine brain neurons," said study lead author Maximillian Diaz, a biomedical engineering Ph.D. student at the University of Florida in Gainesville, Florida. "This means that we are diagnosing patients late in the disease process."

Disease progression is characterized by nerve cell decay that thins the walls of the retina, the layer of tissue that lines the back of the eyeball. The disease also affects the microscopic blood vessels, or microvasculature, of the retina. These characteristics present an opportunity to leverage the power of AI to examine images of the eyes for signs of Parkinson's disease.

For the new study, Diaz collaborated with graduate student Jianqiao Tian and University of Florida neurologist Adolfo Ramirez-Zamora, M.D., under the direction of Ruogu Fang, Ph.D., director of the J. Crayton Pruitt Department of Biomedical Engineering's Smart Medical Informatics Learning and Evaluation Lab (SMILE).

The researchers deployed a type of AI called support vector machine (SVM) learning that has been around since 1989. Using pictures of the back of the eye from both patients with Parkinson's disease and control participants, they trained the SVM to detect signs on the images suggestive of disease.

The results indicated that the machine learning networks can classify Parkinson's disease based on retina vasculature, with the key features being smaller blood vessels. The proposed methods further support the idea that changes in brain physiology can be observed in the eye.

"The single most important finding of this study was that a brain disease was diagnosed with a basic picture of the eye," Diaz said. "This is very different from traditional approaches where to find a problem with the brain you look at different brain images."

Diaz noted that those traditional imaging approaches with MRI, CT and nuclear medicine techniques can be very costly. In contrast, the new approach uses basic photography with equipment commonly available in eye clinics to get an image. The images can even be captured by a smartphone with a special lens.

"It's just a simple picture of the eye, you can have it done in less than a minute, and the cost of the equipment is much less than a CT or MRI machine," Diaz said. "If we can make this a yearly screening, then the hope is that we can catch more cases sooner, which can help us better understand the disease and find a cure or a way to slow the progression."

The approach may also have applications in identifying other diseases that affect the structure of the brain, such as Alzheimer's disease and multiple sclerosis, Diaz said.

Credit: 
Radiological Society of North America