Earth

Compounding impact of severe weather events fuels marine heatwave in the coastal ocean

Several coastal communities are picking up the pieces after being ravaged by hurricanes in the past month. Hurricane Laura, a category 4, and Hurricane Sally, a category 2, seemed to meander their way across the Gulf of Mexico constantly shifting forecasts and keeping meteorologists on their toes. In the hours before these storms struck land, they seemed to explode in intensity.

Researchers at the Dauphin Island Sea Lab with support from the Jet Propulsion Laboratory can offer insight into why these storms intensified quickly as they moved across the continental shelf.

"Surprisingly, both Hurricane Laura and Hurricane Sally appeared to have similar setups to Hurricane Michael with both storm events being preceded by smaller storms (i.e. Hurricane Hanna and Marco, respectively)," Dr. Brian Dzwonkowski explained. "This pre-storm setup of the oceanic environment likely contributed to the intensification prior to landfall. Importantly, this pre-landfall intensification was not well predicted by hurricane models or forecasts, which as you can imagine is critical information for evacuation and disaster preparation."

Dzwonkowski and his team's publication, "Compounding impact of severe weather events fuels marine heatwave in the coastal ocean", outlines how one storm could impact the intensity of another storm by restructuring the thermal properties of the water column. Nature Communications published the findings in its September issue.

The research focuses on Hurricane Michael which devastated Mexico Beach, Florida, and the surrounding communities, on October 10, 2018. The category 5 storm intensified hours before making landfall.

Dzwonkowski, a physical oceanographer with the Dauphin Island Sea Lab and Associate Professor at the University of South Alabama in the Department of Marine Sciences, and his team tracked down the key events and processes that pushed the coastal waters in the Gulf of Mexico to an extremely warm state (i.e. a marine heatwave), likely contributing to the intensification of a storm so close to shore.

Unlike the deep ocean, the continental shelf has a shallow bottom that limits how much cold water can be mixed up to the surface, cooling the sea surface temperature and weakening approaching storms. Dzwonkowski and his team focused on how a strong mixing event pushes surface heat downward and clears the bottom water of its cold water reserve. If this mixing is followed by a period of rewarming, such as an atmospheric heatwave, the shelf's oceanic environment could be primed for the potential generation of extreme storm events, i.e. Hurricane Michael.

"This work shows that understanding the preceding weather conditions in a region where a storm is going to make landfall can improve interpretation of hurricane model forecasts and what the storm is likely to do prior to landfall," says Dr. Dzwonkowski

In mapping out heat flux and mixing, the team focused on the Mississippi Bight in late summer and early fall with data gathered by a mooring site off Dauphin Island's coastline. The mooring site collects data throughout the water column allowing for the full heat content of the shelf to be determined. The period prior to the landfall of Hurricane Michael turned out to be the warmest ocean conditions during this time period in the 13-year record.

"Turns out hurricanes and atmospheric heatwaves will be getting stronger in a warming world which would indicate the identified sequence of events that generate these extreme conditions may become more frequent," Dzwonkowski said. "The occurrence of extreme heat content events, like marine heatwaves has significant implications for a broad range of scientific management interests beyond hurricane intensity."Importantly, the mechanisms that generated this marine heatwave are expected to be more frequent and intense in the future due to climate change, increasing the likelihood of such extreme conditions.

For example, coral reefs and hypoxia-prone shelves are already stressed by long-term warming trends. These temperature-specific benthic communities and habitats are typically of significant societal and economic value. As such, the newly identified sequence of compounding processes is expected to impact a range of coastal interests and should be considered in management and disaster response decisions.

Credit: 
Dauphin Island Sea Lab

NASA catches Tropical Storm Dolphin swimming north   

image: On Sept. 22, 2020, NASA's Terra satellite provided a visible image of Tropical Storm Dolphin as it continued moving north through the Northwestern Pacific Ocean.

Image: 
NASA/NRL

NASA's Terra satellite obtained visible imagery of Tropical Storm Dolphin as it continued moving north though the Northwestern Pacific Ocean on a track toward east central Japan.

NASA Satellite View: Dolphin's Organization

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite captured a visible image of Tropical Storm Dolphin on Sept. 22 at 0345 UTC (Sept. 21 at 11:45 p.m. EDT). The visible image revealed strong storms around the low-level center of circulation with bands of thunderstorms in the northeastern quadrant. Microwave satellite imagery revealed an eye feature.

The Joint Typhoon Warning Center noted that another satellite image indicates that the upper-level circulation center is tilted about 30 nautical miles east of the low-level center due to increasing westerly vertical wind shear.

NASA's Terra satellite is one in a fleet of NASA satellites that provide data for hurricane research.

Dolphin on Sept. 22

At 5 a.m. EDT (0900 UTC) on Sept. 22 Tropical storm Dolphin was located near latitude 27.9 degrees north and longitude 135.5 degrees east, about 493 nautical miles south-southwest of Yokosuka, Japan. Dolphin was moving to the north and had maximum sustained winds 60 knots (69 mph/111 kph).

Dolphin will move north, later turning to the northeast. It is expected to weaken gradually before becoming extra-tropical near Tokyo.

NASA Researches Earth from Space

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Warming temperatures are driving arctic greening

image: When Arctic tundra greens, undergoing increased plant growth, it can impact wildlife species including reindeer and caribou.

Image: 
Logan Berner/Northern Arizona University

As Arctic summers warm, Earth's northern landscapes are changing. Using satellite images to track global tundra ecosystems over decades, a new study found the region has become greener, as warmer air and soil temperatures lead to increased plant growth.

"The Arctic tundra is one of the coldest biomes on Earth, and it's also one of the most rapidly warming," said Logan Berner, a global change ecologist with Northern Arizona University in Flagstaff, who led the recent research. "This Arctic greening we see is really a bellwether of global climatic change - it's a biome-scale response to rising air temperatures."

The study, published this week in Nature Communications, is the first to measure vegetation changes spanning the entire Arctic tundra, from Alaska and Canada to Siberia, using satellite data from Landsat, a joint mission of NASA and the U.S. Geological Survey (USGS). Other studies have used the satellite data to look at smaller regions, since Landsat data can be used to determine how much actively growing vegetation is on the ground. Greening can represent plants growing more, becoming denser, and/or shrubs encroaching on typical tundra grasses and moss.

When the tundra vegetation changes, it impacts not only the wildlife that depend on certain plants, but also the people who live in the region and depend on local ecosystems for food. While active plants will absorb more carbon from the atmosphere, the warming temperatures could also be thawing permafrost, thereby releasing greenhouse gasses. The research is part of NASA's Arctic Boreal Vulnerability Experiment (ABoVE), which aims to better understand how ecosystems are responding in these warming environments and the broader social implications.

Berner and his colleagues used the Landsat data and additional calculations to estimate the peak greenness for a given year for each of 50,000 randomly selected sites across the tundra. Between 1985 and 2016, about 38% of the tundra sites across Alaska, Canada, and western Eurasia showed greening. Only 3% showed the opposite browning effect, which would mean fewer actively growing plants. To include eastern Eurasian sites, they compared data starting in 2000, when Landsat satellites began regularly collecting images of that region. With this global view, 22% of sites greened between 2000 and 2016, while 4% browned.

"Whether it's since 1985 or 2000, we see this greening of the Arctic evident in the Landsat record," Berner said. "And we see this biome-scale greening at the same time and over the same period as we see really rapid increases in summer air temperatures."

The researchers compared these greening patterns with other factors, and found that it's also associated with higher soil temperatures and higher soil moisture. They confirmed these findings with plant growth measurements from field sites around the Arctic.

"Landsat is key for these kinds of measurements because it gathers data on a much finer scale than what was previously used", said Scott Goetz, a professor at Northern Arizona University who also worked on the study and leads the ABoVE Science Team. This allows the researchers to investigate what is driving the changes to the tundra. "There's a lot of microscale variability in the Arctic, so it's important to work at finer resolution while also having a long data record," Goetz said. "That's why Landsat is so valuable."

Credit: 
NASA/Goddard Space Flight Center

Parylene photonics enable future optical biointerfaces

image: A Parylene photonic waveguide surrounded by neurons.

Image: 
Carnegie Mellon University College of Engineering

Carnegie Mellon University's Maysam Chamanzar and his team have invented an optical platform that will likely become the new standard in optical biointerfaces. He's labeled this new field of optical technology "Parylene photonics," demonstrated in a recent paper in Nature Microsystems and Nanoengineering.

There is a growing and unfulfilled demand for optical systems for biomedical applications. Miniaturized and flexible optical tools are needed to enable reliable ambulatory and on-demand imaging and manipulation of biological events in the body. Integrated photonic technology has mainly evolved around developing devices for optical communications. The advent of silicon photonics was a turning point in bringing optical functionalities to the small form-factor of a chip.

Research in this field boomed in the past couple of decades. However, silicon is a dangerously rigid material for interacting with soft tissue in biomedical applications. This increases the risk for patients to undergo tissue damage and scarring, especially due to the undulation of soft tissue against the inflexible device caused by respiration and other processes.

Chamanzar, an Assistant Professor of Electrical and Computer Engineering (ECE) and Biomedical Engineering, saw the pressing need for an optical platform tailored to biointerfaces with both optical capability and flexibility. His solution, Parylene photonics, is the first biocompatible and fully flexible integrated photonic platform ever made.

To create this new photonic material class, Chamanzar's lab designed ultracompact optical waveguides by fabricating silicone (PDMS), an organic polymer with a low refractive index, around a core of Parylene C, a polymer with a much higher refractive index. The contrast in refractive index allows the waveguide to pipe light effectively, while the materials themselves remain extremely pliant. The result is a platform that is flexible, can operate over a broad spectrum of light, and is just 10 microns thick--about 1/10 the thickness of a human hair.

"We were using Parylene C as a biocompatible insulation coating for electrical implantable devices, when I noticed that this polymer is optically transparent. I became curious about its optical properties and did some basic measurements," said Chamanzar. "I found that Parylene C has exceptional optical properties. This was the onset of thinking about Parylene photonics as a new research direction."

Chamanzar's design was created with neural stimulation in mind, allowing for targeted stimulation and monitoring of specific neurons within the brain. Crucial to this, is the creation of 45-degree embedded micromirrors. While prior optical biointerfaces have stimulated a large swath of the brain tissue beyond what could be measured, these micromirrors create a tight overlap between the volume being stimulated and the volume recorded. These micromirrors also enable integration of external light sources with the Parylene waveguides.

ECE alumna Maya Lassiter (MS, '19), who was involved in the project, said, "Optical packaging is an interesting problem to solve because the best solutions need to be practical. We were able to package our Parylene photonic waveguides with discrete light sources using accessible packaging methods, to realize a compact device."

The applications for Parylene photonics range far beyond optical neural stimulation, and could one day replace current technologies in virtually every area of optical biointerfaces. These tiny flexible optical devices can be inserted into the tissue for short-term imaging or manipulation. They can also be used as permanent implantable devices for long-term monitoring and therapeutic interventions.

Additionally, Chamanzar and his team are considering possible uses in wearables. Parylene photonic devices placed on the skin could be used to conform to difficult areas of the body and measure pulse rate, oxygen saturation, blood flow, cancer biomarkers, and other biometrics. As further options for optical therapeutics are explored, such as laser treatment for cancer cells, the applications for a more versatile optical biointerface will only continue to grow.

"The high index contrast between Parylene C and PDMS enables a low bend loss," said ECE Ph.D. candidate Jay Reddy, who has been working on this project. "These devices retain 90% efficiency as they are tightly bent down to a radius of almost half a millimeter, conforming tightly to anatomical features such as the cochlea and nerve bundles."

Another unconventional possibility for Parylene photonics is actually in communication links, bringing Chamanzar's whole pursuit full circle. Current chip-to-chip interconnects usually use rather inflexible optical fibers, and any area in which flexibility is needed requires transferring the signals to the electrical domain, which significantly limits bandwidth. Flexible Parylene photonic cables, however, provide a promising high bandwidth solution that could replace both types of optical interconnects and enable advances in optical interconnect design.

"So far, we have demonstrated low-loss, fully flexible Parylene photonic waveguides with embedded micromirrors that enable input/output light coupling over a broad range of optical wavelengths," said Chamanzar. "In the future, other optical devices such as microresonators and interferometers can also be implemented in this platform to enable a whole gamut of new applications."

With Chamanzar's recent publication marking the debut of Parylene photonics, it's impossible to say just how far reaching the effects of this technology could be. However, the implications of this work are more than likely to mark a new chapter in the development of optical biointerfaces, similar to what silicon photonics enabled in optical communications and processing.

Credit: 
College of Engineering, Carnegie Mellon University

Muslims, atheists more likely to face religious discrimination in US

Muslims and atheists in the United States are more likely than those of Christian faiths to experience religious discrimination, according to new research led by the University of Washington.

In the study, which focused on public schools because they are government-run, community-facing institutions, the researchers tested responses to an individual’s expression of religious belief. In addition to finding greater bias against religious minorities, the researchers also saw that ardent expressions of faith, regardless of religious tradition, were more prone to discrimination.

“The U.S. is becoming a much more culturally diverse society than in the past, and the rate of change is happening very swiftly. So we wanted to ask: How are our public institutions keeping up? Can they provide equal accommodations and protection under the law?” said Steve Pfaff, a University of Washington professor of sociology and lead author of the study, which published Aug. 30 in Public Administration Review.

Religious bias may be a very serious problem, but it has been studied less than other types of discrimination, such as race- or gender-based discrimination, Pfaff added.

“Schools bear this enormous responsibility and perform this important service, and one thing that’s changing quickly, among the population, is religion. So how are schools handling all that change?” he said.

Pfaff points to national statistics that reflect the change: The percentage of Americans who identify as “unchurched” has increased from 16% to 23% in the past decade; the percentage of Americans who identify as Muslim, while small, is expected to double to 2%, by 2050.

For this study, which was conducted in spring 2016, researchers sent an email to some 45,000 school principals in 33 states, including Washington. The email was presented as a note from a family new to the community. The randomized messages varied by belief system — Catholic, Protestant, Muslim or atheist, signaled by a faith-oriented quote in the email signature — and also varied the degree of religious expression in the body of the note. The basic version asked for a meeting to learn about the school; a second version sought to find a school that was the right fit for their beliefs; and a third inquired about accommodation of religious needs at school. A control email presented as a family new to the community, interested in learning about the school, but with no religious expression or a faith-oriented email signature.

The research team chose the audit approach, with its contrived email, over a more standard survey on the assumption that asking people questions about religious bias may not yield genuinely honest answers, Pfaff said. Much research in the social sciences suggests that subjects are unlikely to volunteer what they think will be perceived as discriminatory opinions.

While none of the principals’ responses were explicitly negative, Pfaff said, it was the lack of response that indicated a pattern. Somewhat less than half of emails got a response. But those that signaled affiliation with Islam or atheism, indicated by a famous quote attributed to either the Prophet Muhammad or Richard Dawkins in the email signature, were about 5 percentage points less likely to receive a response than the control emails. Email response rates to notes accompanied by a quote from Pope Benedict XVI or the Rev. Billy Graham mirrored those of the control emails.

Principals displayed across-the-board bias in response to the more overt emails that suggested that their schools might have to accommodate religious requests from parents. In the presence of such treatment language in the email text, the probability of a response declined by as much as 13 percentage points for atheists, nearly 9 percentage points for Muslims, 7.8 percentage points for Catholics and 5.5 percentage points for Protestants. This finding strongly suggests that some bias may result from the perception that religious accommodations of various kinds might be difficult or costly, burdens that principals want to avoid, Pfaff said.

The study’s findings were evident nationwide; emails were sent to principals in 33 states, and even when testing the potential effect of local demographics — whether an urban, diverse community, Democratic or Republican-leaning, or a more homogenous, rural one — the observed discrimination against religious minorities was consistent. That suggests that religious discrimination can occur anywhere, that it’s not reflective of geography or political ideology, Pfaff said.

“Religious bias in response to a routine inquiry from a public school official, amounting to a 5 to 13 percent lower chance of response, reflects substantial evidence of bias,” Pfaff said.

The findings may indicate that the perceived cost in dealing with a person or situation factors in to whether they receive a reply. In that case, the perceived cost may have as much to do with community attitudes and norms about religion in school, or about specific religions, as the principal’s own beliefs or biases, Pfaff said. However, bias against Muslims and atheists on the basis of self-identification, without any request for accommodation, could stem from discrimination rooted in moral judgments. From a discrimination-research perspective, frontline bureaucrats, such as city or county workers or school principals, are an appropriate study pool — perhaps even more than elected officials — because they have so many seemingly routine interactions with the public.

Many other factors could also contribute to a lack of response, Pfaff added, not the least of which is workload and competing demands on a principal’s time. But the pattern of who did not receive a response was clear in the patterns of discrimination against Muslim and atheist emails, as well as discrimination against more overt displays of any faith.

Judaism was not included in the study because at the time previous research suggested little discrimination, and very positive public attitudes, toward Jews. In the years since, however, public expressions of anti-Semitism have increased, and in retrospect, including Judaism in the study would have been valuable, Pfaff acknowledged.

The focus of the study on public schools suggests specific policy solutions, such as briefing school administrators and staff about existing laws and constitutional standards concerning non-discrimination and legitimate religious accommodations, Pfaff said. Hiring a more diverse administrative staff and maintaining a general awareness of changing neighborhood demographics and public values could help better prepare school officials to serve their communities.

Credit: 
University of Washington

Global change ecologist leads NASA satellite study of rapid greening across Arctic tundra

image: The study is the first to measure vegetation changes across the Arctic tundra, from Alaska and Canada to Siberia, using satellite data from Landsat, a joint mission of NASA and the U.S. Geological Survey.

Image: 
Logan Berner, Northern Arizona University

As Arctic summers warm, Earth's northern landscapes are changing. Using satellite images to track global tundra ecosystems over decades, a team of researchers finds the region has become greener as warmer air and soil temperatures lead to increased plant growth.

"The Arctic tundra is one of the coldest biomes on Earth, and it's also one of the most rapidly warming," said Logan Berner, assistant research professor with Northern Arizona University's School of Informatics, Computing, and Cyber Systems (SICCS), who led the research in collaboration with scientists at eight other institutions in the U.S., Canada, Finland and the United Kingdom. "This Arctic greening we see is really a bellwether of global climatic change - it's this biome-scale response to rising air temperatures."

The study, published this week in Nature Communications, is the first to measure vegetation changes across the Arctic tundra, from Alaska and Canada to Siberia, using satellite data from Landsat, a joint mission of NASA and the U.S. Geological Survey. Scientists use Landsat data to determine how much actively growing vegetation is on the ground - greening can represent plants growing more, becoming denser or shrubs encroaching on typical tundra grasses and moss.

When the tundra vegetation changes, it impacts not only the wildlife that depend on certain plants, but also the people who live in the region and depend on local ecosystems for food. While active plants will absorb more carbon from the atmosphere, the warming temperatures are also thawing permafrost, releasing greenhouse gasses. The research is part NASA's Arctic Boreal Vulnerability Experiment (ABoVE), which aims to better understand how ecosystems are responding in these warming environments and its broader implications.

Berner and his colleagues, including SICCS faculty Patrick Jantz and Scott Goetz along with postdoctoral researcher Richard Massey and research associate Patrick Burns, used the Landsat data and additional calculations to estimate the peak greenness for a given year for each of 50,000 randomly selected sites across the tundra. Between 1985 and 2016, about 38 percent of the tundra sites across Alaska, Canada and western Eurasia showed greening. Only 3 percent showed the opposite browning effect, which would mean fewer actively growing plants.

To include eastern Eurasian sites, the team compared data starting in 2000, which was when Landsat satellites began collecting regular images of that region. With this global view, 22 percent of sites greened between 2000 and 2016, while 4 percent browned.

"Whether it's since 1985 or 2000, we see this greening of the Arctic evident in the Landsat record," Berner said. "And we see this biome-scale greening over the same period as we see really rapid increases in summer air temperatures."

The researchers compared these greening patterns with other factors and found that they are also associated with higher soil temperatures and higher soil moisture. They confirmed these findings with plant growth measurements from field sites around the Arctic.

"Landsat is key is for these kinds of measurements because it gathers data on a much finer scale than what was previously used," said NAU professor Goetz, who contributed to the study and leads the ABoVE science team. That allows the researchers to investigate what is driving the changes to the tundra. "There's a lot of microscale variability in the Arctic, so it's important to work at finer resolution while also having a long data record," Goetz said. "That's why Landsat's so valuable."

Credit: 
Northern Arizona University

Diabetes dramatically reduces the kidney's ability clean itself

image: Drs. Zheng Dong and Zhengwei Ma, MCG research associate and the study's first author.

Image: 
Kim Ratliff, Augusta University photographer

AUGUSTA, Ga. (September 22, 2020) - The kidneys often become bulky and dysfunctional in diabetes, and now scientists have found that one path to this damage dramatically reduces the kidney's ability to clean up after itself.

The natural cleanup is called autophagy, which literally means "self-eating," and it's a constant throughout our bodies as debris, like misfolded proteins and damaged cell powerhouses called mitochondria, get packaged into a double-membrane sack, then destroyed by enzymes to help keep cells and organs functioning at a premium.

Autophagy tends to increase in the face of disease challenges like diabetes, but the scientists have found a pathway that then quickly decreases autophagy in both animal models and humans with varying stages of this chronic condition, leaving the kidneys more vulnerable, Dr. Zheng Dong and his colleagues report in the Journal of Clinical Investigation.

"This is the first time we understand there is a novel pathway that leads to autophagy's dysfunction in a chronic kidney disease condition like diabetes," says Dong, cellular biologist and Leon H. Charbonnier Endowed Chair in the Department of Cellular Biology and Anatomy at the Medical College of Georgia at Augusta University.

"I think it also suggests that we have a targeted pathway to prevent or slow progression of kidney failure," says Dong, the study's corresponding author, who is also a senior career scientist and director of research development at the Charlie Norwood Veterans Affairs Medical Center in Augusta. Dr. Zhengwei Ma, MCG research associate, is first author.

Diabetes affects about 100 million adults in the United States, diabetic kidney disease is a major complication of diabetes and is the leading cause of chronic kidney disease and kidney failure, according to the Centers for Disease Control and Prevention.

Scientists like Dong studying the hazardous intersection of diabetes and the kidneys were giving conflicting reports on the state of autophagy: Some saying it went up, others finding it went down. Dong and his colleagues have now found both were correct.

They found autophagy levels naturally, initially increase in response to the stress of diabetes, but quickly drop off as a protein best known as a tumor suppressor activates a microRNA not normally involved in regulating autophagy. What follows is unhealthy overgrowth -- of individual cells and the organ -- called hypertrophy, scarring, inflammation and protein spilling into the urine, a classic sign the kidneys are failing.

"This helps us not just understand the phenomenon -- that there is initial activation, then it comes down -- but we understand what actually causes that down-regulation in a disease state," Dong says.

The known tumor suppressor is p53, the microRNA is miR-214 and the autophagy gene this pair conspires to suppress is ULK1, one of the first genes found to regulate this important metabolic function.

To measure the impact of this path on autophagy, they used levels of the protein LC3, found in the membranes of those sacks where cell parts gets piled for elimination, which is considered a reliable biomarker of autophagy levels.

They consistently found, in both animals and humans, higher activity of p53 and miR-214 and lower ULK1 and autophagy over a relatively short period of time.

In a mouse model of type 1 diabetes, for example, at age nine weeks there was no difference in the autophagy biomarker compared to healthy controls. Two weeks later the biomarker was somewhat lower in the diabetic mice, and by age 11 weeks, it was significantly lower. The diabetic mice were smaller overall but had larger kidneys because of hypertrophy and signs of renal tubule damage, including scarring and inflammation.

"Initially there is an increase, then a decrease and it keeps going down," Dong says, noting the apparent link between less autophagy and greater hypertrophy. By 20 weeks, there was protein in the urine and kidney failure. Kidney cells incubated in high glucose also show this initial increase then, as little as a day later, an autophagy decrease, Dong says.

When they knocked out the autophagy gene, both kidney failure and hypertrophy got even worse, and when they activated the gene it prevented the downward progression.

They found in this dynamic pathway that autophagy went down because levels of the autophagy activating gene ULK1 went significantly down, and that ULK1 went down because miR-214 went way up. There are about 30 genes involved in autophagy and they found ULK1 most consistently down in their diabetes models and in kidney biopsies from patients with diabetes.

So they backed up a step to find what was regulating miR-214 and found p53, the known tumor suppressor that is also known to regulate the cell cycle and which they also think is a stress responder, likely to help eliminate bad cells.

When they knocked out or suppressed miR-214 from the cells of the long tubules right off the kidneys' filtering units, where a lot of the resorption of valuable items like proteins and amino acids normally happens, ULK1 expression and autophagy activity did not decrease in diabetes. There was less unhealthy overgrowth of the kidney and better kidney function, indicated by less spilling of protein. So they went back one step again, this time blocked p53 and saw miR-214 levels did not go up and the downstream benefits of blocking its action were essentially the same.

When they compared human samples of patients with diabetes to biopsies of healthy kidneys, none of the normal samples had signs of p53 but it was present in 15 of the 20 biopsies of patients who had diabetes and varying stages of related kidney problems. There were similar findings about miR-214. The healthy kidneys showed high levels of ULk1 as they should and diabetes biopsies had low levels of this well-established frontline autophagy gene.

"We have a signaling pathway that is very important to hypertrophy, for cell death and inflammation in diabetes that finally leads to kidney failure," Dong says, and interfering with that pathway could make a big difference in the lives of patients whose kidneys are in jeopardy.

"Instead of progressing in 10 to 12 years into kidney failure, we may be able to slow it down to 20 or 30 years or prevent it from happening," he says.

miR-214 is a logical intervention target because it clearly stands in the way of autophagy in the face of diabetes and normally has no obvious role in kidney function, Dong says. p53 is likely not a good target for a chronic problem like diabetic kidney disease because of its clear function as a tumor suppressor, he says, however increasing ULK1 levels is another treatment possibility.

Hypertrophy is part of the pathogenic process that leads to kidney failure, Dong says, as tubular cells have to work harder and get bigger as a result, like an arm muscle responding to weight lifting, but at some point it results in dysfunction. Autophagy is activated to protect those renal tubule cells.

"This helps the cell combat stress to enable it to still survive," Dong says. And it may be a key difference in the resilience of kidneys following an acute kidney injury, which as the name implies, indicates some sort of sudden damage to the kidneys like a blockage of blood flow or a serious burn.

"We know in patients with acute kidney injury, the outcome is very different in different people," Dong says. The kidneys of generally healthy people tend to recover with proper care, while those with existing health problems that affect the kidneys, like hypertension and diabetes, or who are simply over age 75, may proceed to kidney failure, which requires dialysis or an organ transplant to survive. It was known that autophagy is protective and activated in an acute kidney injury and Dong got the idea that maybe in diabetes that protection is lost or at least impaired. A subsequent study by another investigator showed that people with obesity related type 2 diabetes did have impaired autophagy, which matched what they were seeing in the lab, so he decided to focus on why the decrease.

Metabolic changes occur in diabetes, like high levels of glucose, inflammation and oxidative stress; cell powerhouses, called mitochondria, can dysfunction; and the endoplasmic reticulum where proteins are properly folded, can be disrupted. The kidney can grow larger, initially likely to accommodate the challenging work environment, but if it continues, the kidney, much like a heart in failure, may become enlarged, and its function diminished or destroyed, Dong says. Patients with diabetes who develop hypertrophy are more likely to develop end stage renal disease, the scientists write. While he is not certain impaired autophagy completely explains the unhealthy growth, he is confident it contributes to it.

Cell stress is a major instigator of the low, so-called basal, level of autophagy that is going on all the time throughout the body. Problems with autophagy already are associated with diseases from cancer to Alzheimer's and heart disease, and it's also implicated in diabetes.

Some cells need more rigorous autophagy, like high-functioning neurons that are supposed to last a lifetime, and the podocytes that wrap around tiny capillaries in the approximately 1 million filtering units inside each kidney.

"You need to keep them in a healthy condition their whole life," Dong says of the podocytes that keep a tight grip on capillaries with little foot-like extensions and are essential to the essential filtering role of the kidney, like keeping proteins from being lost in the urine.

Renal tubule cells, which were the focus of the study, and carry urine out of the filtering units toward the bladder, filtering salt and water along the way, are the most prominent kidney cell type and they normally function well with relatively low levels of autophagy, Dong says. Evidence of inadequate clearance of dysfunctional components has been found in the tubule cells of the kidney, and there is a consensus that autophagy is impaired in this common kidney condition, Dong says.

Both type 1 and 2 diabetes tend to be lifelong conditions and the kidneys an early target of high levels of glucose in the blood which damages blood vessels, which can drive up blood pressure, which causes more damage. Ultimately the oversized kidneys will shrink because of degeneration and scarring.

For the study, biopsies from 20 people with diabetes and 14 people withou

Credit: 
Medical College of Georgia at Augusta University

Childhood sexual abuse: Mental and physical after-effects closely linked

A new Canadian study reveals that the psychological and physical effects of childhood sexual abuse are closely tied.

The finding could help healthcare professionals develop more effective interventions and ultimately improve mental and physical health outcomes for survivors of abuse in childhood.

Authored by Pascale Vézina-Gagnon, a PhD candidate at Université de Montréal's Department of Psychology, under the supervision of Professor Isabelle Daigneault, the study is published today in Health Psychology.

Twice as many diagnoses

The long-term consequences of childhood sexual abuse on survivors' health have only been recognized recently.

An initial study of 1,764 children and adolescents, published in 2018, showed that girls who survived substantiated cases of sexual abuse received 2.1 times as many diagnoses of urinary health issues and 1.4 times as many diagnoses of genital health issues than girls in the general population.

This finding prompted a subsequent study to determine why and how sexual-abuse survivors suffered from genitourinary problems more often than their peers in the general population.

Specifically, the second study aimed to gain a better understanding of this phenomenon by testing the theory that increased psychological distress is partly responsible for the higher incidence of genitourinary issues - such as urinary tract infections, vaginitis and pain during sex or menstruation - among childhood sexual-abuse survivors.

'A combined approach to treatment'

"The key takeaway from this study is that one-sided treatment - one that addresses just the psychological after-effects or just the physical trauma-- is inadequate," said Vézina-Gagnon. "We need to follow a combined approach to treatment that doesn't view these issues as separate."

She added: "Interdisciplinary care is increasingly becoming the standard, and that's the message we hope our research sends to general practitioners, pediatricians, urologists, gynecologists, psychologists and psychiatrists so that they can help children recover as much as possible."

This is the first study to look at the relationship between genitourinary and psychological issues over such a long period of time - more than a decade - in such a large sample of child survivors of substantiated sexual abuse versus a comparison group.

1,322 girls studied

The researchers used medical data provided by Quebec's public health insurance agency, the Régie de l'assurance maladie du Québec, and the Quebec Ministry of Health and Social Services. The study involved 661 girls between the ages of 1 and 17 who survived one or more instances of substantiated sexual abuse and a comparison group of 661 girls from the general population.

The researchers had access to anonymized data on genitourinary and mental health diagnoses received following medical consultations or hospital stays the girls went through between 1996 and 2013. Several variables were taken into account, such as socioeconomic status, the number of years of access to medical data, and individual predispositions to genitourinary health problems before the sexual abuse occurred.

Childhood sexual abuse includes fondling and petting, oral sex, actual or attempted penetration, voyeurism, indecent exposure, inducement to engage in sexual activity and sexual exploitation (prostitution).

'A wider range of psychiatric issues'

"The results show that girls who were sexually abused were more likely to see a health professional for a wider range of psychiatric issues--anxiety, mood disorders, schizophrenia or substance abuse--than girls in the comparison group," said Vézina-Gagnon. "These consultations were also associated with more frequent medical appointments or hospitalizations for genital and urinary issues in the years after the sexual abuse was reported."

The researchers also found that the more girls consulted their doctors or were hospitalized for multiple psychiatric issues (so-called comorbid psychiatric disorders) after experiencing abuse, the more importantly this explained subsequent genital health issues (62%) and urinary health issues (23%). This difference observed between genital and urinary health (62% vs. 23%) may be explained by factors not included in this study, said Vézina-Gagnon.

"Additional studies are needed to investigate this difference and determine whether other important variables - ones that we didn't have information on, such as the severity, length and frequency of the abuse -could be associated with more severe genitourinary health outcomes," she said.

Two hypotheses offered

"On an emotional and behavioural level, two hypotheses can be formulated to explain these findings," said Vézina-Gagnon. The first is that the association is due to a hypervigilant response. Survivors of sexual abuse who are affected by several mental health issues - such as anxiety, depression and post-traumatic stress disorder - may become hypervigilant or more likely to notice symptoms related to their genital or urinary health, which would lead them to see their doctor more frequently.

"In contrast," she continued, "the second hypothesis is that the association is caused by avoidant behaviour. Survivors may put off or avoid asking for help or seeing a doctor for genitourinary issues, thereby increasing the risk that such problems deteriorate or become chronic conditions. Gynecological care may trigger memories of past abuse (due to the imbalance of power between patients and doctors, the removal of clothing, feelings of vulnerability and physical pain) and it may therefore be especially difficult for these girls."

Toward a holistic approach

The study's findings align with the scientific literature on health psychology and abuse, and once again highlight how important it is to consider the relationship between physical and mental health," said Vézina-Gagnon. A holistic approach (body-mind approach) is therefore needed to help girls recover from sexual trauma, she maintains.

"In light of these findings, healthcare practitioners should assess the level of psychological distress experienced by survivors of childhood sexual abuse who report genitourinary issues and direct them to the right mental health resources" Vézina-Gagnon said.

"The researchers behind this study believe that early and targeted intervention to reduce psychological distress among survivors may be helpful in preventing genitourinary issues from deteriorating or turning into chronic conditions."

Credit: 
University of Montreal

Personal interactions are important drivers of STEM identity in girls

image: Participants in the 2019 SciGirls Coding Camp at the National High Magnetic Field Laboratory practice their skills.

Image: 
Stephen Bilenky/National MagLab

As head of the educational outreach arm of the Florida State University-headquartered National High Magnetic Field Laboratory, Roxanne Hughes has overseen dozens of science camps over the years, including numerous sessions of the successful SciGirls Summer Camp she co-organizes with WFSU .

In a new paper published in the Journal of Research in Science Teaching, Hughes and her colleagues took a much closer look at one of those camps, a coding camp for middle school girls.

They found that nuanced interactions between teachers and campers as well as among the girls themselves impacted how girls viewed themselves as coders.

The MagLab offers both co-ed camps and summer camps for girls about science in general as well as about coding in particular . Hughes, director of the MagLab's Center for Integrating Research and Learning , wanted to study the coding camp because computer science is the only STEM field where the representation of women has actually declined since 1990.

"It's super gendered in how it has been advertised, beginning with the personal computer," Hughes said. "And there are stereotypes behind what is marketed to girls versus what is marketed to boys. We wanted to develop a conceptual framework focusing specifically on coding identity -- how the girls see themselves as coders -- to add to existing research on STEM identity more broadly."

This specific study focused on the disparate experiences of three girls in the camp. The researchers looked at when and how the girls were recognized for their coding successes during the camp, and how teachers and peers responded when the girls demonstrated coding skills.

"Each girl received different levels of recognition, which affected their coding identity development," Hughes said. "We found that educators play a crucial role in amplifying recognition, which then influences how those interactions reinforce their identities as coders."

Positive praise often resulted in a girl pursuing more challenging activities, for example, strengthening her coding identity.

Exactly how teachers praised the campers played a role in how that recognition impacted the girls. Being praised in front of other girls, for example, had more impact than a discreet pat on the back. More public praise prompted peer recognition, which further boosted a girl's coding identity.

The type of behavior recognized by teachers also appeared to have different effects. A girl praised for demonstrating a skill might feel more like a coder than one lauded for her persistence, for example. Lack of encouragement was also observed: One girl who sought attention for her coding prowess went unacknowledged, while another who was assisting her peers received lots of recognition, responses that seem to play into gender stereotypes, Hughes said. Even in a camp explicitly designed to bolster girls in the sciences, prevailing stereotypes can undermine best intentions.

"To me, the most interesting piece was the way in which educators still carry the general gender stereotypes, and how that influenced the behavior they rewarded." Hughes said. "They recognized the girl who was being a team player, checking in on how everyone was feeling -- all very stereotypically feminine traits that are not necessarily connected to or rewarded in computing fields currently."

Messaging about science is especially important for girls in middle school, Hughes said. At that developmental stage, their interest in STEM disciplines begins to wane as they start to get the picture that those fields clash with their other identities.

The MagLab study focused on three girls -- one Black, one white and one Latina -- as a means to develop a framework for future researchers to understand coding identity. Hughes says this is too small a data set to tease out definitive conclusions about roles of race and gender, but the study does raise many questions for future researchers to examine with the help of these findings.

"The questions that come out of the study to me are so fascinating," Hughes said. "Like, how would these girls be treated differently if they were boys? How do the definitions of 'coder' that the girls develop in the camp open or constrain opportunities for them to continue this identity work as they move forward?"

The study has also prompted Hughes to think about how to design more inclusive, culturally responsive camps at the MagLab.

"Even though this is a summer camp, there is still the same carryover of stereotypes and sexism and racism from the outer world into this space," she said. "How can we create a space where girls can behave differently from the social gendered expectations?"

The challenge will be to show each camper that she and her culture are valued in the camp and to draw connections between home and camp that underscore that. "We need to show that each of the girls has value -- in that camp space and in science in general," Hughes said.

Credit: 
Florida State University

Just add water: Biodiversity resurgence in effluent-fed desert riverbeds

image: Nearly 70 years after the historic downtown reach of the Santa Cruz River ran dry, water returned in the form of 2.8 million gallons of reclaimed water released daily through the City of Tucson's Santa Cruz River Heritage Project. Within the first day of the water's release, several species of dragonflies, including this one, were found near the river's banks.

Image: 
Courtesy of Michael Bogan

Throughout the late 19th century, rivers across the southwestern United States were parceled out, and flows were diverted through irrigation canals and trapped behind dams. Growing populations put new demands on groundwater sources. Coupled with changing climate conditions, water tables sank and perennial streams began to run dry.

The fate of the Santa Cruz River in southeastern Arizona was no different.

The banks of the river, described in 1855 by explorer Julius Froebel as "covered with poplars and willows, ash-trees and plantains, oaks and walnut trees," would be unrecognizable by the late 1940s.

Nearly 70 years after the historic downtown reach of the Santa Cruz River ran dry, water returned in the form of 2.8 million gallons of reclaimed water released daily through the city of Tucson's Santa Cruz River Heritage Project.

When the water valves opened, researchers from the University of Arizona School of Natural Resources and the Environment were there to witness the occasion. Their driving question: When and how would the aquatic biodiversity return?

"Within the first day, we saw seven different species of dragonflies," said Michael Bogan, an assistant professor of natural resources with the College of Agriculture and Life Sciences.

After 10 months, the team found over 40 species.

"Which is what you would see in a site that had been flowing for a very long time," Bogan said.

More than a year of effluent-fed river flow later, the results of the "grand experiment" - published this week in the environmental science-focused, open-access journal PeerJ - are simple: Just add water.

"If you can put water back into these river systems that have been essentially de-watered, aquatic life will respond," Bogan said. "It will come back, even after a long absence, as is the case with the river in downtown Tucson."

The Good, the Bad and the Stinky:

Historically, effluent-dependent streams have been widely seen as degraded ecosystems, with much lower biodiversity than natural streams.

"This was due to relatively poor water quality being discharged into streams from wastewater treatment facilities," said Hamdhani, a graduate student in the School of Natural Resources and the Environment.

Hamdhani led a review study of water quality in effluent-fed streams across the globe, recently published in the journal Freshwater Biology.

"It revealed that some critical water quality parameters are often negatively impacted in portions of streams, typically closest to the effluent outfalls," Hamdhani said. "Some common issues include elevated temperature, nutrients, trace organic contaminants and low dissolved oxygen."

However, many wastewater treatment plants have been upgraded to provide better reclaimed water quality, and this better water is now supporting surprisingly high levels of biodiversity, as is the case in the Santa Cruz River.

"A lot of that is due to changes that the county made about eight years ago to both of the treatment plants that feed the northern reaches of the river," Bogan said. "Prior to 2012, they discharged lower quality water. You used to be able to smell the river when you drove in on I-10."

Both treatment plants went through a major upgrade in the early 2010s, where they improved the water treatment process.

"After that, the smell disappeared," Bogan said. "And biodiversity returned."

Crossing the Border, Changing Directions:

The Santa Cruz River, a tributary of the Gila River, runs approximately 184 miles. It springs from the grasslands of the San Rafael Valley east of Patagonia, Arizona, and is the only river in the U.S. to cross the border twice. Running south into Mexico for roughly 10 miles, it changes course and turns north to cross the border again just east of Nogales.

"The river channel flows all the way to the confluence of the Gila River south of the Phoenix area," said Drew Eppehimer, a doctoral student in arid lands resource sciences. "Traditionally, it never really flowed that far. Maybe in huge monsoon flood events, but the formerly perennial sections of the river, here near town, dried up in the early 1900s because of groundwater pumping."

Eppehimer studies water management along the borderlands, focused particularly on treated effluent-fed streams and river systems as a potential source of novel habitats for aquatic life in the desert.

In Arizona, there are about a dozen effluent-dependent river and stream systems, making up about 90 miles of effluent flow.

Reclaimed water released from two wastewater treatment plants just north of Tucson helps the Santa Cruz flow as far as the Pinal County line. Beginning in 2017, Eppehimer delved into the at-times waist-deep waters in these northern reaches of the Santa Cruz River to study the resurgence of aquatic life taking root.

"There wasn't a lot of historical research on that reach of the river, but there was some. When you look at what they found, they would see a handful of insects, five or 10 different kinds," Bogan said. "Whereas Drew went out there and found over 150 different kinds. That's all thanks to the upgrade of these treatment plants and the higher water quality that we have now."

"Mayflies and caddisflies are quite abundant in that part of the river, which kind of shocked us," Eppehimer said. "You think of this effluent as maybe being of poorer quality than a natural stream system, and a lot of other studies view treated wastewater as a detriment to aquatic environments, but here it's the only water we have that sustains year-round flows in the river."

The biodiversity in the northern reaches of the effluent-dependent Santa Cruz River are encouraging; caddisflies and mayflies are not only indicators of a fairly healthy water system, but signs of a comeback, according to the researchers.

New Reaches, New Beginnings:

The new Santa Cruz River Heritage Project is a third discharge location along the river near Tucson, south of downtown and farther upstream. Eppehimer's findings in the northern reaches of the river served as the baseline for additional research.

"The downtown reach gave us that opportunity to say, OK, from day one, who arrives first, which species do well, and how long does it take for that same level of biodiversity that we see in the northern reaches to build up in this new reach? In the case of the dragonfly, it actually became that best case scenario within a year," Bogan said.

Innovative new projects using effluent to restore flow in rivers, like the Santa Cruz River Heritage Project, are showing almost immediate positive biodiversity effects, and the return of species to these rivers after very long dry spells can be incredibly fast.

Still, questions remain. Despite improved wastewater treatment technologies, trace levels of contaminants, such as per- and polyfluoroalkyl substances and pharmaceuticals, pass through treatment and persist in streams.

"Further questions surround the things we cannot readily observe, or those mixtures of environmentally persistent compounds that are not yet regulated and slip by treatment processes," said David Walker, an assistant research scientist in the UArizona Department of Environmental Science.

Walker adds that these compounds may have cumulative biological and behavioral effects on organisms and aquatic life, and says it is often the mixture of these micropollutants in treated effluent discharged into rivers that account for the biggest unknowns.

"Effluent can potentially still pose water quality impairments that may cause biological community degradation," Hamdhani said. "However, when wastewater treatment standards are high, effluent-fed streams can serve as refuges for aquatic biodiversity and corridors of ecological connectivity, especially in semi-arid and arid regions where natural streams have been depleted."

Credit: 
PeerJ

Corona-induced CO2 emission reductions are not yet detectable in the atmosphere

image: On the Zugspitze, KIT researchers monitor CO2 concentration and other parameters of the atmosphere.

Image: 
(Photo: Markus Rettinger, KIT)

Based on current data measured in the energy, industry, and mobility sectors, restrictions of social life during the corona pandemic can be predicted to lead to a reduction of worldwide carbon dioxide emissions by up to eight percent in 2020. According to the Intergovernmental Panel on Climate Change (IPCC), cumulative reductions of about this magnitude would be required every year to reach the goals of the Paris Agreement by 2030. Recent measurements by researchers of Karlsruhe Institute of Technology (KIT) revealed that concentration of carbon dioxide (CO2) in the atmosphere has not yet changed due to the estimated emission reductions. The results are reported in Remote Sensing (DOI: 10.3390/rs12152387).

The corona pandemic has changed both our working and our private lives. People increasingly work from home, have video conferences instead of business trips, and spend their holidays in their home country. The lower traffic volume also reduces CO2 emissions. Reductions of up to eight percent are estimated for 2020. "In spite of the reduced emissions, our measurements show that CO2 concentration in the atmosphere has not yet decreased," says Ralf Sussmann from the Atmospheric Environmental Research Division of KIT's Institute of Meteorology and Climate Research (IMK-IFU), KIT's Campus Alpine, in Garmisch-Partenkirchen. "To reduce CO2 concentration in the atmosphere in the long run, restrictions imposed during the corona pandemic would have to be continued for decades. But even this would be far from being sufficient."

To prove this, researchers additionally studied a long-term scenario that can be controlled well with atmospheric measurements: The goal of the Paris Climate Agreement to limit global warming to 1.5 degrees Celsius can only be reached by an immediate significant reduction of CO2 emissions and a further decrease down to zero by 2055. "The restrictions imposed during the corona crisis, however, are far from being sufficient. They have just resulted in a one-time reduction by eight percent. To reach zero emissions in the coming decades, cumulative reductions of the same magnitude would be required every year, i.e. 16 percent in 2021, 24 percent in 2022, and so on. For this, political measures have to be taken to directly initiate fundamental technological changes in the energy and transport sectors," Sussmann says.

For the study, the team used data from the Total Carbon Column Observing Network (TCCON). It measured the concentrations in different layers of the atmosphere above Garmisch-Partenkirchen and at other places around the globe. "High-tech infrared spectrometers are applied, which use the sun as a light source. The measurement method is highly precise, uncertainties are in the range of a few thousandths," Sussmann adds.

Long Life of CO2 Prevents Early Detection

According to the researchers, the long life of CO2 and the high background concentrations that have accumulated since the start of industrialization prevent the changes in the atmosphere from being detected. "But also natural impacts make early detection difficult: Anthropogenic emissions, the main cause of the long-term increase in atmospheric CO2, are superposed by annual fluctuations of the growth rate due to natural climate variabilities of ocean sinks and land vegetation," Sussmann says. Successful emission reduction, hence, is hard to detect by atmosphere measurements.

For their study, the researchers compared the TCCON measurements with the prognoses of the atmospheric growth rate for 2020 - with and without corona restrictions. "Precision analysis of atmosphere measurements revealed that the impacts of COVID-19 measures on the atmosphere might be measured after little more than six months, if the reference state without COVID-19 would be predicted precisely," the climate researcher explains. "In any case, we would be able to find out within presumably two and half years, whether global political and social measures will help us find viable alternatives of fossil fuels and reach the goals of the Paris Climate Agreement."

Credit: 
Karlsruher Institut für Technologie (KIT)

Extra stability for magnetic knots

Tiny magnetic whirls that can occur in materials - so-called skyrmions - hold high promises for novel electronic devices or magnetic memory in which they are used as bits to store information. A fundamental prerequisite for any application is the stability of these magnetic whirls. A research team of the Institute of Theoretical Physics and Astrophysics of Kiel University has now demonstrated that so far neglected magnetic interactions can play a key role for skyrmion stability and can drastically enhance skyrmion lifetime. Their work, which has been published today (September 21, 2020) in Nature Communications, opens also the perspective to stabilize skyrmions in new material systems in which the previously considered mechanisms are not sufficient.

Intensive research on stability at room temperature

Their unique magnetic structure - more precisely their topology - lends stability to skyrmions and protects them from collapse. Therefore, skyrmions are denoted as knots in the magnetization. However, on the atomic lattice of a solid this protection is imperfect and there is only a finite energy barrier (Figure 1). "The situation is comparable to a marble lying in a trough which thus needs a certain impetus, energy, to escape from it. The larger the energy barrier, the higher is the temperature at which the skyrmion is stable", explains Professor Stefan Heinze from Kiel University. Especially skyrmions with diameters below 10 nanometers, which are needed for future spinelectronic devices, have so far only been detected at very low temperatures. Since applications are typically at room temperature the enhancement of the energy barrier is a key objective in today's research on skyrmions.

Previously, a standard model of the relevant magnetic interactions contributing to the barrier has been established. A team of theoretical physicists from the research group of Professor Stefan Heinze has now demonstrated that one type of magnetic interactions has so far been overlooked. In the 1920s Werner Heisenberg could explain the occurrence of ferromagnetism by the quantum mechanical exchange interaction which results from the spin dependent "hopping" of electrons between two atoms. "If one considers the electron hopping between more atoms, higher-order exchange interactions occur", says Dr. Souvik Paul, first author of the study (Figure 2). However, these interactions are much weaker than the pair-wise exchange proposed by Heisenberg and were thus neglected in the research on skyrmions.

Weak higher-order exchange interactions stabilize skyrmions

Based on atomistic simulations and quantum mechanical calculations performed on the super computers of the North-German Supercomputing Alliance (HLRN) the scientists from Kiel have now explained that these weak interactions can still provide a surprisingly large contribution to skyrmion stability. Especially the cyclic hopping over four atomic sites (see red arrows in Fig. 2) influences the energy of the transition state extraordinarily strongly (see Fig. 1 highest point on the upper right), where only a few atomic bar magnets are tilted against each other. Even stable antiskyrmions were found in the simulations which are advantageous for some future data storage concepts but typically decay too fast.

Higher-order exchange interactions appear in many magnetic materials used for potential skyrmion applications such as cobalt or iron. They can also stabilize skyrmions in magnetic structures in which the previously considered magnetic interactions cannot occur or are too small. Therefore, the present study opens new promising routes for the research on these fascinating magnetic knots.

Credit: 
Kiel University

Climate: risks and future strategies in Italy

image: It could be worth up to 8% of GDP per capita, exacerbate the differences between north and south, between society's rich and poor, as well as affect a number of Italy's strategic sectors: climate change is a risk accelerator for many aspects of both the economy and society. The report, "Risk Analysis. Climate Change in Italy" has been published. Realized by the CMCC Foundation, Euro-Mediterranean Center on Climate Change, it is the first integrated analysis of climate risk in Italy.

Image: 
Fondazione CMCC

It could be worth up to 8% of GDP per capita, exacerbate the differences between north and south, between society’s rich and poor, as well as affect a number of Italy’s strategic sectors: climate change is a risk accelerator for many aspects of both the economy and society. The report, “Analisi del rischio. I cambiamenti climatici in Italia – Risk Analysis. Climate Change in Italy” has been published. Realized by the CMCC Foundation, Euro-Mediterranean Center on Climate Change, it is the first integrated analysis of climate risk in Italy. A document that bases itself on climate predictions for the coming years whilst focusing on specific sectors so as to provide information on what to expect from the future. It is a valuable support tool for concrete resilience and sustainable development strategies.

Risks associated with climate change affect all Italian regions and their economic sectors. Despite contrasts, with different areas being affected in different ways, there are no regions that can be considered immune from climate risks, which have already increased in recent years, in particular when considering extreme events.

The analysis carried out by the CMCC Foundation starts from climate scenarios that, through an advanced use of high-resolution climate models applied to the study of the Italian context, provide information on Italy’s future climate. This information is then applied to risk analysis for a number of sectors of the Italian socio-economic system. What emerges is a framework where, in the coming decades, risk grows in many areas with significant economic and financial costs for the country. Furthermore, the impacts will affect disadvantaged members of society more severely and also involves all sectors, not least of which infrastructure, agriculture and tourism.

Risk, scientific knowledge and response strategies
“This report contains the most up to date and advanced knowledge of the impacts and integrated risk analysis of climate change in Italy”, explains Donatella Spano, member of the CMCC Foundation and professor at the University of Sassari, who coordinated the thirty authors whose five chapters make up the research study. “Analysis of risks and their effects on environmental, natural, social and economic capital allow us to take response options identified by scientific research into account and develop integrated and sustainable management plans for the Italy, enhancing specific features, peculiarities and competences of the different territorial contexts”, continues Spano. “This knowledge is the result of innovative research, networking between the universities that contribute to the CMCC Foundation’s work, and international collaborations. It is also the product of top-level computing infrastructure at a global level. Putting all these aspects together in a multidisciplinary research perspective is a scientific community endeavour, the results of which are at the service of society and produce knowledge that benefits the entire country.”

“The challenge of risk connected to climate change – concludes Donatella Spano – starts from scientific knowledge and integrates adaptation and the solutions needed to face risks, in all phases of the decision-making processes including public policies, investment programs and the planning of public expenditure, so as to guarantee sustainable development at all territorial and governance levels.”

The report addresses the issues summarized below and is accompanied by a series of key messages, infographics and an executive summary that facilitates the reading and use of the report (available at this link).

Italy’s expected climate. The different climate models used concur in evaluating an increase in temperature of up to 2°C in the period 2021-2050 (compared to the period 1981-2010). In the worst-case scenario, the temperature increase may reach 5°C. Summer precipitation decreases in the central and southern regions, whereas intense precipitation events increase. In all scenarios, the number of hot days and periods without rain increases. The consequences of climate change on the marine and coastal environment will have an impact on coastal “ecosystem goods and services” that sustain socioeconomic systems through the provision of food and climate regulation services. (See the infographic).

Aggregate risk for Italy. Adaptation capacity and resilience are themes of concern for the entire Italian region, from north to south. Although the northern regions are richer and more developed, they are not exempt from climate change impacts, nor are they more prepared to face them. As for extreme events, the probability of risk has increased by 9% in the last twenty years in Italy.

Economic costs, tools and financial resources. The costs of climate change impacts in Italy increase exponentially as temperatures rise in the different scenarios, with values ​​ranging between 0.5% and 8% of GDP by the end of the century. Climate change increases economic inequality between regions. All sectors of the Italian economy are negatively affected by climate change. However, the greatest losses concern the country’s networks and infrastructure, as well as the agricultural and the tourism sector, in both summer and winter. Climate change will require sizeable investments and represents an opportunity for sustainable development, which is recognized by the European Green Deal as the only possible development model for the future. Today is the best moment for new ways of doing business and for new ways of sustainable land management to become part of the know-how of companies and of public, local and national bodies. (See the infographic)

Cities and urban environment. Children, the elderly, the disabled and the most vulnerable members of society will be those who suffer the most from the increase in average and extreme temperatures, the greater frequency (and duration) of heat waves and intense precipitation events. Indeed, we expect an increase in mortality due to ischemic heart diseases, strokes, nephropathy and metabolic disorders due to thermal stress, and an increase in respiratory diseases due to the link between phenomena related to temperature rise in the urban environment (heat islands) and concentrations of ozone (O3) and fine dust (PM10). (See the infographic)

Geo-hydrological risk. From the combined analysis of anthropogenic factors and climatic scenarios, the worsening of an already very complex situation is expected. Rising temperatures and an increase in localized precipitation phenomena play an important role in exacerbating risks. For the former, melting snow, ice and permafrost indicates that the areas most affected by variations in magnitude and seasonality of instability phenomena are the Alpine and Apennine regions. For the latter, intense precipitations contribute to a further increase in the hydraulic risk for small basins and the risk associated with superficial landslides in areas with higher permeability soils. (See the infographic)

Water resources. Most of the impacts of climate change on water resources envisage a reduction in the quantity of both surface water and groundwater in almost all semi-arid regions, with a consequent increase in risks for Italy’s sustainable development. The expected changes in climate (drought, extreme events and changes in the rainfall regime, reduction in the flow rate), entail risks for water quality and availability. The most relevant risks for water availability are linked to high competition between sectors (civil, agricultural, industrial, environmental, energy production), which worsens in the hot season when resources are scarce and demand increases (for example for agricultural needs and tourism). (See infographic)

Agriculture. Agricultural systems may suffer from increased production variability, with a tendency towards yield reduction for many cultivated species, accompanied by a probable decrease in the qualitative characteristics of the produce. However, impacts are vary significantly depending on the geographical area and specific crops in question. Negative impacts are also expected for the livestock sector, with both direct and indirect impacts on farmed animals and consequent repercussions on the quality and quantity of production. (See infographic)

Wildfires. Rising temperatures, reduction in average annual precipitation, and greater frequency of extreme weather events such as heat waves and drought will interact with the effects of abandonment of cultivated areas, pastures and areas that used to be managed forests, a strong exodus towards cities and coastal areas, and increasingly efficient monitoring, prevention and active control activities. Climate change is expected to further exacerbate specific components of fire risk, resulting in impacts on vulnerable people, assets and ecosystems in the most vulnerable areas. Increases in the danger of wildfires, shift in altitudes that are considered vulnerable areas, lengthening of the fire season and an increase in extremely dangerous days are expected. This may translate into an increase in burnt areas, with a consequent increase in greenhouse gas emissions and particulate matter, with impacts on human health and the carbon cycle. (See infographic)

 

Risk Analysis. Climate Change in Italy
Spano D., Mereu V., Bacciu V., Marras S., Trabucco A., Adinolfi M., Barbato G., Bosello F., Breil M., Coppini G., Essenfelder A., Galluccio G., Lovato T., Marzi S., Masina S., Mercogliano P., Mysiak J., Noce S., Pal J., Reder A., Rianna G., Rizzo A., Santini M., Sini E., Staccione A., Villani V., Zavatarelli M., 2020. “Risk Analysis. Climate Change in Italy”.
DOI: 10.25424/cmcc/analisi_del_rischio

All material available at the following link: https://www.cmcc.it/it/analisi-del-rischio-i-cambiamenti-climatici-in-italia

DOI

10.25424/cmcc/analisi_del_rischio

Credit: 
CMCC Foundation - Euro-Mediterranean Center on Climate Change

Bio-based inhibition of gas hydrate formation

image: Synthesis of epoxidized sunflower oil (ESFO) and phosphorylated polyol (Phospol)

Image: 
Kazan Federal University

Copper stearate was used as the basis for this catalyst test and showed efficiency for in-situ oil combustion.

"This development is undoubtedly very promising. In-situ combustion is an attractive and effective method of thermal oil extraction enhancement. One of the problems here is the initiation of combustion front and its further support, which can be stifled by a number of issues. Copper stearate is a strong natural catalyst," comments co-author Yuan Chengdong, Senior Research Associate of the Rheological and Thermochemical Research Lab.

Copper stearate has definitely proven to give an impulse to in-situ combustion. It showed high performance in low-temperature conditions, which makes it more economically viable.

"The composition that can be used in the petroleum industry is basically ready. We are now trying to increase its efficiency with other compounds, that is to say, to make the combustion process more gradual," says Junior Research Associate Dmitry Yemelyanov.

The publication is available online and is set to appear in print in November 2020.

Credit: 
Kazan Federal University

Researchers discover new molecules for tracking Parkinson's disease

image: The chemical structure of an alpha-synuclein fibril with an "exemplar" molecule, shown as colored spheres, bound to a previously identified binding site.

Image: 
E. James Petersson

For many of the 200,000 patients diagnosed with Parkinson's disease in the United States every year, the diagnosis often occurs only after the appearance of severe symptoms such as tremors or speech difficulties. With the goal of recognizing and treating neurological diseases earlier, researchers are looking for new ways to image biological molecules that indicate disease progression before symptoms appear. One such candidate, and a known hallmark of Parkinson's disease, is the formation of clumps of alpha-synuclein protein, and, while this protein was identified more than 20 years ago, a reliable way to track alpha-synuclein aggregates in the brain has yet to be developed.

Now, a new study published in Chemical Science describes an innovative approach for identifying molecules that can help track the progression of Parkinson's disease. Conducted by researchers in the labs of E. James Petersson, Robert Mach, and Virginia Lee, this proof-of-concept study could change the paradigm for how researchers screen and test new molecules for studying a wide range of neurodegenerative diseases.

Studying these types of protein aggregates requires new tracers, radioactive molecules that clinicians use to image tissues and organs, for positron emission tomography (PET). As a senior researcher in the field of PET tracer development, Mach and his group worked for several years with the Michael J. Fox Foundation to develop an alpha-synuclein tracer, but without data on the protein's structure they were unable to find candidates that were selective enough to be used as a diagnostic tool.

Then, with the first publication of alpha-synuclein's structure and an increase in tools available from the field of computational chemistry, Mach and Petersson started collaborating on developing an alpha-synuclein PET tracer. By combining their respective expertise in radiochemistry and protein engineering, they were able to confirm experimentally where on the alpha-synuclein protein potential tracer molecules were able to bind, crucial information to help them discover and design molecules that would be specific to alpha-synuclein.

In their latest study, the researchers developed a high-throughput computational method, allowing them to screen millions of candidate molecules, to see which ones will bind to the known binding sites on alpha-synuclein. Building off a previously published method, their approach first identifies an "exemplar", a pseudo-molecule that fits perfectly into the binding site of alpha-synuclein. Then, that exemplar is compared to actual molecules that are commercially available to see which ones have a similar structure. The researchers then use other computer programs to help narrow down the list of candidates for testing in the lab.

To evaluate the performance of their screening method, the scientists identified a small subset of 20 promising candidates from the 7 million compounds that were screened and found that two had extremely high binding affinity to alpha-synuclein. The researchers also used mouse brain tissues provided by the Lee group to further validate this new method. The researchers were impressed, and pleasantly surprised, by their success rate, which they attribute to the specific nature of their search method. "There's certainly a bit of luck involved as well," Petersson adds, "Probably the biggest surprise is just how well it worked."

The idea of using the exemplar method to tackle this problem came to first author and Ph.D. graduate John "Jack" Ferrie while he was learning computational chemistry methods at the Institute for Protein Design at the University of Washington as part of a Parkinson's Foundation Summer Fellowship. "The summer fellowship is designed to train students in new methods that can be applied to Parkinson's disease research, and that's exactly what happened here," says Petersson. "The ideas that Jack came back with formed the basis of a big effort in both my lab and Bob Mach's lab to identify PET tracers computationally."

Now, as part of a large multi-institutional grant, Petersson, Mach, Lee, and many other collaborators are poised to take the lessons learned from this finding to develop PET tracers for Parkinson's and other neurodegenerative diseases. "I really see this as being a game changer on how we do PET probe development," says Mach. "The significance is that we're able to screen millions of compounds within a very short period of time, and we're able to identify large numbers of compounds that will likely bind with high affinity to alpha-synuclein. We're also going to apply this same method to the development of other probes that are important but have presented challenges to the field."

Credit: 
University of Pennsylvania