Tech

A multipronged approach to addressing childhood adversity and promoting resilience

image: The Pediatric Academic Societies (PAS) Meeting connects thousands of pediatricians and other health care providers worldwide.

Image: 
PAS

A hot topic symposia session during the Pediatric Academic Societies (PAS) 2021 Virtual Meeting will discuss a multipronged approach to addressing childhood adversity and promoting resilience - at the clinical, systems, community and educational levels.

The effect of adverse childhood experiences (ACEs) on health outcomes across the lifespan is well recognized among pediatric practitioners. Increasing the ability of healthcare providers to recognize and respond to ACEs can buffer the long-term negative physical and mental health impacts of adversity and increase patient-centered care.

"In the era of COVID-19, employing a trauma-informed approach to care is of even graver importance because the reverberations of the overactivity of the biological stress response during this time will affect populations both in the near and distant future," said Binny Chokshi, MD. "Understanding the biological impact of stress and childhood adversity and recognizing ways in which to mitigate this impact and build resiliency is key. Our panel will serve this function."

For pediatric practitioners, there are multiple ways to address ACEs. This symposium will review approaches at the patient, systems (clinic/hospital), community and education level. It will also highlight the importance of interdisciplinary collaborations in moving this work forward.

At the patient level, the session will review the experience of Atrium Health Levine's Children's Hospital as a pilot site for the National Pediatric Practice Community of the Center for Youth Wellness. Shivani Mehta, MD, MPH, will discuss the facilitators and barriers to ACE screening implementation in both the academic and community primary care pediatric settings and review the use of resource referrals as a key intervention in promoting wellbeing and resilience.

At the clinic and hospital level, the Substance Abuse and Mental Health Services Administration delineates a framework to guide the creation of trauma-informed systems. Anita Shah, DO, MMS, MPH, will review the experience of Cincinnati Children's Hospital Medical Center in developing a trauma-informed strategic plan with multidisciplinary partners.

Community partnerships can be critical in securing resources to build resilience and preventing childhood adversity. Nia I. Bodrick, MD, MPH, FAAP, will highlight two exemplary community partnerships, the Early Childhood Innovation Network and the Building Communities Resilience National Coalition.

Lastly, education on ACEs and trauma-informed care is essential in assuring the sustainability and integration of approaches to confront adversity. Heather Forkey, MD, will describe the Pediatric Approach to Trauma, Treatment, and Resilience (PATTeR) program, supported by the American Academy of Pediatrics. The PATTeR program has trained over 400 pediatricians and clinic team members about childhood adversity and trauma-informed care.

Credit: 
Pediatric Academic Societies

Observation of antichiral edge states in a circuit lattice

image: (a) Photo of electric circuit in experiments. (b) The measured antichiral edge states. (c), (d) A Möbius strip configuration consisting of electric circuit and the propagation of antichiral edge states in experiments, respectively.

Image: 
@Science China Press

Originally formulated in the context of condensed matter physics, the Haldane model is an influential model of a two-dimensional topological insulator. It has also been realized in classical-wave metamaterial analogues of topological insulator, such as photonic crystals, acoustic crystals, and electric LC circuits.

Recently, theorists have shown that a modification to the Haldane model exhibits the novel phenomenon of antichiral edge sates, according to E. Colomés and M. Franz, scholars at Department of Physics and Astronomy and Quantum Matter Institute, University of British Columbia. Unlike the chiral edge states associated with the standard Haldane model, antichiral edge states possess the same propagation direction on opposite edges of a sample; the current carried by the edge states is compensated by counter-propagating bulk states. However, the modified Haldane model has thus far never been realized in any experimental setting. In condensed matter, it is extremely challenging to achieve since it requires an unusual configuration of magnetic vector potentials within each crystalline unit cell.

An article coauthored by Profs. ZhiHong Hang and Yidong Chong, from Soochow University and Nanyang Technological University respectively, provided the first experimental realization of a modified Haldane model, consisting of electrical circuit lattices. This study entitled "Observation of antichiral edge states in a circuit lattice", was published in SCIENCE CHINA Physics, Mechanics & Astronomy. These researchers provide direct experimental evidence that the circuit lattice exhibits edge states, and that their propagation is antichiral. Their experimental results are in good agreement not only with theory, but also with numerical circuit simulations. "It opens the door to further experimental explorations of the properties of antichiral edge states in a real physical system", these researchers stated.

"This work demonstrates the flexibility of electrical circuit lattices as an experimental platform for realizing lattice models with effective magnetic vector potentials", they explained. Previous researchers had shown how to use circuits to implement a very well-studied topological phase (the Chern insulator, i.e. the simplest two-dimensional topological insulator), whereas in their study they implemented a very novel and nontrivial model (the modified Haldane model).

This experiments also point to the intriguing possibility of using circuit lattices to explore models with Möbius strip geometries and other exotic configurations, which may be a rich avenue of future research.

Credit: 
Science China Press

Polarization-sensitive photodetection using 2D/3D perovskite heterostructure crystal

image: (a) Schematic structure of polarized light detector. (b) Photoconductivity parallel and perpendicular to the interface. (c) Photoconductivity anisotropy versus excitation power. (d) Angle-resolved photocurrent as a function of polarization angle measured at 405 nm under zero bias. (e) Experimental polarization ratios of some reported polarized light detectors. (f) Angle-dependent photocurrent of the present device measured at different temperature.

Image: 
@Science China Press

Polarization-sensitive photodetectors, based on anisotropic semiconductors, have exhibited wide advantages in specialized applications, such as astronomy, remote sensing, and polarization-division multiplexing. For the active layer of polarization-sensitive photodetectors, recent researches focus on two-dimensional (2D) organic-inorganic hybrid perovskites, where inorganic slabs and organic spacers are alternatively arranged in parallel layered structures. Compared with inorganic 2D materials, importantly, the solution accessibility of hybrid perovskites makes it possible to obtain their large crystals at low cost, offering exciting opportunities to incorporate crystal out-of-plane anisotropy for polarization-sensitive photodetection. However, limited by the absorption anisotropy of the material structure, polarization sensitivity of such a device remains low. Thus, a new strategy to design 2D hybrid perovskites with large anisotropy for polarization-sensitive photodetection is urgently needed.

Heterostructures provide a clue to address this challenge. On the one hand, construction of heterostructures can improve the optical absorption and free-carrier densities of the composite. On the other hand, the built-in electric field at the heterojunction can spatially separate the photogenerated electron-hole pairs, significantly reducing the recombination rate and further enhancing the sensitivity for polarization-sensitive photodetectors. Therefore, constructing single-crystalline heterostructures of anisotropic 2D hybrid perovskites would realize devices with high polarization sensitivity.

In a new research article published in the Beijing-based National Science Review, scientists at the Fujian Institute of Research on the Structure of Matter, Chinese Academy of Sciences create a 2D/3D heterostructure crystal, combining the 2D hybrid perovskite with its 3D counterpart; and achieve polarization-sensitive photodetection with record-high performance. Different from the previous work, devices based on the heterostructure crystal deliberately leverage the anisotropy of 2D perovskite and the built-in electric field of heterostructure, permitting the first demonstration of a perovskite heterostructure-based polarization-sensitive photodetector that operates without the need for external energy supply. Notably, the polarization sensitivity of the device surpasses all of the reported perovskite-based devices; and can be competitive with conventional inorganic heterostructure-based photodetectors. Further studies disclose that the built-in electric field formed at the heterojunction can efficiently separate those photogenerated excitons, reducing their recombination rate and therefore enhancing the performance of the resulting polarization-sensitive photodetector.

"High polarization sensitivity is successfully achieved in self-driven polarization-sensitive photodetector based on a single-crystalline 2D/3D hybrid perovskite heterostructure which is grown via a delicate solution method," the author claims, "This innovative study broadens the choice of materials that can be used for high-performance polarization-sensitive photodetectors, and correspondingly, the design strategies."

Credit: 
Science China Press

Study explores how private equity acquisitions impact hospitals

HOUSTON - (May 4, 2021) - Private equity investment in hospitals has grown substantially in the 21st century, and it accelerated in the years leading up the COVID-19 pandemic. Now a new study of short-term acute care hospitals acquired by private equity firms finds they not only have higher markups and profit margins, they're also slower to expand their staffs.

In a study published in Health Affairs, a multi-institutional team of investigators led by Dr. Anaeze C. Offodile II, a nonresident scholar in the Center for Health and Biosciences at Rice University's Baker Institute for Public Policy, the Gilbert Omenn Fellow at the National Academy of Medicine and an assistant professor of plastic and reconstructive surgery at The University of Texas MD Anderson Cancer Center, examined private involvement in short-term acute care hospitals and combed through proprietary databases to identify private equity transactions involving such hospitals between 2003 and 2017.

"Hospital spending accounts for one-third of U.S. health care expenditures, and there have been multiple news accounts of private equity investments in hospitals. But we know so little about how private equity acquisition influences hospitals' finance and operations," said Marcelo Cerullo, a surgical resident at Duke University Medical Center and the second author on the paper.

The researchers compared private equity-acquired hospitals to hospitals not involved in private equity deals using information drawn from Medicare cost reports and Area Health Resource Files. The study examined differences among private equity versus nonprivate equity hospitals in 2003 (before the acquisitions occurred) and 2017 (after the private equity deals were completed).

Over that period, private equity firms acquired 282 unique hospitals (some more than once) in leveraged buyouts in 36 different states. The study estimates that of all patient discharges in 2017, about 11% were from a hospital that had at one point been acquired by private equity. In 2003, hospitals that were subsequently acquired by private equity investors had higher charge-to-cost ratios, higher profit margins and comparable staffing ratios. However, post-acquisition in 2017, private equity-acquired hospitals had developed higher markups, even higher profit margins and saw slower rates of growth of staffing ratios. Notably, private equity-acquired hospitals had overall lower costs per each patient discharged.

"Our main findings highlight the overall solvency of private equity-acquired facilities, with strong baseline financial performance," Offodile said. "Our results challenge the prevailing narrative of financially distressed institutions seeking infusion of outside private equity capital. Post-acquisition, these hospitals appeared to continue to boost profits by restraining growth in cost per patient, in part by limiting staffing growth."

Hospitals acquired by private equity firms between 2003 and 2017 had operating margins that were 5.6 percentage points higher than nonprivate hospitals at the start of the period; that gap widened to 8.6 percentage points by 2017.

The authors also surveyed the press releases and industry-specific reports for private equity deals. Private equity firms often touted large capital investments that would maintain employment and commitments to the hospital's central mission. "We found a mixed track record of investors keeping specific promises after acquisition," Cerullo said. "Often, private equity firms would announce the dollar figure attached to renovations, new facilities, or give a time frame for staying open or maintaining raises."

The study forms a foundation for future investigations into how private equity acquisitions change hospital operations and clinical quality, the authors said.

"We need to conduct more detailed analyses before making specific policy recommendations involving private equity-acquired hospitals," said co-author Vivian Ho, a professor of economics at Rice and professor of medicine at Baylor College of Medicine. "For-profit hospitals have operated for several decades, but we need to determine whether the added presence of private equity induces additional efforts to maximize profits at the cost of harming patient welfare."

Credit: 
Rice University

New app makes Bitcoin more secure

image: "More than 90% of users are unaware of whether their wallet is violating this decentralized design principle based on the results of a user study," researchers said. And if an app violates this principle, it can be a huge security risk for the user.

Image: 
Creative commons via Pxhere

A computer science engineer at Michigan State University has a word of advice for the millions of bitcoin owners who use smartphone apps to manage their cryptocurrency: don't. Or at least, be careful. Researchers from MSU are developing a mobile app to act as a safeguard for popular but vulnerable "wallet" applications used to manage cryptocurrency.

"More and more people are using bitcoin wallet apps on their smartphones," said Guan-Hua Tu, an assistant professor in MSU's College of Engineering who works in the Department of Computer Science and Engineering. "But these applications have vulnerabilities."

Smartphone wallet apps make it easy to buy and trade cryptocurrency, a relatively new digital currency that can be challenging to understand in just about every way except one: It's very clearly valuable. Bitcoin was the most valuable cryptocurrency at the time of writing, with one bitcoin being worth more than $55,000.

But Tu and his team are uncovering vulnerabilities that can put a user's money and personal information at risk. The good news is that the team is also helping users better protect themselves by raising awareness about these security issues and developing an app that addresses those vulnerabilities.

The researchers showcased that app -- the Bitcoin Security Rectifier -- in a paper published for the Association for Computing Machinery's Conference on Data and Application Security and Privacy. In terms of raising awareness, Tu wants to help wallet users understand that these apps can leave them vulnerable by violating one of Bitcoin's central principles, something called decentralization.

Bitcoin is a currency that's not tied to any central bank or government. There's also no central computer server that stores all the information about bitcoin accounts, such as who owns how much.

"There are some apps that violate this decentralized principle," Tu said. "The apps are developed by third parties. And, they can let their wallet app connect with their proprietary server that then connects to Bitcoin."

In essence, Bitcoin Security Rectifier can introduce a middleman that Bitcoin omits by design. Users often don't know this and app developers aren't necessarily forthcoming with the information.

"More than 90% of users are unaware of whether their wallet is violating this decentralized design principle based on the results of a user study," Tu said. And if an app violates this principle, it can be a huge security risk for the user. For example, it can open the door for an unscrupulous app developer to simply take a user's bitcoin.

Tu said that the best way users can safeguard themselves is to not use a smartphone wallet app developed by untrusted developers. He instead encourages users to manage their bitcoin using a computer -- not a smartphone -- and resources found on Bitcoin's official website, bitcoin.org. For example, the site can help users make informed decisions about wallet apps.

But even wallets developed by reputable sources may not be completely safe, which is where the new app comes in.

Most smartphone programs are written in a programming language called Java. Bitcoin wallet apps make use of a Java code library known bitcoinj, pronounced "bitcoin jay." The library itself has vulnerabilities that cybercriminals could attack, as the team demonstrated in its recent paper.

These attacks can have a variety of consequences, including compromising a user's personal information. For example, they can help an attacker deduce all the Bitcoin addresses that wallet users have used to send or receive bitcoin. Attacks can also send loads of unwanted data to a user, draining batteries and potentially resulting in hefty phone bills.

Tu's app is designed to run at the same time on the same phone as a wallet, where it monitors for signs of such intrusions. The app alerts users when an attack is happening and provides remedies based on the type of attack, Tu said. For example, the app can add "noise" to outgoing Bitcoin messages to prevent a thief from getting accurate information.

"The goal is that you'll be able to download our tool and be free from these attacks," Tu said.

The team is currently developing the app for Android phones and plans to have it available for download in the Google Play app store in the coming months. There's currently no timetable for an iPhone app because of the additional challenges and restrictions posed by iOS, Tu said.

In the meantime, though, Tu emphasized that the best way users can protect themselves from the insecurities of a smartphone bitcoin wallet is simply by not using one, unless the developer is trusted.

"The main thing that I want to share is that if you do not know your smartphone wallet applications well, it is better not to use them since any developer -- malicious or benign -- can upload their wallet apps to Google Play or Apple App Store," he said.

Credit: 
Michigan State University

New application of AI just removed one of the biggest roadblocks in astrophysics

image: Simulations of a region of space 100 million light-years square. The leftmost simulation ran at low resolution. Using machine learning, researchers upscaled the low-res model to create a high-resolution simulation (right). That simulation captures the same details as a conventional high-res model (middle) while requiring significantly fewer computational resources.

Image: 
Y. Li et al./Proceedings of the National Academy of Sciences 2021

Using a bit of machine learning magic, astrophysicists can now simulate vast, complex universes in a thousandth of the time it takes with conventional methods. The new approach will help usher in a new era in high-resolution cosmological simulations, its creators report in a study published online May 4 in Proceedings of the National Academy of Sciences.

"At the moment, constraints on computation time usually mean we cannot simulate the universe at both high resolution and large volume," says study lead author Yin Li, an astrophysicist at the Flatiron Institute in New York City. "With our new technique, it's possible to have both efficiently. In the future, these AI-based methods will become the norm for certain applications."

The new method developed by Li and his colleagues feeds a machine learning algorithm with models of a small region of space at both low and high resolutions. The algorithm learns how to upscale the low-res models to match the detail found in the high-res versions. Once trained, the code can take full-scale low-res models and generate 'super-resolution' simulations containing up to 512 times as many particles.

The process is akin to taking a blurry photograph and adding the missing details back in, making it sharp and clear.

This upscaling brings significant time savings. For a region in the universe roughly 500 million light-years across containing 134 million particles, existing methods would require 560 hours to churn out a high-res simulation using a single processing core. With the new approach, the researchers need only 36 minutes.

The results were even more dramatic when more particles were added to the simulation. For a universe 1,000 times as large with 134 billion particles, the researchers' new method took 16 hours on a single graphics processing unit. Existing methods would take so long that they wouldn't even be worth running without dedicated supercomputing resources, Li says.

Li is a joint research fellow at the Flatiron Institute's Center for Computational Astrophysics and the Center for Computational Mathematics. He co-authored the study with Yueying Ni, Rupert Croft and Tiziana Di Matteo of Carnegie Mellon University; Simeon Bird of the University of California, Riverside; and Yu Feng of the University of California, Berkeley.

Cosmological simulations are indispensable for astrophysics. Scientists use the simulations to predict how the universe would look in various scenarios, such as if the dark energy pulling the universe apart varied over time. Telescope observations may then confirm whether the simulations' predictions match reality. Creating testable predictions requires running simulations thousands of times, so faster modeling would be a big boon for the field.

Reducing the time it takes to run cosmological simulations "holds the potential of providing major advances in numerical cosmology and astrophysics," says Di Matteo. "Cosmological simulations follow the history and fate of the universe, all the way to the formation of all galaxies and their black holes."

So far, the new simulations only consider dark matter and the force of gravity. While this may seem like an oversimplification, gravity is by far the universe's dominant force at large scales, and dark matter makes up 85 percent of all the 'stuff' in the cosmos. The particles in the simulation aren't literal dark matter particles but are instead used as trackers to show how bits of dark matter move through the universe.

The team's code used neural networks to predict how gravity would move dark matter around over time. Such networks ingest training data and run calculations using the information. The results are then compared to the expected outcome. With further training, the networks adapt and become more accurate.

The specific approach used by the researchers, called a generative adversarial network, pits two neural networks against each other. One network takes low-resolution simulations of the universe and uses them to generate high-resolution models. The other network tries to tell those simulations apart from ones made by conventional methods. Over time, both neural networks get better and better until, ultimately, the simulation generator wins out and creates fast simulations that look just like the slow conventional ones.

"We couldn't get it to work for two years," Li says, "and suddenly it started working. We got beautiful results that matched what we expected. We even did some blind tests ourselves, and most of us couldn't tell which one was 'real' and which one was 'fake.'"

Despite only being trained using small areas of space, the neural networks accurately replicated the large-scale structures that only appear in enormous simulations.

The simulations don't capture everything, though. Because they focus only on dark matter and gravity, smaller-scale phenomena -- such as star formation, supernovae and the effects of black holes -- are left out. The researchers plan to extend their methods to include the forces responsible for such phenomena, and to run their neural networks 'on the fly' alongside conventional simulations to improve accuracy. "We don't know exactly how to do that yet, but we're making progress," Li says.

Credit: 
Simons Foundation

Machine learning accelerates cosmological simulations

image: The leftmost simulation ran at low resolution. Using machine learning, researchers upscaled the low-res model to create a high-resolution simulation (right). That simulation captures the same details as a conventional high-res model (middle) while requiring significantly fewer computational resources.

Image: 
Y. Li et al./Proceedings of the National Academy of Sciences 2021

A universe evolves over billions upon billions of years, but researchers have developed a way to create a complex simulated universe in less than a day. The technique, published in this week's Proceedings of the National Academy of Sciences, brings together machine learning, high-performance computing and astrophysics and will help to usher in a new era of high-resolution cosmology simulations.

Cosmological simulations are an essential part of teasing out the many mysteries of the universe, including those of dark matter and dark energy. But until now, researchers faced the common conundrum of not being able to have it all ¬-- simulations could focus on a small area at high resolution, or they could encompass a large volume of the universe at low resolution.

Carnegie Mellon University Physics Professors Tiziana Di Matteo and Rupert Croft, Flatiron Institute Research Fellow Yin Li, Carnegie Mellon Ph.D. candidate Yueying Ni, University of California Riverside Professor of Physics and Astronomy Simeon Bird and University of California Berkeley's Yu Feng surmounted this problem by teaching a machine learning algorithm based on neural networks to upgrade a simulation from low resolution to super resolution.

"Cosmological simulations need to cover a large volume for cosmological studies, while also requiring high resolution to resolve the small-scale galaxy formation physics, which would incur daunting computational challenges. Our technique can be used as a powerful and promising tool to match those two requirements simultaneously by modeling the small-scale galaxy formation physics in large cosmological volumes," said Ni, who performed the training of the model, built the pipeline for testing and validation, analyzed the data and made the visualization from the data.

The trained code can take full-scale, low-resolution models and generate super-resolution simulations that contain up to 512 times as many particles. For a region in the universe roughly 500 million light-years across containing 134 million particles, existing methods would require 560 hours to churn out a high-resolution simulation using a single processing core. With the new approach, the researchers need only 36 minutes.

The results were even more dramatic when more particles were added to the simulation. For a universe 1,000 times as large with 134 billion particles, the researchers' new method took 16 hours on a single graphics processing unit. Using current methods, a simulation of this size and resolution would take a dedicated supercomputer months to complete.

Reducing the time it takes to run cosmological simulations "holds the potential of providing major advances in numerical cosmology and astrophysics," said Di Matteo. "Cosmological simulations follow the history and fate of the universe, all the way to the formation of all galaxies and their black holes."

Scientists use cosmological simulations to predict how the universe would look in various scenarios, such as if the dark energy pulling the universe apart varied over time. Telescope observations then confirm whether the simulations' predictions match reality.

"With our previous simulations, we showed that we could simulate the universe to discover new and interesting physics, but only at small or low-res scales," said Croft. "By incorporating machine learning, the technology is able to catch up with our ideas."

Di Matteo, Croft and Ni are part of Carnegie Mellon's National Science Foundation (NSF) Planning Institute for Artificial Intelligence in Physics, which supported this work, and members of Carnegie Mellon's McWilliams Center for Cosmology.

"The universe is the biggest data sets there is -- artificial intelligence is the key to understanding the universe and revealing new physics," said Scott Dodelson, professor and head of the department of physics at Carnegie Mellon University and director of the NSF Planning Institute. "This research illustrates how the NSF Planning Institute for Artificial Intelligence will advance physics through artificial intelligence, machine learning, statistics and data science."

"It's clear that AI is having a big effect on many areas of science, including physics and astronomy," said James Shank, a program director in NSF's Division of Physics. "Our AI planning Institute program is working to push AI to accelerate discovery. This new result is a good example of how AI is transforming cosmology."

To create their new method, Ni and Li harnessed these fields to create a code that uses neural networks to predict how gravity moves dark matter around over time. The networks take training data, run calculations and compare the results to the expected outcome. With further training, the networks adapt and become more accurate.

The specific approach used by the researchers, called a generative adversarial network, pits two neural networks against each other. One network takes low-resolution simulations of the universe and uses them to generate high-resolution models. The other network tries to tell those simulations apart from ones made by conventional methods. Over time, both neural networks get better and better until, ultimately, the simulation generator wins out and creates fast simulations that look just like the slow conventional ones.

"We couldn't get it to work for two years," Li said, "and suddenly it started working. We got beautiful results that matched what we expected. We even did some blind tests ourselves, and most of us couldn't tell which one was 'real' and which one was 'fake.'"

Despite only being trained using small areas of space, the neural networks accurately replicated the large-scale structures that only appear in enormous simulations.

The simulations didn't capture everything, though. Because they focused on dark matter and gravity, smaller-scale phenomena -- such as star formation, supernovae and the effects of black holes -- were left out. The researchers plan to extend their methods to include the forces responsible for such phenomena, and to run their neural networks 'on the fly' alongside conventional simulations to improve accuracy.

Credit: 
Carnegie Mellon University

Without commuter traffic, pandemic-era drivers are speeding up, increasing noise pollution

image: Boston University biologists Richard Primack and Carina Terry are pictured here collecting noise samples at Hall's Pond Sanctuary in Brookline, Massachusetts.

Image: 
Photo courtesy of Richard Primack

As pandemic lockdowns went into effect in March 2020 and millions of Americans began working from home rather than commuting to offices, heavy traffic in America’s most congested urban centers—like Boston—suddenly ceased to exist. Soon afterwards, the air was noticeably cleaner. But that wasn’t the only effect. A team of Boston University biologists who study how human-related sounds impact natural environments seized the opportunity to learn how the reduced movement of people would impact local ecosystems. They found—surprisingly—that sound levels increased in some nature conservation areas, a result of cars driving faster on roads no longer choked by traffic.

BU ecologist Richard Primack and Carina Terry, an undergraduate student working in Primack’s research lab, ventured into Boston-area parks, iPhones in hand, to take environmental sound recordings to see how sound levels had changed in comparison to pre-pandemic times, when there were more people out and about, construction underway, and cars on the road. Primack, a BU College of Arts & Sciences professor of biology, has studied noise pollution for over four years and has trained over a hundred students and citizen conservationists to collect noise samples in nature sanctuaries across Massachusetts. 

The team focused their study on three locations in Massachusetts: Hammond Pond Reservation in Newton, Hall’s Pond Sanctuary in Brookline, and Blue Hills Reservation—by far the largest of the three—which covers parts of Milton, Quincy, Braintree, Canton, Randolph, and Dedham. They collected noise samples from all three parks using a specialized sound-sensing app on iPhones, called SPLnFFT. Then, by referencing the Primack lab’s huge library of previously collected sound data, the study authors compared sound levels collected in the months during the pandemic to measurements collected before the pandemic began. The resulting paper was recently published in the journal Biological Conservation.

They found that Hammond Pond Reservation and Hall’s Pond Sanctuary, both located in suburban residential areas, had lower levels of noise. But at Blue Hills Reservation, they found the opposite—sound levels increased substantially in all areas of the park, “which was very surprising,” Terry says. Blue Hills is a popular destination for local hikes and it is intersected by several major highways and roadways. While there are less cars on the roads these days, the researchers say their sound recordings indicate cars are moving much faster, generating more noise. This finding aligns with a trend that has been observed nationwide—the pandemic has seen traffic jams replaced with increased reports of recklessly fast drivers speeding on open roadways

“Before the pandemic, traffic was going relatively slow on [I-93] because it was so congested,” says Primack, the study’s senior author. Now, noise from faster-moving cars is “penetrating the entire park,” he says, measuring about five decibels noisier, even in the interior of the park, compared to pre-pandemic times.

“It’s not so much the [number] of cars, but the speed,” says Terry, the study’s lead author. This study was part of her undergraduate honors thesis from the department of earth and environment and the Kilachand Honors College which she graduated from in 2020, and won her the Francis Bacon Award for Writing Excellence in the Natural Sciences

For animals, road noise (and other forms of noise pollution like leaf blowers and airplanes overhead) can interfere with their ability to hear threats and communicate with each other, especially for certain birds who are vulnerable to predators or who have calls that can’t penetrate through the noise. Noise pollution can then impact which species are able to survive in areas with high noise levels from human activity. 

“There’s an increasing volume of studies that say wildlife is very sensitive to noise pollution,” Primack says. “Animals rely strongly on their hearing for detecting predators and social interactions.” 

“The big impact [of noise pollution] is the filtering out of which species can live in an area, because if you have a species you need to conserve, you can’t conserve them if they won’t be able to survive in a loud area, or if the conservation area is right by a road,” Terry says. 

There are also well-measured health effects of noise pollution on people, according to the researchers, including elevated blood pressure, heart attacks, inability to sleep, increasing irritability, mood changes, and anxiety. 

“When you’re [recreating] in a protected [nature conservation] area, people want to relax and experience a natural environment especially after being in the city all day,” Primack says. “If people are hearing a lot of noise, it means they can’t get the rejuvenating effects of the park.” 

Primack and his lab will continue to measure noise pollution levels in Boston-area parks and around BU’s campus, documenting how noise levels change as vaccinated people begin to repopulate offices, drive more, and resume more normal activities. Terry is applying to graduate school, where she hopes to pursue further research on wildlife ecology and human impacts on the environment.

And for nature lovers behind the wheel, the takeaway from the study is clear: slow down.

Journal

Biological Conservation

DOI

10.1016/j.biocon.2021.109039

Credit: 
Boston University

Powering Discovery: A new expert panel report from the CCA

image: The Expert Panel on International Practices for Funding Natural Science and Engineering Research

Image: 
Council of Canadian Academies (CCA)

Research funding agencies around the world are testing creative approaches to address urgent needs while laying the foundation for discoveries that will meet the unpredictable demands of the future. According to a new expert panel report from the Council of Canadian Academies (CCA), Canada can bolster its research capacity by reducing administrative burdens, experimenting with funding approaches, and cultivating a robust, resilient, and diverse scientific workforce.

"In the past year we have seen the power and promise of transformative research and the ability of researchers and funding organizations to pivot in times of crisis," said Shirley M. Tilghman, PhD, O.C., FRS, Chair of the Expert Panel. "But the pandemic has also exacerbated existing inequalities within the research community and highlighted the challenge funders face in attempting to balance immediate needs with preserving support for fundamental, curiosity-driven research. This report highlights many practices used internationally to address these and other issues."

The Panel found that segmenting awards by career stage can help reduce barriers for early career researchers and improve support for researchers across their careers, while longer duration grants and support for collaboration benefit high-risk and interdisciplinary research. Experimentation with alternative funding practices, such as short pre-applications and partial lotteries, may also allow funders to reduce burdens on grant applicants and reviewers.

Funding agencies also play an important role in increasing diversity in the professoriate and attracting and retaining research talent. Promising practices to support equity, diversity, and inclusion (EDI) in the research community include explicit diversity targets, dedicated funding programs, equality charters, and initiatives to reduce bias in peer review.

"Funding agencies serve a vital function by channeling public investments into new discoveries and research to address pressing societal needs," said Eric M. Meslin, PhD, FRSC, FCAHS, President and CEO of the CCA. "We anticipate that this report will help to inform their decisions when considering how to structure their funding programs and maximize their impact."

The Natural Sciences and Engineering Research Council of Canada (NSERC) asked the CCA for an evidence-based, independent assessment that examines successful international practices for funding natural sciences and engineering (NSE) research and how these could be applied in Canada. Powering Discovery explores funding practices that could improve support for researchers across their careers, enhance EDI, and support interdisciplinary and high-risk research. It also sheds light on novel approaches for increasing funding efficiency and promoting and measuring impact.

Visit http://www.cca-reports.ca to download the report.

Credit: 
Council of Canadian Academies

3D 'lung-on-a-chip' model developed to test new therapies for COVID-19 and other lung conditions

video: 3D and 2D visuals

Image: 
Brigham and Women's Hospital

First-of-its-kind model replicates human alveolar lung tissue

Allows researchers to study effects of COVID-19 on cell growth and development

Provides insight as to how various drugs impact viral spread

Globally, lung failure is one of the leading causes of death. Many conditions can affect and damage the lungs, including asthma, chronic obstructive pulmonary disease, influenza, pneumonia, and, most recently, COVID-19. To better understand respiratory diseases and develop new drugs faster, investigators from Brigham and Women's Hospital designed a 3D "lung-on-a-chip" model of the distal lung and alveolar structures, the tiny air sacs that take in oxygen as you breathe. With this innovation, researchers are actively studying how COVID-19 viral particles travel through airways and impact pulmonary cells. Notably, this technology enables scientists to investigate how various COVID-19 therapies, such as remdesivir, impact the replication of the virus. Their results are published in Proceedings of the National Academy of Sciences.

"We believe that it is a true innovation," said Y. Shrike Zhang, PhD, associate bioengineer in the Brigham's Department of Medicine and Division of Engineering in Medicine. "This is a first-of-its-kind in vitro model of the human lower lung that can be used to test many of the biological mechanisms and therapeutic agents, including anti-viral drugs for COVID-19 research."

Understanding and developing treatments for COVID-19 requires human clinical trials, which are time- and resource-intensive. With better laboratory models, such as the lung-on-a-chip, researchers may be able to evaluate drugs much faster and help select the drug candidates most likely to succeed in clinical trials.

Zhang and colleagues developed this technology to mirror the biological characteristics of the human distal lung. Previous models have been based on flat surfaces and oftentimes made with plastic materials, which do not incorporate the curvature of the alveoli and are much stiffer than the human tissue. Researchers created this new model with materials more representative of human alveolar tissue and stimulated cell growth within these 3D spaces.

In testing the model's effectiveness, researchers found that the 3D alveolar lung effectively grew cells over multiple days and that these cells adequately populated airway surfaces. Through genome sequencing, scientists observed that the alveolar lung model more closely resembled the human distal lung than previous 2D models have. Additionally, the lung-on-a-chip model successfully stimulated breaths of air at the normal frequency for humans.

Beyond COVID-19, Zhang's research team intends to use this technology to study a broad range of pulmonary conditions, including various lung cancers. To replicate smoking's impact on the lungs, scientists allowed smoke to seep into the model's air chambers then simulated a breathing event, moving smoke deeper into the lungs. From there, they measured the smoke's impact and cell damage it caused.

While this innovation holds the potential to vastly expand the possibilities of studying and treating pulmonary diseases, this model is still in its early stages, said Zhang. Currently, the alveolar lung-on-a-chip only incorporates two out of the 42 cell types existing in the lung. In the future, researchers hope to incorporate more cell types into the model to make it more clinically representative of human lungs.

Going forward, Zhang also hopes to study how COVID-19 variants may travel through airways and impact pulmonary cells and COVID-19 therapies. He believes that using this model in tandem with other 3D organs, such as the intestines, could enable researchers to study how oral drugs impact cells in the lower lungs. Zhang also hopes that in the future, this technology could be implemented to urgently understand and develop treatments for emerging contagious diseases.

"In terms of COVID-19, we've had very minimal timelines for developing therapies. In the future, if we have these models ready in hand, we can easily use them to study and test therapeutics in urgent situations where clinical trials are limited," said Zhang.

Credit: 
Brigham and Women's Hospital

A physics perspective on wound healing

image: Fluorescent microscopy image of a proliferating cell front, whose edge is indicated in green. The rat epithelial cells, with cytoplasm (blue) and nuclei (red), form a 2-dimensional culture which can be used to model wound healing.

Image: 
© Guillaume Rapin, UNIGE

In material physics understanding how systems interact across the interfaces separating them is of central interest. But can physical models clarify similar concepts in living systems, such as cells? Physicists at the University of Geneva (UNIGE), in collaboration with the University of Zurich (UZH), used the framework of disordered elastic systems to study the process of wound healing - the proliferation of cell fronts which eventually join to close a lesion. Their study identified the scales of the dominant interactions between cells which determine this process. The results, published in the journal Scientific Reports, will allow better analysis of cell front behaviour, in terms of both wound healing and tumour development. In the future, this approach may offer personalised diagnostics to classify cancers and better target their treatment, and identify new pharmacological targets for transplantation.

By focusing on macroscopic properties of large datasets, statistical physics makes it possible to extract an overview of system behaviour independent of its specific microscopic character. Applied to biological elements, such as the cell fronts bordering a wound, this approach makes it possible to identify the various interactions which play a defining role during tissue growth, differentiation, and healing, but above all to highlight their hierarchy at the different scales observed. Patrycja Paruch, professor in the Department of Quantum Matter Physics at the UNIGE Faculty of Science, explains: "For cancer tumour invasion, or in the event of a wound, cell front proliferation is crucial, but the speed and morphology of the front is highly variable. However, we believe that only a few dominant interactions during this process will define the dynamics and the shape - smooth or rough, for example - of the cell colony edge. Experimental observations across multiple lengthscales to extract general behaviours can allow us to identify these interactions in healthy tissue and diagnose at what level pathological changes can occur, to help combat them. This is where statistical physics comes in."

The many scales of wound healing

In this multidisciplinary study, the UNIGE physicists collaborated with the team of Professor Steven Brown from the UZH. Using rat epithelial cells, they established flat colonies (2D) in which the cells grow around a silicone insert, subsequently removed to mimic an open lesion. The cell fronts then proliferate to fill the opening and heal the tissue. "We reproduced five possible scenarios by 'handicapping' the cells in different ways, in order to see what impact this has on wound healing, i.e. on the speed and roughness of the cell front", explains Guillaume Rapin, a researcher in Patrycja Paruch's team. The idea is to see what happens in normal healthy tissue, or when processes such as cell division and communication between neighbouring cells are inhibited, when cell mobility is reduced or when cells are permanently pharmacologically stimulated. "We took some 300 images every four hours for about 80 hours, which allowed us to observe the proliferating cell fronts at very different scales", continues Guillaume Rapin. "By applying high-performance computational techniques, we were able to compare our experimental observations with the results of numerical simulations", adds Nirvana Caballero, another researcher in Patrycja Paruch's team.

Zooming out for greater effect

The scientists observed two distinct roughness regimes: at less than 15 micrometres, below the size of a single cell, and between 80 and 200 micrometres, when several cells are involved. "We have analysed how the roughness exponent evolves over time to reach its natural dynamic equilibrium, depending on the pharmacochemical conditions we have imposed on the cells, and how this roughness increases depending on the scale at which we look", emphasises Nirvana Caballero. "In a system with a single dominant interaction, we expect to see the same roughness exponent at all scales. Here, we see a changing roughness if we look at the scale of one cell or of ten cells."

The Geneva and Zurich teams revealed only minor variations in the roughness exponent below 15 micrometres, whatever the conditions imposed on the cell fronts. On the other hand, they found that between 80 and 150 micrometers, the roughness is altered by all pharmacological inhibitors, significantly reducing the roughness exponent. Moreover, they observed that proliferation speed varied greatly between the different pharmacochemical conditions, slowing when cell division and motility were inhibited, and accelerating when cells were stimulated. "More surprisingly, the fastest proliferation speed was achieved when gap-junction communication between cells was blocked", says Guillaume Rapin. This observation suggests that such communication may be targeted in future therapies, either to promote healing of burns or wounds, or to slow cancer tumour invasion.

These results show that medium-scale interactions play a crucial role in determining the healthy proliferation of a cell front. "We now know at what scale biologists should look for problematic behaviour of cell fronts, which can lead to the development of tumours", says Nirvana Caballero. Now scientists will be able to focus on these key lenghtscales to probe tumour cells fronts, and directly compare their pathological interactions with this of healthy cells.

Credit: 
Université de Genève

Lead found in rural drinking water supplies in West Africa

Scientists are warning that drinking water supplies in parts of rural West Africa are being contaminated by lead-containing materials used in small community water systems such as boreholes with handpumps and public taps.

They analysed scrapings taken from the plumbing of 61 community water supply systems in Ghana, Mali and Niger. Eighty percent of the tested systems had at least one component that contained lead in excess of international guidance.

Lead is released into the water when the components corrode.

The study, by a research team from the University of Leeds, University of North Carolina at Chapel Hill and Boston University, also took samples of the water from those 61 water distribution systems, and from a further 200 taps and boreholes with handpumps.

Sixty percent of the samples contained lead - nine percent were at a level that exceeded World Health Organisation guidelines.

The researchers found that lead contamination was significantly associated with the use of lead-containing components in the water systems.

There is no-known safe level of exposure to lead. It accumulates in the body and crosses the blood-brain barrier and can cause irreversible damage to cognitive and neurological development, particularly in children and babies in the womb.

Lead contamination in plumbing systems has been a recognised problem for decades which has been controlled in urban areas served by large piped water systems by implementing corrosion control and using lead-free or low-lead components, enforced through testing and monitoring, building codes and regulations.

Evidence shows there is still a problem in higher-income countries with lead contamination in water from private wells and small, piped systems. The picture in low and middle-income countries has been less well studied, although the problem is believed to be widespread - and the potential implications for public health are much greater because of the global scale and the number of people who rely on small community water-supply systems.

In Sub-Sharan Africa alone, it is estimated that 184 million people use boreholes with handpumps to access water and 112 million people use rural piped supplies.

'Opportunity for effective prevention'

Jamie Bartram, Professor of Public Health and Environment at the University of Leeds, who supervised the latest research, said the evidence demonstrated the need for coordinated and urgent remedial action.

"We have an opportunity for effective prevention and improved water supply practice world-wide. The required actions are overdue and unquestionably beneficial. The cost of ensuring that components are lead-safe is negligible," he said.

"Using certified-safe components has multiple benefits, minimizing the risk to other hazardous contaminants. In contrast, delay carries further disease burden, increases the ultimate cost of protecting populations, and accumulates remediation burdens for future generations."

The study, Occurrence of Lead and other Toxic Metals Derived from Drinking-Water Systems in three West African Countries, is published in the journal Environmental Health Perspectives.

The International Plumbing Code (IPC), from the International Code Council, recommends that lead in a plumbing component should not exceed 0.25 percent, based on weight.

Of the 130 plumbing components tested by the research team, 82 percent had a lead level that exceeded the IPC recommended maximum. Brass components were the most problematic. The researchers say the use of brass in a water system increased the expected lead concentrations in drinking-water samples by a factor of 3.8.

Where drinking water was contaminated, the mean value of lead in the water was approximately 8 micrograms in a litre of water - where a microgram is one-millionth of a gram. The individual values, the 95 percent confidence limit, ranged from 0.5 micrograms/litre of water to 15 micrograms/litre of water. The World Health Organisation guideline value is 10 micrograms/litre of water.

Dr Michael Fisher, Assistant Professor at the University of North Carolina at Chapel Hill, led the study. He said: "It is clear is that lead is present in most tested systems in this study and finds its way into drinking water at levels of concern.

"These findings suggest several affordable, feasible, no-regrets opportunities to reduce or prevent this contamination from occurring. Collaboration among multiple stakeholders will be required to achieve rapid success.

"Lead exposure from other sources like paint and petrol has been successfully phased out and lead can be successfully eliminated from drinking water systems through concerted and collaborative responses."

Need for coordinated action

The scientists say manufacturers could discourage the use of unsuitable components, for example through explicit labelling and engagement in professional networks.

They write in the paper: "This contamination may be readily addressed through cost-effective preventive action, such as consistent use of components and materials compliant with IPC codes. Supply chain improvements with verification of compliance would reduce the availability and use of unsuitable components, such as leaded brass parts, in drinking-water systems.

"Governments may develop or update regulations related to lead-free water system components and their implementation, including verification schemes."

The research team say importers and wholesalers should ensure that product suitability and specifications are conspicuous and intelligible. Professional associations should disseminate knowledge and foster understanding and good practices throughout their memberships. Several governments are already taking action.

The United Nations Sustainable Development Goal SDG 6 states that everybody should have access to safe and affordable drinking water.

Credit: 
University of Leeds

NTU Singapore scientists invent catheter system to deliver electricity-activated glue path

video: A team of researchers led by Nanyang Technological University, Singapore (NTU Singapore) has developed a device that offers a quicker and less invasive way to seal tears and holes in blood vessels, using an electrically-activated glue patch applied via a minimally invasive balloon catheter.

This device could eventually replace the need for open or keyhole surgery to patch up or stitch together internal blood vessel defects.

Image: 
NTU Singapore

A team of researchers led by Nanyang Technological University, Singapore (NTU Singapore) has developed a device that offers a quicker and less invasive way to seal tears and holes in blood vessels, using an electrically-activated glue patch applied via a minimally invasive balloon catheter.

This device could eventually replace the need for open or keyhole surgery to patch up or stitch together internal blood vessel defects.

After inserting the catheter into an appropriate blood vessel, the glue patch - nicknamed 'Voltaglue' - can be guided through the body to where the tear is located and then activated using retractable electrodes to glue it shut in a few minutes, all without making a single surgical cut.

Patented by NTU and Massachusetts Institute of Technology (MIT) scientists, Voltaglue is a new type of adhesive that works in wet environments and hardens when a voltage is applied to it.

The catheter device that deploys Voltaglue is jointly developed by Associate Professor Terry Steele from the NTU School of Materials Science and Engineering, former NTU PhD student Dr Manisha Singh, now at MIT, and Associate Professor Ellen Roche from the Department of Mechanical Engineering and Institute for Medical Engineering and Science at MIT, USA.

This catheter device is the first proof-of-concept application of Voltaglue in a medical setting since it was invented by Assoc Prof Steele in 2015.

Their research was published in the peer-reviewed scientific journal Science Advances in April.

Assoc Prof Steele said: "The system that we developed is potentially the answer to the currently unmet medical need for a minimally-invasive technique to repair arteriovenous fistulas (an abnormal connection between an artery and a vein) or vascular leaks, without the need for open surgery. With Voltaglue and the catheter device, we open up the possibility of not having to make surgical incisions to patch something up inside - we can send a catheter-based device through to do the job."

A new way to mend broken blood vessels

The catheter system is made up of two components:

i) The adhesive patch containing Voltaglue called ePATCH, which is applied to the catheter's balloon,

ii) a modified catheter with retractable wires that carry electrical current, named CATRE.

The team showed in lab experiments on a pig's heart that the Voltaglue patch can be safely and effectively administered in a variety of situations, including withstanding the high pulsatile pressure of blood in arteries like the aorta.

The device was used to close a 3 mm defect in an explanted pig aorta connected to a mock heart under continuous flow of blood of 10 ml per minute.

The flexible catheter is first inserted and guided through the blood vessel. Once at the site of the break, the balloon is expanded so that the injury is covered by the Voltaglue patch.

A small electrical charge is sent through the two wires to activate the patch. The glue's hardness can be adjusted by changing the amount of voltage applied to it, a process called electrocuring. This allows the patch to adapt to various types of tissue surfaces, from relatively smooth aortic tissue to more irregular, uneven surfaces of synthetic vascular grafts.

The patch starts to set after 20 seconds and fully hardens between 3 to 5 minutes. Upon hardening, the patch effectively 'glues' the broken vessel together, thereby sealing the two broken ends shut. The wires, deflated balloon, and catheter are then withdrawn.

In this experiment, the team left the patch on the pig heart for 1,000 physiological stress/strain cycles (heartbeats), which, at 70 beats per minute, was around 15 minutes. When the aorta was examined after the experiment, the patch was found to be still successfully sealing the gap.

The paper's first author Dr Manisha Singh, formerly from NTU School of Materials Science and Engineering, said: "Voltaglue is unlike other adhesives in the market as it is voltage-activated, is stable in wet environments and can stick onto soft tissue, making it suitable and effective for repairing blood vessels. By combining it with existing, commercially available catheters, we have developed a new delivery mechanism that is minimally invasive, yet flexible and adaptable. This system shows promise for a diverse range of medical applications, as the suitability of the patch could be tailored according to the needs of the patient."

A safe way to patch up tears in a variety of organs and vessels

The catheter is designed for use in vessels ranging from 7.5 to 30 mm in size, making it suitable for sealing defects in organs and vessels such as the aorta, intestine, and oesophagus.

Both Voltaglue and the patch are made with bioresorbable material, which are entirely degradable and dissolve after a few weeks.

These properties make the catheter suitable for potential applications such as vascular grafting, a common surgical procedure to redirect blood flow from one area to another, or to seal off blood flow to tumours, in order to kill them off.

Giving an independent comment on this innovation, Associate Professor Andrew Chin, senior consultant in the Department of Hand and Reconstructive Microsurgery at Singapore General Hospital, said: "The clinical application of this device in delivery of bio-adhesive has tremendous potential not just for vascular anastomoses (vessel connection), but other soft tissue fixation which significantly cuts down on the time taken to complete at this current point in time where suture materials are being used."

Drawing on their findings, the researchers foresee that the catheter device may someday be used to deliver patches to repair birth defects such as holes in the wall of the heart.

The research team has filed a joint patent for this device, shared between MIT and NTUitive, NTU's innovation and enterprise company.

The commercial potential of the catheter system highlights NTU's commitment to innovation in its recently announced 2025 strategic plan, which aims to translate research into products and outcomes that enhance the quality of life.

Credit: 
Nanyang Technological University

Strong and flexible cofactors

In a number of biological processes, iron-sulfur clusters play a vital role, where they act as cofactors to enzymes. Research published in Angewandte Chemie now shows that cubic clusters can support unusual bonding states. This study shows that the cluster copes well with a multiple bond between iron and nitrogen--a structural motif that may be involved in biological nitrogen fixation.

Clusters made of iron and sulfur atoms are essential cofactors for a number of enzymes, especially in biological processes involving electron transfer. As an example, nitrogen-fixing bacteria use iron-sulfur clusters to convert nitrogen from the air into useful nitrogen compounds. To understand this important biological process, scientists dig deep into the bonding relationships possible between nitrogen and iron atoms in such clusters.

Daniel Suess and colleagues, from Massachusetts Institute of Technology in Cambridge, USA, have now investigated the cluster's capability to form unusual bonds between iron and nitrogen. A double bond, which is part of a chemical group called an imide, may play a role in nitrogen fixation.

To construct the imide, the team began by producing a cube-shaped iron-sulfur cluster. The eight corners of the cube are occupied by alternating iron and sulfur atoms; three of the iron atoms are protected by chemical species serving as ligands. These ligands do not bond directly to the atoms, and just shield them instead. The remaining unshielded iron atom of the cluster was bound to a replaceable chloride ligand. Careful selection of the reagents enabled the team to swap out the chloride ion and then, by oxidation with a nitrogen-containing reagent, the tricky double bond between the unique iron atom and the nitrogen atom--and thus the imide group--was formed.

The researchers expected that the iron-nitrogen double bond could strongly distort the cluster's structure. Instead, to their surprise, they observed only minor structural changes. The authors' spectroscopic studies explain this finding: the electron-rich imide pushes away electron density from the neighboring sulfur and iron atoms, and the totality of these minor effects is what allows the cluster to accommodate the imide bond. "These findings demonstrate a dynamic interplay between iron-nitrogen, iron-sulfur, and iron-iron bonding," state the authors.

The new imido-bound cluster was able to cleave weak carbon-hydrogen bonds from organic reagents. The authors intend to use these studies as a starting point for further investigation of the reactivity of imide-bound iron-sulfur clusters. "This highlights the promise of exploiting the synergy between the structural robustness and electronic flexibility of these fundamental cofactors," Suess says.

Credit: 
Wiley

Scientists warn: Humanity does not have effective tools to resist the tsunami

image: Maria Gritsevich recalled that the impact of an asteroid into the Gulf of Mexico 65 million years ago led to the extinction of a large number of animal species, including dinosaurs.

Image: 
UrFU / Grigory Tkachenko

An international team of scientists from 20 countries identified 47 problems that hinder the successful prevention and elimination of the consequences of the tsunami. Based on the carried out analysis, the world's leading experts on natural hazards have outlined directions for further scientific research. The research group's review is published in a special issue of the "Frontiers in Earth Science".

The main problems identified in the review are related to the large gaps and uncertainties in knowledge about tsunami, the lack of well-documented observations, and imperfect methods of processing available information. One of the reasons is the lack of coordination of the efforts of those countries for which the study and prediction of tsunamis, forecasting the corresponding risks, and preparation for repelling threats are vital.

"Generally accepted approaches have not yet been determined, potentially incompatible probabilistic methods are used in different regions of the world, and different sources of tsunamis are often considered independently of each other," said authors of the research.

Maria Gritsevich, senior researcher at the Extra Terra Consortium laboratory at the Ural Federal University and at the Finnish Geospatial Research Institute, adjunct professor in planetary sciences at the University of Helsinki points out that the asteroid-comet hazard is associated with the origin of the tsunami as well.

"Science knows more than one million asteroids in the solar system," says Maria Gritsevich. "In total, according to the estimates, more than 150 million asteroids exceeding 100 meters in size revolve around the Sun. Since the ocean occupies more than 70% of the Earth's surface, collision of any of these celestial bodies with our planet may cause a strong tsunami. Let's recall that the impact of an asteroid into the Gulf of Mexico 65 million years ago led to the extinction of a large number of animal species, including dinosaurs."

The main terrestrial sources of the origin of tsunamis are abnormally strong and rapid fluctuations in atmospheric pressure, volcanic eruptions and earthquakes (on land and underwater), crustal movement, and landslides. Often these forces are interconnected. However, humanity does not have reliable historical and detailed modern data to take into account the interdependence of these factors. This leads to a difficulty to predict the time and place of each next tsunami.

Moreover, due to the uncertainties, scientists studying the natural phenomena that cause tsunamis often ignore this connection. Although tsunamis can be even more destructive and deadly. According to the authors of the review, this approach is typical, for example, for volcanologists. As a result, systematic analysis of information about tsunamis in volcano studies is often omitted the authors of the review state. In addition, the power of computer technologies used to predict tsunamis is insufficient to meet the challenges. The numerical models themselves are too complex and costly.

Due to a combination of reasons, many coastal cities, especially in developing countries, are not ready to "receive" the tsunami, to adequately assess the possible damage and losses. This is reflected, for example, in the construction of buildings and structures. Schools and hospitals, industrial enterprises, harbors, roads and bridges, power plants (including atomic ones), gas and oil storage facilities, and various communications are under the threat of destruction. And most importantly, so are the lives of many people.

"Buildings are often used as evacuation shelters," says the authors of the review. "Tsunamis affect the lower floors of a high-rise building, while seismic loads affect the upper ones. But tsunami effects such as basement erosion and debris impact are rarely modeled. These effects remain to be investigated."

Thus, there is no clear idea about the potential economic damage and costs required to combat tsunamis and their consequences. The quality of disaster risk management - who and what, from what harm, at what cost, and how to protect - often leaves much to be desired. In most cases, assistance arrives late, leaving affected communities in a vulnerable position, especially in the first hours and days after the event, states the authors' review.

"We call for the creation and continuous enrichment of unified databases, for conduction of the necessary research and regular exchange of information, for improving the methods of analysis and modeling, and careful planning of actions in case of cascading natural disasters," says Maria Gritsevich. "We are convinced that with proper funding, with the availability of the necessary scientific equipment and technology, it is quite possible to bridge the gaps in understanding the tsunami phenomenon that we have identified."

Credit: 
Ural Federal University