Tech

Separating beer waste into proteins for foods, and fiber for biofuels

WASHINGTON, April 6, 2021 -- Home brewing enthusiasts and major manufacturers alike experience the same result of the beer-making process: mounds of leftover grain. Once all the flavor has been extracted from barley and other grains, what's left is a protein- and fiber-rich powder that is typically used in cattle feed or put in landfills. Today, scientists report a new way to extract the protein and fiber from brewer's spent grain and use it to create new types of protein sources, biofuels and more.

The researchers will present their results today at the spring meeting of the American Chemical Society (ACS). ACS Spring 2021 is being held online April 5-30. Live sessions will be hosted April 5-16, and on-demand and networking content will continue through April 30. The meeting features nearly 9,000 presentations on a wide range of science topics.

"There is a critical need in the brewing industry to reduce waste," says Haibo Huang, Ph.D., the project's principal investigator. His team partnered with local breweries to find a way to transform leftover grain into value-added products.

"Spent grain has a very high percentage of protein compared to other agricultural waste, so our goal was to find a novel way to extract and use it," says Yanhong He, a graduate student who is presenting the work at the meeting. Both Huang and He are at Virginia Polytechnic and State University (Virginia Tech).

Craft brewing has become more popular than ever in the U.S. This increased demand has led to an increase in production, generating a major uptick in waste material from breweries, 85% of which is spent grain. This byproduct comprises up to 30% protein and up to 70% fiber, and while cows and other animals may be able to digest spent grain, it is difficult for humans to digest it because of its high fiber content.

In order to transform this waste into something more functional, Huang and He developed a novel wet milling fractionation process to separate the protein from the fiber. Compared to other techniques, the new process is more efficient because the researchers do not have to dry the grain first. They tested three commercially available enzymes -- alcalase, neutrase and pepsin -- in this process and found that alcalase treatment provided the best separation without losing large amounts of either component. After a sieving step, the result was a protein concentrate and a fiber-rich product.

Up to 83% of the protein in the spent grain was recaptured in the protein concentrate. Initially the researchers proposed using the extracted protein as a cheaper, more sustainable replacement for fishmeal to feed farmed shrimp. But more recently, Huang and He have started to explore using the protein as an ingredient in food products, catering to the consumer demand for alternate protein sources.

However, that still left the remaining fiber-rich product without a specific use. Last year, Huang's postdoctoral researcher Joshua O'Hair, Ph.D., reported finding a new species of Bacillus lichenformis in a spring at Yellowstone National Park. In the paper, they noted that the bacteria could convert various sugars to 2,3-butanediol, a compound that is used to make many products, such as synthetic rubber, plasticizers and 2-butanol, a fuel. So, He pretreated the extracted fiber with sulfuric acid, then broke it down into sugars from cellulose and hemicellulose. She then fed the sugars to the microbe, producing 2,3-butanediol.

Next, the team plans to work on scaling up the process of separating the protein and fiber components in order to keep up with the volume of spent grain generated at breweries. They are also working with colleagues to determine the economic feasibility of the separation process, as the enzymes currently used to separate the protein and fiber components are expensive. Huang and He hope to find suitable enzymes and green chemicals to make this process even more sustainable, scalable and affordable.

Credit: 
American Chemical Society

COVID-19 pandemic highlights the urgent global need to control air pollution

image: COVID-19 health effects are a wake-up call to control air pollution.

Image: 
ATS

April 06, 2020-- A new commentary published online in the Annals of the American Thoracic Society provides an exhaustive examination of published research that discusses whether air pollution may be linked to worse COVID-19 outcomes. The studies that the authors examined look at several potential disease mechanisms, and also at the relationship between pollution, respiratory viruses and health disparities.

In "COVID-19 Pandemic: A Wake-Up Call for Clean Air," Stephen Andrew Mein, MD, Department of Medicine, Beth Israel Deaconess Medical Center, Boston, and colleagues discuss several ways that the COVID-19 pandemic highlights the urgent need to address the global problem of air pollution through sustainable local and national policies to improve respiratory health and equity worldwide. More than 91 percent of the world's population lives in areas that exceed the World Health Organization's air quality guidelines and more people are impacted by worsening air quality each year.

The commentary focuses on the health effects of ambient air pollution. Ambient air pollution consists of potentially harmful pollutants, such as small particles and toxic gases, emitted by industries, households, cars and trucks. International studies have shown that exposure to these pollutants worsens viral respiratory infections and new studies are showing a similar association with ambient pollution and COVID-19 outcomes.

"There are a multitude of studies showing that exposure to higher long-term ambient air pollution is associated with both increased risk of infection and death from COVID-19," Dr. Mein said. "Historically, air pollution has been linked with worse outcomes, including higher mortality, due to other respiratory viruses like influenza."

He added, "Research that we examined on pollution during the COVID-19 pandemic has found similar detrimental effects. New research on COVID-19 adds further evidence of the adverse effects of ambient air pollution and the urgent need to address the public health crisis of pollution."

One of the most prominent studies that the authors examined, in which COVID-19 mortality was modeled, found that each small (1 ?g/m3) increase in long-term fine inhalable particle (PM2.5) exposure was associated with an 8 percent increase in mortality during the pandemic.
Another study concluded that air pollution has contributed 15 percent to COVID-19 mortality worldwide.

"The studies we reviewed evaluated whether long-term, ambient air pollution exposure that occurred years prior to the pandemic was associated with worse COVID-19 outcomes," Dr. Mein stated.

The exact mechanisms for the association between long-term pollution and poor COVID-19 outcomes are not fully known. However, scientists have suggested several theories. Long-term exposure to air pollution may impair the immune system, leading to both increased susceptibility to viruses and more severe viral infections.

Higher air pollution exposure is associated with higher rates of heart disease and metabolic disorders such as diabetes, which are known to be risk factors for severe disease and death from COVID-19. These chronic effects would have occurred prior to the reported reductions in air pollution since the start of the COVID-19 pandemic.

A major point of the authors' commentary is that improved air quality (due to less travel and industrial activity) during the pandemic may have reduced morbidity and mortality from non-communicable diseases. "Research evaluating associations between the dramatic reduction in ambient air pollution during the global lockdowns and health care utilization for respiratory conditions would further confirm the impact of ambient air pollution on non-communicable diseases and the need to reduce air pollution to improve overall health."

The authors also noted that much of the research about ambient air pollution and the COVID-19 pandemic is just emerging. "While the primary association between air pollution and COVID-19 outcomes has been generally consistent, there is still much research to be done. In particular, there is a need for studies that adjust for individual-level risk factors, since current studies have been restricted to county or municipal-level exposure and outcome data. Research also needs to be conducted to evaluate whether air pollution is contributing to the stark differences in COVID-19 outcomes among minority groups."

Racially and ethnically diverse communities are more likely to be located in areas closer to industrial pollution such as PM2.5 and nitrogen dioxide, and to work in types of businesses that expose them to more air pollution. These inequalities in residential and occupational air pollution exposure may be one of the causes of the stark disparities of the COVID-19 pandemic along racial and ethnic lines.

In conclusion, the authors state, "The COVID-19 pandemic has highlighted the widespread health consequences of ambient air pollution, including acute effects on respiratory immune defenses and chronic effects that lead to higher risk of chronic cardiopulmonary disease and acute respiratory distress syndrome (ARDS). These chronic health effects likely explain the higher COVID-19 mortality among those exposed to more air pollution. The pandemic has also provided a glimpse into the health benefits of cleaner air. As we emerge from this devastating public health crisis, COVID-19 is a wakeup call for the need to adopt stricter air quality standards and end our tolerance for pollution in disadvantaged neighborhoods. As part of our post-COVID-19 recovery, we must clean up the air to improve respiratory health and equality worldwide."

Credit: 
American Thoracic Society

Canada-wide ban on menthol cigarettes leads to significant increases in quitting among smokers

Bans on menthol cigarettes across Canada from 2016 to 2017 led to a significant increase in the number of smokers who attempted to quit, smokers who quit successfully, and lower rates of relapse among former smokers, according to a new research study from the International Tobacco Control Policy Evaluation Project (the ITC Project) at the University of Waterloo.

Menthol is the most common flavoring for cigarettes in many countries. Menthol creates a cooling sensation, which reduces the harshness of cigarette smoke. Because of this, menthol leads to increased experimentation and progression to regular smoking among new smokers, especially among youth.

"Our study demonstrates the substantial benefits of banning menthol cigarettes," said Geoffrey T. Fong, Professor of Psychology and Public Health and Health Systems at Waterloo, and principal investigator of the ITC Project. "The enormous success of the Canadian menthol ban makes it even clearer now that the U.S. should finally ban menthol, which the tobacco industry has used for decades to attract new smokers and to keep many of them as customers, especially among the African-American community.

"The positive effects of the Canada menthol ban suggest that a U.S. menthol ban would lead to greater benefits since menthol cigarettes are much more popular in the U.S. From our findings, we estimate that banning menthol cigarettes in the U.S. would lead an additional 923,000 smokers to quit, including 230,000 African-American smokers."

The study conducted by Fong and his team examined the impact of menthol bans across seven Canadian provinces, covering 83 per cent of the Canadian population, which saw menthol cigarettes banned between August 2016 and October 2017. Canada was the one of the first countries to implement a ban on menthol cigarettes, and the first country where a menthol ban has been evaluated.

"The Canadian menthol ban did not lead to a high level of illicit menthol cigarette purchasing, which has been a concern by regulators considering a menthol ban," said Fong. "Fewer than 10 per cent of menthol smokers reported still smoking a menthol brand after the ban."

Scientific reviews conducted by the Tobacco Products Scientific Advisory Committee to the U.S. Food and Drug Administration (FDA), the FDA itself, and the World Health Organization have also concluded that banning menthol would have significant public health benefits. 

The harms of menthol cigarettes in the U.S. have been much greater among African-Americans. Menthol cigarettes are smoked by 85 per cent of African-American smokers, over 2.8 times the percentage of menthols among white smokers.

A national sample of 1098 non-menthol and 138 menthol smokers participating in the ITC Canada Smoking and Vaping Survey were surveyed both before the menthol ban (in 2016) and after the menthol ban (in 2018).

The survey demonstrated three benefits of the Canadian menthol ban. Menthol smokers were significantly more likely than non-menthol smokers to attempt to quit after the menthol ban (58.7 per cent vs. 49 per cent). 

Daily menthol smokers were almost twice as likely than daily non-menthol smokers to quit after the menthol ban (21 per cent vs. 11.6 per cent). 

Finally, those menthol smokers who had quit smoking before the menthol ban were significantly less likely than non-menthol smokers who had quit smoking to have relapsed back to smoking.

Credit: 
University of Waterloo

Cancer discovery could revive failed treatments for solid tumors

image: Research from UVA's Jogender Tushir-Singh, PhD, explains why the antibody approaches effectively killed cancer tumors in lab tests but proved ineffective in people.

Image: 
Dan Addison | UVA Communications

New research from UVA Cancer Center could rescue once-promising immunotherapies for treating solid cancer tumors, such as ovarian, colon and triple-negative breast cancer, that ultimately failed in human clinical trials.

The research from Jogender Tushir-Singh, PhD, explains why the antibody approaches effectively killed cancer tumors in lab tests but proved ineffective in people. He found that the approaches had an unintended effect on the human immune system that potentially disabled the immune response they sought to enhance.

The new findings allowed Tushir-Singh to increase the approaches' effectiveness significantly in lab models, reducing tumor size and improving overall survival. The promising results suggest the renewed potential for the strategies in human patients, he and his team report.

"So far, researchers and protein engineers around the globe, including our research group, were focused on super-charging and super-activating tumor cell-death receptor targeting antibodies in the fight against cancer. Here at UVA, we took a comprehensive approach to harness the power of the immune system to create dual-specificity and potentially clinically effective oncologic therapeutics for solid tumors," said Tushir-Singh, of the UVA School of Medicine's Department of Biochemistry and Molecular Genetics. "Our findings also have significant potential to improve further the clinical efficacy of currently FDA-approved PD-L1 targeting antibodies in solid tumors, particularly the ones approved for deadly triple-negative breast cancer."

Immunotherapy for Solid Tumors

Immunotherapy aims to harness the body's immune system to recognize and destroy cancer cells. Lab-engineered antibodies remain the core facilitator of immunotherapies and CAR T-cell therapies, which have generated tremendous excitement in the last decade. But these therapies have proved less effective against solid tumors than against melanoma (skin cancer) and leukemia (blood cancers). One major obstacle: It is difficult for immune cells to make their way efficiently into the core of solid tumors.

To overcome that problem, scientists have developed an approach that selectively uses antibodies to target a receptor on the cancer cells' surface called death receptor-5 (DR5). This approach essentially tells the cancer cells to die and enhances the permeation of the body's immune cells into a solid tumor. And it does so without the toxicity associated with chemotherapy.

Previously tested DR5-targeting antibodies have worked very well in lab tests and reduced tumor size in immune-deficient mouse models. But when tested in phase-II human clinical trials, these antibodies consistently failed to improve survival in patients - despite many big-name pharmaceutical companies spending billions of dollars on them.

Tushir-Singh, an antibody engineer, and his collaborators wanted to understand what was happening - why didn't this promising approach work in patients who need it desperately? They found that the anti-DR5 antibody approaches unintentionally triggered biological processes that suppress the body's immune response. This allowed the cancer tumors to evade the immune system and continue to grow.

Tushir-Singh and his team could restore the potency of the DR5-based antibody approach in human cancer cells and immune-sufficient mouse models by co-targeting the negative biological processes with improved, immune-activating therapy. The new combination therapy "markedly" increased the effectiveness of cancer killer immune cells known as T cells, shrinking tumors and improving survival in lab mice, they report in a new scientific paper.

That is an encouraging sign for the combination therapy's potential in patients with solid tumors, such as ovarian cancer and triple-negative breast cancer - the deadliest cancers in women.

"We would like to see these strategies in clinical trials, which we strongly believe have huge potential in solid tumors," Tushir-Singh said. "Our findings are extraordinary: Along with the translational impact, our work also explains, after more than 60 years of research in the field, why most approaches targeting apoptosis [cell death] have not done well in clinical trials and ultimately develop resistance to therapies."

Credit: 
University of Virginia Health System

Origins of life could have started with DNA-like XNAs

image: Some scientists think that XNA evolved into RNA, which then evolved into DNA, forming the very beginnings of life.

Image: 
Keiji Murayama

Nagoya University scientists in Japan have demonstrated how DNA-like molecules could have come together as a precursor to the origins of life. The findings, published in the journal Nature Communications, not only suggest how life might have begun, but also have implications for the development of artificial life and biotechnology applications.

"The RNA world is widely thought to be a stage in the origin of life," says Nagoya University biomolecular engineer Keiji Murayama. "Before this stage, the pre-RNA world may have been based on molecules called xeno nucleic acids (XNAs). Unlike RNA, however, XNA replication probably didn't require enzymes. We were able to synthesize an XNA without enzymes, strongly supporting the hypothesis that an XNA world might have existed before the RNA world."

XNAs are formed of chains of linked nucleotides, similar to DNA and RNA but with a different sugar backbone. XNAs can carry genetic code very stably because the human body can't break them down. Some researchers have reported that XNAs containing specific sequences can act as enzymes and bind to proteins. This makes XNAs exciting in the field of synthetic genetics, with potential biotechnology and molecular medicine applications.

Murayama, Hiroyuki Asanuma and colleagues wanted to find out if conditions likely present on early Earth could have led to XNA chain formation. They synthesized fragments of acyclic (non-circular) L-threoninol nucleic acid (L-aTNA), a molecule that is thought to have existed before RNA came to be. They also made a longer L-aTNA with a nucleobase sequence that complemented the sequences of the fragments, similar to how DNA strands match up.

When placed together in a test tube under controlled temperature, the shorter L-aTNA fragments came together and linked up with each other on the longer L-aTNA template. Critically, this happened in the presence of a compound, called N-cyanoimidazole, and a metal ion, like manganese, both of which were possibly present in early Earth. The fragments interlinked when a phosphate at the end of one chemically attached to a hydroxyl group at the end of its neighbour, without the help of an enzyme.

"To the best of our knowledge, this is the first demonstration of template-driven, enzyme-free extension of acyclic XNA from a random fragment pool, generating phosphodiester bonding," says Murayama.

The team also demonstrated that L-aTNA fragments could interlink on DNA and RNA templates. This suggests that genetic code could be transferred from DNA and RNA onto L-aTNA and vice versa.

"Our strategy is an attractive system for experimenting with the construction of artificial life and the development of highly functional biological tools composed of acyclic XNA," says Murayama. "The data also indicate that L-aTNA could have been an RNA precursor."

The team plans to continue their investigations to clarify whether L-aTNA could have been synthesized in early Earth 'pre-life' conditions and to examine their potential for developing advanced biological tools.

Credit: 
Nagoya University

Mapping North Carolina's ghost forests from 430 miles up

image: Emily Ury measures soil salinity in a ghost forest.

Image: 
Photo by Emily Bernhardt, Duke University

DURHAM, N.C. -- Emily Ury remembers the first time she saw them. She was heading east from Columbia, North Carolina, on the flat, low-lying stretch of U.S. Highway 64 toward the Outer Banks. Sticking out of the marsh on one side of the road were not one but hundreds dead trees and stumps, the relic of a once-healthy forest that had been overrun by the inland creep of seawater.

"I was like, 'Whoa.' No leaves; no branches. The trees were literally just trunks. As far as the eye could see," said Ury, who recently earned a biology Ph.D. at Duke University working with professors Emily Bernhardt and Justin Wright.

In bottomlands throughout the U.S. East Coast, trees are dying off as rising seas and higher storm surges push saltwater farther inland, poisoning soils far from shore.

While these "ghost forests" are becoming a more common sight in North Carolina's coastal plain, scientists had only a rough idea of their extent. Now, satellite images are providing new answers.

In a study published April 4 in the journal Ecological Applications, a Duke-led team mined 35 years of satellite images of a 245,000-acre area in the state's Albemarle-Pamlico Peninsula.

The images show that, between 1985 and 2019, 11% of the area's tree cover was taken over by ghost forests. Instead of mirroring the gradual pace of sea level rise, most of this spread occurred abruptly in the wake of extreme weather events such as hurricanes and droughts, which can concentrate salts or send them surging into the region's interior.

The study focused on the Alligator River National Wildlife Refuge, which was established in 1984 to protect the area's unique forested wetlands and the endangered red wolves, red-cockaded woodpeckers and other wildlife that live there.

Here, the Duke team is monitoring what Bernhardt and other researchers call "the leading edge of climate change."

From 1900 to 2000, the sea rose about a foot in this part of coastal North Carolina, faster than the global average. By the end of this century, it could rise two to five feet more.

Shrinking shorelines dominate most discussions of sea-level rise, as oceans submerge coastlines and chew away at beachfront property. Yet less talked about is what's happening farther inland.

Long before beaches shrink and disappear under the rising sea, seawater starts creeping into low-lying regions.

Most of the Alligator River National Wildlife refuge sits less than two feet above sea level, "which makes it all the more vulnerable to sea level rise," Ury said.

Add to that the hundreds of miles of ditches and canals that crisscross the region. Built during the mid-1900s to drain water out, they now act as a conduit for seawater -- which is about 400 times saltier than freshwater -- to flow in.

With no barriers in the way, seawater gets pushed inland through these channels, leaving its salty fingerprints on the soils. As the salt moves in, it draws water out of plant cells and strips seeds of their moisture, making it harder for new tree seedlings to sprout. Salt-sensitive tree species first fail to reproduce and eventually die off, as freshwater forest turns to salt marsh.

Using pictures taken by 430-mile-high Landsat satellites, the team was able to map the spread of ghost forests in the refuge over time.

Each pixel in the satellite images represents the wavelengths of light bouncing off the Earth below, in an area on the ground roughly the size of a baseball diamond.

The team fed the satellite images to a computer algorithm, which in turn analyzed each pixel and determined whether it was dominated by dominated by pines, hardwoods, shrubs, grassy marsh, open water or dead trees. Any pixel with as many as 20 to 40 visibly dead trees present at once was labeled as ghost forest.

The view from space changed over the 35 years of the study.

More than three-fourths of the study area was covered in trees in 1985. Since then, even without any logging or development, the refuge has lost more than 46,950 acres of forest, or a quarter of its 1985 tree cover.

More than half of these losses occurred in the interior of the refuge, more than a kilometer from any coast, the study revealed.

"It's not just the fringe that's getting wetter," Ury said.

Of the more than 21,000 acres of ghost forest that formed between 1985 and 2019, the most noticeable die-off was in 2012. The area had just endured a five-year drought and then a potent strike by Hurricane Irene in 2011, when a 6-foot wall of seawater was pushed ashore. The storm surge swept across the refuge, cresting over Highway 264, more than 1.2 miles inland from the coast. Within months, entire stands of dying and downed trees were visible from space.

What is happening in eastern North Carolina is happening elsewhere, too, the researchers say. In coastal regions across the globe, saltwater is starting to reach areas that haven't seen it before, even reducing crop yields and jeopardizing freshwater aquifers that people rely on for drinking water.

The Duke team is collaborating with other researchers to expand their study to other parts of the Atlantic and Gulf coastal plains, from Cape Cod to Texas.

"Because of its geological location, North Carolina is just ahead of other coastal areas in terms of how far sea level rise has progressed," Ury said. "Lessons learned here could help manage similar transitions in other places," or pinpoint areas that are likely to be vulnerable in the future.

Credit: 
Duke University

Transportation noise pollution and cardio- and cerebrovascular disease

Epidemiological studies have found that transportation noise increases the risk of cardiovascular morbidity and mortality, with high-quality evidence for ischaemic heart disease. According to the WHO, ?1.6 million healthy life-years are lost annually from traffic-related noise in Western Europe. Traffic noise at night causes fragmentation and shortening of sleep, elevation of stress hormone levels, and increased oxidative stress in the vasculature and the brain. These factors can promote vascular dysfunction, inflammation and hypertension, thereby elevating the risk of cardiovascular disease. In this Review, the authors such as Mette Sørensen from the Danish Cancer Society, Copenhagen, Denmark and the Department of Natural Science and Environment, Roskilde University, Denmark as well as Thomas Münzel MD and Andreas Daiber PhD from the University Medical Center Mainz at the Johannes Gutenberg University, Mainz Germany focus on the indirect, non-auditory cardiovascular health effects of transportation noise. They provide an updated overview of epidemiological research on the effects of transportation noise on cardiovascular risk factors and disease, discuss the mechanistic insights from the latest clinical and experimental studies, and propose new risk markers to address noise-induced cardiovascular effects in the general population. The authors also explain, in detail, the potential effects of noise on alterations of gene networks, epigenetic pathways, gut microbiota, circadian rhythm, signal transduction along the neuronal-cardiovascular axis, oxidative stress, inflammation and metabolism. Lastly, they describe current and future noise-mitigation strategies and evaluate the status of the existing evidence on noise as a cardiovascular risk factor.

Thomas Münzel, MD, lead author of the review and director of Cardiology at University Medical Center Mainz, Johannes Gutenberg University, Mainz, Germany, said, "as the percentage of the population exposed to detrimental levels of transportation noise will rise again when the COVID pandemic is over, noise mitigation efforts and legislation to reduce noise are highly important for future public health."
(DOI: 10.1038/s41569-021-00532-5)

Credit: 
Dpt of Cardiology - University Medical Center Mainz

Human activities sound an alarm for sea life

image: Marine ecosystems are severely impacted by noise pollution generated by shipping vessels, seismic surveys searching for oil and gas, sonar mapping of the ocean floor, coastal construction and wind farms.

Image: 
© 2021 Morgan Bennett Smith

Humans have altered the ocean soundscape by drowning out natural noises relied upon by many marine animals, from shrimp to sharks.

Sound travels fast and far in water, and sea creatures use sound to communicate, navigate, hunt, hide and mate. Since the industrial revolution, humans have introduced their own underwater cacophony from shipping vessels, seismic surveys searching for oil and gas, sonar mapping of the ocean floor, coastal construction and wind farms. Global warming could further alter the ocean soundscape as the melting Arctic opens up more shipping routes and wind and rainfall patterns change.

Yet noise has been conspicuously absent from global assessments of ocean health.

A team led by Carlos Duarte, distinguished professor at KAUST, trawled more than 10,000 papers and extracted the most rigorous quantitative studies of how noise affects marine animals. "The research goes back nearly 50 years, but this is the first time all the scattered evidence has been assembled and systematically assessed," says Duarte.

Examining the literature, Duarte's team sought to describe ocean soundscapes, how these soundscapes have changed in the Anthropocene, the negative impacts they have on marine animals, and possible solutions. "We were stunned by the contrast between the wealth of evidence and the general neglect of the problem in scientific debates and policymaking," says Duarte.

Of the 538 carefully chosen papers, 90 percent revealed significant impacts of human sounds on marine mammals, while 80 percent identified effects on fish and invertebrates, such as jellyfish. "But there's still a gap in our understanding of the impacts on diving birds and sea turtles," adds Duarte.

Shipping is a widespread problem, disrupting travel, foraging, communication behaviors and the ability of young fish to learn to avoid predators. Particularly concerning was the effect of dampened soundscapes on the hearing of larvae trying to navigate to suitable habitats. "If they miss the call home, they will likely starve to death or be eaten," says Duarte.

Noise pollution has driven many marine animals from their natural territory, although escape is not always possible for species with specific ranges, such as the endangered Maui dolphin. In 2020, when global lockdowns were enforced during the COVID-19 pandemic, noise from shipping fell by 20 percent, and dolphins and sharks were spotted swimming through formerly busy, noisy waterways. "This is promising evidence of an almost immediate response of marine life to a relaxing of acoustic pressures," says Duarte.

The researchers highlight ways to alleviate anthropogenic noise, such as reducing shipping speeds, fitting boats with quieter propellers and using floating wind turbines. "Retrofitting only one-tenth of the noisiest boats with better propellers would have widespread benefits," says Duarte. However, marine ecosystems and noise pollution cross international boundaries, meaning that solutions require binding global agreements for restoring a sustainable ocean economy.

Credit: 
King Abdullah University of Science & Technology (KAUST)

TPU scientists first obtain high-entropy carbide in electric arc plasma

Scientists of Tomsk Polytechnic University have synthetized high-entropy carbide consisting of five various metals using a vacuum-free electric arc method. The research findings are published in the Journal of Engineering Physics and Thermophysics.

High-entropy carbides are a new class of materials simultaneously consisting of four or more various metals and carbon. Their main feature lies in the capability to endure high temperatures and energy flux densities. Combining various elements in the composition, it is possible to obtain the required mix of features (melting point, oxidation temperature, specific weight and others).

"High-entropy materials are called in such a way due to a relatively high degree of disorder in the crystalline lattice, as an atom of every chemical element possesses a certain size in the crystalline lattices.

It causes structural distortions and can positively affect material properties," Alexander Pak, Research Fellow of the TPU Research Center - Ecoenergy 4.0, explains.

The TPU scientists managed to synthetize high-entropy carbide consisting of Ti, Zr, Nb, Hf, Ta and C. Carbide was obtained using a vacuum-free electric arc synthesis. High temperatures are required for a reaction, in order, every primary component interacting with C connects to the face-centered cubic lattice and forms ultra-refractory carbide. The scientists use electric arc plasma to obtain it.

"We became the first who could obtain high-entropy carbide using a vacuum-free electric arc method. It is a great rarity and success for us to synthetize a material that has recently been discovered and to use our method at electric arc reactors created by our research team.

We are planning to improve a synthesis process to obtain a clearer and uncontaminated material, to reduce energy intensity, as well as to research material properties and synthetize high-entropy carbides of the other chemical composition," Alexander Pak adds.

Credit: 
Tomsk Polytechnic University

Distinctive MJO activity during 2015/2016 super El Niño

image: Warm SST could increase water vapor in the troposphere, stimulating convection.

Image: 
Wenjun Zhang

El Niño-Southern Oscillation (ENSO) is one of the most prominent ocean-atmosphere interactions that varies year-to-year. This process exerts significant impacts on global weather and climate. El Niño is the warm phase of ENSO, which can be strong, moderate, or even weak. Within the past four decades, climatologists observed three super El Niño events (1982/83, 1997/98 and 2015/16). These extreme phases impacted global climate far more than moderate or weak events.

El Niño has a profound effect on the Madden-Julian Oscillation (MJO), which is the most significant sub-seasonal variability element of the tropic atmosphere. The MJO is a major force that drives monsoon sub-seasonal variability, bringing sustained wet or dry weather to Asia. Extreme El Niño events have similar severity and evolution processes. However, scientists have sought to understand whether El Niño altered the behavior of the MJO in the same way during each of these three super El Niño winters.

A research group, led by Dr. Wenjun Zhang from the Nanjing University of Information Science and Technology analyzed MJO activity of the super El Niño event during the Northern Hemisphere winter of 2015/16. Observations show that the western Pacific MJO activity was strongly suppressed during the peak phase of the 1982/83 and 1997/98 super El Niño events. However, during the crest of the 2015/16 super El Niño event, western Pacific MJO-related convection was enhanced.

"It is apparent that the enhanced western Pacific MJO is mainly related to its sea surface temperature (SST) anomaly distribution and the associated background thermodynamic conditions." said Dr. Zhang. His team's complete research and data are published in Advances in Atmospheric Sciences.

When compared to the previous super El Niño events, the warm SST anomaly, or change from average, of the 2015/16 El Niño was located more westward than during the other two extreme seasons. Additionally, no significant cold SST anomaly was detected in the western Pacific. Accordingly, the moisture and air temperature tended to increase in the central-western Pacific during the winter of 2015/16 unlike the previous super El Niño events.

This research highlights that climatologists should consider the SST anomaly distribution of super El Niño events for future MJO activity studies.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Silencing vibrations in the ground and sounds underwater

Metamaterials that can control the refractive direction of light or absorb it to enable invisible cloaks are gaining attention. Recently, a research team at POSTECH has designed a metasurface that can control the acoustic or elastic waves. It is gaining attention as it can be used to escape from threatening earthquakes or build submarines untraceable to SONAR.

Professor Junsuk Rho of POSTECH's departments of mechanical engineering and chemical engineering and Ph.D. candidate Dongwoo Lee of the Department of Mechanical Engineering in collaboration with Professor Jensen Li of HKUST have designed an artificial structure that can control not only the domain of underwater sound but also of vibration. The research team has presented an underwater stealth metasurface independent from SONAR by controlling the acoustic resonance to absorb the wave. They also confirmed that the wave propagation through a curved plate, such as vibrations, can be drastically altered; and have presented a methodology that can actually achieve the cloaking effect with singularity of infinite refractive index, which has been considered impossible to demonstrate until now. These research findings were recently published in Journal of Applied Physics and Physical Review Applied, respectively.

When light encounters a substance in nature, it generally refracts in the positive (+) direction. Metamaterials can design this refractive of light as a negative (-) direction, a zero refractive index (0) that allows complete transmission, or a complete absorber. This is the reason why things look transparent when they encounter metamaterials.

The research team theoretically confirmed a metasurface that can significantly absorb sound waves without reflecting them by tailoring the resonance of sound. Through layers of split-orifice-conduit (SOC) hybrid resonators, a thin metasurface was designed to absorb sound waves in broadband (14 kHz to 17 kHz). The metasurface designed this way can achieve underwater stealth capability untraceable by SONAR, which detects objects using the information between transmitted and reflected waves.

The research team confirmed that it is possible to transmit or change the direction of elastic waves - like seismic waves - according to the design of curved plates. Applying Albert Einstein's general theory of relativity which states that the path of light changes in the warping of space-time due to the change in the gravitational field caused by mass, the research team proposed a platform capable of extremely controlling elastic waves on a curved plate. As an example, a refractive index singularity lens, which is a metasurface lens that approaches near-zero thickness, is implemented to demonstrate an elastic version of Eaton lenses that can be bent at 90 and 180 degrees in the broad frequency range (15 kHz to 18 kHz) on thin curved plates.

In addition, by proposing a methodology that can actually implement the cloaking effect in which the singularity exists in theory, it is expected that extreme celestial phenomena such as black hole enabled by gravitational fields can be explored as a test bed in the elastic platform in the near future. Based on this understanding of the singularity of refractive index, it is anticipated to enable technologies that protect nuclear power plants or buildings from earthquakes or to control wave energy generated when tectonic plates collide or split.

"Until now, metamaterial research has focused on light and electromagnetic waves, but we have confirmed that it can be applied to sound waves and seismic waves," remarked Professor Junsuk Rho, renown worldwide for his research on metamaterials. "We anticipate that it will be applicable to building untraceable submarines or nuclear power plants that can stay intact even during earthquakes."

Credit: 
Pohang University of Science & Technology (POSTECH)

Longer stay, greater costs related to late-week laminectomy & discharge to specialty care

image: Box-and-whisker plot depicting predicted lengths of hospital stay with respect to discharge disposition of patients surgically treated early or late in the week. Acute Rehab = acute rehabilitation center; Early Week = Monday + Tuesday + Wednesday; Late Week = Thursday + Friday; SNF = skilled nursing facility.

Image: 
Copyright 2021 AANS.

CHARLOTTESVILLE, VA (APRIL 6, 2021). New research by a team from the Cleveland Clinic and the London School of Economics and Political Science (LSE) has determined that surgeries performed late in the workweek, and those culminating in discharge to a specialty care facility, are associated with higher costs and unnecessarily longer stays in the hospital following a common elective spine surgery.

Sebastian Salas-Vega, PhD, and colleagues retrospectively reviewed the data for all adult patients who underwent elective lumbar laminectomy over a nearly three-year period at any Ohio hospital included within the Cleveland Clinic system. The laminectomies were performed for degenerative stenosis of the lower spine and included both open and minimally invasive procedures. The surgeries were performed to release pressure on spinal nerves at one or more sites in the lumbar spine, reducing patients' pain and improving their neurological function.

Following surgery, recuperation in the hospital over several days can be very expensive and, in some cases, unnecessary. In efforts to curb expenses, shortening the hospital length of stay is paramount--as long as it is clinically justified and does not negatively impact patient outcomes.

The authors' goal in this study was to distinguish between clinical and nonclinical factors that drive the length of a patient's stay in the hospital for this common spine procedure and to understand the relationships between these factors and both patient outcomes and hospital costs. The findings can be found in a new article, "Late-week surgery and discharge to specialty care associated with higher costs and longer lengths of stay after elective lumbar laminectomy," published today in the Journal of Neurosurgery: Spine.

Using generalized linear modeling, the researchers assessed relationships between the day of the week when surgery was performed and the length of hospital stay (LOS) and patient discharge to specialized care, while making adjustments for underlying patient health risks and other non-clinical factors, including health insurance and the facility where the surgery was performed.

The researchers found that patients' mean LOS varied according to what day of the workweek the surgery was performed: 2.01 days on Monday, 2.04 days on Tuesday, 2.16 days on Wednesday, 2.64 days on Thursday, and 2.47 days on Friday. For patients treated on Thursdays or Fridays, the mean LOS was approximately 20% to 30% longer than that for patients treated on Mondays or Tuesdays.

The association between surgery performed late in the week (Thursday or Friday) and a prolonged LOS was apparent regardless of the extent of patients' comorbidities or the occurrence of postoperative complications. These findings also "persisted even after adjusting for patient demographics, ... hospital surgery site, and insurance."

The paper reveals that total costs of care also vary according to when surgery is performed, with surgical costs being about 20% higher for patients who undergo surgery on Friday than for patients treated on Monday. Among patients who are later discharged to specialized care facilities, the costs are 24% higher for those treated late in the week than for those treated earlier. The trend for higher costs among patients treated late in the week is also observed across all health insurance groups (private/commercial, Medicare, Medicaid, and so forth), pointing to potential cost savings irrespective of insurer.

The researchers suggest that costs and lengths of hospital stay could be lowered by improving the surgical scheduling process and care coordination. They believe that predictive analytics could, for instance, assist in identifying which patients might be more likely to require specialty care after hospital discharge. Surgeries for those patients could then be scheduled earlier in the week, helping to avoid weekend discharge and resulting in shorter lengths of stay and lower costs to patients, insurers, and healthcare providers.

When asked about the findings of this study, Dr. Salas-Vega responded, "Without considering what care is delivered, our study suggests that changing how it is scheduled can help improve patients' surgical experience, reduce hospital lengths of stay, and lower costs. These findings remind us that even some of the more mundane aspects of patient care can significantly impact their hospital course. If the ambition for providers around the world is to deliver the highest quality care, optimizing surgical scheduling and care coordination may represent low-hanging fruit among efforts to reduce unnecessarily long postoperative stays."

Credit: 
Journal of Neurosurgery Publishing Group

Simple fetal heartbeat monitoring still best to reduce unnecessary cesarean sections

Newer is not always better; a study in CMAJ (Canadian Medical Association Journal) led by researchers at the University of Warwick shows that simple fetal heartbeat monitoring is still the best method for determining whether a baby is in distress during delivery and whether cesarean delivery is needed http://www.cmaj.ca/lookup/doi/10.1503/cmaj.202538.

Cesarean delivery is the most common surgical procedure worldwide, performed to expedite birth and avoid neonatal complications.

Listening to the fetal heart rate using a stethoscope -- intermittent auscultation -- has been used for years to assess the fetal state and whether the baby is experiencing distress that might require a cesarean delivery. Other monitoring techniques have become common in recent years, including echocardiograms and blood tests.

"Despite extensive investment in clinical research, the overall effectiveness of such methods in improving maternal and neonatal outcomes remains debatable as stillbirth rates have plateaued worldwide, while cesarean delivery rates continue to rise," writes Dr. Bassel Al Wattar, Warwick Medical School, University of Warwick, Coventry, United Kingdom, with coauthors.

Researchers from the United Kingdom and Spain reviewed 33 studies that included more than 118,000 women, mainly from high-income countries as well as India and Tanzania, to evaluate the effectiveness of different monitoring methods in improving outcomes for mothers and babies and reducing the number of cesarean deliveries.

They found that all methods had similar outcomes for babies, but only intermittent auscultation reduced the risk of cesarean deliveries without increased risk to babies' health. The researchers estimate that intermittent auscultation led to an average 30% reduction in emergency cesareans compared to other methods.

"Our analysis suggests that all additional methods introduced to improve the accuracy of electronic fetal heart monitoring have failed to reduce the risk of adverse neonatal or maternal outcomes beyond what intermittent auscultation achieved 50 years ago, and this may have contributed to the increased incidence of unnecessary emergency cesarean deliveries," write the authors.

The authors urge investment in developing novel techniques to monitor fetuses to make delivery safer for mothers and their babies.

Credit: 
Canadian Medical Association Journal

Cannabis legalization and link to increase in fatal collisions

Legalization of recreational cannabis may be associated with an increase in fatal motor vehicle collisions based on data from the United States, and authors discuss the implications for Canada in an analysis in CMAJ (Canadian Medical Association Journal).

"Analyses of data suggest that legalization of recreational cannabis in United States jurisdictions may be associated with a small but significant increase in fatal motor vehicle collisions and fatalities, which, if extrapolated to the Canadian context, could result in as many as 308 additional driving fatalities annually," says Ms. Sarah Windle, Lady Davis Institute/McGill University, Montreal, Quebec, with coauthors.

In Canada, the number of people reporting cannabis consumption increased from 14% in 2018 (before legalization) to 17% in 2019 (after legalization). Among cannabis users with a driver's licence, 13% reported driving within 2 hours of cannabis consumption, with the number of individuals who reported driving after recent cannabis use increasing from 573,000 to 622,000. An analysis of 2012 data estimated the cost of cannabis-related collisions in Canada to be $1.1 billion annually in societal and economic costs, with drivers aged 34 years and younger responsible for the bulk of the costs.

Health care providers can play a role in educating patients, and the authors suggest resources to help.

"Health care professionals have an opportunity to educate patients about the safer use of cannabis products, including advising against cannabis use and driving (especially in combination with alcohol), with a suggested wait time of at least 6 hours before driving," the authors say.

Government regulation and public awareness could also help reduce the risk of injuries and deaths from driving after cannabis use.

"Implementation of impaired driving regulations and educational campaigns, including federal THC driving limits and public awareness of these limits, may contribute to the prevention of potential increases in cannabis-impaired driving in Canada," the authors conclude.

Credit: 
Canadian Medical Association Journal

Making the case for adjusting quality measures for social risk factors

image: David Nerenz, Ph.D., Director Emeritus of Henry Ford Health System's Center for Policy and Health Services Research and the study's lead author.

Image: 
Henry Ford Health System

Making the Case for Adjusting Quality Measures for Social Risk Factors
Henry Ford Health System-led report says adjustments would enhance quality.

DETROIT (April 5, 2021) - A new analysis by a team of researchers led by Dr. David Nerenz of Henry Ford Health System suggests that accounting for social risk factors like poverty, housing instability and transportation insecurity can have meaningful impact on healthcare quality measures without compromising quality of care.

In a report published today in Health Affairs, researchers make the case for using social risk factors in specific circumstances to "level the playing field" for adjusting quality measures used in quality reporting and value-based purchasing programs. Social risk adjustment would apply to scenarios in which providers cannot mitigate the impact of social risk factors and when those risk factors directly impact care outcomes.

"We acknowledge the challenges in knowing when social risk adjustment is appropriate, but we have shown when it should be done and not done," said David Nerenz, Ph.D., Director Emeritus of Henry Ford's Center for Policy and Health Services Research and the study's lead author.

"In our analysis we show that adjusting for social risk factors will not necessarily mask or excuse poor quality. Instead, it can demonstrate exceptional levels of quality among safety-net providers."

Adjusting quality measures for social risk factors has been the focus of ongoing debate for the last decade since the Medicare Hospital Readmission Reduction Program (HRRP) was shown to disproportionately penalize safety-net hospitals financially. Safety net hospitals have long held they are unfairly penalized because social risk factors put their patients at higher risk for readmission and are not adjusted for in HRRP.

In 2014, an expert panel convened by the National Quality Forum recommended using social risk factors under certain circumstances. However, the Assistant Secretary of Planning and Evaluation, the principal advisor to the secretary of the U.S. Department of Health and Human Services, issued a report in June 2020 opposing adjustments for social risk factors.

"This has been a contentious issue, but there is growing consensus that social risk matters," said
Karen Joynt Maddox, M.D., MPH, a co-author of the Health Affairs report and Assistant Professor of Medicine in the Cardiovascular Division at Washington University's School of Medicine in Missouri.

"To move towards high-value care that improves the health of populations, equity has to be central rather than an afterthought. We this analysis moves the needle on making that happen."

Dr. Nerenz and Dr. Maddox led a study published in Health Services Research in 2019 that found a risk adjustment model including social factors could reduce the financial penalty for at least half of all safety-net hospitals, which care for patients regardless of their insurance status or ability to pay. In some cases, the study said, the adjustment model could render them free from any penalty.

Conversely, more affluent hospitals - those that care for higher-income, better-educated patients - could see their penalty for readmission rates increase.

Quality measures like mortality rates, readmission rates, complication rates, and average functional improvement are used to compare doctors, hospitals, home health agencies, and health plans. These measures are then applied to determine financial rewards and penalties for providers that perform relatively well or poorly.

Because providers don't treat the same mix of patients, and some patients are at higher or lower risk of poor outcome than others, some form of statistical adjustment is applied to level of the playing field for making comparisons, Dr. Nerenz said.

"While clinical risk factors like age, presence of other illnesses, and severity of illness are used routinely in these adjustments, social risk factors have not," Dr. Nerenz said. "At issue is whether the financial penalties reflect true differences in quality of care, or they reflect factors other than quality that are outside of providers' control."

Credit: 
Henry Ford Health