Tech

Building cities with wood would store half of cement industry's current carbon emissions

image: The Aikalava pavillion was build to celebrate Finland's 100th birthday.

Image: 
Photo: Vesa Loikas

Buildings around us create a whopping one-third of global greenhouse gas emissions -- that is about ten times more than air traffic worldwide. In Europe alone about 190 million square metres of housing space are built each year, mainly in the cities, and the amount is growing quickly at the rate of nearly one percent a year.

A recent study by researchers at Aalto University and the Finnish Environment Institute shows that shifting to wood as a building construction material would significantly reduce the environmental impact of building construction. The results show that if 80 percent of new residential buildings in Europe were made of wood, and wood were used in the structures, cladding, surfaces, and furnishings of houses, all together the buildings would store 55 million tons of carbon dioxide a year. That is equivalent to about 47 percent of the annual emissions of Europe's cement industry.

'This is the first time that the carbon storage potential of wooden building construction has been evaluated on the European level, in different scenarios,' explains Ali Amiri, who is completing his doctorate at Aalto University. 'We hope that our model could be used as roadmap to increase wooden construction in Europe.

The study is based on an extensive analysis of the literature. Drawing on 50 case studies, the researchers divided buildings into three groups according to how much wood they use and, as a consequence, how much carbon dioxide they store.

The group with the least amount of wood stored 100 kg of carbon dioxide per square metre, the middle group stored 200 kg, and the group with the greatest amount of wood stored 300 kg per square metre (CO2 kg m2). The potential carbon storage capacity was not generally related to building or wood type, or even its size; rather, capacity is based on the number and volume of wood used as building components, from beams and columns to walls and finishings.

The researchers also looked at how Europe could achieve the tremendous cut by modelling a path for reaching the level of 55 million tons per year by 2040. If say, in 2020, 10 percent of new residential buildings were made of wood each storing 100 CO2 kg m2, the share of wood-built buildings would need to grow steadily to 80 percent by 2040. At the same time the scenario demands a shift to wooden buildings that store even more carbon dioxide, with more buildings falling into the 200 CO2 kg m2-storage group, and eventually the 300 CO2 kg m2-storage group.

Energy efficiency is the most frequently used instrument for measuring the environmental impact of buildings. However, energy efficiency requires more insulation, efficient recovery of heat, and better windows. In fact, about half of the carbon footprint of zero-energy houses occurs before anyone has even lived in them.

When the energy used in housing comes increasingly from renewable sources, the significance of the construction phase of the building's total environmental impact grows even more.

'Certificates for green buildings used around the world, such as LEED and BREEAM, could better take the climate benefits of wood construction into account. So far, they are strongly focused on how energy is consumed during use,' Amiri says.

In terms of wood products, a wooden building provides longer-term storage for carbon than pulp or paper. According to the study findings, a wooden building of 100 m2 has the potential to store 10 to 30 tons of carbon dioxide. The upper range corresponds to an average motorist's carbon dioxide emissions over ten years.

'Wood construction is sustainable only if the wood comes from forests that are grown in a sustainable manner. Shifting from short-lived products, like paper, to products with a long life-cycle, like wooden construction materials, would help minimise the impact on European forests and the crucial carbon sinks they hold,' says postdoctoral researcher Juudit Ottelin.

Credit: 
Aalto University

Cockroaches and lizards inspire new robot developed by Ben-Gurion University researcher

image: A new high-speed amphibious robot inspired by the movements of cockroaches and lizards, developed by Ben-Gurion University of the Negev (BGU) researchers, swims and runs on top of water at high speeds and crawls on difficult terrain.

Image: 
Ben-Gurion University

BEER-SHEVA, Israel...November 2, 2020 - A new high-speed amphibious robot inspired by the movements of cockroaches and lizards, developed by Ben-Gurion University of the Negev (BGU) researchers, swims and runs on top of water at high speeds and crawls on difficult terrain.

The mechanical design of the AmphiSTAR robot and its control system were presented virtually last week at the IROS (International Conference on Intelligent Robots and Systems) by Dr. David Zarrouk, director, Bioinspired and Medical Robotics Laboratory in BGU's Department of Mechanical Engineering, and graduate student Avi Cohen.

"The AmphiSTAR uses a sprawling mechanism inspired by cockroaches, and it is designed to run on water at high speeds like the basilisk lizard," says Zarrouk. "We envision that AmphiSTAR can be used for agricultural, search and rescue and excavation applications, where both crawling and swimming are required."

Click here for a video of AmphiSTAR.

The palm-size AmphiSTAR, part of the family of STAR robots developed at the lab, is a wheeled robot fitted with four propellers underneath whose axes can be tilted using the sprawl mechanism. The propellers act as wheels over ground and as fins to propel the robot over water while swimming and running on water at high speeds of 1.5 m/s. Two air tanks enable it to float and transition smoothly between high speeds when hovering on water to lower speeds swimming, and from crawling to swimming and vice versa.

The experimental robot can crawl over gravel, grass and concrete as fast as the original STAR robot and can attain speeds of 3.6 m/s (3.3 mph).

"Our future research will focus on the scalability of the robot and on underwater swimming," Zarrouk says.

This study was supported in part by the BGU Helmsley Charitable Trust through the Agricultural, Biological and Cognitive Robotics Initiative, and by the Marcus Endowment Fund, both at Ben-Gurion University of the Negev. The Marcus legacy gift, of over $480 million, was donated in 2016 to American Associates, Ben-Gurion University of the Negev by Dr. Howard and Lottie Marcus. The donation is the largest gift given to any Israeli university and is believed to be the largest gift to any Israeli institution.

Credit: 
American Associates, Ben-Gurion University of the Negev

A 40-year-old catalyst unveils its secrets

'Titanium silicalite-1' (TS-1) is not a new catalyst: It has been almost 40 years since its development and the discovery of its ability to convert propylene into propylene oxide, an important basic chemical in the chemical industry. Now, by combining various methods, a team of scientists from ETH Zurich, the University of Cologne, the Fritz Haber Institute and BASF has unveiled the surprising mechanism of action of this catalyst. From Cologne, the working group of Professor Dr. Albrecht Berkessel at the Department of Chemistry was involved. These findings will help catalyst research take an important step forward.

Propylene oxide is used in industry to make products such as polyurethanes, antifreeze additives and hydraulic fluids. More than 11 million metric tons of propylene oxide are produced annually in the chemical industry worldwide, of which 1 million metric tons are already produced by the oxidation of propylene with hydrogen peroxide. This chemical reaction is catalyzed by TS-1, a microporous, crystalline material made up of silicon and oxygen and containing small amounts of titanium. The catalyst has been used successfully for 40 years and experts assumed that the active centre in TS-1 contains individual, isolated titanium atoms that ensure the special reactivity of the catalyst.

A team of researchers from the ETH Zurich, the University of Cologne, the Fritz Haber Institute and BASF questioned this assumption. 'In recent years, doubts have arisen as to whether the assumption about the mechanism of action is correct, as it relies primarily on analogies with comparable catalysts and less on experimental evidence. But if you try to optimize a catalyst on the basis of a wrong assumption, it is very difficult and can lead you in the completely wrong direction. It was therefore important to examine this assumption more closely', explains BASF scientist Dr. Henrique Teles, one of the co-authors of the scientific publication, the starting point for the collaboration.

In a study now published in Nature, the team was able, by using solid-state NMR studies and computer modelling, to show that two neighbouring titanium atoms are necessary to explain the particular catalytic activity. This in turn led the research team to conclude that the titanium atoms are not isolated but rather the catalytically active centre consists of a titanium pair. 'None of the methods we used in the study are fundamentally new, but none of the research groups involved in the study could have carried out the investigation on their own', emphasizes Prof. Christophe Copéret from ETH Zurich, the correspondence author of the publication. 'Only the combination of different fields of expertise and various techniques made it possible to more closely examine the active center of the catalyst.'

'We have worked for many years to elucidate the reaction mechanism of a homogeneous titanium catalyst and found that - contrary to the assumptions in the literature - the hydrogen peroxide is activated by a titanium pair. It was really a special moment when we saw in the current study that the findings from homogeneous catalysis also apply to heterogeneous catalysis', said co-author Prof. Albrecht Berkessel from the University of Cologne. And Dr. Thomas Lunkenbein, co-author from the Fritz Haber Institute in Berlin, adds: 'We are very pleased that we were able to make a contribution to this study. With our analytics, we were able to substantiate the conclusions. The knowledge of a diatomic active centre is of fundamental importance and opens up new possibilities in catalyst research.'

The team is convinced that the findings of this study will not only help to improve existing catalysts, but also to develop new homogeneous and heterogeneous catalysts.

Credit: 
University of Cologne

Mobile phones help Americans encounter more diverse news

image: Mobile access increases the reach of news sources and audience engagement. It also increases co-exposure to news, diversifying information diets.

Image: 
Sandra González-Bailón/Annenberg School for Communication

In recent years, we've heard a lot about "news bubbles" and "echo chambers," the idea that to validate their own worldviews, liberals read liberal news and conservatives read conservative news. The proliferation of partisan online news sites, the thinking goes, only makes it worse. Numerous studies have supported these ideas. However, they all have one thing in common: They don't take into account the news people read on their mobile devices.

In a new study published in the Proceedings of the National Academy of Sciences, researchers at the University of Pennsylvania's Annenberg School for Communication analyzed the news consumption of tens of thousands of Americans over a five-year period on their desktop computers, tablets, and mobile phones. They found that contrary to the conventional wisdom about segregated news bubbles, mobile devices expose Americans to a much greater variety of news, diversifying the stories that people encounter and exposing them to a breadth of information sources.

In an era where mobile phones are increasingly the gateway to the internet for the majority worldwide, this insight into mobile news exposure is an encouraging sign.

"Given our observation that echo chambers are not as significant as many assume they are, our data suggest that Americans might still be willing to hear from the other side," says lead author Tian Yang, an Annenberg School for Communication doctoral candidate. "Also, getting information from a diverse pool of news sources might help reduce the effects of misinformation, a major concern in the current information environment."

The study shows that exposure to news through desktop computers has resulted in people reading from a narrower set of sources over time. But when the analysis takes into account the full range of devices on which people access the internet, including mobile phones, a picture emerges in which Americans encounter a much wider range of news sources, exposure that is largely not dictated by partisan beliefs. On average, the political ideology of news audiences only counted for 16% of the variation in their exposure to news outlets.

"Our results are counterintuitive because we are taking into account a large percentage of the online population that is often excluded from the data," says senior author Sandra González-Bailón, associate professor of communication at the Annenberg School for Communication. "Examining mobile phone usage totally changes the picture of how people encounter information online."

The researchers theorize that the structure of mobile apps, news aggregators, and social media networks lead people to click on stories on a variety of sites that they might not otherwise visit.

While the data paint a pro-democratic picture of a broadly informed electorate, there was another, less optimistic finding: Half of Americans who are online do not read news on any of their digital devices. While we think about Americans as split between two news bubbles, in fact half of the U.S. population is opting out of online news altogether.

"What this tells us is that the left-right ideological divide may be less important than the divide between those who are informed and those who are not," says González-Bailón. "And this finding matters because news-avoiders are likely to be more vulnerable to misinformation and manipulation, and we have to do more to research that."

Credit: 
University of Pennsylvania

Study provides clues on curbing the aggressive nature of coronavirus

image: Figure1

Image: 
University of Tartu

Recent study by Estonian researchers in University of Tartu explains how coronavirus is activated before attacking the cell and what could help to impede that. The study published in Scientific Reports, takes us a step closer to understanding why the spread of SARS-CoV-2 has been so rapid and aggressive. The studied virus activation mechanism is also one potential target for developing drugs for the treatment of COVID-19.

Molecular biologists Mihkel Örd and Ilona Faustova from the University of Tartu together with Professor Mart Loog compared the cell entry mechanisms of SARS-CoV-2 causing the current corona pandemic with those of other coronaviruses (SARS-CoV-1 and MERS-CoV causing the Middle East respiratory syndrome, with outbreaks in 2003 and 2012, respectively). The study focused on the spike protein that forms the spikes we know well from the images of the virus. The SARS-CoV-2 spike protein consists of two parts (see the Figure 1). The tip of the spike works like a detector, looking for a suitable cell to attach to. In cells, the exchange of information is mediated by various proteins that send and receive chemical signals. Viruses take advantage of these proteins to enter cells.

One protein the SARS-CoV-2 spike protein addresses is the enzyme furin that is present in many cells of the human body. Furin's task is to cleave proteins, acting like biological scissors. Different enzymes affect different proteins. Among other proteins, furin also cleaves the SARS-CoV-2 spike protein, removing its detector-like tip. After that, the remaining part of the spike protein starts work by fusing with the cell membrane and allowing the virus to enter the cell.

"Many viruses use a similar logic. They have a receptor domain and a binding domain, but use different enzymes of the host. To some extent, this also affects the way the virus functions in the body. Different enzymes are expressed in different tissues. Furin is expressed throughout the human body, so the range of cells the virus can enter is enormous. We can suppose that this is why the virus is particularly aggressive," explained Mihkel Örd, a doctoral student in biomedical technology at the University of Tartu (UT).

From cell division to taming the virus

In their everyday work, Mihkel Örd and UT Senior Research Fellow in Molecular Biology Ilona Faustova study what affects the division of cells and how. They pay special attention to phosphorylation - a chemical reaction in which a phosphoryl group is attached to a protein. As a result, the initial task or activity of the proteins may change, which allows controlling the intracellular processes in certain conditions.

The way enzymes process proteins depends on the patterns or short linear motifs of the amino acids in the protein. Proteins are composed of 20 different amino acids and the short linear motifs of amino acids may encode important biological information. When analysing the sequence of the spike protein of the novel coronavirus, the researchers from Tartu soon noticed that in that critical site where the spike protein is supposed to be cut into two, there can be three motifs with a different task. When the spread of coronavirus turned into a pandemic, Örd and Faustova decided to study whether these predicted motifs function in real life and how they can affect the life cycle of the virus.

"We ordered DNA sequences of coronavirus, synthesised and separated the proteins from the cells, created mutations to detect furin-sensitive sites and to understand how phosphorylation affects them. There are more than one thousand amino acids in the spike protein, but for this process, only one tiny part is important. We devoted our attention to that site and started to study it in more detail," Örd explained.

The analysis revealed that in the case of one motif, the furin cleavage site was very specifically encoded in the novel coronavirus. "What was surprising was how efficiently furin cleaved the SARS-CoV-2 spike protein motif. In comparison, we analysed the motifs in viruses causing MERS and SARS that are only slightly different: furin failed to cleave them. Another interesting finding was that the cleavage site was surrounded by two other motifs the phosphorylation of which completely inhibited the furin cleavage of the spike protein. These findings provide new information about which processes may affect the life cycle of the virus and why the novel virus has been more successful than the previous two," explained Faustova.

"In our research, we have always been guided by the idea that the better we understand different molecular mechanisms, the more possibilities we have to intentionally affect or control them. Since the studied motifs are repeated in different proteins, their description allows making predictions about the tasks and regulation of many proteins. The first published findings already confirm that furin inhibitors curb the replication of SARS-CoV-2 and may have a potential pharmaceutical application," said Faustova.

Corona crisis put the research system to a test

Faustova adds that the corona pandemic has revealed the collaboration capacity of the global research community. "We have studied the encoded enzyme signals of different motifs for years and this allowed us to contribute to understanding the mechanisms of coronavirus based on our expertise. Hundreds of labs across the world have done the same. In the crisis, other projects were halted to jointly focus on the new virus. Over the past nine months, life scientists of the world have made enormous progress. Never before have we managed to join all of our forces and conduct a detailed study of a single critical process. High-quality publications in the field are published every day. This gives us hope that the corona vaccine and drugs may reach us pretty soon."

According to Mart Loog, all this can also bring us to a sobering conclusion: "In the future, the transmission of viruses from birds and animals to humans can happen again and even more often. The corona pandemic showed that many other essential studies (for instance, on cancer) were halted because of one virus. To solve the key problems of the humanity, it is crucial to significantly increase the global capacity of life sciences. This requires an exponential increase in funding. With the current resources, we would not be able to solve several simultaneous or consecutive crises." Loog added that also bioterrorism is increasingly seen as a growing security risk. Research must always be one step ahead of terrorists to prevent them. For that, we need to invest in research.

Credit: 
Estonian Research Council

Malaria test as simple as a bandage

image: A bandagelike test for malaria features an array of microneedles that collect interstitial fluid from skin and delivers results on a test strip within minutes.

Image: 
Jeff Fitlow/Rice University

HOUSTON - (Nov. 2, 2020) - Testing for malaria could become as simple as putting on a bandage.

That's the idea behind a platform developed by Rice University engineers who introduced a microneedle patch for rapid diagnostic testing that does not require extracting blood.

The device detailed in the Nature journal Microsystems and Nanoengineering draws upon protein biomarkers contained in dermal interstitial fluid, what people generally recognize as the fluid inside blisters but surrounds all of the cells in skin.

This fluid contains a multitude of biomarkers for various diseases, such as malaria, which can be used for rapid testing. The disposable patches could be programmed to detect other diseases, potentially including COVID-19, said mechanical engineer Peter Lillehoj of Rice's Brown School of Engineering.

"In this paper, we focus on malaria detection because this project was funded by the Bill and Melinda Gates Foundation, and it's a big priority for them," said Lillehoj, who joined Rice in January as an associate professor of mechanical engineering. "But we can adapt this technology to detect other diseases for which biomarkers appear in interstitial fluid."

The self-contained test developed by Lillehoj and lead author Xue Jiang, a Rice postdoctoral researcher, delivers a result in about 20 minutes and does not require medical expertise or any equipment.

The sticky patch has 16 hollow microneedles in a 4-by-4 array on one side, coupled with an antibody-based lateral-flow test strip on the other. The antibodies react when they sense protein biomarkers for malaria and turn two readout lines on the strip's exposed surface red. If the test is negative, only one line turns red.

The needles are treated to be hydrophilic -- that is, attracted to water -- so the fluid is drawn in and flows through to the test strip. Once the test is complete, the device can be removed like any bandage.

While both microneedles and antibody test strips have been extensively studied, Lillehoj said his lab is the first to combine them into a simple, inexpensive package that will be easy to deploy at the point of need, especially in developing regions where finger-prick blood sampling and the availability of trained medical personnel to diagnose samples may be challenging.

The hollow needles are 375 microns wide and 750 microns long, enough to reach the fluid within skin that is typically between 800 to 1,000 microns thick. The needles are sharp enough to overcome the mechanical stress of entering the skin.

"Xue and I have applied the patch to our skin, and it doesn't feel painful at all compared to a finger prick or a blood draw," Lillehoj said. "It's less painful than getting a splinter. I would say it feels like putting tape on your skin and then peeling it off."

They think the familiar form factor may provide some comfort, especially to children.
"We didn't intend for it to look like a bandage," he said. "We started with a rectangular shape and then just rounded the edges to make it a little more presentable. We didn't plan for that, but perhaps it makes the patch more relatable to the general public."

He estimated individual patches could cost about $1 if and when they are produced in bulk.

Credit: 
Rice University

To predict how crops cope with changing climate, 30 years of experiments simulate future

image: A review from the University of Illinois synthesizes 30 years of data from 14 facilities worldwide that simulate future climate conditions to determine that rising carbon dioxide could be counterproductive for crops.

Pictured: The SoyFACE 'Free-Air Concentration Enrichment' (FACE) facility at the University of Illinois simulates rising carbon dioxide and temperature as well as changing rainfall patterns.

Image: 
University of Illinois

Five years ago, the United Nations committed to achieving the Sustainable Development Goal of Zero Hunger by 2030. Since then, however, world hunger has continued to rise. Nearly 9 percent of our global population is now undernourished, according to a 2020 report from the FAO, and climate variability is a leading factor driving us off course.

Over the past 30 years, a network of 14 long-term research facilities spanning five continents has simulated future levels of carbon dioxide (CO2) to forecast the impact on crops. Importantly, these 'Free-Air Concentration Enrichment' (FACE) experiments are conducted outside in real-world field conditions to capture the complex environmental factors that impact crop growth and yield.

Today, a review published in Global Change Biology synthesizes 30 years of FACE data to grasp how global crop production may be impacted by rising CO2 levels and other factors. The study portends a less optimistic future than the authors' previous review published 15 years ago in New Phytologist. "There are likely genetic solutions, should society decide to act on these--however, time is short," said co-author Stephen Long, Ikenberry Endowed University Chair of Crop Sciences and Plant Biology at the University of Illinois.

"It's quite shocking to go back and look at just how much CO2 concentrations have increased over the lifetime of these experiments," said co-author Lisa Ainsworth, a research plant physiologist with the U.S. Department of Agriculture, Agricultural Research Service (USDA-ARS). "We are reaching the concentrations of some of the first CO2 treatments 30 years back. The idea that we can check the results of some of the first FACE experiments in the current atmosphere is disconcerting."

The reviews consider two groups of plants: most crops are C3 (including soybean, cassava, and rice), which are less efficient at turning CO2 and light into energy through the process of photosynthesis. C4 plants, such as corn and sugarcane, supercharge photosynthesis by using some of the light energy they receive to concentrate CO2 within their leaves, making them up to 60 percent more efficient.

In C3 crops, oxygen inhibits photosynthesis, which is diminished by increasing the concentration of CO2. This is why many commercial greenhouse growers raise CO2 levels to boost photosynthesis and the yields of tomatoes, peppers, cucumbers, and other greenhouse crops. So will the increase in atmospheric CO2--that we are causing through our use of fossil fuels and deforestation--have the same effect?

Results

As in their previous review, but this time with ten times more studies, the authors show that elevation of CO2 to levels expected for the second half of this century could increase C3 crops' yields by 18% with adequate nutrients and water.

"So should we anticipate a bounty as CO2 rises?" said Long, who is a member of the Carl R. Woese Institute for Genomic Biology. "Sadly not because rising CO2 is the primary cause of change in the global climate system. The anticipated 2° C rise in temperature, caused primarily by this increase in CO2, could halve yields of some of our major crops, wiping out any gain from CO2."

While CO2 increased yields, it also caused important quality losses; many crops showed lower mineral nutrient and protein contents. This yield response is also much smaller under the conditions of low nitrogen fertilization, which is the situation for many farmers in the world's poorer countries. Alarmingly, what has become apparent since the first review is that our major food crops become considerably more vulnerable to pests and diseases at higher CO2.

"Lots of people have presumed that rising CO2 is largely a good thing for crops: assuming more CO2 will make the world's forests greener and increase crop yields," Ainsworth said. "The more recent studies challenge that assumption a bit. We're finding that when you have other stresses, you don't always get a benefit of elevated CO2."

On a more positive note, the authors show that there is sufficient genetic variation within our major crops to overcome some of these negative effects and capitalize on the yield benefit of higher CO2. "Where genetic variation is lacking, there are some bioengineering solutions with one already demonstrated to prevent yield loss when the temperature is raised with CO2," Long said. "But, given the time taken to develop new crop cultivars, this potential could only be realized if we start now."

Future Research

"We are driven by a motivation to prepare for the future and to identify the traits that are going to be important for maximizing this CO2 response while dealing with the aspects of global change that may drive down yields," Ainsworth said. "The last 15 years have taught us to account more for the complex interactions from other factors like drought, temperature, nutrients, and pests."

Researchers should explore a wider variety of crops and genotypes as well as different management practices, such as seeding density, tillage, and cover crops, to find other solutions that are less burdensome on the environment, Ainsworth said.

Also, the FACE community needs greater accessibility to all of the experimental results.

"We don't have a formal database of all of the FACE results from the last two decades of research," Ainsworth said. "There's an opportunity to put all of the information together in one place and make it openly accessible for everyone to use and to encourage more people to use the data to think about solutions."

"The ideal solution will be that we dramatically decrease our release of CO2 into the atmosphere and quickly achieve carbon neutrality," Long said. "But we also need to take out an insurance policy against this not being achieved. That is, we need to breed and engineer future-proof crops and systems that can be sustainable and nutritious under the combined changes in atmospheric composition and climate to help realize the goal of zero hunger."

Credit: 
Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

Printing plastic webs to protect the cellphone screens of the future

image: Polycarbonate webs synthesized using additive manufacturing absorb up to 96% of impact energy.

Image: 
Shibo Zou

Follow the unbreakable bouncing phone! A Polytechnique Montréal team recently demonstrated that a fabric designed using additive manufacturing absorbs up to 96% of impact energy - all without breaking. Cell Reports Physical Science journal recently published an article with details about this innovation, which paves the way for the creation of unbreakable plastic coverings.

The concept and accompanying research revealed in the article is relatively simple. Professors Frédérick Gosselin and Daniel Therriault from Polytechnique Montréal's Department of Mechanical Engineering, along with doctoral student Shibo Zou, wanted to demonstrate how plastic webbing could be incorporated into a glass pane to prevent it from shattering on impact.

It seems a simple enough concept, but further reflection reveals that there's nothing simple about this plastic web.

The researchers' design was inspired by spider webs and their amazing properties. "A spider web can resist the impact of an insect colliding with it, due to its capacity to deform via sacrificial links at the molecular level, within silk proteins themselves," Professor Gosselin explains. "We were inspired by this property in our approach."

Biomimicry via 3D printing

Researchers used polycarbonate to achieve their results; when heated, polycarbonate becomes viscous like honey. Using a 3D printer, Professor Gosselin's team harnessed this property to "weave" a series of fibres less than 2 mm thick, then repeated the process by printing a new series of fibres perpendicularly, moving fast, before the entire web solidified.

It turns out that the magic is in the process itself - that's where the final product acquires its key properties.

As it's slowly extruded by the 3D printer to form a fibre, the molten plastic creates circles that ultimately form a series of loops. "Once hardened, these loops turn into sacrificial links that give the fibre additional strength. When impact occurs, those sacrificial links absorb energy and break to maintain the fibre's overall integrity - similar to silk proteins," researcher Gosselin explains.

In an article published in 2015, Professor Gosselin's team demonstrated the principles behind the manufacturing of these fibres. The latest Cell Reports Physical Science article reveals how these fibres behave when intertwined to take the shape a of web.

Study lead author Shibo Zou, used the opportunity to illustrate how such a web could behave when located inside a protective screen. After embedding a series of webs in transparent resin plates, he conducted impact tests. The result? Plastic wafers dispersed up to 96% of impact energy without breaking. Instead of cracking, they deform in certain places, preserving the wafers' overall integrity.

According to Professor Gosselin, this nature-inspired innovation could lead to the manufacture of a new type of bullet-proof glass, or lead to the production of more durable plastic protective smartphones screens. "It could also be used in aeronautics as a protective coating for aircraft engines," the Professor Gosselin notes. In the meantime, he certainly intends to explore the possibilities that this approach may open for him.

Credit: 
Polytechnique Montréal

UNH research: Longer mud season, no snow could alter northeast rivers by 2100

DURHAM, N.H. - As temperatures begin to drop and fall transitions into winter, snow will soon blanket the northern regions of the United States. But researchers at the University of New Hampshire have found that snow cover is on the decline in this area due to climate change and the shift from winter to spring, known as the vernal window, is getting longer. By the end of the century, the scientists say the vernal window, sometimes referred to as mud season in the northeast, could be two to four weeks longer which means significantly less melting snow that could be detrimental to key spring conditions in rivers and surrounding ecosystems.

"We found that climate change could alter the vernal window so much that by the year 2100, 59% of northeastern North America - which goes from Maine to Virginia - would not accumulate any snow," said Danielle Grogan research scientist in UNH's Earth Systems Research Center and lead author. "Historically, an average of 27% of the northeast goes without snow but by the end of century states like Connecticut and Pennsylvania could be snow free."

In their study, recently published in the journal Environmental Research Letters, the researchers looked at a variety of ecological factors that determine the length of the vernal window which scientists define as the transition time from winter to spring when there isn't any snowpack or forest canopy. They focused on three key vernal window events: snow disappearance, spring runoff, and budburst - the appearance of buds on plants that signals the end of the vernal window. Climate datasets that projected future temperature and precipitation were used to drive simulation models to assess the shift in both the opening and closing of the vernal window as well as the effect on rivers and surrounding ecosystems over the period from 1980 to 2099.

"Snow melt is a major event for rivers and forests in the northeast," said Grogan. "It moves nutrients from the land to the rivers, boosts water levels and triggers essential spring happenings like the migration of fish. Losing the snow and the melt would change ecosystems on many levels and remove key signals that would disrupt natural patterns like fish mating."

Previous studies have examined how climate change will alter the vernal window but few have explored the impact on rivers and surrounding areas during this transitional period. Researchers say by lengthening the vernal window and decreasing snow melt conditions in this area it could become similar to southern snow-free regions and would be a fundamental change in the hydrologic character of the northeast.

Credit: 
University of New Hampshire

Tunable THz radiation from 3D topological insulator

image: Generation of elliptically and circularly polarized terahertz beams, from Haihui Zhao et al., doi 10.1117/1.AP.2.6.066003

Image: 
Haihui Zhao et al., doi 10.1117/1.AP.2.6.066003

Terahertz (THz) waves, located between the millimeter and far-infrared frequency ranges, are an electromagnetic frequency band that is as-yet incompletely recognized and understood. Xiaojun Wu of Beihang University leads a group of researchers actively seeking ways to understand, generate, and control THz radiation. Wu notes that THz waves have great potential for expanding real applications--from imaging to information encryption--but the development of THz science and technology has been hindered by a lack of sufficiently efficient sources.

Wu's research group has been investigating a three-dimensional topological insulator of bismuth telluride (Bi2Te3) as a promising basis for an effective THz system. They recently systematically investigated THz radiation from Bi2Te3 nanofilms driven by femtosecond laser pulses. Their report published in Advanced Photonics demonstrates efficient generation of chiral THz waves with an arbitrarily adjustable polarization state that allows control of chirality, ellipticity, and principal axis.

According to Wu, bismuth telluride is a great candidate for future on-chip topological insulator-based terahertz systems; it has already exhibited excellent prospects in THz emission, detection, and modulation. The well-studied topological insulator presents a special spin-momentum locked surface state, which can also be accurately adjusted by various factors such as the number of atomic layers. Wu explains that this kind of THz source can efficiently radiate linearly and circularly polarized THz waves, with adjustable chirality and polarization. This will enable the development of THz science and applications in such areas as ultrafast THz opto-spintronics, polarization-based THz spectroscopy and imaging, THz biosensing, line-of-sight THz communications, and information encryption.

Generation and manipulation of linearly polarized THz waves

Wu's group systematically investigated the THz radiation from topological insulator Bi2Te3 nanofilms driven by femtosecond laser pulses. They found that the linearly polarized THz wave originates from the shift current formed by the ultrafast redistribution of the electron density between Bi-Te atoms in Bi2Te3 after the topological insulator is excited by the linearly polarized pump light. The ultrafast shift current contributes to the linearly polarized THz radiation. Due to the lattice characteristics of Bi2Te3, the radiated THz waves are always linearly polarized with a three-fold rotation angle, depending on the sample azimuthal angle. This reliability makes it very convenient to arbitrarily manipulate the THz wave polarization angle by controlling the incident laser in the polarization direction.

Generation and manipulation of circularly polarized THz waves

Wu explains that, in order to produce circularly polarized THz pulses, it was necessary to simultaneously tune the pump laser polarization and the sample azimuthal angle. When the sample azimuthal angle was fixed, the researchers also obtained elliptical THz beams with various ellipticities and principle axes, due to the combination of a linear photogalvanic effect (LPGE) and a circular photogalvanic effect (CPGE), which is caused by the intrinsic time delay between the LPGE-driven and CPGE-driven THz electric field components. Within the scope of their expectations, they were able to manipulate the chirality of the emitted THz waves by varying the incident laser helicity.

Wu explains, "Helicity-dependent current is the critical reason why we can obtain spin-polarized THz pulses because we can continuously tune the magnitude and polarity of it by changing the helicity." Specific discussion of the implementation and control of circularly polarized THz radiation is included in their paper.

The authors are optimistic that their work will help further collective understanding of femtosecond coherent control of ultrafast spin currents in light-matter interaction and will also provide an effective way to generate spin-polarized THz waves. Wu notes that the manipulation of polarization is a step toward the goal of tailoring twisted THz waves efficiently at the source.

Credit: 
SPIE--International Society for Optics and Photonics

Excessive alcohol consumption during the COVID-19 pandemic

image: Researchers at McLean warn of increases in alcohol consumption during the pandemic and encourage more screening and support for alcohol addiction

Image: 
McLean Hospital

Alcohol consumption is a common coping response to stress, and historically, it has increased in the United States following catastrophic events, such as terrorist attacks and large-scale natural disasters. Considering COVID-19, experts at McLean Hospital have published a viewpoint article in the Journal of General Internal Medicine that examines potential ways to moderate and reduce rising alcohol consumption in the face of the pandemic.

Because the COVID-19 pandemic is longer lasting and more extensive than previous traumatic events--with widespread social disruption and isolation, limited social support and access to medical care, and negative domestic and global economic impacts--it could have an even greater effect on population-wide alcohol use.

"We hope this article will call attention to the pandemic's effects on alcohol use and offer mitigating approaches to this under-recognized public health concern," said co-author Dawn E. Sugarman, PhD, a research psychologist in the Center of Excellence in Alcohol, Drugs, and Addiction at McLean Hospital.

The article stresses that public health messages should include education about managing stress and anxiety without using alcohol, drinking within safe limits during physical distancing and social isolation, and knowing when an individual ought to be concerned about themselves or someone else.

The authors also call for greater efforts to screen for alcohol use disorders during primary care visits and to provide treatments for individuals at risk for relapse or exacerbation of heavy drinking. Telehealth services that use mobile and online programs may help provide access to such care. Ensuring adequate insurance for treatment will be essential with the added concern that many individuals have lost their employer-based health insurance and may have reduced access to health care and addiction treatment programs.

"Increasing identification of harmful alcohol use in patients and intervening early are key components of addressing this problem. In addition, recognition of the problem from policymakers could lead to changes in federal regulations--such as we have seen with telehealth--and improvements in access to health care," said co-author Shelly F. Greenfield, MD, MPH, director of the Alcohol, Drug, and Addiction Clinical and Health Services Research Program at McLean Hospital.

Sugarman and Greenfield note that the full impact of COVID-19 on alcohol use is not yet known, but rising rates during the first few months of the pandemic point to the urgent need for effective public health and medical responses.

Credit: 
McLean Hospital

Microfluidics helps MTU engineers watch viral infection in real time

image: Before they burst, infected cell membranes form structures called blebs, which change the electric signal measured in a microfluidic device.

Image: 
Phil Mcleod/Michigan Tech

Get your popcorn. Engineers and virologists have a new way to watch viral infection go down.

The technique uses microfluidics -- the submillimeter control of fluids within a precise, geometric structure. On what is basically a tricked-out microscope slide, chemical engineers from Michigan Technological University have been able to manipulate viruses in a microfluidic device using electric fields. The study, published this summer in Langmuir, looks at changes in the cell membrane and gives researchers a clearer idea of how antivirals work in a cell to stop the spread of infection.

Viruses carry around an outer shell of proteins called a capsid. The proteins act like a lockpick, attaching to and prying open a cell's membrane. The virus then hijacks the cell's inner workings, forcing it to mass produce the virus's genetic material and construct many, many viral replicas. Much like popcorn kernels pushing away the lid of an overfilled pot, the new viruses explode through the cell wall. And the cycle continues with more virus lockpicks on the loose.

"When you look at traditional techniques -- fluorescent labeling for different stages, imaging, checking viability -- the point is to know when the membrane is compromised," said Adrienne Minerick, study co-author, dean of the College of Computing and a professor of chemical engineering. "The problem is that these techniques are an indirect measure. Our tools look at charge distribution, so it's heavily focused on what's happening between the cell membrane and virus surface. We discovered with greater resolution when the virus actually goes into the cell."

Watching the viral infection cycle and monitoring its stages is crucial for developing new antiviral drugs and gaining better understanding of how a virus spreads. Dielectrophoresis happens when polarizable cells get pushed around in a nonuniform electric field. The movement of these cells is handy for diagnosing diseases, blood typing, studying cancer and many other biomedical applications. When applied to studying viral infection, it's important to note that viruses have a surface charge, so within the confined space in a microfluidic device, dielectrophoresis reveals the electric conversation between the virus capsid and the proteins of a cell membrane.

"We studied the interaction between the virus and cell in relation to time using microfluidic devices," said Sanaz Habibi, who led the study as a doctoral student in chemical engineering at Michigan Tech. "We showed we could see time-dependent virus-cell interactions in the electric field."

Watching a viral infection happen in real time is like a cross between a zombie horror film, paint drying and a Bollywood epic on repeat. The cells in the microfluidic device dance around, shifting into distinct patterns with a dielectric music cue. There needs to be the right ratio of virus to cells to watch infection happen -- and it doesn't happen quickly. Habibi's experiment runs in 10-hour shifts, following the opening scenes of viral attachment, a long interlude of intrusion, and eventually the tragic finale when the new viruses burst out, destroying the cell in the process.

Before they burst, cell membranes form structures called blebs, which change the electric signal measured in the microfluidic device. That means the dielectrophoresis measurements grant high-resolution understanding of the electric shifts happening at the surface of the cell through the whole cycle.

Viral infections are top of mind right now, but not all viruses are the same. While microfluidic devices that use dielectrophoresis could one day be used for on-site, quick testing for viral diseases like COVID-19, the Michigan Tech team focused on a well-known and closely studied virus, the porcine parvovirus (PPV), which infects kidney cells in pigs.

But then the team wanted to push the envelope: They added the osmolyte glycine, an important intervention their collaborators study in viral surface chemistry and vaccine development.

"Using our system, we could show time-dependent behavior of the virus and cell membrane. Then we added the osmolyte, which can act as an antiviral compound," Habibi explained. "We thought it would stop the interaction. Instead, it looked like the interaction continued to happen at first, but then the new viruses couldn't get out of the cell."

That's because glycine likely interrupts the new capsid formation for the replicated viruses within the cell itself. While that specific portion of the viral dance happens behind the curtain of the cell wall, the dielectric measurements show a shift between an infected cycle where capsid formation happens and an infected cell where capsid formation is interrupted by glycine. This difference in electrical charge indicates that glycine prevents the new viruses from forming capsids and stops the would-be viral lockpickers from hitting their targets.

"When you are working with such small particles and organisms, when you're able to see this process happening in real time, it's rewarding to track those changes," Habibi said.

This new view of the interactions between virus capsids and cell membranes could speed up testing and characterizing viruses, cutting out expensive and time-consuming imaging technology. Perhaps in a future pandemic, there will be point-of-care, handheld devices to diagnose viral infections and we can hope medical labs will be outfitted with other microfluidic devices that can quickly screen and reveal the most effective antiviral medications.

Credit: 
Michigan Technological University

NIST researchers advance efforts to accurately measure glyphosate pesticide in oats

image: NIST researcher Jacolin Murray records the mass of an oat flour sample.

Image: 
J. Murray/NIST

Pesticides help farmers increase food production, reduce costly damage to crops, and even prevent the spread of insect-borne diseases, but since the chemicals can also end up in human food, it's essential to ensure that they are safe. For a commonly used pesticide known as glyphosate, concerns exist over how high a level is safe in food as well as the safety of one of its byproducts, known as AMPA. Researchers at the National Institute of Standards and Technology (NIST) are advancing efforts to measure glyphosate and AMPA accurately in the oat-based food products where they frequently appear by developing reference materials.

The Environmental Protection Agency (EPA) establishes tolerances for pesticide levels in food that are still considered safe for consumption. Food manufacturers test their products to make sure they meet EPA regulations. But in order to make sure their measurements are accurate they need a reference material (RM) with known levels of glyphosate with which to compare their products.

There is no reference material available for measuring glyphosate, the active ingredient in the commercial product Roundup, in the oatmeal or oat-based products in which the pesticide is heavily used. However, there are a small number of food-based RMs available for measuring other pesticides. In efforts to develop one for glyphosate and meet the immediate needs of the manufacturers, NIST researchers have optimized a test method to analyze glyphosate in 13 commercially available oat-based food samples to identify candidate reference materials. They detected glyphosate in all the samples, and they also found AMPA (short for aminomethylphosphonic acid) in three of the samples.

The researchers have published their findings in the journal Food Chemistry.

For decades, glyphosate has been one of the most dominant pesticides in the United States and worldwide. In 2014 alone, 125,384 metric tons of glyphosate were used in the U.S, according to a 2016 study. It is a herbicide, a type of pesticide for destroying weeds or unwanted plants that are detrimental to crops.

Sometimes pesticides remain in small amounts, known as residues, on food produce. In the case of glyphosate, it can also break down into AMPA, which can also remain on fruit, vegetables and grains. The potential effects of AMPA on human health are not well understood and remain an active area of study. Glyphosate is also heavily used on other cereals and grains such as barley and wheat, but oats are a special case.

"Oats are unique, as grains go," said NIST researcher Jacolin Murray. "We chose oats as our first material because food producers use the glyphosate as a desiccant to dry out the crop before they harvest it. Oats tend to have a high amount of glyphosate." Crop desiccation allows for an earlier harvest and improves uniformity of crops. Because of its wide use, glyphosate is typically found at higher levels compared with other pesticides, according to co-author Justine Cruz.

The 13 oat samples in the study included oatmeal, slightly to highly processed oat-based breakfast cereals, and oat flour from conventional and organic farming practices.

The researchers analyzed the samples for glyphosate and AMPA using a modified method of extracting glyphosate from solid food, in conjunction with standard techniques known as liquid chromatography and mass spectrometry. In the first method, the solid sample is dissolved into a liquid mixture where glyphosate is removed from the food. Next, in liquid chromatography, the glyphosate and AMPA in the extract sample are separated from other components in the sample. Finally, mass spectrometry measures the mass-to-charge ratio of ions to identify the different chemical compounds in the sample.

Their results showed that the lowest levels of glyphosate were detected in the organic breakfast cereal sample (26 nanograms per gram) and organic oat flour sample (11 nanograms per gram). The highest levels of glyphosate (1,100 nanograms per gram) were detected in conventional instant oatmeal samples. AMPA levels were much lower than glyphosate levels in both organic and conventional oatmeal and oat-based samples.

All glyphosate and AMPA levels in the oatmeal and oat-based cereals were well below the EPA tolerance of 30 micrograms per gram. "The highest glyphosate levels we measured were 30 times lower than the regulatory limit," said Murray.

Based upon the results of this study and initial discussions with stakeholders interested in using a RM for oatmeal and oat-based cereals, the researchers discovered that it might be beneficial to develop a low-level RM (50 nanograms per gram) and high-level one (500 nanograms per gram). These RMs would be beneficial to agricultural and food testing labs as well as food producers, who need to test their source material for pesticide residues and to have an accurate standard against which to compare their measurements.

NIST's RMs are used not just in the United States but also worldwide, so it was important for researchers also to consider the regulatory limits abroad, such as in Europe, where the limit is 20 micrograms per gram.

"Our researchers have to balance the needs of food testing labs based in the U.S. and beyond to make reference materials with a global reach," said NIST researcher Katrice Lippa.

Researchers were able to identify three potential RM candidates for glyphosate in oat-based cereals and two candidates for AMPA. They were also able to conduct a preliminary stability study that showed glyphosate was stable in oats over a six-month period at a consistent temperature of 40 degrees Celsius, which is important in developing a future RM, which could potentially be based on one or more of these products.

Next, the researchers plan to evaluate the feasibility of the RMs through an interlaboratory study and then conduct more long-term stability studies of both glyphosate and AMPA in their materials. The NIST team will continue to engage stakeholders to make sure that the RM will meet their needs.

Credit: 
National Institute of Standards and Technology (NIST)

Covid-19 "super-spreading" events play outsized role in overall disease transmission

CAMBRIDGE, MA -- There have been many documented cases of Covid-19 "super-spreading" events, in which one person infected with the SARS-CoV-2 virus infects many other people. But how much of a role do these events play in the overall spread of the disease? A new study from MIT suggests that they have a much larger impact than expected.

The study of about 60 super-spreading events shows that events where one person infects more than six other people are much more common than would be expected if the range of transmission rates followed statistical distributions commonly used in epidemiology.

Based on their findings, the researchers also developed a mathematical model of Covid-19 transmission, which they used to show that limiting gatherings to 10 or fewer people could significantly reduce the number of super-spreading events and lower the overall number of infections.

"Super-spreading events are likely more important than most of us had initially realized. Even though they are extreme events, they are probable and thus are likely occurring at a higher frequency than we thought. If we can control the super-spreading events, we have a much greater chance of getting this pandemic under control," says James Collins, the Termeer Professor of Medical Engineering and Science in MIT's Institute for Medical Engineering and Science (IMES) and Department of Biological Engineering and the senior author of the new study.

MIT postdoc Felix Wong is the lead author of the paper, which appears this week in the Proceedings of the National Academy of Sciences.

Extreme events

For the SARS-CoV-2 virus, the "basic reproduction number" is around 3, meaning that on average, each person infected with the virus will spread it to about three other people. However, this number varies widely from person to person. Some individuals don't spread the disease to anyone else, while "super-spreaders" can infect dozens of people. Wong and Collins set out to analyze the statistics of these super-spreading events.

"We figured that an analysis that's rooted in looking at super-spreading events and how they happened in the past can inform how we should propose strategies of dealing with, and better controlling, the outbreak," Wong says.

The researchers defined super-spreaders as individuals who passed the virus to more than six other people. Using this definition, they identified 45 super-spreading events from the current SARS-CoV-2 pandemic and 15 additional events from the 2003 SARS-CoV outbreak, all documented in scientific journal articles. During most of these events, between 10 and 55 people were infected, but two of them, both from the 2003 outbreak, involved more than 100 people.

Given commonly used statistical distributions in which the typical patient infects three others, events in which the disease spreads to dozens of people would be considered very unlikely. For instance, a normal distribution would resemble a bell jar with a peak around three, with a rapidly-tapering tail in both directions. In this scenario, the probability of an extreme event declines exponentially as the number of infections moves farther from the average of three.

However, the MIT team found that this was not the case for coronavirus super-spreading events. To perform their analysis, the researchers used mathematical tools from the field of extreme value theory, which is used to quantify the risk of so-called "fat-tail" events. Extreme value theory is used to model situations in which extreme events form a large tail instead of a tapering tail. This theory is often applied in fields such as finance and insurance to model the risk of extreme events, and it is also used to model the frequency of catastrophic weather events such as tornadoes.

Using these mathematical tools, the researchers found that the distribution of coronavirus transmissions has a large tail, implying that even though super-spreading events are extreme, they are still likely to occur.

"This means that the probability of extreme events decays more slowly than one would have expected," Wong says. "These really large super-spreading events, with between 10 and 100 people infected, are much more common than we had anticipated."

Stopping the spread

Many factors may contribute to making someone a super-spreader, including their viral load and other biological factors. The researchers did not address those in this study, but they did model the role of connectivity, defined as the number of people that an infected person comes into contact with.

To study the effects of connectivity, the researchers created and compared two mathematical network models of disease transmission. In each model, the average number of contacts per person was 10. However, they designed one model to have an exponentially declining distribution of contacts, while the other model had a fat tail in which some people had many contacts. In that model, many more people became infected through super-spreader events. Transmission stopped, however, when people with more than 10 contacts were taken out of the network and assumed to be unable to catch the virus.

The findings suggest that preventing super-spreading events could have a significant impact on the overall transmission of Covid-19, the researchers say.

"It gives us a handle as to how we could control the ongoing pandemic, which is by identifying strategies that target super-spreaders," Wong says. "One way to do that would be to, for instance, prevent anyone from interacting with over 10 people at a large gathering."

The researchers now hope to study how biological factors might also contribute to super-spreading.

Credit: 
Massachusetts Institute of Technology

Biomarker combination predicts kidney injury in critically ill children

Researchers at the University of Liverpool have identified a unique method of identifying the early signs of a potentially serious condition known as Acute Kidney Injury (AKI).

The condition occurs commonly in critically ill children admitted to paediatric intensive care units often because of reduced blood supply to the kidneys (for example due to dehydration or sepsis), heart bypass surgery, or due to medicines which can cause kidney injury. Children who develop AKI have poorer immediate outcomes including longer hospital stay and increased mortality. They also have an increased risk of long-term reduction in kidney function (Chronic Kidney Disease).

Diagnosis of AKI depends on identifying elevation of a substance called creatinine in the blood. However, this only rises slowly after a kidney injury, therefore recognition of AKI is frequently delayed. If doctors can identify children at high risk for AKI early on, then pre-emptive therapy, such as renal replacement therapy (dialysis), could be instituted earlier to protect the kidneys from further injury.

In this study, led by Dr Rachel McGalliard and Dr Steve McWilliam, the team looked at the potential of two early markers to predict severe AKI in children admitted to the Paediatric Intensive Care Unit (PICU) at Alder Hey Children's NHS Foundation Trust. The first, a protein called Neutrophil Gelatinase-Associated Lipocalin (NGAL), measured in urine and blood, and the second is a clinical score called the Renal Angina Index (RAI).

After an extensive study, the team found that a combination of RAI and urinary NGAL on the first day of PICU admission provided an accurate prediction for severe AKI - a potentially life-saving discovery for many critically-ill children.

In the study, 16% of children admitted to PICU developed severe AKI within 72 hours of admission, and 7% required renal replacement therapy. In keeping with previous studies, development of AKI was associated with prolonged PICU admission and increased mortality. A novel finding in this study, was that AKI was also associated with increased risk of hospital-acquired infection.

Dr McGalliard, an NIHR Academic Clinicainfectionl Fellow in Paediatric Infectious Diseases and first author on the paper, said: "I am most proud of the analysis being in a clinically heterogenous paediatric population. Potentially, these results could lead to earlier identification of acute kidney injury (AKI) in critically ill children and could be used to trial pre-emptive therapy, if validated in further studies."

Dr McWilliam, a Senior Lecturer in Paediatric Clinical Pharmacology, said: "The results of this study are extremely exciting. The renal angina index is a simple calculation that could easily be automatically provided in electronic medical records at PICU admission. NGAL assays compatible with most clinical laboratory analysers are now widely available, making its real-time measurement in these critically unwell children realistic. We are now in a great position to investigate whether having these results available in real-time in PICUs could lead to reduced rates of AKI in these children, and whether they can be used to effectively target protective strategies to those at high risk of AKI."

Credit: 
University of Liverpool