Culture

Computer can determine whether you'll die from COVID

Using patient data, artificial intelligence can make a 90 percent accurate assessment of whether a person will die from COVID-19 or not, according to new research at the University of Copenhagen. Body mass index (BMI), gender and high blood pressure are among the most heavily weighted factors. The research can be used to predict the number of patients in hospitals, who will need a respirator and determine who ought to be first in line for a vaccination.

Artificial intelligence is able to predict who is most likely to die from the coronavirus. In doing so, it can also help decide who should be at the front of the line for the precious vaccines now being administered across Denmark.

The result is from a newly published study by researchers at the University of Copenhagen's Department of Computer Science. Since the COVID pandemic's first wave, researchers have been working to develop computer models that can predict, based on disease history and health data, how badly people will be affected by COVID-19.

Based on patient data from the Capital Region of Denmark and Region Zealand, the results of the study demonstrate that artificial intelligence can, with up to 90 percent certainty, determine whether an uninfected person who is not yet infected will die of COVID-19 or not if they are unfortunate enough to become infected. Once admitted to the hospital with COVID-19, the computer can predict with 80 percent accuracy whether the person will need a respirator.

"We began working on the models to assist hospitals, as during the first wave, they feared that they did not have enough respirators for intensive care patients. Our new findings could also be used to carefully identify who needs a vaccine," explains Professor Mads Nielsen of the University of Copenhagen's Department of Computer Science.

Older men with high blood pressure are highest at risk

The researchers fed a computer program with health data from 3,944 Danish COVID-19 patients. This trained the computer to recognize patterns and correlations in both patients' prior illnesses and in their bouts against COVID-19.

"Our results demonstrate, unsurprisingly, that age and BMI are the most decisive parameters for how severely a person will be affected by COVID-19. But the likelihood of dying or ending up on a respirator is also heightened if you are male, have high blood pressure or a neurological disease," explains Mads Nielsen.

The diseases and health factors that, according to the study, have the most influence on whether a patient ends up on a respirator after being infected with COVID-19 are in order of priority: BMI, age, high blood pressure, being male, neurological diseases, COPD, asthma, diabetes and heart disease.

"For those affected by one or more of these parameters, we have found that it may make sense to move them up in the vaccine queue, to avoid any risk of them becoming inflected and eventually ending up on a respirator," says Nielsen.

Predicting respiratory needs is a must

Researchers are currently working with the Capital Region of Denmark to take advantage of this fresh batch of results in practice. They hope that artificial intelligence will soon be able to help the country's hospitals by continuously predicting the need for respirators.

"We are working towards a goal that we should be able to predict the need for respirators five days ahead by giving the computer access to health data on all COVID positives in the region," says Mads Nielsen, adding:

"The computer will never be able to replace a doctor's assessment, but it can help doctors and hospitals see many COVID-19 infected patients at once and set ongoing priorities."

However, technical work is still pending to make health data from the region available for the computer and thereafter to calculate the risk to the infected patients. The research was carried out in collaboration with Rigshospitalet and Bispebjerg and Frederiksberg Hospital.

Credit: 
University of Copenhagen - Faculty of Science

Technion researchers discover new pathway for attacking cancer cells

When treating cancer, researchers are always searching for ways to remove cancer cells while minimizing damage to the rest of the body. One possible approach is to find processes unique to cancer cells, and which would allow specific targeting. If such a process can be disrupted, only those cells would be affected.

A process (or absence thereof) can be unique to some types of cancer, and not be present in others. In such a case, we would want a simple way to recognize whether a particular tumor possesses the unique trait or not. The implication of this question is whether the tumor would respond to this or that treatment, allowing us to match a treatment to the patient who is likely to be helped by it, rather than going by trial and error.

Professor Tomer Shlomi's research group discovered just such a process - one that may be targeted in cancer cells without causing damage to healthy ones, findings that have been published in Cell Metabolism.

The folate cycle is a process essential to DNA and RNA production. As a result, it is highly important to both cancer cells and healthy cells. Because DNA production is a critical stage in cell division, and thus in tumor growth, the folate cycle is a common target for chemotherapy. However, for the very same reason, there are significant side effects to attacking it.

There are, in fact, two folate cycles - one happening in the mitochondria (an organelle inside the cell), and one in the cytosol (the fluid that fills the cell). A healthy cell can switch from one to the other. A variety of tumor cells, Professor Shlomi's group discovered, rely on the cytosolic pathway exclusively. The implication is, if treatment were to target the cytosolic folate cycle, healthy cells would switch to the mitochondrial cycle and would not be harmed, leaving tumor cells to die.

It remains to recognize whether a particular tumor is indeed one in which the mitochondrial folate cycle is non-functional, and here too Shlomi's team provided. RFC is a transporter protein that regulates intracellular folate levels. Low RFC - low folate. Low folate, the group discovered, is devastating to the mitochondrial cycle. So low RFC tumors are the ones that would be affected by cytosolic cycle-blocking treatments.

Both the pathway that may be attacked, and the way to recognize which tumors the attack would be effective against have thus been found.

Credit: 
Technion-Israel Institute of Technology

Machine learning generates realistic genomes for imaginary humans

image: A chromosome emerges from random digital noise.

Image: 
Burak Yelmen

Machines, thanks to novel algorithms and advances in computer technology, can now learn complex models and even generate high-quality synthetic data such as photo-realistic images or even resumes of imaginary humans. A study recently published in the international journal PLOS Genetics uses machine learning to mine existing biobanks and generate chunks of human genomes which do not belong to real humans but have the characteristics of real genomes.

"Existing genomic databases are an invaluable resource for biomedical research, but they are either not publicly accessible or shielded behind long and exhausting application procedures due to valid ethical concerns. This creates a major scientific barrier for researchers. Machine-generated genomes, or artificial genomes as we call them, can help us overcome the issue within a safe ethical framework," said Burak Yelmen, first author of the study and Junior Research Fellow of Modern Population Genetics at the University of Tartu.

The pluridisciplinary team performed multiple analyses to assess the quality of the generated genomes compared to real ones. "Surprisingly, these genomes emerging from random noise mimic the complexities that we can observe within real human populations and, for most properties, they are not distinguishable from other genomes from the biobank we used to train our algorithm, except for one detail: they do not belong to any gene donor," said Dr Luca Pagani, one of the senior authors of the study and a Mobilitas Pluss fellow.

The study additionally involves the assessment of the proximity of artificial genomes to real genomes to test whether the privacy of the original samples is preserved. "Although detecting privacy leaks among thousands of genomes could appear as looking for a needle in a haystack, combining multiple statistical measures allowed us to check all models carefully. Excitingly, the detailed exploration of complex leakage patterns can lead to improvements in generative model evaluation and design, and will fuel back the machine learning field," said Dr Flora Jay, the coordinator of the study and CNRS researcher in the Interdisciplinary computer science laboratory (LRI/LISN, Université Paris-Saclay, French National Centre for Scientific Research).

All in all, machine learning approaches had provided faces, biographies and multiple other features to a handful of imaginary humans: now we know more about their biology. These imaginary humans with realistic genomes could serve as proxies for all the real genomes which are not publicly available or require long application procedures or collaborations, hence removing an important accessibility barrier in genomic research, in particular for underrepresented populations.

Credit: 
Estonian Research Council

Study highlights risk of new SARS-CoV-2 mutations emerging during chronic infection

SARS-CoV-2 mutations similar to those in the B1.1.7 UK variant could arise in cases of chronic infection, where treatment over an extended period can provide the virus multiple opportunities to evolve, say scientists.

Writing in Nature, a team led by Cambridge researchers report how they were able to observe SARS-CoV-2 mutating in the case of an immunocompromised patient treated with convalescent plasma. In particular, they saw the emergence of a key mutation also seen in the new variant that led to the UK being forced once again into strict lockdown, though there is no suggestion that the variant originated from this patient.

Using a synthetic version of the virus Spike protein created in the lab, the team showed that specific changes to its genetic code - the mutation seen in the B1.1.7 variant - made the virus twice as infectious on cells as the more common strain.

SARS-CoV-2, the virus that causes COVID-19, is a betacoronavirus. Its RNA - its genetic code - is comprised of a series of nucleotides (chemical structures represented by the letters A, C, G and U). As the virus replicates itself, this code can be mis-transcribed, leading to errors, known as mutations. Coronaviruses have a relatively modest mutation rate at around 23 nucleotide substitutions per year.

Of particular concern are mutations that might change the structure of the 'spike protein', which sits on the surface of the virus, giving it its characteristic crown-like shape. The virus uses this protein to attach to the ACE2 receptor on the surface of the host's cells, allowing it entry into the cells where it hijacks their machinery to allow it to replicate and spread throughout the body. Most of the current vaccines in use or being trialled target the spike protein and there is concern that mutations may affect the efficacy of these vaccines.

UK researchers within the Cambridge-led COVID-19 Genomics UK (COG-UK) Consortium have identified a particular variant of the virus that includes important changes that appear to make it more infectious: the ΔH69/ΔV70 amino acid deletion in part of the spike protein is one of the key changes in this variant.

Although the ΔH69/ΔV70 deletion has been detected multiple times, until now, scientists had not seen them emerge within an individual. However, in a study published today in Nature, Cambridge researchers document how these mutations appeared in a COVID-19 patient admitted to Addenbrooke's Hospital, part of Cambridge University Hospitals NHS Foundation Trust.

The individual concerned was a man in his seventies who had previously been diagnosed with marginal B cell lymphoma and had recently received chemotherapy, meaning that that his immune system was seriously compromised. After admission, the patient was provided with a number of treatments, including the antiviral drug remdesivir and convalescent plasma - that is, plasma containing antibodies taken from the blood of a patient who had successfully cleared the virus from their system. Despite his condition initially stabilising, he later began to deteriorate. He was admitted to the intensive care unit and received further treatment, but later died.

During the patient's stay, 23 viral samples were available for analysis, the majority from his nose and throat. These were sequenced as part of COG-UK. It was in these sequences that the researchers observed the virus's genome mutating.

Between days 66 and 82, following the first two administrations of convalescent sera, the team observed a dramatic shift in the virus population, with a variant bearing ΔH69/ΔV70 deletions, alongside a mutation in the spike protein known as D796H, becoming dominant. Although this variant initially appeared to die away, it re-emerged again when the third course of remdesivir and convalescent plasma therapy were administered.

Professor Ravi Gupta from the Cambridge Institute of Therapeutic Immunology & Infectious Disease, who led the research, said: "What we were seeing was essentially a competition between different variants of the virus, and we think it was driven by the convalescent plasma therapy.

"The virus that eventually won out - which had the D796H mutation and ΔH69/ΔV70 deletions - initially gained the upper hand during convalescent plasma therapy before being overtaken by other strains, but re-emerged when the therapy was resumed. One of the mutations is in the new UK variant, though there is no suggestion that our patient was where they first arose."

Under strictly-controlled conditions, the researchers created and tested a synthetic version of the virus with the ΔH69/ΔV70 deletions and D796H mutations both individually and together. The combined mutations made the virus less sensitive to neutralisation by convalescent plasma, though it appears that the D796H mutation alone was responsible for the reduction in susceptibility to the antibodies in the plasma. The D796H mutation alone led to a loss of infection in absence of plasma, typical of mutations that viruses acquire in order to escape from immune pressure.

The researchers found that the ΔH69/ΔV70 deletion by itself made the virus twice as infectious as the previously dominant variant. The researchers believe the role of the deletion was to compensate for the loss of infectiousness due to the D796H mutation. This paradigm is classic for viruses, whereby escape mutations are followed by or accompanied by compensatory mutations.

"Given that both vaccines and therapeutics are aimed at the spike protein, which we saw mutate in our patient, our study raises the worrying possibility that the virus could mutate to outwit our vaccines," added Professor Gupta.

"This effect is unlikely to occur in patients with functioning immune systems, where viral diversity is likely to be lower due to better immune control. But it highlights the care we need to take when treating immunocompromised patients, where prolonged viral replication can occur, giving greater opportunity for the virus to mutate."

Credit: 
University of Cambridge

Birds living in natural habits can help inform captive care

Bird species that live in their natural habitats can help zoos learn how to manage those in captivity, according to a new review.

Birds are the most diverse group housed by zoos around the world, but zoo-based research tends not to focus on birds.

A new article published in the journal Birds, by Dr Paul Rose of the University of Exeter, suggests zoos can improve management of birds by looking at how species live in their natural habitats.

Likewise, birds living under the care of humans can also help guide and develop conservation action for those living in the wild.

"Research into wild birds is extremely useful for furthering how birds are managed in zoos," said Dr Rose.

"For species of conservation concern, zoo professionals can be linked with field biologists to share information on how to best care for these species in captivity and how to develop and formulate conservation actions.

"We can use proxy species - those common in zoos - to develop practices for conservation that can be used for less familiar species that might be of concern and need help from information gathered through things such as captive breeding.

"Or we can promote the threats that these not-in-the-zoo species face by using the commoner species as an ambassador.

"We do this through my work at the Wildfowl & Wetlands Trust, promoting the rarer species of flamingo that are in the wild using the commoner ones we keep in the living collection."

In the review, Dr Rose uses hornbills as an example, a species of bird that is essential to the long viability and sustainability of biodiversity in the rainforest.

The helmeted hornbill, a critically endangered species, plays an important role in the dispersal of seeds within pristine, undistributed areas of south-east Asian rainforests.

The population decline of the helmeted hornbill has been caused by poaching of the birds for their "ivory", the large casque on the bird's head and bill that can be up to 10% of its overall body mass.

Whilst the helmeted hornbill is not found in captivity, other species of large hornbill are.

By looking at the ecological role of the helmeted hornbill in its natural habitat, zoos have been able to design enclosures that will increase chances of reproduction.

For example, by identifying the temperature and humidity range of hornbill nesting sites in the wild which are more likely to hatch eggs, zoos have been able to use this data to enable them to match these environmental conditions as closely as possible.

A similar situation happened with the Guam kingfishers, a species that is extinct in the wild and reliant on captive breeding for its survival.

Data from the nesting locations of the closely related Pohnpei kingfisher, found on a neighbouring island, showed that temperatures were hotter than those sometimes provided for captive Guam kingfishers.

The findings led to zoos raising the temperature to improve nesting success amongst the species.

Zoos have also been able to guide conservation action for hornbills living in the wild by monitoring the behaviour of these birds and discovering that using nest boxes enhances the quality of habitats for hornbills to breed in, which has led to these boxes being built in areas of the helmeted hornbill's range in Borneo.

Expertise and financial support has been provided by several large zoological collections in European Association of Zoos and Aquaria (EAZA) and North American Association of Zoos and Aquariums (AZA) that has successfully seen wild rhinoceros hornbills, listed as vulnerable, fledge a chick from an artificial nest box in the Bornean rainforest.

"The effect of visitors on zoos can also help direct future research questions and increase understanding of birds under human care," adds Dr Rose.

"Developing zoo bird exhibits to theme them around specific conservation messages can be used to promote wider understanding of the threats faced by wild birds specifically and hopefully encourage human behaviour change that benefits ecosystem health."

Credit: 
University of Exeter

Link found between time perception, risk for developmental coordination disorder

image: A researcher helping a child participant wearing an EEG cap, a non-invasive approach to measure the brain waves.

Image: 
Auditory Development Lab, McMaster University

Neuroscientists at McMaster University have found a link between children who are at risk for developmental coordination disorder (DCD), a common condition that can cause clumsiness, and difficulties with time perception such as interpreting changes in rhythmic beats.

Accurate time perception is crucial for basic skills such as walking and processing speech and music.

"Many developmental disorders, including dyslexia or reading difficulties, autism and attention deficits have been linked to deficits in auditory time perception," says Laurel Trainor, senior author of the study and founding director of the McMaster Institute for Music and the Mind.

Previous research has shown the brain networks involved in time perception often overlap with the motor control networks required for such activities as catching a ball or tapping along to musical beats. Until now, researchers had not investigated whether children with DCD tended to have auditory timing deficits, despite being at risk for dyslexia and attention deficits.

The study, published online in the journal Child Development, provides new evidence about that connection in children.

Developmental coordination disorder is a common but little-studied condition that affects approximately five to 15 per cent of all children, who can experience a wide range of difficulties with fine and/or gross motor skills. It can have profound and lifelong effects on everyday tasks such as get dressed, writing, and engaging in sports or play, and often interferes with learning, academic performance and socialization.

For this study, researchers recruited more than 60 children aged 6 and 7 years old, who underwent motor skills tests and were assessed either to be at risk for DCD or to be developing typically.

During the first study, each child was asked in a series of trials to pinpoint which of two sounds was shorter in time or had an off-beat rhythm. From this, researchers measured the threshold or smallest time difference at which each child could just barely make the correct judgement.

"We saw that indeed, children at risk for DCD were much less sensitive to time changes compared to typically developing children," says Andrew Chang, the lead researcher and graduate student in the Department of Psychology, Neuroscience & Behaviour at McMaster.

In the second experiment, researchers used EEG to measure the brain waves of children as they listened to a sequence of sounds that had been tweaked to include occasional timing deviations. Children at risk for DCD had slower brain activity in response to the unexpected timing deviants.

There are no medications to treat DCD, but physiotherapy and occupational therapy can help children improve muscle strength, balance and coordination.

"We know anecdotally that therapists sometimes incorporate regular rhythms into the physical therapy they give to children with DCD, and they have the impression this helps - for example that children can walk better when they walk to a rhythm." Chang explains.

"Although our current study did not directly investigate any intervention effects, the results suggest that music with salient and regular beats could be used for physiotherapy to help treat children," he says.

He points to motor rehabilitation featuring auditory cueing with metronomes or musical beats, which helps adult patients who have Parkinson's disease or are recovering from a stroke. Further research could help to determine whether similar therapies are useful for children with DCD, he says.

Credit: 
McMaster University

Pangolin coronavirus could jump to humans

Scientists at the Francis Crick Institute have found important structural similarities between SARS-CoV-2 and a pangolin coronavirus, suggesting that a pangolin coronavirus could infect humans.

While SARS-CoV-2 is thought to have evolved from a bat coronavirus, its exact evolutionary path is still a mystery. Uncovering its history is challenging as there are likely many undiscovered bat coronaviruses and, due to differences between bat coronaviruses and SARS-CoV-2, it is thought that the virus may have passed to humans via at least one other species.

In their study, published in Nature Communications, the scientists compared the structures of the spike proteins found on SARS-CoV-2, the most similar currently identified bat coronavirus RaTG13, and a coronavirus isolated from Malayan pangolins which were seized by authorities after being smuggled to China. They found that the pangolin virus was able to bind to receptors from both pangolins and humans. This differs to the bat coronavirus, which could not effectively bind with human or pangolin receptors.

Antoni Wrobel, co-lead author and postdoctoral training fellow in the Structural Biology of Disease Processes Laboratory at the Crick, says: "By testing if the spike protein of a given virus can bind with cell receptors from different species, we're able to see if, in theory, the virus could infect this species."

"Importantly here, we've shown two key things. Firstly, that this bat virus would unlikely be able to infect pangolins. And secondly that a pangolin virus could potentially infect humans."

The team used cryo-electron microscopy to uncover in minute detail the structure of the pangolin coronavirus' spike protein, which is responsible for binding to and infecting cells. While some parts of the pangolin virus' spike were found to be incredibly similar to SARS-CoV-2, other areas differed.

In terms of understanding the evolutionary path of SARS-CoV-2, this work does not confirm whether or not this pangolin virus is definitely part of the chain of evolution for SARS-CoV-2. But the findings do support various possible scenarios for how the coronavirus jumped from bats to humans. One potential route is that SARS-CoV-2 originated from a different, currently unknown bat coronavirus which could infect pangolins, and from this species it then moved to humans. Or alternatively, RaTG13 or a similar bat coronavirus might have merged with another coronavirus in a different intermediate species, other than a pangolin.

Donald Benton, co-lead author and postdoctoral training fellow in the Structural Biology of Disease Processes Laboratory at the Crick, says: "We still don't have evidence to confirm the evolutionary path of SARS-CoV-2 or to prove definitively that this virus did pass through pangolins to humans."

"However, we have shown that a pangolin virus could potentially jump to humans, so we urge caution in any contact with this species and the end of illegal smuggling and trade in pangolins to protect against this risk."

Steve Gamblin, group leader of the Structural Biology of Disease Processes Laboratory at the Crick says: "A lot is still to be uncovered about the evolution of SARS-CoV-2, but the more we know about its history and which species it passed through, the more we understand about how it works, and how it may continue to evolve."

This work builds upon previous studies from the Crick team, including research published in July 2020, which found that the bat coronavirus RaTG13 could not effectively bind to human receptors.

The team are continuing to examine the spikes of SARS-CoV-2 and related coronaviruses, including other bat viruses, to better understand the mechanisms of infection and evolution.

Credit: 
The Francis Crick Institute

Not all banking crises involve panics

A banking crisis is often seen as a self-fulfilling prophecy: The expectation of bank failure makes it happen. Picture people lining up to withdraw their money during the Great Depression or customers making a run on Britain's Northern Rock bank in 2007.

But a new paper co-authored by an MIT professor suggests we have been missing the bigger picture about banking crises. Yes, there are sometimes panics about banks that create self-reinforcing problems. But many banking crises are quieter: Even without customers panicking, banks can suffer losses serious enough to create subsequent economy-wide downturns.

"Panics are not needed for banking crises to have severe economic consequences," says Emil Verner, the MIT professor who helped lead the study. "But when panics do occur, those tend to be the most severe episodes. Panics are an important amplification mechanism for banking crises, but not a necessary condition."

Indeed, in an ambitious piece of research, spanning 46 countries and going back to 1870, the study surveys banking crises that occurred with and without panics. When there is a panic and bank run, the research finds, a 30 percent decline in banking-sector equity predicts a 3.4 percent drop in real GDP (gross domestic product adjusted for inflation) after three years. But even without any creditor panic, a 30 percent decline in bank equity predicts a 2.7 percent drop in real GDP after three years.

Thus, virtually all banking crises, not just history's greatest hits, create long-term macroeconomic damage, since banks are less able to furnish the credit used for business expansion.

"Banking crises do often come with very severe recessions," says Verner, who is the Class of 1957 Career Development Professor and an assistant professor of finance at the MIT Sloan School of Management.

The paper, "Banking Crises Without Panics," appears in the February issue of the Quarterly Journal of Economics. The authors are Matthew Baron, an assistant professor of finance at Cornell University; Verner; and Wei Xiong, a professor of finance and economics at Princeton University.

A rigorous, quantitative approach

To conduct the study, the researchers constructed a new dataset of bank stock prices and dividends in 46 countries from 1870 through 2016, using existing databases and adding information from historical newspaper archives. They also gathered nonbank stock prices, monthly credit spread information, and macroeconomic information such as GDP and inflation.

"People had looked historically at defining and identifying different episodes of banking crises, but there wasn't that much of a rigorous, quantitative approach to defining these episodes," Verner says. "There was a bit more of a 'know it when you see it' approach."

Scholars examining past banking crises divide roughly into two camps. One group has focused on panics, with the implication that if bank runs could be prevented, then banking crises would not be as bad. Another group has looked more at bank assets and focused on circumstances in which banks' decisions lead to big losses -- through bad loans, for instance.

"We come down in the middle, in some sense," Verner says. Panics make bank troubles worse, but nonetheless, "There are a number of examples of banking crises where banks suffered losses and cut back lending, and businesses and households had a harder time getting access to credit, but there weren't runs or panics by creditors. Those episodes still led to bad economic outcomes."

More specifically, the study's close look at the monthly dynamics of banking crises shows how often these circumstances are in fact presaged by an erosion of the bank's portfolio, and recognition of this fact by its investors.

"The panics don't just come out of the blue. They tend to be preceded by bank stocks declining," Verner says. "The bank equity investors recognize the bank is going to suffer loses on the loans it has. And so what that suggests is that panics are really often the consequences, rather than the fundamental cause, of troubles that have already built up in the banking system due to bad loans."

The study also quantifies how impaired bank activity becomes in these situations. After banking crises with visible panics involved, the average bank credit-to-GDP ratio was 5.7 lower after three years; that is, there was less bank lending as a basis for economic activity. When a "quiet" banking crisis hit, with no visible panic, the average bank credit-to-GDP ratio was 3.5 percent lower after three years.

Historical detective work

Verner says the researchers are pleased they were "able to do some historical detective work and find some episodes that had been forgotten." The study's expanded set of crises, he notes, comprises "new information that other researchers are already using."

Formerly overlooked banking crises in this study include a welter of episodes from the 1970s, Canada's struggles during the Great Depression, and various 19th century banking failures. The researchers have presented versions of this study to an array of policymakers, including some regional U.S. Federal Reserve boards and the Bank of International Settlements, and Verner also says he hopes such officials will keep the work in mind.

"I think it's valuable going forward, and not just for historical perspective," he says. "Having a broad sample across many countries is important for recognizing what the lessons are when new crises happen."

The researchers are continuing their research in this area with further studies about patterns in the loans banks make before losing value -- for instance, identifying the kinds of businesses who are less likely to repay bank loans. When banks start lending more heavily to certain kinds of companies -- possibly including restaurant, construction, and real estate companies -- it may be a sign of incipient trouble.

Credit: 
Massachusetts Institute of Technology

Bioplastics in the sustainability dilemma

Plastics made from crops such as maize or sugarcane instead of fossil fuels are generally considered sustainable. One reason is that plants bind CO2, which compensates for the carbon released into the atmosphere when plastics are disposed. However, there is a catch: With increasing demand for raw materials for bioplastic production, the areas under cultivation may not be sufficient. As a result, natural vegetation is often converted to agricultural land and forests are cut down. This in turn releases large amounts of CO2. The assumption that more bioplastics does not necessarily lead to more climate protection has now been confirmed by researchers at the University of Bonn (Germany) in a new study. They found that the sustainability of plant-based bioplastics depends largely on the country of origin, its trade relationships and the raw material processed. The study has been published in the journal "Resources, Conservation & Recycling".

As in previous analyses, the scientists used a global, flexible and modular economic model developed at the University of Bonn to simulate the impact of rising supply for bioplastics. The model is based on a world database (Global Trade Analysis Project). For their current study, the researchers modified the original model by disaggregating both conventional plastics and bioplastics, as well as additional crops such as maize and cassava. "This is crucial to better represent the bioplastics supply chain in major producing regions and assess their environmental impacts from a life cycle perspective," emphasizes agricultural engineer Dr. Neus Escobar, who conducted the study at the Institute for Food and Resource Economics (ILR) and the Center for Development Research (ZEF) at the University of Bonn and is now based at the International Institute for Applied Systems Analysis in Laxenburg (Austria).

In the current study, she and her colleague Dr. Wolfgang Britz considered the loss of natural vegetation on a global scale. They made estimates of readily available land to be converted into productive uses at the region level and associated model parameters. In their previous publication, the Bonn scientists had already disaggregated the production of conventional plastics and bioplastics in Brazil, China, the EU and the U.S. - the countries that lead the way in bioplastics production. In their current study, they also included Thailand, which is home of carbon-rich forests. Experts expect the Asian country to become a leading global producer of biodegradable and biobased plastics in the near future. "All these changes in the model are necessary to estimate global spillovers of policies or technologies," says Dr. Wolfgang Britz, who worked with his team on the extension of the model to derive sustainability indicators considering global land use change.

Factors such as country of origin and raw materials are decisive

The researchers simulated a total of 180 scenarios (36 scenarios per region) that varied according to the degree of bioplastics market penetration and other model parameters determining economywide responses. "We found that the carbon footprints of commercially available bioplastics are much larger than the values previously estimated in scientific literature and policy reports," says Neus Escobar.

The reason: CO2 emissions resulting from changes in land use outweigh the greenhouse gas savings resulting from the substitution for fossil raw materials in the long term. With one exception, the bioplastics produced in Thailand save an average of two kilograms of CO2 per ton. This is mainly due to the relatively smaller increase in bioplastics production that is simulated, which translates into minor adjustments in food prices and associated land cover changes. However, increasing production of bioplastics from cassava and sugarcane in Thailand to catch up with the other regions can result in the loss of carbon-rich ecosystems within the country.

None of the regions is clearly better positioned than another

The overall calculations show that none of the regions is clearly better positioned than another to become a hub for sustainable bioplastics production. The largest land footprints are estimated for Chinese bioplastics, while the European Union has the largest average carbon footprint: Bioplastics produced in the EU take an average of 232.5 years to offset global CO2 emissions. Bioplastics production in the U.S. causes the greatest land and carbon spillovers, which means that the production generates greater agricultural land expansion, deforestation and carbon emissions in the rest of the world than within the country. Bioplastics production in Thailand and Brazil comes at the cost of forest cover loss to a large extent, which can lead to additional impacts on biodiversity.

"Our study shows that an expansion in bio-based production should be carefully assessed on a region-by-region case in order to understand potentially sustainability risks and trade-offs," says Neus Escobar. The authors emphasize that the proposed metrics can be used in the future to monitor the long-term sustainability of bioeconomic interventions globally. Among other things, the metrics could help identify where complementary policies are needed - for example, to prevent deforestation.

Working on future-relevant research topics

The study is thematically embedded in the Transdisciplinary Research Area (TRA) "Innovation and Technology for Sustainable Futures" at the University of Bonn. In six different TRAs, scientists from a wide range of faculties and disciplines come together to work on future-relevant research topics. Neus Escobar was a member of the Transdisciplinary Research Area during the study, Wolfgang Britz is a member of the "PhenoRob" Cluster of Excellence at the University of Bonn.

Credit: 
University of Bonn

New study shows pandemic's toll on jobs, businesses, and food security in poorer countries

image: Share of Households Experiencing Drops in Food Security

Image: 
Innovations for Poverty Action

Washington, D.C.. -- The onset of the COVID-19 pandemic caused a sharp decline in living standards and rising food insecurity in developing countries across the globe, according to a new study by an international team of economists.

The study, published Feb. 5 in the journal Science Advances, provides an in-depth view of the health crisis's initial socioeconomic effects in low- and middle-income countries, using detailed micro data collected from tens of thousands of households across nine countries. The phone surveys were conducted from April through July 2020 of nationally and sub-nationally representative samples in Bangladesh, Burkina Faso, Colombia, Ghana, Kenya, Nepal, Philippines, Rwanda, and Sierra Leone. Across the board, study participants reported drops in employment, income, and access to markets and services, translating into high levels of food insecurity. Many households reported being unable to meet basic nutritional needs.

"COVID-19 and its economic shock present a stark threat to residents of low- and middle-income countries -- where most of the world's population resides -- which lack the social safety nets that exist in rich countries," said economist Susan Athey, of Stanford University's Graduate School of Business. "The evidence we've collected show dire economic consequences, including rising food insecurity and falling income, which, if left unchecked, could thrust millions of vulnerable households into poverty."

Across the 16 surveys, the percentage of respondents reporting losses in income ranged from 8% in Kenya to 86% in Colombia. The median, or midpoint of the range, was a staggering 70%. The percentage reporting loss of employment ranged from 6% in Sierra Leone to 51% in Colombia with a median of 29%.

"Painting a comprehensive picture of the economic impact of this global crisis requires the collection of harmonized data from all over the world," said Edward Miguel, the Oxfam Professor of Environmental and Resource Economics at the University of California, Berkeley, Director of the Center for Effective Global Action, and a co-author of the study. "Our work is an exciting example of fruitful collaboration among research teams from UC Berkeley, Northwestern, Innovations for Poverty Action, The Busara Center for Behavioral Economics in Kenya, Yale, and many others working in multiple countries simultaneously to improve our understanding of how COVID-19 has affected the living standards of people in low- and middle-income countries on three continents."

Significant percentages of respondents across the surveys reported being forced to miss meals or reduce portion sizes, including 48% of rural Kenyan households, 69% of landless, agricultural households in Bangladesh, and 87% of rural households in Sierra Leone -- the highest level of food insecurity. Poorer households generally reported higher rates of food insecurity, though rates were substantial even among the better off. The steep rise in food insecurity reported among children was particularly alarming given the potentially large negative long-run effects of under-nutrition on outcomes later in life, according to the study.

Survey results from Bangladesh and Nepal suggest that levels of food insecurity were far higher during the pandemic than during the same season in previous years.

In most countries, a large share of respondents reported reduced access to markets, consistent with lockdowns and other restrictions on mobility implemented between March and June 2020 to contain the spread of the virus. The amount of social support available to respondents from governments or non-governmental organizations varied widely across the surveys, but the high rates of food insecurity reported suggest that support was insufficient even when present, the researchers state.

The study shows that in addition to increasing food insecurity, the pandemic and accompanying containment measures have undermined several other aspects of household wellbeing. Schools in all sample countries were closed during most or all of the survey period. Respondents also reported reduced access to health services, including prenatal care and vaccinations. Combined, these factors could be particularly damaging to children in the long run, the researchers note.

"The pandemic's economic shock in these countries, where so many people depend on casual labor to feed their families, causes deprivations and adverse consequences in the long term, including excess mortality," said study co-author Ashish Shenoy, of the University of California, Davis. "Our findings underscore the importance of gathering survey data to understand the effects of the crisis and inform effective policy responses. We demonstrate the efficacy of large-scale phone surveys to provide this crucial data."

Current circumstances may call for social protection programs that prioritize addressing immediate poverty and under-nutrition before tackling deeper underlying causes, the researchers state. They suggest policymakers consider identifying poor households using mobile phones and satellite data and then provide them mobile cash transfers. The researchers also recommend providing support for basic utilities, such as water and electricity, through subsidies and by removing penalties for unpaid bills. They note a fundamental link between containing COVID-19 and providing economic relief as households facing acute shortages may be less willing than others to follow social distancing rules so that they can find opportunities to meet basic needs.

Credit: 
Innovations for Poverty Action

Drug 'breakthrough' gives longest-ever survival in nonresectable liver cancer patients

image: Digital Liver Cancer Summit 2021

Image: 
EASL

6 February: The IMbrave150 trial found median overall survival was 19.2 months in patients treated with atezo+bev vs 13.4 months for those treated with sorafenib alone, the current standard treatment (HR, 0.66 [95% CI, 0.52-0.85]; P=0.0009). Survival at 18 months was 52% with atezo+bev and 40% in patients treated with sorafenib.

All patients in the trial had nonresectable HCC - the most common form of liver cancer - and had not previously been treated with systemic therapy. A total of 501 patients were treated in the multicentre, open label, randomised controlled trial and the new follow-up figures confirm the superiority of the atezo+bev combination over sorafenib in this group of patients with HCC.

Atezolizumab is an immune checkpoint inhibitor drug, which helps the immune system hunt down and destroy cancer. Bevacizumab is a targeted monoclonal antibody therapy that starves tumours of their blood supply by preventing endothelial growth but also enhances the immune effects of atezolizumab.

The new data, presented today at the European Association for the Study of the Liver (EASL) Liver Cancer Summit 2021, follows the initial publication of trial data[i] with 8.6 months of follow-up which found survival at 12 months was 67.2% with atezo+bev, compared to 54.6% in those treated with sorafenib. This new post-hoc descriptive overall survival analysis included 12 months of additional follow-up from the primary analysis.

Prof. Richard Finn, lead author of the study, commented, "IMbrave150 showed consistent clinically meaningful treatment benefit and safety with an additional 12 months of follow-up. The combination provides the longest survival seen in a front-line Phase III study in advanced HCC, confirming atezo+bev as a standard of care for previously untreated, unresectable HCC."

"These are highly significant findings for the treatment of patients with HCC. Many thousands of patients worldwide could benefit from this treatment and it can be considered a major breakthrough - the first improvement in treatment for these types of cases in 13 years and a treatment long awaited by doctors."

The trial enrolled systemic treatment-naive patients with unresectable HCC, ?1 measurable untreated lesion (RECIST 1.1), Child-Pugh class A liver function and ECOG PS 0/1. Patients were randomised 2:1 to atezo 1200 mg IV q3w + bev 15 mg/kg IV q3w or sorafenib 400 mg bid until unacceptable toxicity or loss of clinical benefit per investigator. Patients were required to have an upper endoscopy within 6 months of starting the study, to assess for high-risk varices.

Survival benefit with atezo+bev vs sorafenib was generally consistent across subgroups and with the primary analysis. The updated objective response rate (ORR; 29.8% per RECIST 1.1) with atezo+bev was in line with the primary analysis, with more patients achieving complete response (CR; 7.7%) than previously reported. Safety was consistent with the primary analysis, with no new signals identified.

"We now need to understand what is next in front-line liver cancer and how will we build on this data to further improve outcomes beyond the 19.2 months we described. Additionally, we need to evaluate the efficacy for this regimen in earlier stages of HCC."

Credit: 
Spink Health

Pandemic caused 'staggering' economic, human impact in developing counties, research says

Berkeley -- The onset of the COVID-19 pandemic last year led to a devastating loss of jobs and income across the global south, threatening hundreds of millions of people with hunger and lost savings and raising an array of risks for children, according to new research co-authored at the University of California, Berkeley.

The research, to be published Friday Feb. 5, 2021, in the journal Science Advances, found "staggering" income losses after the pandemic emerged last year, with a median 70% of households across nine countries in Africa, Asia and Latin America reporting financial losses. By April last year, roughly 50% or more of those surveyed in several countries were forced to eat smaller meals or skip meals altogether, a number that reached 87% for rural households in the West African country of Sierra Leone.

"In the early months of the pandemic, the economic downturn in low- and middle-income countries was almost certainly worse than any other recent global economic crisis that we know of, whether the Asian financial crisis of the late 1990s, the Great Recession that started in 2008, or the more recent Ebola crisis," said UC Berkeley economist Edward Miguel, a co-author of the study. "The economic costs were just severe, absolutely severe."

The pandemic has produced some hopeful innovations, including a partnership between the government of Togo in West Africa and UC Berkeley's Center for Effective Global Action (CEGA) on a system to provide relief payments via digital networks.

But such gains are, so far, isolated.

The new study -- the first of its kind globally -- reports that after two decades of growth in many low- and middle-income countries, the economic crisis resulting from the COVID-19 pandemic threatens profound long-term impact: Reduced childhood nutrition could have health consequences later in life. Closed schools may lead to delayed development for some students, while others may simply drop out. When families use their savings to eat, rather than invest in fertilizer or farm improvements, crop yields can decline.

"Such effects can slow economic development in a country or a region, which can lead to political instability, diminished growth or migration," said Miguel, a co-director at CEGA.

A troubling picture of life during the pandemic

The study was launched in spring 2020, as China, Europe and the U.S. led global efforts to check spread of the virus through ambitious lockdowns of business, schools and transit. Three independent research teams, including CEGA, joined to conduct surveys in the countries where they already worked.

Between April and early July 2020, they connected with 30,000 households, including over 100,000 people, in nine countries with a combined population of 500 million: Burkina Faso, Ghana, Kenya, Rwanda and Sierra Leone in Africa; Bangladesh, Nepal and the Philippines in Asia; and Colombia in South America. The surveys were conducted by telephone.

Reports early in the pandemic suggested that developing countries might be less vulnerable because their populations are so much younger than those in Europe and North America.

But the research teams found that, within weeks after governments imposed lockdowns and other measures to control the virus's spread, the pandemic was having a pervasive economic impact:

Income fell broadly. In Colombia, 87% of respondents nationwide reported lost income in the early phase of the pandemic. Such losses were reported by more than 80% of people nationwide in Rwanda and Ghana.

People struggled to find food. In the Philippines, 77% of respondents nationwide said they faced difficulty purchasing food because stores were closed, transport was shut down or food supplies were inadequate. Similar reports came from 68% of Colombians and 64% of respondents in Sierra Leone; rates were similar for some communities within other countries.

Food insecurity rose sharply. While the impact was worst in rural Sierra Leone, other communities were hard hit: In Bangladesh, 69% of landless agricultural households reported that they were forced to eat less, along with 48% of households in rural Kenya.

Children faced increased risk. With schools closed, the risk of educational setbacks rose. Many respondents reported delaying health care, including prenatal care and vaccinations. Some communities reported rising levels of domestic violence.

"The combination of a lengthy period of undernutrition, closed schools, and limited health care may be particularly damaging in the long run for children from poorer households who do not have alternative resources," the authors wrote.

Miguel's recent research has focused on economic conditions for poor people in Kenya, and he said people there scrambled to cope with the crisis.

"People moved in with relatives," he said. "People moved back to their home areas in rural places where there was food. Other people were just relying on the generosity of friends and relatives and co-workers to get by. When you're living on only a couple of dollars a day, and you don't get that money, it's a desperate situation."

Wealthier countries are also gripped by crisis, but co-author Susan Athey, an economist at Stanford University's Graduate School of Business, said they're better able to cope.

"COVID-19 and its economic shock present a stark threat to residents of low- and middle-income countries -- where most of the world's population resides -- which lack the social safety nets that exist in rich countries," Athey said. "The evidence we've collected shows dire economic consequences ... which, if left unchecked, could thrust millions of vulnerable households into poverty."

A model of positive, high-impact international partnership

In fact, Miguel said, governments everywhere have struggled to address the health and economic dimensions of the pandemic. In both rich and poor nations, he said, governments have used the pandemic as a reason to crack down on political opponents.

But the crisis has also produced hopeful engagements. The CEGA initiative to support Togolese leaders in developing a system for digital relief payments could be a model for international partnerships.

Under that project, CEGA co-Director Joshua Blumenstock has worked closely with top government officials in Togo to develop an advanced data-driven system for identifying people in need and delivering financial aid. The system uses new computational technologies, with data from satellite imagery, mobile phones and traditional surveys to identify people or communities in economic distress.

CEGA and the GiveDirectly aid organization have just won a $1.2 million grant under the data.org Inclusive Growth and Recovery Challenge to allow further work on the project.

Already, "over 550,000 Togolese individuals have received cash transfers of roughly $20 a month," said Lauren Russell, CEGA director of operations. "The grant should allow for the project to be scaled and evaluated even further, with the hope that the methods might be well-suited for adoption by other low- and middle-income countries."

Global crises require global solutions

Still, Miguel said the disparities between rich and poor nations have been "disheartening." In North America and Europe, nations may be struggling with vaccination plans, but vaccines have barely arrived in most low-income countries, he said.

"We will not recover in the rich countries until the whole world gets the vaccine and until the crisis is dealt with globally," he said. "As long as there's active pandemic in parts of the world that's affecting travel and tourism and trade, our economy and our society is going to suffer. If we can spread the wealth in terms of pandemic relief assistance and vaccine distribution, we're all going to get out of this hole faster."

Credit: 
University of California - Berkeley

Fingerprint for the formation of nitrous oxide emissions

image: The 16 grassland monoliths come from the Kaserstattalm in the Tyrolean Stubaital - a site for long-term ecosystem research.

Image: 
Eliza Harris

Scientists led by Eliza Harris and Michael Bahn from the Institute of Ecology at the University of Innsbruck have succeeded in studying emissions of the greenhouse gas N2O under the influence of environmental impacts in an unprecedented level of detail. The study, which has now been published in Science Advances, is thus also a starting point for the creation of models that could predict future trends in the greenhouse gas emission dynamics of ecosystems under global climate change.

Nitrous oxide (N2O) is a potent greenhouse gas whose atmospheric growth rate has accelerated over the past decade. The largest share of anthropogenic N2O emissions results from the fertilization of soils with nitrogen, which is converted into N2O via various abiotic and biological processes. A team of scientists led by Eliza Harris and Michael Bahn from the Functional Ecology research group at the University of Innsbruck has now been able to trace in detail the N2O production and consumption pathways that occur within the nitrogen cycle, and ultimately lead to the emission of this greenhouse gas, as part of the FWF-funded project NitroTrace. In an experimental setup at the University of Innsbruck, 16 intact grassland monoliths of the subalpine Long-Term Ecosystem Research (LTER) site Kaserstattalm in the Stubaital region of Tyrol were studied. The soil blocks were exposed to extreme drought and subsequent rewetting. These weather conditions reflect the climatic changes to which many regions across the globe, including the Alps, are increasingly exposed. "Our goal was to quantify the net effect of drought and rewetting on N2O formation processes and emissions, which is currently largely unexplored", says Eliza Harris. Contrary to expectations of the researchers, the process of denitrification, the breakdown of nitrate to N2O and molecular nitrogen (N2) by specialized microorganisms, was found to dominate N2O production in very dry soils. According to previous assumptions, this process takes place primarily in moist, oxygen-poor soils, and as a result more N2O can be released into the atmosphere during drought than expected. The researchers had expected the process of nitrification to predominate in dry soils, producing nitrate, which is an important chemical compound for plants. "We assumed that if the soil was dry, there would be enough oxygen available for nitrification. After closer examination, we were able to detect drought-induced accumulations of nitrogen-containing organic matter on the surface of our soil samples and identify them as triggers for denitrification in dry soil. This suggests a strong role for the previously poorly understood chemodenitrification and codenitrification pathways, where additional abiotic and biotic processes lead to the formation of N2O", explains Eliza Harris the surprising result. Overall, N2O emission was greatest during rewetting after extreme drought.

The results provide researchers with unprecedented insights into the nitrogen cycle and the processes involved in the formation of the greenhouse gas N2O in response to environmental parameters. A better understanding of production and consumption reactions can help to find solutions to reduce greenhouse gas emissions, which have been increasing for decades.

Innovative analysis method

Crucial to the research success was the use of laser isotope spectroscopy, made possible through the FFG-funded project LTER-CWN. "Through this novel analytical technique, we can determine the isotopic composition of N2O. Thus, we get a kind of fingerprint for the formation process of the emitted N2O, which in turn helps us to understand its microbial formation process", emphasizes Eliza Harris the importance of this procedure. Molecular ecology analyses also helped them determine which genes and microbes were involved in the nitrogen transformation. In addition, spatial analysis techniques helped determine elemental composition and distribution in the soil. "We hope that by continuing to apply the combination of these methods in future similar research projects, we will gain further insights into feedback effects between climate change and the nitrogen cycle across different ecosystems and environments", says Eliza Harris. The researchers' long-term goal is to use models to predict ecosystem emission dynamics in the context of climate change.

Credit: 
University of Innsbruck

Genes for face shape identified

image: An international research team found that a gene influencing lip shape in a Latin American population appears to have been inherited from the Denisovans, an extinct group of ancient humans who lived tens of thousands of years ago.

Image: 
UCL, Aix-Marseille University and The Open University research team

Genes that determine the shape of a person's facial profile have been discovered by a UCL-led research team.

The researchers identified 32 gene regions that influenced facial features such as nose, lip, jaw, and brow shape, nine of which were entirely new discoveries while the others validated genes with prior limited evidence.

The analysis of data from more than 6,000 volunteers across Latin America was published today in Science Advances.

The international research team, led from UCL, Aix-Marseille University and The Open University, found that one of the genes appears to have been inherited from the Denisovans, an extinct group of ancient humans who lived tens of thousands of years ago.

The team found that the gene, TBX15, which contributes to lip shape, was linked with genetic data found in the Denisovan people, providing a clue to the gene's origin. The Denisovans lived in central Asia, and other studies suggest they interbred with modern humans, as some of their DNA lives on in Pacific Islanders and Indigenous people of the Americas.

Co-corresponding author Dr Kaustubh Adhikari (UCL Genetics, Evolution & Environment and The Open University) said: "The face shape genes we found may have been the product of evolution as ancient humans evolved to adapt to their environments. Possibly, the version of the gene determining lip shape that was present in the Denisovans could have helped in body fat distribution to make them better suited to the cold climates of Central Asia, and was passed on to modern humans when the two groups met and interbred."

Co-first author Dr Pierre Faux (Aix-Marseille University) said: "To our knowledge this is the first time that a version of a gene inherited from ancient humans is associated with a facial feature in modern humans. In this case, it was only possible because we moved beyond Eurocentric research; modern-day Europeans do not carry any DNA from the Denisovans, but Native Americans do."

Co-first author Betty Bonfante (Aix-Marseille University) added: "It is one of only a few studies looking for genes affecting the face in a non-European population, and the first one to focus on the profile only."

Researchers have only been able to analyse complex genetic data from thousands of people at once over the last two decades, since the mapping of the human genome enabled the use of genome-wide association studies to find correlations between traits and genes. This study compared genetic information from the study participants with characteristics of their face shape, quantified with 59 measurements (distances, angles and ratios between set points) from photos of the participants' faces in profile.

Co-corresponding author Professor Andres Ruiz-Linares (Fudan University, UCL Genetics, Evolution & Environment, and Aix-Marseille University) said: "Research like this can provide basic biomedical insights and help us understand how humans evolved."

The findings of this research could help understand the developmental processes that determine facial features, which will help researchers studying genetic disorders that lead to facial abnormalities.

The results also contribute to the understanding of the evolution of facial appearance in human and other species. One of the newly discovered genes found in this study is VPS13B, which influenced nose pointiness; the researchers also found that this gene affects nose structure in mice, indicating a broadly shared genetic basis among distantly related mammal species.

Credit: 
University College London

COVID-19: Schools urgently need guidelines on improving ventilation in classrooms

There is an urgent need for guidelines on how schools can use ventilation to reduce the risk of COVID-19 transmission in the classroom, according to doctors at Imperial College London and the headteacher of a secondary school in Pinner, Middlesex. In a commentary published by the Journal of the Royal Society of Medicine, the authors say that improving air quality in classroom spaces should be as important as following government advice regarding social distancing, mask-wearing and hand washing.

The authors point to lessons from the airline industry, where the risk of contracting COVID-19 on a flight is currently lower than from an office building or a classroom. Lead author Dr Kaveh Asanati, Honorary Clinical Senior Lecturer in occupational lung disease at the National Heart & Lung Institute, Imperial College London, said: "The multi-layer risk reduction strategy used in the aviation industry seems to have been working efficiently. The strategy includes testing passengers, the use of face coverings or masks, hygiene measures and, more importantly, maintaining clean air by circulating a mix of fresh air and recycled air through High-Efficiency Particulate Air (HEPA) filters."

Few school buildings have HEPA filtration but a potential practical option for schools would, according to the authors, be the use of portable HEPA filtration units. They say that the US Centers for Disease Control and Prevention recommends for healthcare workers during COVID-19 pandemic to consider the addition of these units to augment air quality in areas when permanent air-handling systems are not a feasible option. The authors go on to describe a study in a hospital room of COVID-19 patients, where the researchers were able to detect SARS-CoV-2 in aerosols, only when they used the air samplers without a HEPA filter on the inlet tube.

Dr Asanati said: "To keep schools open, there is an urgent need to implement more effective on-site mitigation strategies, with particular attention to ventilation and testing. In addition, it is essential that teachers and other school staff should be added to the priority list for vaccination."

The authors say a feasibility study of implementing better ventilation and filtration systems in schools is needed, as well as some pilot work and research involving indoor air quality and heating, ventilation and air conditioning (HVAC) experts. Until then, they write, keeping doors and windows open - for as much as is reasonably practicable - seems to be the best way forward.

Credit: 
SAGE