Tech

Dealing with the global tsunami of mental health problems during and post COVID-19

In a special session addressing global mental health before, during and after the COVID-19 pandemic held at the ESCMID Conference on Coronavirus Disease (ECCVID) Professor Vikram Patel H(arvard Medical School, USA) will present a new review of the impact of the COVID-19 pandemic on global mental health.

He will explain: "Mental health problems were already a leading cause of suffering and the most neglected health issue globally before the pandemic. The pandemic will, through worsening the social determinants of mental health, fuel a worsening of this crisis,"

The pressures on mental health, that already existed in abundance before this global pandemic, are increasing at an alarming rate. Prof Patel will touch on some of these in his talk. "There are so many issues which affect large sections of the population, including worries about jobs and income security, social exclusion, school closures and working from home creating huge pressure on families," he says. "There are also disruptions to medical services and care, potential domestic violence situations, and the varying levels of fear people have of being infected by this new virus."

The pandemic threatens to reverse years of global development, including in the countries that can least afford to start going backwards. In August 2020, World Bank President David Malpass predicted as many as 100 million people will be pushed back into extreme poverty. As a result of the global economic recession, the mental health tsunami is going to sweep through all countries, rich and poor. "The 2008 recession, which largely affected only the US, was followed by a wave of 'deaths of despair' in the USA, driven by suicide and substance use," explains Prof Patel. "Without huge levels of government support for both the mental health sector and a whole host of other sectors, we are tragically facing a repeat of this, but perhaps on a much greater scale."

He points out at that before COVID-19 arrived, there was already a global mental health crisis. "The relative burden of mental and substance use disorders increased by nearly 50% in the past 25 years. These disorders now account for one in every ten years of lost health globally and suicide rates in young people are rising in many countries."

However, he concludes: "I believe the pandemic presents a historic opportunity to reimagine mental health care, by realising the science which demonstrates that we must reframe mental health beyond a narrow focus on 'diagnoses, doctors and drugs'. This implies that we need to greatly increase the emphasis on prevention through actions on social determinants in the first two decades of life, adopt a human rights approach to eliminate coercion and place the lived experience at the heart of all mental health care, and scale up the evidence demonstrating that non-specialist, community based providers can effectively deliver psychosocial interventions. Above all, the science emphasizes the need to embrace the diversity of experiences and interventions to address this crisis - that for most of us, will be the worst health crisis we are likely to see in our lifetimes."

Credit: 
European Society of Clinical Microbiology and Infectious Diseases

Chemists from RUDN University developed biodegradable antibacterial film for storing food

image: A team of chemists from RUDN University created an antibacterial coating for food products. The mixture consists of two components that are safe for human health and form a thin, non-toxic, and biodegradable film. The film has no color or flavor and can increase the shelf life of different products 2.5 to 8 times.

Image: 
RUDN University

A team of chemists from RUDN University created an antibacterial coating for food products. The mixture consists of two components that are safe for human health and form a thin, non-toxic, and biodegradable film. The film has no color or flavor and can increase the shelf life of different products 2.5 to 8 times. The results of the study were published in the Food Packaging and Shelf Life journal.

Food wraps are usually produced from chemical compounds based on synthetic polymers such as polyethylene and polypropylene. These materials are bad for the environment and take dozens, if not thousands of years to decompose. Naturally, a food coating should protect the food and contain no toxins that could contaminate it, but it is also important for it to decompose without a trace after being used. A team of chemists from RUDN University created such a coating from polysaccharides--natural macromolecules that are the building blocks of living organisms. Polysaccharides don't have a negative effect on health, are biodegradable and non-toxic.

The antibacterial coatings suggested by the team are based on chitosan, a polysaccharide that is found in carapaces of crabs or in lower fungi. Specifically, the chemists used two derivatives of chitosan: SC-Na, or chitosan succinyl sodium salt, and a compound of triazole, betaine, and chitosan (TBC). The latter possesses antibacterial properties comparable to those of modern-day antibiotics. According to the team, TBC nanoparticles embed into an SC-Na grid or matrix, creating a thin uniform film. It is much stronger than any of its individual components and lets less oxygen and steam in. In the course of the experiments, the scientists confirmed that the film is the most efficient when the SC-Na to TBC ratio is 1 to 1.

"We managed to obtain non-toxic chitosan derivatives with extraordinary antibacterial properties almost similar to those of commercial antibiotics and suggested that they could be used to increase film durability and antibacterial characteristics. We based our coating on SC-Na, a salt with high film-forming ability. Moreover, it is not toxic and works as an antioxidant, increasing the shelf life of food products. By changing the TBC/SC-Na ratio, we developed multifunctional food coatings with improved antibacterial, protective, and mechanical properties," said Andreii Kritchenkov, a Candidate of Chemical Sciences, and a research assistant at the Department of Inorganic Chemistry, RUDN University.

To test their invention, the team put several bananas in the film for 10 days. In the course of the experiment, the scientists measured their weight, vitamin C content, and the level of carbon dioxide emission. After 10 days these parameters were compared to the results from the control group that was kept without coating. Coated fruit turned out to have lost 3 times less weight and 8 times less vitamin C, and the frequency of their 'breathing' was 2.6 times lower (i.e. metabolic processes that are associated with CO2 emissions slowed down).
Thanks to these properties, chitosan-based films can be used to store food products. After a film has been used, it will decompose without causing harm to the environment.

Credit: 
RUDN University

Dartmouth study offers new details on pediatric mental health boarding

A Dartmouth-led study, published in the journal Pediatrics, offers new details about pediatric mental health boarding in emergency departments across the country, a problem that has steadily increased in the last 10 years and been made worse by a shortage of psychiatric resources.

Boarding refers to the practice of admitting children and adolescents--who are in need of inpatient mental health treatment--to emergency departments or inpatient medical units while they wait for a psychiatric bed to become available in the hospital.

Behavioral and mental health disorders are the most common and costly chronic diseases that affect children and adolescents. Approximately one in six U.S. youths has a behavioral or mental health condition, and treatment costs for these disorders are estimated to exceed $13 billion annually. Yet, 50 to 70 percent of children who have treatable behavioral and mental health conditions don't receive care from behavioral and mental health professionals.

"Although mental health boarding is widely recognized as a major health system challenge, its processes, outcomes, and risk factors had not previously been systematically reviewed," explains co-author Fiona McEnany, MPH '19, a PhD student at Dartmouth's Guarini School of Graduate and Advanced Studies who contributed to the research project as part of her Master of Public Health program study at The Dartmouth Institute for Health Policy and Clinical Practice. "Our goal was to characterize the prevalence of pediatric mental health boarding and to identify factors among patients and hospitals that increase the likelihood of this care process."

To this end, the research team conducted a comprehensive review of 222 studies to describe frequencies, durations, processes, outcomes, and/or risk factors associated with youth mental health boarding in the U.S. Of the 11 studies that met their criteria for inclusion, the majority were retrospective analyses conducted at individual hospitals. Of these single-center studies, all were performed at children's hospitals or pediatric emergency departments in urban or suburban areas--with study sample sizes ranging from 27 to 44,328 patient participants.

The investigators found that among young patients needing inpatient psychiatric care, 23 to 58 percent were boarded in hospital emergency departments while 26 to 49 percent were boarded on inpatient medical units. Boarding durations ranged on average from five to 41 hours in emergency departments and two to three days in inpatient units. Key risk factors for children included being younger in age, having suicidal or homicidal thoughts, and seeking care at hospitals during non-summer months.

In sum, the research team found that pediatric mental health boarding is prevalent and understudied. "It's a vital issue in youth mental healthcare today, experienced by at least 40,000 to 66,000 youth admitted to hospitals each year," says The Dartmouth Institute's JoAnna Leyenaar, MD, PhD, MPH, an associate professor of pediatrics and of health policy and clinical practice at Dartmouth's Geisel School of Medicine, and a co-author on the study. "More research that represents a diversity of hospital types and geographical regions is needed, so that we can inform clinical interventions and healthcare policies to better support youth who board each day at hospitals across the country."

Credit: 
The Geisel School of Medicine at Dartmouth

NASA observations aid efforts to track California's wildfire smoke from space

image: On Aug. 31, MODIS detected several hotspots in the August Complex Fire in California, as well as several other actively burning areas to the north, west, and south.

Image: 
Credits: R. Kahn/K.J. Noyes/NASA Goddard/A. Nastan/JPL Caltech/J. Tackett/J-P Vernier/NASA Langley

Wildfires have been burning across the state of California for weeks - some of them becoming larger complexes as different fires merge. One of those was the August Complex Fire, which reportedly began as 37 distinct fires caused by lightning strikes in northern California on Aug. 17. That fire is still burning over a month later.

The August Complex Fire and others this fire season have been sending far-reaching plumes of wildfire smoke into the atmosphere that worsen air quality in California and beyond. Predicting where that smoke will travel and how bad the air will be downwind is a challenge, but Earth-observing satellites can help. Included among them are NASA's Terra and CALIPSO satellites, and the joint NASA-National Oceanic and Atmospheric Association (NOAA) Suomi NPP satellite. Together, the instruments on these satellites provide glimpses at the smoke over time, which can help improve air quality predictions.

"The satellite instruments have the advantage of providing broad coverage and consistent measurement accuracy over time, as well as making their observations without any risk to the people taking the data," said Ralph Kahn, a senior research scientist with the Earth Sciences Division at NASA's Goddard Space Flight Center in Greenbelt, Maryland, who studies aerosols. Kahn and other atmospheric scientists at NASA collect data about the fires from Earth-observing satellites used to improve models that predict how wildfire smoke will affect air quality downwind of the fires.

MISR: Assessing the Situation from Different Angles

One of the instruments on NASA's Terra satellite is the Multi-angle Imaging Spectroradiometer (MISR), which has nine different cameras pointing toward Earth at different angles. As Terra passed over the August Complex Fire on Aug. 31, MISR collected snapshots of the smoke plume from different angles.

Scientists look at those different perspectives to calculate the extent and height of the smoke plume downwind, as well as the height nearest the source of the fire, called the injection height. That information is essential for determining how far the smoke will travel.

"Smoke tends to stay aloft longer, travel farther and have a larger environmental impact, perhaps far downwind, if it's injected higher into the atmosphere," said Kahn.

On Aug. 31, the highest parts of the plume from the August Complex Fire reached approximately 2.5 miles (4 kilometers) into the air - putting it above the boundary layer of the atmosphere, which is the layer of the atmosphere nearest to the Earth's surface. The fresh smoke plume extended at least 30 miles (45 kilometers) east of the burning area near Mendocino National Forest in northern California. Over the previous few days, smoke from this fire had already traveled more than 310 miles (500 kilometers) to the west and over 460 miles (750 kilometers) east of the source, crossing into Utah and out over the Pacific Ocean.

The MISR instrument also collected information about the amount, size, and brightness of the particles within the smoke plume based on how the particles scatter light at different angles and wavelengths. These data give researchers information about the characteristics of the wildfire smoke in order to predict how it will move and affect air quality. For example, the southern part of the smoke plume emitted by the August Complex Fire on Aug. 31 was made of mostly small, dark particles usually released when a fire is burning intensely. But as the plume moved downwind, the particles became larger and brighter, possibly because water or other gases emitted by the fires condensed on the smoke particles.

MODIS: A Snapshot of Wildfire Hotspots

Individual wildfires and large conflagrations of merged fires burning throughout the state - and the accumulated smoke they produce - make it difficult to see the actual flaming hotspots from space. But the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA's Terra satellite can see the longer wavelengths of nonvisible light, or infrared radiation produced by the heat coming from actively burning wildfires. In other words, MODIS can sometimes see through smoke even when our eyes can't by comparing the higher infrared radiation from hotspots to the lower radiation coming from the surrounding area.

As it passes over the Western U.S., MODIS can see a swath about 1,430 miles (2,300 kilometers) wide - about the distance from central Utah to almost 70 miles into the Pacific Ocean - providing valuable context about what's going on with the fires and smoke over the Western U.S. MODIS pinpointed multiple clusters of fire hotspots in the August Complex Fire, which had consumed over 240,000 acres by Sept. 2.

"The fire extent is huge in this case, and the smoke plumes can travel hundreds or even thousands of kilometers," said Kahn. "The satellites provide not only context, but also information about the relationships between different fires." During its pass overhead on Aug. 31, MODIS captured the August Complex Fire as well as several other fires and larger complexes of fires burning to the north, south, and east. Seeing the relationships between the fires offers clues to which fires are likely to merge in subsequent days.

CALIPSO and Suomi NPP: Seeing the Extent of the Smoke

The smoke plumes from California's wildfires have engulfed many cities and towns throughout the state, turning the sky an apocalyptic shade of burnt orange. In other areas, the sky is a hazy gray, and flecks of ash float through the air. But in some regions of the West Coast, the sky looks relatively normal - even if there are smoke particles in the air - because there are too few smoke particles for our eyes to detect.

That's where NASA's CALIPSO satellite comes in. CALIPSO has a laser onboard that shoots bursts of laser light toward Earth. When that light hits something, such as particles in a wildfire smoke plume, it is reflected back to sensors on CALIPSO. Although the laser light is too weak to cause any sort of damage, the light reflected back to the satellite by smoke particles tells scientists a lot about the smoke even when the plume is too transparent for them to see with their eyes. As the plume from the August Complex Fire was carried west, CALIPSO detected smoke several days old descending from about 2.5 miles above land to within a mile of the ocean's surface as it crossed the California coastline.

CALIPSO can tell the difference between clouds and smoke, which can sometimes be hard to do by looking at a satellite image. Knowing where the smoke is in relation to clouds allows researchers to see the interactions between clouds and smoke, which can affect the characteristics and spread of the smoke. For example, sometimes clouds ingest and modify smoke particles, and can even remove them from the air when it rains. Other times, dark wildfire smoke particles can absorb sunlight, becoming warm and heating the atmosphere, which can cause clouds to evaporate.

NASA's CALIPSO satellite captures detailed data, but it has a narrow field of vision. The satellite observes along a two-dimensional vertical "curtain" that slices through the smoke plume as it passes overhead, collecting detailed measurements of the type and position of wildfire smoke aerosols in the atmosphere. Scientists then turn to three sensors aboard Suomi NPP, collectively called the Ozone Mapping and Profiler Suite (OMPS), for context. Those sensors get a broader but less detailed view of what's going on with the smoke particles in Earth's atmosphere, which allows scientists to figure out what CALIPSO is homing in on and make better extrapolations based on CALIPSO's data.

The instruments aboard satellites in NASA's Earth-observing fleet provide extensive data, unavailable from any other source, enabling researchers to gain a better understanding of wildfire smoke and how it affects air quality. In cases like the current wildfires across California, NASA's atmospheric scientists studying the fires collaborate with the NASA Earth Science Disasters program to share their findings with firefighters and public health officials. NASA Disasters program partners with local and regional agencies on the ground, helping get the data from NASA's satellites into the hands of those who need it most.

"Our work is primarily helpful in improving the models that forecast air quality," said Kahn. "This is a team effort and when we can help, we certainly do."

Credit: 
NASA/Goddard Space Flight Center

FSU researchers help develop sustainable polymers

image: Rufina Alamo, Simon Ostrach Professor of Engineering and Distinguished Research Professor of Chemical and Biomedical Engineering at the FAMU-FSU College of Engineering.

Image: 
Mark Wallheiser/FAMU-FSU College of Engineering

Researchers at the FAMU-FSU College of Engineering have made new discoveries on the effects of temperature on sustainable polymers. Their findings may help the industry to produce plastics that are better for the environment.

"Plastics made from petroleum, a non-renewable resource, remain too long in our land and water when discarded," said Rufina Alamo, a professor in the Department of Chemical and Biomedical Engineering. "We are researching how sustainable polymers are heated and cooled so we may produce more 'environmentally friendly' plastics."

Alamo and former doctoral candidate Xiaoshi Zhang, now a postdoctoral research fellow at Penn State, recently published the work in a series of papers that focus on the crystallization of "green" polymers. The latest paper appears as the cover article in Macromolecules, a leading journal for polymer science.

"There is a worldwide motivation to transform how the largest volume of plastics are made," Alamo said. "Polymer chemists and physicists are working hard to produce substitute materials to end problematic plastic waste."

Determining the correct temperature for processing is key to producing better materials that will help scientists replace inexpensive polymers made from petroleum with economically viable, sustainable polymers.

"How the polymer is melted and cooled to make the desired shape is important," Alamo said. "We are trying to understand the intricacies of crystallization to further understand the transformation process."

The team is studying a type of polymer called "long-spaced polyacetals," which are used in plastics. Synthesized in a laboratory at the University of Konstanz in Germany, the long-spaced polyacetals Alamo's team used come from sustainable biomass. They contain a polyethylene backbone linked with acetal groups at precise equal distances. The structure combines the toughness of polyethylene with the hydrolytic degradability of the acetal group. This type of polymer is strong but breaks apart more easily with water than traditional polymers.

"What we discovered is these types of polymers crystalize in an unusual way when cooled after melting," Alamo said.

During the cooling process, molecules that look like curly strands of spaghetti of melted plastics disentangle to form crystals and are responsible for the toughness of the final material. Alamo's group showed that polymer crystallization is controlled by molecular events that take place at the crystal growth front.

The researchers found that when cooled rapidly, these polyacetals become tough and crystalline, and the molecules self-assemble in a type of crystal termed "Form I." When cooled slowly, the material is also very crystalline, but the crystals formed are quite different and are dubbed "Form II." When cooled at intermediate temperatures, the material does not solidify at all. This phenomenon has never been observed in any other crystalline polymers, according to the researchers.

"For crystals to be formed, an energy barrier first needs to be surmounted," Alamo said. "At low temperatures, crystals are easily formed. At high temperatures, crystals are more stable, and at intermediate temperatures, the crystals compete to form, and the material can't solidify."

"This is a significant discovery because it is an important key to understanding how the plastics we use become solids," she said. "We want to provide the industry with the best transformation processes possible. We want sustainable plastics that don't warp or have difficulty solidifying."

The research may provide new ways of manufacturing plastics that will be more economical to produce and sustainable.

Credit: 
Florida State University

COVID-19 news from Annals of Internal Medicine

Below please find a summary and link(s) of new coronavirus-related content published today in Annals of Internal Medicine. The summary below is not intended to substitute for the full article as a source of information. A collection of coronavirus-related content is free to the public at http://go.annals.org/coronavirus.

More data needed before the National Institutes of Health COVID-19 Treatment Guidelines Panel can recommend for or against convalescent plasma for the treatment of COVID-19

In the United States, the efficacy and safety of convalescent plasma for treating COVID-19 is currently being tested in randomized placebo-controlled clinical trials. Based on earlier evidence, the treatment was granted Emergency Use Authorization (EUA) by the U.S. Food and Drug Administration (FDA), which facilitates the availability and unapproved uses of medical products during a public health emergency, such as the COVID-19 pandemic. Members of the National Institutes of Health COVID-19 Treatment Guidelines Panel provide their views regarding the use of convalescent plasma for treating COVID-19 and explain why the currently available data are insufficient for them to recommend for or against the treatment at this time. Their analysis is published in Annals of Internal Medicine. Read the full text: https://www.acpjournals.org/doi/10.7326/M20-6448.

Credit: 
American College of Physicians

Faced with shortages, researchers combine heat and humidity to disinfect N95 masks

As the COVID-19 pandemic swept around the world early this year, shortages of protective equipment such as N95 masks left healthcare workers little choice but to reuse the masks they had - increasing the risk of infection for both them and their patients.

Now, researchers at the Department of Energy's SLAC National Accelerator Laboratory, Stanford University and the University of Texas Medical Branch may have a solution: Using a combination of moderate heat and high relative humidity, the team was able to disinfect N95 mask materials without hampering their ability to filter out viruses.

What's more, it should not be too difficult to turn the new results into an automated system hospitals could use in short order - because the process is so simple, it might take just a few months to design and test a device.

"This is really an issue, so if you can find a way to recycle the masks a few dozen times, the shortage goes way down," said Stanford physicist Steven Chu, a senior author on the new paper. "You can imagine each doctor or nurse having their own personal collection of up to a dozen masks. The ability to decontaminate several of these masks while they are having a coffee break will lessen the chance that masks contaminated with COVID viruses would expose other patients."

The team reported their results September 25th in the journal ACS Nano.

Facing a shortage of the masks early this year, researchers considered a number of ways to disinfect them for reuse, including ultraviolet light, hydrogen peroxide vapors, autoclaves and chemical disinfectants. The problem is that many of those methods degrade N95 masks' filtering abilities, so that at most they could be reused a few times.

In the new study, Chu, University of Texas Medical Branch virologist Scott Weaver and Stanford/SLAC professors Yi Cui and Wah Chiu and colleagues focused their attention on a combination of heat and humidity to try to decontaminate masks.

Working at the World Reference Center for Emerging Viruses and Arboviruses, which has biosafety measures in place for working with the most contagious viruses, the team first mixed up batches of SARS-CoV-2 virus in liquids designed to mimic the fluids that might spray out of our mouths when we cough, sneeze, sing or simply breathe. They next sprayed droplets of the brew on a piece of meltblown fabric, a material used in most N95 masks, and let it dry.

Finally, they heated their samples at temperatures ranging from 25 to 95 degrees Celsius for up to 30 minutes with relative humidity up to 100 percent.

Higher humidity and heat substantially reduced the amount of virus the team could detect on the mask, although they had to be careful not to go too hot, which additional tests revealed could lower the material's ability to filter out virus-carrying droplets. The sweet spot appeared to be 85 degrees Celsius with 100 percent relatively humidity - the team could find no trace of SARS-CoV-2 after cooking the masks under those conditions.

Additional results indicate masks could be decontaminated and reused upwards of 20 times and that the process works on at least two other viruses - a human coronavirus that causes the common cold and the chikungunya virus.

Weaver said that although the results are not especially surprising - researchers have known for a long time that heat and humidity are good ways to inactivate viruses - there hadn't been an urgent need for a detailed quantitative analysis of something like mask decontamination until now. The new data, he said, "provide some quantitative guidance for the future."

And even after the coronavirus pandemic is over, there are likely benefits, in part because of the method's application beyond SARS-CoV-2 to other viruses, and because of the economic and environmental benefits of reusing masks. "It's good all around," Cui said.

Credit: 
DOE/SLAC National Accelerator Laboratory

Machine learning takes on synthetic biology: algorithms can bioengineer cells for you

image: Berkeley Lab scientists Tijana Radivojevic (left) and Hector Garcia Martin working on mechanistic and statistical modeling, data visualizations, and metabolic maps at the Agile BioFoundry last year.

Image: 
Thor Swift/Berkeley Lab

If you've eaten vegan burgers that taste like meat or used synthetic collagen in your beauty routine - both products that are "grown" in the lab - then you've benefited from synthetic biology. It's a field rife with potential, as it allows scientists to design biological systems to specification, such as engineering a microbe to produce a cancer-fighting agent. Yet conventional methods of bioengineering are slow and laborious, with trial and error being the main approach.

Now scientists at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a new tool that adapts machine learning algorithms to the needs of synthetic biology to guide development systematically. The innovation means scientists will not have to spend years developing a meticulous understanding of each part of a cell and what it does in order to manipulate it; instead, with a limited set of training data, the algorithms are able to predict how changes in a cell's DNA or biochemistry will affect its behavior, then make recommendations for the next engineering cycle along with probabilistic predictions for attaining the desired goal.

"The possibilities are revolutionary," said Hector Garcia Martin, a researcher in Berkeley Lab's Biological Systems and Engineering (BSE) Division who led the research. "Right now, bioengineering is a very slow process. It took 150 person-years to create the anti-malarial drug, artemisinin. If you're able to create new cells to specification in a couple weeks or months instead of years, you could really revolutionize what you can do with bioengineering."

Working with BSE data scientist Tijana Radivojevic and an international group of researchers, the team developed and demonstrated a patent-pending algorithm called the Automated Recommendation Tool (ART), described in a pair of papers recently published in the journal Nature Communications. Machine learning allows computers to make predictions after "learning" from substantial amounts of available "training" data.

In "ART: A machine learning Automated Recommendation Tool for synthetic biology," led by Radivojevic, the researchers presented the algorithm, which is tailored to the particularities of the synthetic biology field: small training data sets, the need to quantify uncertainty, and recursive cycles. The tool's capabilities were demonstrated with simulated and historical data from previous metabolic engineering projects, such as improving the production of renewable biofuels.

In "Combining mechanistic and machine learning models for predictive engineering and optimization of tryptophan metabolism," the team used ART to guide the metabolic engineering process to increase the production of tryptophan, an amino acid with various uses, by a species of yeast called Saccharomyces cerevisiae, or baker's yeast. The project was led by Jie Zhang and Soren Petersen of the Novo Nordisk Foundation Center for Biosustainability at the Technical University of Denmark, in collaboration with scientists at Berkeley Lab and Teselagen, a San Francisco-based startup company.

To conduct the experiment, they selected five genes, each controlled by different gene promoters and other mechanisms within the cell and representing, in total, nearly 8,000 potential combinations of biological pathways. The researchers in Denmark then obtained experimental data on 250 of those pathways, representing just 3% of all possible combinations, and that data were used to train the algorithm. In other words, ART learned what output (amino acid production) is associated with what input (gene expression).

Then, using statistical inference, the tool was able to extrapolate how each of the remaining 7,000-plus combinations would affect tryptophan production. The design it ultimately recommended increased tryptophan production by 106% over the state-of-the-art reference strain and by 17% over the best designs used for training the model.

"This is a clear demonstration that bioengineering led by machine learning is feasible, and disruptive if scalable. We did it for five genes, but we believe it could be done for the full genome," said Garcia Martin, who is a member of the Agile BioFoundry and also the Director of the Quantitative Metabolic Modeling team at the Joint BioEnergy Institute (JBEI), a DOE Bioenergy Research Center; both supported a portion of this work. "This is just the beginning. With this, we've shown that there's an alternative way of doing metabolic engineering. Algorithms can automatically perform the routine parts of research while you devote your time to the more creative parts of the scientific endeavor: deciding on the important questions, designing the experiments, and consolidating the obtained knowledge."

More data needed

The researchers say they were surprised by how little data was needed to obtain results. Yet to truly realize synthetic biology's potential, they say the algorithms will need to be trained with much more data. Garcia Martin describes synthetic biology as being only in its infancy - the equivalent of where the Industrial Revolution was in the 1790s. "It's only by investing in automation and high-throughput technologies that you'll be able to leverage the data needed to really revolutionize bioengineering," he said.

Radivojevic added: "We provided the methodology and a demonstration on a small dataset; potential applications might be revolutionary given access to large amounts of data."

The unique capabilities of national labs

Besides the dearth of experimental data, Garcia Martin says the other limitation is human capital - or machine learning experts. Given the explosion of data in our world today, many fields and companies are competing for a limited number of experts in machine learning and artificial intelligence.

Garcia Martin notes that knowledge of biology is not an absolute prerequisite, if surrounded by the team environment provided by the national labs. Radivojevic, for example, has a doctorate in applied mathematics and no background in biology. "In two years here, she was able to productively collaborate with our multidisciplinary team of biologists, engineers, and computer scientists and make a difference in the synthetic biology field," he said. "In the traditional ways of doing metabolic engineering, she would have had to spend five or six years just learning the needed biological knowledge before even starting her own independent experiments."

"The national labs provide the environment where specialization and standardization can prosper and combine in the large multidisciplinary teams that are their hallmark," Garcia Martin said.

Synthetic biology has the potential to make significant impacts in almost every sector: food, medicine, agriculture, climate, energy, and materials. The global synthetic biology market is currently estimated at around $4 billion and has been forecast to grow to more than $20 billion by 2025, according to various market reports.

"If we could automate metabolic engineering, we could strive for more audacious goals. We could engineer microbiomes for therapeutic or bioremediation purposes. We could engineer microbiomes in our gut to produce drugs to treat autism, for example, or microbiomes in the environment that convert waste to biofuels," Garcia Martin said. "The combination of machine learning and CRISPR-based gene editing enables much more efficient convergence to desired specifications."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Tree rings show scale of Arctic pollution is worse than previously thought

image: Widescale pollution has caused devastating forest decline east of Norilsk, Russia.

Image: 
Dr Alexander Kirdyanov

The largest-ever study of tree rings from Norilsk in the Russian Arctic has shown that the direct and indirect effects of industrial pollution in the region and beyond are far worse than previously thought.

An international team of researchers, led by the University of Cambridge, has combined ring width and wood chemistry measurements from living and dead trees with soil characteristics and computer modelling to show that the damage done by decades of nickel and copper mining has not only devastated local environments, but also affected the global carbon cycle.

The extent of damage done to the boreal forest, the largest land biome on Earth, can be seen in the annual growth rings of trees near Norilsk where die off has spread up to 100 kilometres. The results are reported in the journal Ecology Letters.

Norilsk, in northern Siberia, is the world's northernmost city with more than 100,000 people, and one of the most polluted places on Earth. Since the 1930s, intensive mining of the area's massive nickel, copper and palladium deposits, combined with few environmental regulations, has led to severe pollution levels. A massive oil spill in May 2020 has added to the extreme level of environmental damage in the area.

Not only are the high level of airborne emissions from the Norilsk industrial complex responsible for the direct destruction of around 24,000 square kilometres of boreal forest since the 1960s, surviving trees across much of the high-northern latitudes are suffering as well. The high pollution levels cause declining tree growth, which in turn have an effect of the amount of carbon that can be sequestered in the boreal forest.

However, while the link between pollution and forest health is well-known, it has not been able to explain the 'divergence problem' in dendrochronology, or the study of tree rings: a decoupling of tree ring width from rising air temperatures seen since the 1970s.

Using the largest-ever dataset of tree rings from both living and dead trees to reconstruct the history and intensity of Norilsk's forest dieback, the researchers have shown how the amount of pollution spewed into the atmosphere by mines and smelters is at least partially responsible for the phenomenon of 'Arctic dimming', providing new evidence to explain the divergence problem.

"Using the information stored in thousands of tree rings, we can see the effects of Norilsk's uncontrolled environmental disaster over the past nine decades," said Professor Ulf Büntgen from Cambridge's Department of Geography, who led the research. "While the problem of sulphur emissions and forest dieback has been successfully addressed in much of Europe, for Siberia, we haven't been able to see what the impact has been, largely due to a lack of long-term monitoring data."

The expansion of annually-resolved and absolutely dated tree ring width measurements compiled by the paper's first author Alexander Kirdyanov, along with new high-resolution measurements of wood and soil chemistry, allowed the researchers to quantify the extent of Norilsk's devastating ecosystem damage, which peaked in the 1960s.

"We can see that the trees near Norilsk started to die off massively in the 1960s due to rising pollution levels," said Büntgen. "Since atmospheric pollution in the Arctic accumulates due to large-scale circulation patterns, we expanded our study far beyond the direct effects of Norilsk's industrial sector and found that trees across the high-northern latitudes are suffering as well."

The researchers used a process-based forward model of boreal tree growth, with and without surface irradiance forcing as a proxy for pollutants, to show that Arctic dimming since the 1970s has substantially reduced tree growth.

Arctic dimming is a phenomenon caused by increased particulates in the Earth's atmosphere, whether from pollution, dust or volcanic eruptions. The phenomenon partially blocks out sunlight, slowing the process of evaporation and interfering with the hydrological cycle.

Global warming should be expected to increase the rate of boreal tree growth, but the researchers found that as the pollution levels peaked, the rate of tree growth in northern Siberia slowed. They found that the pollution levels in the atmosphere diminished the trees' ability to turn sunlight into energy through photosynthesis, and so they were not able to grow as quickly or as strong as they would in areas with lower pollution levels.

"What surprised us is just how widespread the effects of industrial pollution are - the scale of the damage shows just how vulnerable and sensitive the boreal forest is," said Büntgen. "Given the ecological importance of this biome, the pollution levels across the high-northern latitudes could have an enormous impact on the entire global carbon cycle."

Credit: 
University of Cambridge

Pair of massive baby stars swaddled in salty water vapor

image: Different colors show the different distributions of dust particles (yellow), methyl cyanide (CH3CN, red), salt (NaCl, green), and hot water vapor (H2O, blue). Bottom insets are the close-up views of each components. Dust and methyl cyanide are distributed widely around the binary, whereas salt and water vapor are concentrated in the disk around each protostar. In the wide-field image, the jets from one of the protostars, seen as several dots in the above image, are shown in light blue.

Image: 
ALMA (ESO/NAOJ/NRAO), Tanaka et al.

Using the Atacama Large Millimeter/submillimeter Array (ALMA), astronomers spotted a pair of massive baby stars growing in salty cosmic soup. Each star is shrouded by a gaseous disk which includes molecules of sodium chloride, commonly known as table salt, and heated water vapor. Analyzing the radio emissions from the salt and water, the team found that the disks are counter rotating. This is the second detection of salt around massive young stars, promising that salt is an excellent marker to explore the immediate surroundings of giant baby stars.

There are stars of many different masses in the Universe. Smaller ones only have one-tenth the mass of the Sun, while larger ones have 10 times or more mass than the Sun. Regardless of the mass, all stars are formed in cosmic clouds of gas and dust. Astronomers have eagerly studied the origins of stars, however, the process of massive star formation is still veiled. This is because the formation sites of massive stars are located farther from the Earth, and massive baby stars are surrounded by massive clouds with complicated structures. These two facts prevent astronomers from obtaining clear views of massive young stars and their formation sites.

A team of astronomers led by Kei Tanaka at the National Astronomical Observatory of Japan utilized ALMA's power to investigate the environment where massive stars are forming. They observed the massive young binary IRAS 16547-4247. The team detected radio emissions from a wide variety of molecules. Particularly, sodium chloride (NaCl) and hot water (H2O) are found to be associated in the immediate vicinity of each star, i.e., the circumstellar disk. On the other hand, other molecules such as methyl cyanide (CH3CN), which has commonly been observed in previous studies of massive young stars, were detected further out, but do not trace structures in the vicinity of stars well.

"Sodium chloride is familiar to us as table salt, but it is not a common molecule in the Universe," says Tanaka. "This was only the second detection of sodium chloride around massive young stars. The first example was around Orion KL Source I, but that is such a peculiar source that we were not sure whether salt is suitable to see gas disks around massive stars. Our results confirmed that salt is actually a good marker. Since baby stars gain mass through disks, it is important to study the motion and characteristics of disks to understand how the baby stars grow."

Further investigation of the disks shows an interesting hint to the origin of the pair. "We found a tentative sign that the disks are rotating in opposite directions," explains Yichen Zhang, a researcher at RIKEN. If the stars are born as twins in a large common gaseous disk, then naturally the disks rotate in the same direction. "The counter-rotation of the disks may indicate that these two stars are not actual twins, but a pair of strangers which were formed in separated clouds and paired up later." Massive stars almost always have some companions, and thus it is pivotal to investigate the origin of massive binary systems. The team expects that further observation and analysis will provide more dependable information on the secrets of their birth.

The presence of heated water vapor and sodium chloride, which were released by the destruction of dust particles, suggests the hot and dynamic nature of disks around massive baby stars. Interestingly, investigations of meteorites indicate that the proto-Solar System disk also experienced high temperatures in which dust particles were evaporated. Astronomers will be able to trace these molecules released from dust particles well by using the next generation Very Large Array [1] , currently under planning. The team anticipates that they can even obtain clues to understand the origin of our Solar System through studying hot disks with sodium chloride and hot water vapor.

The baby stars IRAS 16547-4247 are located 9500 light-years away in the constellation Scorpius. The total mass of the stars is estimated to be 25 times the mass of the Sun, surrounded by a gigantic cloud with the mass of 10,000 Suns.

[1] The next generation Very Large Array (ngVLA) is a project to construct a large set of radio telescopes in the United States, led by the U. S. National Radio Astronomy Observatory. The ngVLA is expected to make significant contributions to various research topics, including planet formation, interstellar chemistry, galaxy evolution, pulsars, and multi-messenger astronomy.

Credit: 
National Institutes of Natural Sciences

PLUS takes 3D ultrasound images of solids

image: High-resolution 3D imaging result of branched stress corrosion cracking.

Image: 
Yoshikazu Ohara, Tohoku University

A new system, developed by Tohoku University researchers in Japan in collaboration with Los Alamos National Laboratory in the US, takes 3D images that can detect defects in metallic structures. The approach was published in the journal Applied Physics Letters and could enhance safety in power plants and airplanes.

Yoshikazu Ohara and colleagues at Tohoku University use non-destructive techniques to study structures, and wanted to find a way to produce 3D images of structural defects. They developed a new technology, called the piezoelectric and laser ultrasonic system (PLUS), that combines the strengths of two different devices to produce high-resolution 3D images of defects in metallic structures.

"We believe that PLUS will pave the way for accurate evaluation of material strength, the identification of defects, and finding out how defects initially started to form," says Ohara.

Currently available 'ultrasonic phased arrays' are a powerful tool for imaging internal defects in solids, but only in two dimensions. These devices are made of a piezoelectric one-dimensional array transducer with a limited number of individual elements--up to 128. Electrical pulses in the piezoelectric elements are converted to a mechanical vibration that emits ultrasonic waves into the material under investigation. Ultrasonic waves are reflected back from internal defects and converted into electric signals that can be translated into a 2D image.

In PLUS, the waves generated in a material from a piezoelectric transducer with a single element are received by a laser Doppler vibrometer, which moves around the material's surface to get a good 2D scan of the area. As a result of this process, it receives the scattered and reflected waves at a much larger number of 'points' than those that can be received by a piezoelectric array transducer. The information received by the laser Doppler vibrometer is transmitted by an oscilloscope to a computer, where it is processed by an imaging algorithm and converted into a 3D image.

"Ultrasonic phased arrays, which are on the cutting-edge of ultrasonic inspection, can only provide 2D images because of their limited number of elements," says Ohara. "PLUS makes it possible to have thousands of elements as a result of incorporating the 2D scan of a laser Doppler vibrometer in place of a piezoelectric array transducer."

Although tested only on defects in metallic materials, Ohara says their technology can be applied to other materials, including concrete and rock, simply by changing the phased array transmitter to one that emits a different range of ultrasound frequencies.

One drawback is the long data acquisition and processing time, which takes several hours. However, this can be shortened by adopting a high-speed analog-to-digital converter in place of the oscilloscope, using a more sensitive laser Doppler vibrometer, utilizing different imaging algorithms, and employing a graphical processing unit.

Credit: 
Tohoku University

A clearer view of what makes glass rigid

image: A team of scientists led by the University of Tokyo uses computer simulations to study the rigidity of amorphous solids like glass

Image: 
Institute of Industrial Science, the University of Tokyo

Tokyo, Japan - Researchers led by The University of Tokyo employed a new computer model to simulate the networks of force-carrying particles that give amorphous solids their strength even though they lack long range order. This work may lead to new advances in high-strength glass, which can be used for cooking, industrial, and smartphone applications.

Amorphous solids such as glass--despite being brittle and having constituent particles that do not form ordered lattices--can possess surprising strength and rigidity. This is even more unexpected because amorphous systems also suffer from large anharmonic fluctuations. The secret is an internal network of force-bearing particles that span the entire solid which lends strength to the system. This branching, dynamic network acts like a skeleton that prevents the material from yielding to stress even though it makes up only a small fraction of the total particles. However, this network only forms after a "percolation transition" when the number of force-bearing particles exceeds a critical threshold. As the density of these particles increases, the probability that a percolating network that goes from one end to the other increases from zero to almost certain.

Now, scientists from the Institute of Industrial Science at The University of Tokyo have used computer simulations to carefully show the formation of these percolating networks as an amorphous material is cooled below its glass transition temperature. In these calculations, binary particle mixtures were modelled with finite-range repulsive potentials. The team found that the strength of amorphous materials is an emergent property caused by the self-organization of the disordered mechanical architecture.

"At zero temperature, a jammed system will show long-range correlations in stress due to its internal percolating network. This simulation showed that the same is true for glass even before it has completely cooled," first author Hua Tong says.

The force-bearing backbone can be identified by recognizing that particles in this network are must be connected by at least two strong force bonds. Upon cooling, the number of force-bearing particles increases, until a system-spanning network links together.

"Our findings may open up a way towards a better understanding of amorphous solids from a mechanical perspective," senior author Hajime Tanaka says. Since rigid, durable glass is highly prized for smartphones, tablets, and cookware, the work can find many practical uses.

Credit: 
Institute of Industrial Science, The University of Tokyo

An area of the brain where tumor cells shelter from chemotherapy in childhood leukaemia

The stroma of the choroid plexus is one of the locations within the central nervous system (CNS) that serves as a shelter for tumour cells, allowing them to elude chemotherapy and potentially cause subsequent relapses in childhood acute lymphoblastic leukaemia, according to research led by the Complutense University of Madrid (UCM).

The choroid plexus is a structure located in the ventricles of the brain and is responsible for the production of cerebrospinal fluid. Although leukaemia cells are primarily located in the bone marrow, they are also capable of spreading to other areas of the body and show a particular propensity to infiltrate the CNS.

"The fact that relapses continued to occur in the CNS despite prophylactic treatment led us to suspect that some cells might remain hidden in small, practically undetectable groups elsewhere, and could be responsible for subsequent relapses", explained Ángeles Vicente, a researcher in the Department of Cell Biology at the UCM School of Medicine.

Besides identifying the hiding place, the study published in the Journal of Pathology also reveals how leukaemia cells elude chemotherapy: by interacting with stroma cells in the choroid plexus, modifying the microenvironment to ensure their own survival.

The Niño Jesús Hospital in Madrid and the Autonomous University of Chihuahua (Mexico) were also involved in the research.

A breakthrough for more effective treatment

To carry out the study, the researchers infused leukaemic cells from patients into immunodeficient mice and then used immunofluorescence and electron microscopy to determine the cerebral location of metastatic tumour cells in the CNS, successfully identifying the site in the choroid plexus.

This in vivo technique was combined with in vitro assays to study the cell interactions that take place between leukaemia cells and choroid plexus stroma cells and their effects on chemoresistance.

About 15-20% of paediatric patients with acute lymphoblastic leukaemia are not cured, and relapses in the CNS are the main cause of morbidity and mortality from the disease in the paediatric population.

"Studies like ours could be essential to design more effective therapeutic strategies in the future aimed at preventing tumour cells from colonising niches in the CNS or eliminating the cells that have already established themselves in these sites. This would represent a breakthrough in treatment of the disease, reducing relapses and further increasing the chances of a cure", predicted Lidia Martínez Fernández de Sevilla, a postdoctoral researcher in the Department of Cell Biology and the first named author of the study.

The research focused on childhood acute lymphoblastic leukaemia because the group only works with paediatric samples, but experts believe it is feasible that leukaemia cells might use the same hiding places in adults --where 5% of relapses are related to the CNS-- as they do in children.

Credit: 
Universidad Complutense de Madrid

The surprising organization of avian brains

image: Similar to the cortex of mammals, the nerve cells in certain areas of the brain of birds are organized in vertical layers and horizontal columns.

Image: 
RUB, Biopsychlogy.

These findings refute 150-year-old assumptions. The team, published its findings in the journal Science from 25. September 2020.

The largest brains

Birds and mammals have the largest brains in relation to their body. Apart from that, however, they have little in common, according to scientific opinion since the 19th century: mammalian brains have a neocortex, i.e. a cerebral cortex that's made up of six layers and arranged in columns perpendicular to these layers. Avian brains, on the other hand, look like clumps of grey cells.

"Considering the astonishing cognitive performance that birds can achieve, it seemed reasonable to suspect that their brains are more organised than expected," says Professor Onur Güntürkün, Head of the Biopsychology Research Unit at the RUB Faculty of Psychology. He and his former doctoral students Dr. Martin Stacho and Dr. Christina Herold proved this in several experiments.

Perfected technology facilitates new insights

In the first step, the researchers deployed a new method perfected by the Düsseldorf and Jülich teams: so-called 3D polarized light imaging, or 3D PLI for short, is capable of displaying the orientation of individual nerve fibres. To the researchers' surprise, an analysis of the brains of various birds revealed an organisation that is similar to that in the mammalian brain: here too, the fibres are arranged horizontally and vertically in the same way as in the neocortex.

In further experiments, the researchers used tiny crystals, which are absorbed by nerve cells in brain slices and transport them to their smallest dendrites, to examine the interconnection of cells in the bird brain in detail. "Here, too, the structure was shown to consist of columns, in which signals are transmitted from top to bottom and vice versa, and long horizontal fibres," explains Onur Güntürkün. However, this structure is only found in the sensory areas of the avian brain. Other areas, such as associative areas, are organised in a different way.

Amazing cognitive performance

Some birds are capable of astonishing cognitive performances to rival those of higher developed mammals such as primates. For example, ravens recognise themselves in the mirror and plan for the future. They are also able to put themselves in the position of others, recognise causalities and draw conclusions. Pigeons can learn English spelling up to the level of six-year-old children.

Credit: 
Ruhr-University Bochum

Reusing tableware can reduce waste from online food deliveries

image: This is a typical Chinese meal that was ordered online and includes plastic tableware.

Image: 
Zhou et al

Lifestyles in China are changing rapidly, and ordering food online is an example. However, those billions of delivery meals produce an enormous amount of plastic waste from packaging, but also from food containers and cutlery; in one year, some 7.3 billion sets of single-use tableware accompany the food. Around one-third of the 553 kilotons of municipal solid waste that is generated each day comes from packaging. That is why a group of scientists analysed whether using paper alternatives or reusable tableware could reduce plastic waste and associated life cycle emissions.

Alternatives

Ya Zhou (associate professor at Guangdong University of Technology) and Yuli Shan are the first authors of this paper. Yuli Shan, Dabo Guan (Professor at Tsinghua University) and Yanpeng Cai (Professor at Guangdong University of Technology) are the corresponding authors.

'We quantified the environmental impact and modelled different alternatives,' explains Shan. The alternatives to the single-use plastic tableware were single-use paper alternatives and reusable silicone tableware that is cleaned either by the restaurants that cook the food or in a central cleaning facility.

Paper substitution

Paper may sound like a good alternative since it can be degraded, but single-use polyethylene-coated paper containers and bags actually increased emissions and total waste volume. 'For those areas without paper waste collection and recycling systems, paper substitution is not the optimal option for addressing the takeaway packaging waste dilemma,' says Zhou. Reusable silicon tableware reduced plastic waste by up to 92 per cent, and environmental emissions (carbon, sulfur and nitrogen dioxides, small particulate matter, dioxins) and chemical oxygen demand) and water consumption by more than two-thirds.

Getting an environmentally friendly and, at the same time, safe system of reusable tableware up and running requires some investments. 'A central cleaning facility would be best, also for health inspections to ensure safety, but this requires a system for collecting the used sets.' That is not easy but also not impossible: 'However, it does need a government effort to realize this.'

Zero waste

The Chinese government is taking steps to drastically reduce waste, as the Nature Food paper explains. A number of initiatives have sought new solutions for municipal solid waste management and plastic reduction, including a sorting implementation plan, a 'zero-waste city' pilot programme that started last year and a nationwide single-use plastic ban as from January 2020.

Would it not be easier to deliver the food without tableware? It may be possible in some cases, but not for all takeaway orders, according to Shan: 'Most meals are not eaten at home but in the classroom, during lunch breaks or at the office, when employees work late.' Zhou added that 'reusing tableware provides a potential solution to reduce waste and emissions from takeaway meals and a new strategy for promoting sustainable and "zero-waste" lifestyles'.

Credit: 
University of Groningen