Earth

Invasion by non-native insects expected to increase 36 percent worldwide by 2050

image: New research by an international team of scientists suggests that worldwide, invasion by non-native insects will increase 36 percent by 2050. Photograph shows an emerald ash borer, a non-native insect that has proven highly destructive in U.S. forests.

Image: 
Leah Bauer, USDA Forest Service

Research by an international team of scientists found that the steady, centuries-long increase in insect invasions globally is likely to continue. Using a new modelling approach to simulate non-native insect species numbers on continents for different taxonomic groups based on observed long-term historic trends, scientists established that biological invasions will increase by 36 percent between 2005 and 2050.

Modeling suggests that Europe is likely to experience the strongest biological invasions, followed by Asia, North America and South America.

The study delivers a first baseline for the assessment of future developments of biological invasions, information that will support decision-making related to containing the spread of alien species. Andrew Liebhold, a USDA Forest Service research entomologist based in West Virginia and the study's only North American co-author, describes the research as an important tool in allowing a shift from a reactionary stance to a proactive stance in defending against biological invasion.

"For centuries, the element of surprise has worked in favor of invasive pests," Liebhold said. "Because we were not anticipating these insects, and not monitoring for them, many have been well established and causing damage by the time we did find them. This research gives nations the opportunity to play offense instead of defense by identifying where invasions are likely and what species are most likely to invade."

Credit: 
USDA Forest Service - Northern Research Station

Timing the life of antimatter particles may lead to better cancer treatment

image: Researchers at the University of Tokyo and National Institute of Radiological Sciences have designed a way to detect the absolute oxygen concentration in patients' bodies, which may lead to more effective cancer treatment. The results are published in Communication Physics.

Image: 
Taiga Yamaya, CC-BY

Experts in Japan have devised a simple way to glean more detailed information out of standard medical imaging scans. A research team made up of atomic physicists and nuclear medicine experts at the University of Tokyo and the National Institute of Radiological Sciences (NIRS) has designed a timer that can enable positron emission tomography (PET) scanners to detect the oxygen concentration of tissues throughout patients' bodies. This upgrade to PET scanners may lead to a future of better cancer treatment by quickly identifying parts of tumors with more aggressive cell growth.

"Patients' experience in this future PET scan will be the same as now. Medical teams' experience of conducting the scan will also be the same, just with more useful information at the end," said nuclear medicine physician Dr. Miwako Takahashi from the NIRS, a co-author of the research publication in Communication Physics.

"This was a quick project for us, and I think it should also become a very fast medical advance for real patients within the next decade. Medical device companies can apply this method very economically, I hope," said Assistant Professor Kengo Shibuya from the University of Tokyo Graduate School of Arts and Sciences, first author of the publication.

PET scans

The positrons that PET scans are named for are the positively charged antimatter forms of electrons. Due to their tiny size and extremely low mass, positrons pose no danger in medical applications. Positrons produce gamma rays, which are electromagnetic waves similar to X-rays, but with shorter wavelengths.

When receiving a PET scan, a patient receives a small amount of very weakly radioactive liquid, often composed of modified sugar molecules, usually injected into their blood. The liquid circulates for a short period of time. Differences in blood flow or metabolism affect how the radioactivity is distributed. The patient then lies in a large, tube-shaped PET scanner. As the radioactive liquid emits positrons that then decay into gamma rays, rings of gamma-ray detectors map the locations of gamma rays emitted from the patient's body.

Doctors already request PET scans when they need information about not just the structure, but also the metabolic function of tissues inside the body. Detecting oxygen concentration using the same PET scan would add another layer of useful information about the body's function.

Oxygen concentration measured in nanoseconds

The life of a positron is a choice of two very short paths, both of which begin when a positron is "born" as it is released from the radioactive PET scan liquid. On the shorter path, the positron immediately collides with an electron and produces gamma rays. On the slightly longer path, the positron initially transforms into another type of particle called a positronium, which then decays into gamma rays. Either way, the lifetime of a positron inside a human body is not longer than 20 nanoseconds, or one fifty millionth of a second (1/50,000,000 second).

"The outcome is the same, but the lifetime is not. Our proposal is to distinguish the lifetimes of positrons using a PET scan with a timer so that we can map oxygen concentrations inside patients' bodies," said Shibuya.

Shibuya and his colleagues developed a life expectancy chart for positrons using a miniaturized PET scanner to time the formation and decay of positrons in liquids with known concentrations of oxygen.

The research team's new results reveal that when oxygen concentration is high, the shorter path is more likely. Researchers predict that their technique will be able to detect the absolute oxygen concentration in any tissue of a patient's body based on the lifetime of positrons during a PET scan.

Detecting the lifetime of positrons is possible using the same gamma-ray detectors that PET scans already use. The research team predicts that the majority of work to transfer this research from the lab to the bedside will be on upgrading gamma-ray detectors and software so that the gamma-ray detectors can record not just location, but accurate time data as well.

"It should not be much of a cost increase for development of instruments," said Professor Taiga Yamaya, a co-author of the research publication and leader of the Imaging Physics Group at the NIRS.

Enhanced PET scans for more effective cancer treatment

Medical experts have long understood that low oxygen concentrations in tumors can impede cancer treatment for two reasons: First, a low oxygen level in a tumor is often caused by insufficient blood flow, which is more common in fast-growing, aggressive tumors that are harder to treat. Second, low oxygen levels make radiation less effective because the desired cancer cell-killing effects of radiation treatment are achieved in part by the radiation energy converting oxygen present in the cells into DNA-damaging free radicals.

Thus, detecting the concentration of oxygen in body tissues would inform medical experts how to more effectively attack tumors inside patients.

"We imagine targeting more intense radiation treatment to the aggressive, low-oxygen concentration areas of a tumor and targeting lower-intensity treatment to other areas of the same tumor to give patients better outcomes and less side effects," said Takahashi.

Shibuya says that the team of researchers was inspired to put into practice a theoretical model about the ability for positrons to reveal oxygen concentration published last year by researchers in Poland. The project went from concept to publication in just a few months even with COVID-19 pandemic-related restrictions.

Shibuya and colleagues are now aiming to expand their work to find any other medical details that may be revealed by the lifetime of a positron.

Credit: 
University of Tokyo

Why drugs sometimes cause receptor potentiation rather than inhibition

image: Using a dedicated application technique and electrophysiological measurements, the researchers rapidly activated the glutamate receptors. The picture shows a setup for patch-clamp electrophysiology.

Image: 
RUB, Marquard

Professor Andreas Reiner and Stefan Pollok from the junior research group Cellular Neurobiology at Ruhr-Universität Bochum (RUB) report on this unexpected finding and the underlying mechanisms in the journal PNAS from 30 September 2020.

Wanted: more precise drugs

Glutamate is the messenger substance, which the brain uses to pass on excitatory signals. Receptors for this neurotransmitter are a promising target for drug development, as they are involved in many pathological processes. For example, they play a role in epilepsy, mental disorders, strokes or brain tumours. "In these cases, it may be beneficial to reduce the activity of glutamate receptors," explains Andreas Reiner. For this purpose, so-called antagonists have been developed, i.e. drugs that inhibit the activation of glutamate receptors. However, many of these antagonists inhibit all glutamate receptor subtypes, thus producing undesired adverse effects. To circumvent this problem, researchers are currently looking for drugs that only bind to certain receptor subtypes.

Measuring the effects of antagonists directly

In their current study, the researchers analysed the effects of such antagonists on selected receptor subtypes in more detail. For this purpose, they used cultivated cells containing only individual subtypes or specific receptor combinations. Using a dedicated application technique and electrophysiological measurements, the researchers rapidly activated the glutamate receptors, similar to their activation at synapses in the brain, and measured the influence of the antagonists.

Potentiation instead of inhibition

"We made a surprising observation in the process," says Stefan Pollok. "For certain receptor combinations, we did indeed see a reduction in activation, as expected, but, at the same time, the natural inactivation process was reduced or even completely abolished." The result was a longer-lasting and overall stronger response than without the antagonist. Instead of the desired inhibition, the researchers observed a potentiating effect.

In subsequent experiments, the team identified the molecular mechanisms of this behaviour more precisely: The potentiating effect is observed when the antagonists bind to receptors that consist of different subunits where it acts on only a part of the subunits. "Such so-called heteromeric receptors are, however, of great importance for signal transduction in the central nervous system," says Andreas Reiner. The findings are therefore significant for neuroscientists, who are increasingly using selective antagonists to decipher the function of the various receptor subtypes. On the other hand, the study might also have an impact on the development of new therapeutics. "We've gained new insights into how this fascinating class of receptors works," concludes Andreas Reiner. In the future, he also wants to investigate the effects of other glutamate receptor drugs.

Credit: 
Ruhr-University Bochum

Carb-eating bacteria under viral threat

image: Reconstructed microscopy image of a bacteriophage, which is a virus that attacks bacteria.

Image: 
(Purdue University and Seyet LLC)

Strictly speaking, humans cannot digest complex carbohydrates -- that's the job of bacteria in our large intestines. UC Riverside scientists have just discovered a new group of viruses that attack these bacteria.

The viruses, and the way they evade counterattack by their bacterial hosts, are described in a new Cell Reports paper.

Bacterioides can constitute up to 60% of all the bacteria living in a human's large intestine, and they're an important way that people get energy. Without them, we'd have a hard time digesting bread, beans, vegetables, or other favorite foods. Given their significance, it is surprising that scientists know so little about viruses that prey on Bacteroides.

"This is largely unexplored territory," said microbiologist Patrick Degnan, an assistant professor of microbiology and plant pathology, who led the research.

To find a virus that attacks Bacteroides, Degnan and his team analyzed a collection of bacterial genomes, where viruses can hide for numerous generations until something triggers them to replicate, attack and leave their host. This viral lifestyle is not without risk as over time mutations could occur that prevent the virus from escaping its host.

On analyzing the genome of Bacteroides vulgatus, Degnan's team found DNA belonging to a virus they named BV01. However, determining whether the virus is capable of escaping, or re-infecting its host, proved challenging.

"We tried every trick we could think of. Nothing in the laboratory worked until we worked with a germ-free mouse model," Degnan said. "Then, the virus jumped."

This was possible due to Degnan's collaboration with UCR colleague, co-author and fellow microbiologist Ansel Hsiao.

This result suggests conditions in mammalian guts act as a trigger for BV01 activity. The finding underscores the importance of both in vitro and in vivo experiments for understanding the biology of microbes.

Looking for more information about the indirect effect of this bacterial virus might have on humans, Degnan's team determined that when BV01 infects a host cell, it disrupts how that cell normally behaves.

"Over 100 genes change how they get expressed after infection," Degnan said.

Two of the altered genes that stood out to the researchers are both responsible for deactivating bile acids, which are toxic to microbes. The authors speculate that while this possibly alters the sensitivity of the bacteria to bile acids, it also may influence the ability of the bacteria to be infected by other viruses.

"This virus can go in and change the metabolism of these bacteria in human guts that are so key for our own metabolism," Degnan said.

Though the full extent of BV01 infection is not yet known, scientists believe viruses that change the abundance and activity of gut bacteria contribute to human health and disease. One area for future studies will involve the effect of diet on BV01 and viruses like it, as certain foods can cause our bodies to release more bile.

Degnan also notes that BV01 is only one of a group of viruses his team identified that function in similar ways. The group, Salyersviridae, is named after famed microbiologist Abigail Salyers whose work on intestinal bacteria furthered the science of antibiotic resistance.

Further research is planned to understand the biology of these viruses.

"It's been sitting in plain sight, but no one has characterized this important group of viruses that affect what's in our guts until now," Degnan said.

Credit: 
University of California - Riverside

Carbon-carbon covalent bonds far more flexible than presumed

image: The molecules synthesized and analysed by the research group. In green and blue is the flexible C-C single bond (Takuya Shimajiri, Takanori Suzuki, Yusuke Ishigaki, Angewandte Chemie International Edition, September 30, 2020).

Image: 
Takuya Shimajiri, Takanori Suzuki, Yusuke Ishigaki, Angewandte Chemie International Edition, September 30, 2020

A Hokkaido University research group has successfully demonstrated that carbon-carbon (C-C) covalent bonds expand and contract flexibly in response to light and heat. This unexpected flexibility of C-C bonds could confer new properties to organic compounds.

Rigid and robust, C-C covalent bonds are the most basic structure in organic and biological compounds. Understanding their nature is essential to improving our knowledge of chemical phenomena.

Usually, the C-C bond length is almost constant. The researchers, however, conducted this study on the premise that extremely elongated C-C bonds are weak, and so can expand or contract in response to external stimuli. The group designed and synthesized compounds that cyclize to form cage-like structures when exposed to light. They investigated how the structural transformation influences the length of C-C bonds at compounds' cores.

The researchers found that the C-C single bonds at the core contract flexibly during photocyclization. They also found that the cyclization can be reversed by heating, and the C-C bonds expand as the compounds return to the original state. Using single crystals of the compounds as analogs made it possible for the researchers to directly observe their flexibility and easily elucidate their structure in detail.

This is the first time the process of expansion and contraction of C-C bonds has been directly observed. The scientists concluded that this is a new phenomenon, in which C-C bonds obtained flexibility when they were elongated to the limit, decreasing the bonding energy. Furthermore, they showed that the oxidation potential of the compound changed by more than 1 volt due to the reversible expansion and contraction of the extremely elongated C-C bond, suggesting a new property related to the bond's flexibility.

The researchers say that synthesizing compounds with even longer bonds may lead to more functions through unique responses or major changes in their properties. This challenging research, aimed at breaking the record for the length of the C-C single bond, plays a role in developing materials that can be activated/deactivated by a novel response mode.

The researchers are Takuya Shimajiri, Professor Takanori Suzuki and Assistant Professor Yusuke Ishigaki of Hokkaido University's Department of Chemistry. The results of their study were published in Angewandte Chemie International Edition on September 30, 2020. This work follows their study in 2018, in which the group synthesized an organic compound with a record C-C bond length of more than 0.18 nanometers, compared to the standard 0.154 nanometers.

Credit: 
Hokkaido University

Why writing by hand makes kids smarter

image: Typing, clicking and watching occupy an increasing number of hours in the average child's day. But brain research shows that writing by hand helps people remember better and learn more. The photo shows a EEG Geodesic Sensor Net with 256 evenly distributed sensors that was used to record EEG activity from the participant's scalp during the research.

Image: 
NTNU/Microsoft

Professor Audrey van der Meer at NTNU believes that national guidelines should be put into place to ensure that children receive at least a minimum of handwriting training.

Results from several studies have shown that both children and adults learn more and remember better when writing by hand.

Now another study confirms the same: choosing handwriting over keyboard use yields the best learning and memory.

Van der Meer and her colleagues have investigated this several times, first in 2017 and now in 2020.

In 2017, she examined the brain activity of 20 students. She has now published a study in which she examined brain activity in twelve young adults and twelve children.

This is the first time that children have participated in such a study.

Both studies were conducted using an EEG to track and record brain wave activity. The participants wore a hood with over 250 electrodes attached.

The brain produces electrical impulses when it is active. The sensors in the electrodes are very sensitive and pick up the electrical activity that takes place in the brain.

Handwriting gives the brain more hooks to hang memories on

Each examination took 45 minutes per person, and the researchers received 500 data points per second.

The results showed that the brain in both young adults and children is much more active when writing by hand than when typing on a keyboard.

"The use of pen and paper gives the brain more 'hooks' to hang your memories on. Writing by hand creates much more activity in the sensorimotor parts of the brain. A lot of senses are activated by pressing the pen on paper, seeing the letters you write and hearing the sound you make while writing. These sense experiences create contact between different parts of the brain and open the brain up for learning. We both learn better and remember better," says Van der Meer.

She believes that her own and others' studies emphasize the importance of children being challenged to draw and write at an early age, especially at school.

Today's digital reality is that typing, tapping and screen time are a big part of children's and adolescents' everyday lives.

A survey of 19 countries in the EU shows that Norwegian children and teens spend the most time online. The smartphone is a constant companion, followed closely by the PC and tablet.

https://www.lse.ac.uk/media-and-communications/assets/documents/research/eu-kids-online/reports/EU-Kids-Online-2020-10Feb2020.pdf

The survey shows that Norwegian children ages 9 to16 spend almost four hours online every day, double the amount since 2010.

Kids' leisure time spent in front of a screen is now amplified by schools' increasing emphasis on digital learning.

Van der Meer thinks digital learning has many positive aspects, but urges handwriting training.

"Given the development of the last several years, we risk having one or more generations lose the ability to write by hand. Our research and that of others show that this would be a very unfortunate consequence" of increased digital activity, says Meer.

She believes that national guidelines should be put in place that ensure children receive at least a minimum of handwriting training.

"Some schools in Norway have become completely digital and skip handwriting training altogether. Finnish schools are even more digitized than in Norway. Very few schools offer any handwriting training at all," says Van der Meer.

In the debate about handwriting or keyboard use in school, some teachers believe that keyboards create less frustration for children. They point out that children can write longer texts earlier, and are more motivated to write because they experience greater mastery with a keyboard.

"Learning to write by hand is a bit slower process, but it's important for children to go through the tiring phase of learning to write by hand. The intricate hand movements and the shaping of letters are beneficial in several ways. If you use a keyboard, you use the same movement for each letter. Writing by hand requires control of your fine motor skills and senses. It's important to put the brain in a learning state as often as possible. I would use a keyboard to write an essay, but I'd take notes by hand during a lecture," says Van der Meer.

"The brain has evolved over thousands of years. It has evolved to be able to take action and navigate appropriate behaviour. In order for the brain to develop in the best possible way, we need to use it for what it's best at. We need to live an authentic life. We have to use all our senses, be outside, experience all kinds of weather and meet other people. If we don't challenge our brain, it can't reach its full potential. And that can impact school performance," says Van der Meer.

Credit: 
Norwegian University of Science and Technology

Internet gaming youth not more prone to psychiatric disorders

Many of our children play a lot of computer games. Some youth play so much and develop such big problems that a new diagnosis called Internet Gaming Disorder (IGD) has been proposed.

Symptoms of a gaming disorder include that it has an impact on school, work or friendships, that we continue to play even though we know it creates problems, that we are unable to stop or reduce the activity, we lose interest in other activities and that we lie about how much we play.

Previous findings show that excessive screen use among young children can lead to them becoming less able to recognize emotions. But some children also experience valuable mastery through gaming, and many find friendship and other social togetherness.

A research group at the Norwegian University of Science and Technology (NTNU) has looked at possible connections between children with symptoms of IGD and mental health problems. The results may reassure parents who might have let their children play more digital games during the a period marked by the coronavirus and home office.

"We've found no connection between IGD and psychiatric problems, other than that 10- and 12-year-olds who had more symptoms of gaming addiction developed fewer symptoms of anxiety two years later, when they were 12 and 14 years old," says Beate Wold Hygen.

She is a postdoctoral fellow at NTNU's Department of Psychology and the first author of a new article in the Journal of Child Psychology and Psychiatry.

Yes, children actually develop fewer symptoms of anxiety, not more.

This finding may could be related to the social aspects of the game, or that gaming is a distraction that causes the children to ruminate less than others.

No inverse relationship has been found.

"We looked at anxiety, depression, ADHD and oppositional defiant disorder. But children who had more symptoms of these mental disorders were not more susceptible to gaming addiction," says Hygen.

"When psychiatric difficulties and IGD occur at the same time, which they do, they must be explained by other shared underlying factors," says Professor Lars Wichstrøm, who is a co-author of the work and leads the research project on which the study is based.

The researchers are not sure exactly what factors come into play, but genes that affect both the tendency to become addicted, including to internet gaming, and having other mental health problems may play a role.

The figures are based on interviews with 702 children from the Trondheim Early Secure Study. These are children who have been followed up with questionnaires, tests, in-depth interviews and observation every other year since they were four years old. Today they are 16-17 years old.

Credit: 
Norwegian University of Science and Technology

From San Diego to Italy, study suggests wisdom can protect against loneliness

Over the last few decades, there has been growing concern about loneliness across all ages, particularly in middle-aged and older adults. Loneliness, defined as feeling isolated or not having an adequate number of meaningful personal connections, is consistently associated with unhealthy aging and has been identified as a major risk factor for overall adverse health outcomes.

In a recent cross-cultural study, researchers at University of California San Diego School of Medicine and University of Rome La Sapienza examined middle-aged and older adults in San Diego and Cilento, Italy and found loneliness and wisdom had a strong negative correlation.

The study, publishing in the October 1, 2020 online edition of Aging and Mental Health, suggests wisdom may be a protective factor against loneliness.

"An important finding from our study was a significant inverse correlation between loneliness and wisdom. People with higher scores on a measure of wisdom were less lonely and vice versa," said Dilip V. Jeste, MD, lead investigator of the study, senior associate dean for the Center of Healthy Aging and Distinguished Professor of Psychiatry and Neurosciences at UC San Diego School of Medicine.

"Loneliness was consistently associated with poor general health, worse quality of sleep and less happiness, whereas the reverse was generally true for wisdom."

Using the UCLA Loneliness Scale and San Diego Wisdom Scale, the researchers examined four groups: adults age 50 to 65 and those older than age 90 from Cilento and from San Diego. The researchers found the inverse correlation between loneliness and wisdom in all four groups.

"We translated the rating scales for loneliness and wisdom from English to Italian. It is remarkable that the findings related to these two traits were largely similar in two markedly different cultures -- a rural region of southern Italy and an urban/suburban county in the United States, both with different native languages and unique historical, educational and socioeconomic backgrounds," said Salvatore Di Somma, MD, PhD, lead Italian investigator and professor of emergency medicine at University of Rome La Sapienza.

The Cilento region in southwestern Italy is a relatively isolated, rural area believed to have a high concentration of individuals older than age 90. The present study was born out of the Cilento Initiative on Aging Outcomes (CIAO) study launched in 2016.

"Both loneliness and wisdom are personality traits. Most personality traits are partially inherited and partially determined by environment," said Jeste.

Wisdom has several components, such as empathy, compassion, self-reflection and emotional regulation. Researchers found that empathy and compassion had the strongest inverse correlation with loneliness. People who were more compassionate were less lonely.

"If we can increase someone's compassion, wisdom is likely to go up and loneliness is likely to go down," said David Brenner, MD, vice chancellor of UC San Diego Health Sciences. "At UC San Diego, we have considerable interest in enhancing empathy and compassion to reduce levels of stress and improve happiness and well-being."

Jeste said studies that examine how to decrease loneliness as people age will be critical for effective interventions and the future of health care.

"Routine assessment of loneliness with evidence-based, compassion-focused interventions for prevention and management of loneliness should become an integral part of clinical practice. So how do you increase compassion? Utilizing approaches like cognitive behavioral therapy or writing in a gratitude diary can help someone become more compassionate," he said.

Jeste noted that a limitation of this study was that it was cross-sectional. Only longitudinal studies can establish cause-and-effect relationships. Next steps will include testing an intervention to increase compassion for reducing loneliness.

Credit: 
University of California - San Diego

Researchers use satellite imaging to map groundwater use in California's central valley

image: New work from UC San Diego could be revolutionary for managing groundwater use in agricultural regions around the world, as groundwater monitoring and management have been notoriously difficult to carry out due to lack of reliable data.

Image: 
adamkaz

Researchers at the University of California San Diego report in a new study a way to improve groundwater monitoring by using a remote sensing technology (known as InSAR), in conjunction with climate and land cover data, to bridge gaps in the understanding of sustainable groundwater in California's San Joaquin Valley.

Their work could be revolutionary for managing groundwater use in agricultural regions around the world, as groundwater monitoring and management have been notoriously difficult to carry out due to lack of reliable data.

The satellite-based InSAR (interferometric synthetic aperture radar) is used to make high-resolution maps of land surface motion in space and time, including measurement of subsidence (or sinking). Subsidence can occur when large amounts of groundwater are removed from underground stores, called aquifers.

The study, published in the journal Environmental Research Letters, took advantage of the incredibly fine-scale resolution of InSAR to evaluate subsidence patterns according to crop type, revealing surprising results. For example, despite reports of high water consumption by fruit and nut crops in California, the crop types with the greatest rates of subsidence, and by association the greatest rates of groundwater use, were field crops such as corn and soy, followed by pasture crops like alfalfa, truck crops like tomatoes, and lastly, fruit and nut crops like almonds and grapes.

"Our initial hypothesis was that fruit and nut crops would be associated with some of the highest rates of subsidence, but we found the opposite," said study lead author, Morgan Levy, an assistant professor with a joint appointment with UC San Diego's Scripps Institution of Oceanography and School of Global Policy and Strategy.

Because displacement is a response to groundwater storage change in locations with varying geology, soils and vegetation, the interpretation of InSAR varies across locations, unlike satellite measurements of climate that have the same interpretation in any location. Therefore, InSAR must be combined with other sources of geophysical data to achieve location-specific insight into groundwater use.

By combining InSAR with other land surface datasets including land cover, potential evapotranspiration (a measure of plant water demand), and the location of surface water supply networks, UC San Diego researchers found that between 2015 and 2017, subsidence occurred at much higher rates in irrigated cultivated land compared to undeveloped land, and in dry surface water-limited years relative to wet years.

Over the study period, there was a median 272 millimeters (or 16 inches) of total cumulative subsidence for field crops (like corn and soy), and a dry water year subsidence rate of 131 millimeters (5 inches) per year. For fruit and nut crops, (like almonds and grapes) there was a median 62 millimeters (2.5 inches) of total subsidence over the study period, and a dry water year subsidence rate of 31 millimeters (1 inch) per year.

"The outcome might be explained by two things. First, on average fruits and nuts require less water physiologically, compared to field and pasture crops. Second, field and pasture crops tend to use irrigation methods that are less efficient and higher-volume than those used by fruit and nut crops," Levy said. "However, fruits and nuts may still consume greater total volumes of water because they occupy more land area, even if their groundwater use intensity is less."

Methods and findings from this research could be used to support the state's ongoing effort to prevent overdraft of groundwater aquifers. Groundwater is a critical resource both nationally and globally: In the U.S., groundwater is a source of drinking water for roughly half of the population, and constitutes the largest source of irrigation water for agriculture. Irrigation accounts for approximately 70 percent of total U.S. groundwater withdrawals, and California has the highest rates of groundwater pumping in the nation.

"Our findings indicate that in the Central Valley, the costs and benefits of transitions away from field crops and towards fruit and nut crops in recent years are more complex than typically assumed," Levy added. "Our results suggest the possibility that transitions to fruit and nut cultivation might be desirable, at least from a groundwater sustainability perspective, although more research is needed to confirm this."

Global potential to advance groundwater monitoring and management

California is an example of a semi-arid and irrigation-dependent climate for agriculture. Coordinated efforts from the UC San Diego team of climate scientists and geophysicists to link subsidence, groundwater and surface water use, and crop production data across comparable time and space scales has tremendous potential to advance groundwater monitoring and management in agricultural regions in other parts of the world, said the authors.

In the San Joaquin Valley during wet years, farmers may receive up to 100 percent of their surface water allocations, while in extremely dry years, they may receive none. When surface water supplies are unavailable, farmers mine groundwater. Thus, groundwater has become increasingly important under climate change, as California and many parts of the world have experienced surface water shortages. However, excessive pumping does occur, even in relatively wet years. And, aquifers can run out.

In 2014, California passed legislation mandating a gradual, locally led shift towards sustainable use of groundwater--the resource on which 85 percent of its population and much of its $50-billon agriculture industry rely. The data from InSAR can be critical to the state's efforts to perform effective monitoring and management in response to climate change.

While the legislation has encouraged local agencies to begin to use InSAR for documenting land subsidence, uses of InSAR for direct monitoring of groundwater use are early in their development. The UC San Diego research efforts provide an example of how water managers might use satellite data sources, including InSAR, to directly monitor local relationships between subsidence, groundwater pumping and crop portfolios.

"The promise of InSAR lies in our ability to combine it with other sources of geophysical and social data to answer water policy-relevant questions," Levy and co-authors wrote. "We provide a preview of the power of such a synthesis, demonstrating that spatial patterns of subsidence and their relationship to agricultural cultivation and associated water demand are clear and robust."

They concluded, "Our findings suggest that policy levers supporting sustainable groundwater management might benefit from consideration of the groundwater use intensity of crop selection, not only the difficult-to-define sustainability of groundwater extraction volumes over groundwater aquifer boundaries that remain uncertain and that are costly to delineate."

Credit: 
University of California - San Diego

The most sensitive and fastest graphene microwave bolometer

image: Schematics of the device, which consists of a graphene Josephson junction, which is integrated into a microwave circuit.

Image: 
harvard-icfo-mit-bbntechnologies-nims

Bolometers are devices that measure the power of incident electromagnetic radiation thru the heating of materials, which exhibit a temperature-electric resistance dependence. These instruments are among the most sensitive detectors so far used for infrared radiation detection and are key tools for applications that range from advanced thermal imaging, night vision, infrared spectroscopy to observational astronomy, to name a few.

Even though they have proven to be excellent sensors for this specific range of radiation, the challenge lies in attaining high sensitivity, fast response time and strong light absorption, which not always are accomplished all together. Many studies have been conducted to obtain these higher-sensitivity bolometers by searching to reduce the size of the detector and thus increase the thermal response, and in doing so, they have found that graphene seems to be an excellent candidate for this.

If we focus on the infrared range, several experiments have demonstrated that if you take a sheet of graphene and place it in between two layers of superconducting material to create a Josephson junction, you can obtain a single photon detector device. At low temperatures, and in the absence of photons, a superconducting current flows through the device. When a single infrared photon passes through the detector, the heat it generates is enough to warm up the graphene, which alters the Josephson junction such that no superconducting current can flow. So you can actually detect the photons that are passing through the device by measuring the current. This can be done basically because graphene has an almost negligible electronic heat capacity. This means that, contrary to materials that retain heat like water, in the case of graphene a single low-energy photon can heat the detector enough to block the superconducting current, and then dissipate quickly, allowing the detector to rapidly reset, and thus achieving very fast time responses and high sensitivities.

Trying to take a step further and move to higher wavelengths, in a recent study published in Nature, a team of scientists which includes ICFO researcher Dmitri Efetov, together with colleagues from Harvard University, Raytheon BBN Technologies, MIT, and the National Institute for Material Sciences, has been able to develop a graphene-based bolometer that can detect microwave photons at extremely high sensitivities and with fast time responses.

Just like with the infrared range, the team took a sheet of graphene and placed it in between two layers of superconducting material to create a Josephson junction. This time, they went an entirely new route and attached a microwave resonator to generate the microwave photons and by passing these photons through the device, were able to reach an unprecedented detection levels. In particular, they were able to detect single photons with a much lower energy resolution, equivalent to that of a single 32 Ghz photon, and achieve detection readouts 100.000 times faster than the fastest nanowire bolometers constructed so far.

The results achieved in this study mean a major breakthrough in the field of bolometers. Not only has graphene proven to be an ideal material for infrared sensing and imaging, but it has also proven to span to higher wavelengths, reaching the microwave, where it has also shown to attain extremely high sensitivities and ultra-fast read out times.

As Prof. at ICFO Dmitri Efetov comments "such achievements were thought impossible with traditional materials, and graphene did the trick again. This open entirely new avenues for quantum sensors for quantum computation and quantum communication".

Credit: 
ICFO-The Institute of Photonic Sciences

NASA confirms, heavy rainfall, strengthening of tropical storm Marie

image: On Sept. 30 at 5:30 a.m. EDT (0930 UTC), NASA's IMERG estimated Tropical Storm Marie was generating as much as 30 to 40 mm (1.2 to 1.6 inches of rain/dark pink/red) around the center of circulation. Rainfall throughout most of the storm and in bands of thunderstorms west of the center, was occurring between 2 and 15 mm (0.08 to 0.6 inches/yellow and green colors) per hour. The rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite.

Image: 
NASA/NOAA/NRL

Tropical Storm Marie has formed in the Eastern Pacific Ocean and NASA satellite data helped confirm the strengthening of the storm. In addition, using a NASA satellite rainfall product that incorporates data from satellites and observations, NASA estimated Marie's rainfall rates the provided more clues about intensification.

Tropical Depression 18E formed on Sept. 29 by 5 p.m. EDT well southwest of the southwestern coast of Mexico. Twelve hours later the depression strengthened into a tropical storm and renamed Marie.

Marie's Status on Sept. 30

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Marie was located near latitude 14.2 degrees north and longitude 113.8 degrees west. Marie is located about 655 miles (1,050 km) south-southwest of the southern tip of Baja California, Mexico and is moving toward the west near 16 mph (26 kph).

A westward to west-northwestward motion is expected through Friday. Maximum sustained winds have increased to near 65 mph (100 kph) with higher gusts. The estimated minimum central pressure is 997 millibars.

Estimating Marie's Rainfall Rates from Space

NASA's Integrated Multi-satellitE Retrievals for GPM or IMERG, which is a NASA satellite rainfall product, estimated on Sept. 30 at 5:30 a.m. EDT (0930 UTC) that Tropical Storm Marie was generating as much as 30 to 40 mm (1.2 to 1.6 inches) of rain around the center of circulation. That heavy rainfall near the center is suggestive of hot towering thunderstorms.

A "hot tower" is a tall cumulonimbus cloud that reaches at least to the top of the troposphere, the lowest layer of the atmosphere. It extends approximately 9 miles/14.5 km high in the tropics. These towers are called "hot" because they rise to such altitude due to the large amount of latent heat. Water vapor releases this latent heat as it condenses into liquid. Those towering thunderstorms have the potential for heavy rain. NASA research shows that a tropical cyclone with a hot tower in its eyewall was twice as likely to intensify within six or more hours, than a cyclone that lacks a hot tower.

Rainfall throughout most of the storm and in bands of thunderstorms west of the center was occurring at a rate of between 2 and 15 mm (0.08 to 0.6 inches) per hour.

At the U.S. Naval Laboratory in Washington, D.C., the IMERG rainfall data was overlaid on infrared imagery from NOAA's GOES-16 satellite to provide the full extent of the storm.

NASA satellite imagery has shown that Marie's structure has been gradually improving. The National Hurricane Center (NHC) noted that Marie's center is embedded beneath a central dense overcast feature, and the band of thunderstorms in the western quadrant of the storm has become more pronounced and continuous. In addition, a mid-level eye has begun to form, as observed in microwave satellite data.

What Does IMERG Do?

This near-real time rainfall estimate comes from the NASA's IMERG, which combines observations from a fleet of satellites, in near-real time, to provide near-global estimates of precipitation every 30 minutes. By combining NASA precipitation estimates with other data sources, we can gain a greater understanding of major storms that affect our planet.

What IMERG does is "morph" high-quality satellite observations along the direction of the steering winds to deliver information about rain at times and places where such satellite overflights did not occur. Information morphing is particularly important over the majority of the world's surface that lacks ground-radar coverage. Basically, IMERG fills in the blanks between weather observation stations.

How Other NASA Satellites Help Forecasters

Infrared and water vapor data from NASA's Aqua, Terra and NASA-NOAA's Suomi NPP satellite were used to help forecasters assess the environment where Marie was headed. Infrared imagery provides temperature information about cloud tops and sea surface environments. Colder cloud tops indicate stronger storms. Sea surface temperature data are also critical for forecasters because tropical cyclones require ocean temperatures of at least 26.6 degrees Celsius (80 degrees Fahrenheit) to maintain intensity. Warmer waters can help with tropical cyclone intensification, while cooler waters can weaken tropical cyclones.

Water vapor analysis of tropical cyclones tells forecasters how much potential a storm has to develop. Water vapor releases latent heat as it condenses into liquid. That liquid becomes clouds and thunderstorms that make up a tropical cyclone. Temperature is important when trying to understand how strong storms can be. The higher the cloud tops, the colder and the stronger the storms.

Marie's Forecast

NHC Hurricane Specialist Robbie Berg noted, "The stage appears set for Marie to rapidly intensify during the next couple of days. Water vapor imagery indicates that the easterly [wind] shear over the cyclone has continued to decrease and should be generally low for the next 3 days, and upper-level divergence will also be in place during that period to help ventilate the storm. The thermodynamics are also favorable for fast strengthening, highlighted by sea surface temperatures of 28-29 degrees Celsius and plenty of moisture in the surrounding environment. Due to these conditions, the NHC forecast explicitly shows rapid intensification during the next couple of days, with a peak intensity likely occurring sometime between 48 and 60 hours."

The National Hurricane Center expects rapid strengthening and Marie is expected to become a hurricane this evening or tonight. Marie could then become a major hurricane by late Thursday, Oct. 1.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Study: Greenland is on track to lose ice faster than in any century over 12,000 years

image: Infographic describing the study's findings.

Image: 
Bob Wilder / University at Buffalo

BUFFALO, N.Y. -- If human societies don't sharply curb emissions of greenhouse gases, Greenland's rate of ice loss this century is likely to greatly outpace that of any century over the past 12,000 years, a new study concludes.

The research will be published on Sept. 30 in the journal Nature. The study employs ice sheet modeling to understand the past, present and future of the Greenland Ice Sheet. Scientists used new, detailed reconstructions of ancient climate to drive the model, and validated the model against real-world measurements of the ice sheet's contemporary and ancient size.

The findings place the ice sheet's modern decline in historical context, highlighting just how extreme and unusual projected losses for the 21st century could be, researchers say.

"Basically, we've altered our planet so much that the rates of ice sheet melt this century are on pace to be greater than anything we've seen under natural variability of the ice sheet over the past 12,000 years. We'll blow that out of the water if we don't make severe reductions to greenhouse gas emissions," says Jason Briner, PhD, professor of geology in the University at Buffalo College of Arts and Sciences. Briner led the collaborative study, coordinating the work of scientists from multiple disciplines and institutions.

"If the world goes on a massive energy diet, in line with a scenario that the Intergovernmental Panel on Climate Change calls RCP2.6, our model predicts that the Greenland Ice Sheet's rate of mass loss this century will be only slightly higher than anything experienced in the past 12,000 years," Briner adds. "But, more worrisome, is that under a high-emissions RCP8.5 scenario -- the one the Greenland Ice Sheet is now following -- the rate of mass loss could be about four times the highest values experienced under natural climate variability over the past 12,000 years."

He and colleagues say the results reiterate the need for countries around the world to take action now to reduce emissions, slow the decline of ice sheets, and mitigate sea level rise. The research was largely funded by the U.S. National Science Foundation.

Combining ice sheet modeling with field work, real-life observations

The study brought together climate modelers, ice core scientists, remote sensing experts and paleoclimate researchers at UB, NASA's Jet Propulsion Laboratory (JPL), the University of Washington (UW), Columbia University's Lamont-Doherty Earth Observatory (LDEO), the University of California, Irvine (UCI) and other institutions.

This multidisciplinary team used a state-of-the-art ice sheet model to simulate changes to the southwestern sector of the Greenland Ice Sheet, starting from the beginning of the Holocene epoch some 12,000 years ago and extending forward 80 years to 2100.

Scientists tested the model's accuracy by comparing results of the model's simulations to historical evidence. The modeled results matched up well with data tied to actual measurements of the ice sheet made by satellites and aerial surveys in recent decades, and with field work identifying the ice sheet's ancient boundaries.

Though the project focused on southwestern Greenland, research shows that changes in the rates of ice loss there tend to correspond tightly with changes across the entire ice sheet.

"We relied on the same ice sheet model to simulate the past, the present and the future," says co-author Jessica Badgeley, a PhD student in the UW Department of Earth and Space Sciences. "Thus, our comparisons of the ice sheet mass change through these time periods are internally consistent, which makes for a robust comparison between past and projected ice sheet changes."

"We have significantly improved our understanding of how anomalous future Greenland change will be," says co-author Joshua Cuzzone, PhD, an assistant project scientist at UCI who completed much of his work on the study as a postdoctoral researcher at JPL and UCI. "This work represents a massive success for multidisciplinary science and collaboration, and represents a framework for future successful multidisciplinary work."

Cuzzone and other researchers at UCI and JPL led ice sheet modeling, leveraging the work of colleagues at UW, who used data from ice cores to create maps of temperatures and precipitation in the study region that were used to drive the ice sheet model simulations up to the year 1850. Previously published climate data was used to drive the simulations after that date.

UB and LDEO scientists partnered on field work that helped validate the model by identifying the ice sheet's boundaries in southwestern Greenland thousands of years ago.

"We built an extremely detailed geologic history of how the margin of the southwestern Greenland Ice Sheet moved through time by measuring beryllium-10 in boulders that sit on moraines," says co-author Nicolás Young, PhD, associate research professor at LDEO. "Moraines are large piles of debris that you can find on the landscape that mark the former edge of an ice sheet or glacier. A beryllium-10 measurement tells you how long that boulder and moraine have been sitting there, and therefore tells you when the ice sheet was at that exact spot and deposited that boulder.

"Amazingly, the model reproduced the geologic reconstruction really well. This gave us confidence that the ice sheet model was performing well and giving us meaningful results. You can model anything you want and your model will always spit out an answer, but we need some way to determine if the model is doing a good job."

A continuous timeline of changes to the Greenland Ice Sheet

The study makes an important contribution by creating a timeline of the past, present and future of the Greenland Ice Sheet, Briner says. The results are sobering.

"We have long timelines of temperature change, past to present to future, that show the influence of greenhouse gases on Earth's temperature," Briner says. "And now, for the first time, we have a long timeline of the impacts of that temperature -- in the form of Greenland Ice Sheet melt -- from the past to present to future. And what it shows is eye-opening."

"It is no secret that the Greenland Ice Sheet is in rough shape and is losing ice at an increasing rate," Young says. "But if someone wants to poke holes in this, they could simply ask, 'how do you know this isn't just part of the ice sheet's natural variability?' Well, what our study suggests is that the rate of ice loss for this century will exceed the rate of ice loss for any single century over the last 12,000 years. I think this is the first time that the current health of the Greenland Ice Sheet has been robustly placed into a long-term context."

Despite these sobering results, one vital takeaway from the model's future projections is that it's still possible for people and countries around the world to make an important difference by cutting emissions, Briner says. Models of the RCP2.6 and RCP8.5 scenarios yield very different results, with high-emission scenarios producing massive declines in the ice sheet's health, and significant sea level rise.

"Our findings are yet another wake-up call, especially for countries like the U.S.," Briner says. "Americans use more energy per person than any other nation in the world. Our nation has produced more of the CO2 that resides in the atmosphere today than any other country. Americans need to go on an energy diet. The most affluent Americans, who have the highest energy footprint, can afford to make lifestyle changes, fly less, install solar panels and drive an energy-efficient vehicle."

"This study shows that future ice loss is likely to be larger than anything that the ice sheet experienced in the Holocene -- unless we follow a low-carbon emission scenario in the future," Badgeley says.

Credit: 
University at Buffalo

Aortic valve replacement during COVID-19 pandemic

What The Study Did: The outcomes associated with deferred compared with expedited aortic valve replacement in patients with severe aortic stenosis during the COVID-19 pandemic are evaluated in this observational study.

Authors: Thomas Pilgrim, M.D. M.Sc., of the University of Bern in Switzerland, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.20402)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Novel role of microglia as modulators of neurons in the brain is discovered

Immune cells in the brain that act as scavengers to remove dying cells also play a potentially pivotal role in the regulation of behavior in both mice and humans, a research team from Mount Sinai has found. The newly identified function of the scavenger cells, known as microglia, to protect the brain from abnormal activation in health and disease has implications for treating behavioral abnormalities associated with neurodegenerative and inflammatory diseases in humans. The study was published September 30, 2020 in Nature.

"When we think about brain function, we typically think about how neurons control our thoughts and behavior," says Anne Schaefer, MD, PhD, Professor of Neuroscience, and Psychiatry, at the Icahn School of Medicine at Mount Sinai, and senior author of the study. "But the brain also contains large amounts of non-neuronal cells, including microglia, and our study puts a fresh spotlight on these cells as partners of neurons in the regulation of neuronal activity and behavior. We found that microglia can sense and respond to neuronal activation and provide negative feedback on excessive neuronal activity. This novel microglia-mediated mechanism of neuromodulation could play an important role in protecting the brain from disease."

Mount Sinai researchers identified the biochemical circuit that supports neuron-microglia communication. When neurons are active, they release a molecule called adenosine triphosphate (ATP). Microglia can sense extracellular ATP, and the compound draws them toward the active neurons. As the next step, the microglia break ATP down to generate adenosine, which then acts on adenosine receptors on the surface of active neurons to suppress their activity and prevent excessive activation.

"In inflammatory conditions and neurodegenerative diseases like Alzheimer's, microglia become activated and lose their ability to sense ATP and to generate adenosine," says Ana Badimon, PhD, a former student in the Schaefer Lab and first author of the study.

"This suggested to us that behavioral alterations associated with disease may be mediated, in part, by changes in microglial-neuron communication," adds Dr. Schaefer, who is also Co-Director of the Center for Glial Biology at The Friedman Brain Institute at the Icahn School of Medicine.

Dr. Schaefer describes the identification of the biochemical circuit that enables microglial control of neuronal responses as a potential "paradigm shift" in our understanding of how innate immune cells in the brain can contribute to behavior. This observation is particularly important, she adds, given the fact that microglia, while residing in the brain, are uniquely equipped to also respond to signals generated in the peripheral body. Microglia can therefore act as an interface between peripheral body changes, like a viral infection, and the brain by communicating these signals to neurons to modulate behavioral responses.

By shedding valuable light on the interaction of neurons and microglia, the study carries a number of practical implications for further research. They range from novel approaches of neuromodulation of normal behaviors by targeting microglia, to potential treatment of behavioral abnormalities associated with neurodegenerative diseases.

"The future promise of our study also lies in the identification of novel signals like ATP that will allow microglia to modulate the function of highly diverse neurons, including neurons controlling sleep or metabolism," says Dr. Schaefer. "We believe our work has the potential to add to our knowledge about the mechanisms of neuromodulation."

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Science snapshots September 2020

image: Near-field scanning microscope image of nanocircuits "written" into a 2D device made of boron nitride and graphene.

Image: 
Alex Zettl/Berkeley Lab

An Innovative Pattern: Scientists Rewrite Rules for 2D Electronics

By Theresa Duque

A research team led by Alex Zettl, senior faculty scientist in Berkeley Lab's Materials Sciences Division and professor of physics at UC Berkeley, has developed a new technique for fabricating tiny circuits from ultrathin materials for next-generation electronics, such as rewritable, low-power memory circuits. Their findings were reported in the journal Nature Electronics.

Using the nanofabrication facility at the Molecular Foundry, the researchers prepared two different 2D devices known as van der Waals heterostructures: one by sandwiching graphene between two layers of boron nitride; and another by sandwiching molybdenum disulfide.

When applying a fine electron beam to the boron-nitride "sandwiches," the researchers demonstrated that they can "write" nanoscale conducting channels, or nanocircuits, into the core "active" layer by controlling the intensity of electron beam exposure while properly controlling a back-gate electric field.

When written into the graphene or molybdenum disulfide layer, these nanocircuits allow high densities of electrons, or quasiparticles called holes, to accumulate and move through the semiconductor along narrow predetermined tracks at ultrahigh speeds with few collisions - like cars racing through a freeway within inches of each other without crashing or stalling.

The researchers also found that reapplying the electron beam with a special back-gate to the 2D materials can erase nanocircuits that have already been written - or write additional or different circuits in the same device, which suggests that the technique has great potential for next-generation reconfigurable 2D electronics.

Importantly, the researchers demonstrated that the material's conducting states and ultrahigh electronic mobility persists even after the electron beam and the back-gate have been removed. This finding is critical to many applications, including energy-efficient nonvolatile memory devices that do not require constant power to retain data, said lead author Wu Shi, a project scientist in Berkeley Lab's Materials Sciences Division and the Zettl Lab at UC Berkeley.

Synthetic Pathways Turn Plants into Biofactories for New Molecules

By Emily Scott

Plants can produce a wide range of molecules, many of which help them fight off harmful pests and pathogens. Biologists have harnessed this ability to produce many molecules important for human health -- aspirin and the antimalarial drug artemisinin, for example, are derived from plants.

Now, scientists at the Joint BioEnergy Institute (JBEI) are using synthetic biology to give plants the ability to create molecules never seen before in nature. New research led by Patrick Shih, director of Plant Biosystems Design at JBEI, and Beth Sattely of Stanford University describes success in swapping enzymes between plants to engineer new synthetic metabolic pathways. These pathways gave plants the ability to create new classes of chemical compounds, some of which have enhanced properties.

"This is a demonstration of how we can begin to start rewiring and redesigning plant metabolism to make molecules of interest for a range of applications," Shih said.

Engineering plants to make new molecules themselves provides a sustainable platform to produce a wide range of compounds. One of the compounds the researchers were able to create is comparable to commercially used pesticides in their effectiveness, while others may have anti-cancer properties. The long-term goal is to engineer plants to be biofactories of molecules such as these, bypassing the need to externally spray pesticides or synthesize therapeutic molecules in a lab.

"That's the motivation for where we could go," Shih said. "We want to push the boundaries of plant metabolism to make compounds we've never seen before."

JBEI is a DOE Bioenergy Research Center supported by DOE's Office of Science.

Transforming Waste into Bio-Based Chemicals

By Emily Scott

Researchers at Berkeley Lab have transformed lignin, a waste product of the paper industry, into a precursor for a useful chemical with a wide range of potential applications.

Lignin is a complex material found in plant cell walls that is notoriously difficult to break down and turn into something useful. Typically, lignin is burned for energy, but scientists are focusing on ways to repurpose it.

In a recent study, researchers demonstrated their ability to convert lignin into a chemical compound that is a building block of bio-based ionic liquids. The research was a collaboration between the Advanced Biofuels and Bioproducts Process Development Unit, the Joint BioEnergy Institute (both established by the Department of Energy and based at Berkeley Lab), and the Queens University of Charlotte.

Ionic liquids are powerful solvents/catalysts used in many important industrial processes, including the production of sustainable biofuels and biopolymers. However, traditional ionic liquids are petroleum-based and costly. Bio-based ionic liquids made with lignin, an inexpensive organic waste product, would be cheaper and more environmentally friendly.

"This research brings us one step closer to creating bio-based ionic liquids," said Ning Sun, the study's co-corresponding author. "Now we just need to optimize and scale up the technology."

According to Sun, bio-based ionic liquids also have a broad range of potential uses outside of industry. "We now have the platform to synthesize bio-based ionic liquids with different structures that have different applications, such as antivirals," Sun said.

This research was funded by DOE's Bioenergy Technologies Office through the Technology Commercialization Fund.

Providing New Technologies for Vaccine Development

By Lida Gifford

Vaccines, which help the body recognize infectious microorganisms and stage a stronger and faster response, are made up of proteins that are specific to each type of microorganism. In the case of a virus, viral proteins - or antigens - can sometimes be attached to a protein scaffold to help mimic the shape of the virus and elicit a stronger immune response. Using scaffolds to approximate the natural configuration of the antigen is an emerging approach to vaccine design.

A team of scientists led by David Baker at the University of Washington developed a method to design artificial proteins to serve as a framework for the viral antigens. Their study was published recently in the journal eLife. Berkeley Lab scientists collected data at the Advanced Light Source to visualize the atomic structure and determine the dynamics of the designed scaffolds.

"When bound, the scaffolds assume predicted geometries, which more closely approximate the virus shape and thereby maximize the immune response," said Banu Sankaran, a research scientist in the Molecular Biophysics and Integrated Bioimaging (MBIB) Division. "It was exciting to collaborate on this method to predictably design frameworks, which could lead to more effective vaccines, especially for viruses that we don't have a scaffold for."

The team also included Peter Zwart, MBIB staff scientist; small angle X-ray scattering data were collected at the SIBYLS beamline by MBIB's Kathryn Burnett and Greg Hura.

The Advanced Light Source is a DOE Office of Science user facility.

Credit: 
DOE/Lawrence Berkeley National Laboratory