Tech

A liquid crystal walks into an optical resonator: new Skoltech research helps model future optoelect

image: Photonic lab at Skoltech

Image: 
Timur Sabirov/Skoltech

Researchers at Skoltech and their colleagues proposed a photonic device from two optical resonators with liquid crystals inside them to study optical properties of this system that can be useful for future generations of optoelectronic and spinoptronic devices. The paper was published in the journal Physical Review B.

The simplest kind of optical resonator consists of two mirrors directly opposite each other, "squeezing" light between them. When you stand inside a mirror resonator, you see infinite copies of yourself in the mirrors; when a liquid crystal - the kind in your computer and smartphone screen - is placed into a much smaller and a bit more complex resonator, interesting things tend to happen. Since the orientation of the liquid crystal molecules can be changed by applying an electric current, researchers were able to control various characteristics of light propagation inside the resonator and, in some sense, simulate the operation of electronic devices that are widely used in our lives using photons.

"One of the main trends in physics now is the transition from electronic to photonic computing systems, since the latter are able to significantly increase the speed of processing and transmitting information, as well as to potentially significantly reduce energy consumption. That is why studies of various kinds of tunable photonic architectures mimicking the properties of electronic analogues attract great interest," says Pavel Kokhanchik, MSc student at Skoltech and the paper's first author.

Kokhanchik, Skoltech Professor Pavlos Lagoudakis and their colleagues decided to see what happens if two such optical resonators filled with liquid crystals were placed very close to each other, at a distance of several micrometers. The researchers expected to reveal new properties not inherent in a single liquid crystal microcavity (resonator), which was investigated in collaboration with colleagues from the University of Warsaw recently.

The resonators, sharing the same "pool" of photons which entangles them, behave kind of like two pendulums, which, when put in close proximity, will sync to share the same frequency. The team found that in this case, light acquires new properties, studied in a field called topological physics. These properties can be fine-tuned, so the device increases the number of physical systems that can be imitated both for fundamental studies and for practical use.

"Our work is just one small step in the huge field of research of photonic analogues of electronic solid-state systems. Fundamental research will certainly be followed by the compaction of these devices, their production on a chip on an industrial scale, and their integration into everyday devices, but at the moment this is a rather distant prospect," Pavel Kokhanchik notes.

The scientists plan to implement a double liquid crystal cavity experimentally to demonstrate the rich physics postulated in the paper. They will also continue the research of similar double microcavity systems and study them in the light-matter strong coupling regime.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

LJI research leads to promising combination therapy for type 1 diabetes

image: A promising therapeutic candidate for type 1 diabetes combines anti-IL-21 antibody with the diabetes drug liraglutide.

Image: 
La Jolla Institute for Immunology

LA JOLLA, CA--Translational research led by scientists at La Jolla Institute for Immunology (LJI) has resulted in a promising combination therapeutic candidate for adults with recent-onset type 1 diabetes.

The combination therapy was recently tested in a randomised, double-blind, placebo-
controlled, phase 2 trial run and funded by pharmaceutical company Novo Nordisk. The results, published recently in The Lancet Diabetes and Endocrinology, point to a potential way to treat the autoimmune disease without leaving the body vulnerable to infectious disease.

The therapeutic candidate combines anti-IL-21 antibody with the diabetes drug liraglutide. This two-pronged approach is based on research findings from the lab of LJI Professor Matthias von Herrath, M.D., who also serves as vice president and senior medical officer, Global Chief Medical Office, at Novo Nordisk.

"This is the first large trial for a combination therapy, and the data suggest it has value for patients," says von Herrath. "The groundwork for choosing a combination therapy was laid through preclinical work at La Jolla Institute."

Type 1 diabetes is an autoimmune disease that occurs when the body's own T cells mistakenly target insulin-producing beta cells in the pancreas. When beta cells die, the body loses its ability to regulate glucose levels, which can eventually lead to severe organ damage and death.

One challenge in treating type 1 diabetes is that therapies targeting "system-wide" T cell responses also run the risk of hindering the immune system's ability to fight real threats, such as viruses and bacteria.

The von Herrath Laboratory at LJI is focused on uncovering the molecular triggers of type 1 diabetes. Their work has pointed to ways to modulate parts of the immune system without suppressing overall immune system function.

In 2012, the von Herrath Laboratory published a study in Immunity showing the importance of the interleukin (IL)-21 receptor in allowing harmful T cells into the pancreas. Follow-up studies showed that an anti-IL-21 antibody could interrupt that signal and potentially shield the pancreas from attack.

Importantly, because the anti-IL-21 antibody appears to only affect a group of T cells, von Herrath and his colleagues believed the antibody might help treat type 1 diabetes without dampening the overall immune system.

In 2017, Novo Nordisk published a pre-clinical study in the Journal of Autoimmunity showing the effects of a combination therapy that consisted of anti-IL-21 monoclonal antibody combined with the FDA-approved type 2 diabetes drug liraglutide. Liraglutide has been shown to protect beta cell function. The study, which included von Herrath as a co-author, showed that this combination could reverse type 1 diabetes in a mouse model.

For the new study, von Herrath and his collaborators tested the combination therapy in a randomized, parallel-group, placebo-controlled, double-dummy, double-blind, phase 2 trial. Compared with the placebo group, patients who received the 54-week course of treatment had higher levels of endogenous insulin secretion. No safety concerns were identified.

The researchers followed up with study participants for 26 weeks after the treatment ended and found that the effects diminished during that time. They also found no lasting adverse changes to the immune system. The researchers note that the combination therapy will need to be assessed for long-term safety and efficacy in a phase 3 clinical trial.

Going forward, von Herrath and his laboratory at LJI are focused on translating their findings on both type 1 diabetes and type 2 diabetes from the laboratory to the clinic.

Credit: 
La Jolla Institute for Immunology

Experts recommend shared patient - doctor decision-making prior to lung cancer screening

image: "In our view, CMS should continue to require, as well as pay for, shared decision-making, including associated tobacco counseling, for people being considered for annual lung cancer screening because having yearly CT screening is a consequential decision," said Daniel Reuland, MD, MPH, UNC Lineberger Comprehensive Cancer Center member. "Patients should understand the benefits, harms and costs involved, and their values and preferences should be considered.

Image: 
UNC Lineberger Comprehensive Cancer Center

CHAPEL HILL, NC -- In a viewpoint perspective published in JAMA on March 9, 2021, a University of North Carolina Lineberger Comprehensive Cancer Center researcher and two other experts endorsed the Center for Medicare & Medicaid Services' (CMS) requirement for a patient and their doctor to engage in a shared discussion of benefits and harms before proceeding with a low-dose spiral computed tomography (LDCT) scan as a method for preventing lung cancer death. An accompanying evidence report detailed the benefits and harms from screening, suggesting that shared decision-making between a patient and their health care professional is crucial in ensuring screening is used optimally and with fully informed consent.

"In our view, CMS should continue to require, as well as pay for, shared decision-making, including associated tobacco counseling, for people being considered for annual lung cancer screening because having yearly CT screening is a consequential decision," said Daniel Reuland, MD, MPH, one of the review authors, a member of the UNC Lineberger Comprehensive Cancer Center, and a professor in the division of General Medicine and Clinical Epidemiology at UNC School of Medicine. "Patients should understand the benefits, harms and costs involved, and their values and preferences should be considered. Because the decision-making process can be time-consuming, we also think shared decision-making could be done by trained, non-physician staff."

The experts note that during the COVID-19 pandemic many people have effectively received medical advice through technology such as Skype and Zoom. Therefore, they recommend that CMS continue to pay for counseling delivered by telehealth. In addition, if the patient is a current smoker, they said a professional should counsel that quitting smoking is by far the most important thing the patient can do to stay healthy.

The United States Preventive Services Task Force, the main evaluator of evidence for preventive strategies, now recommends low-dose CT screening for people 50 to 80 years old with a 20 pack-year smoking history. The new recommendations expand the group of people eligible for screening from the initial recommendations in 2013, and includes more Black people, which research has shown have a higher risk of developing lung cancer at earlier ages and with less tobacco exposure. The viewpoint authors believe shared decision-making is more important than ever as it can promote patient engagement, tobacco cessation and screening adherence, which in turn may lead to greater health equity.

As part of standard practice, Reuland, who is also a research fellow at UNC's Cecil G. Sheps Center for Health Services Research, believes that a physician should explicitly ask about the patient's informed values and preferences regarding key tradeoffs of screening and use that information to reach a decision that makes sense for the patient. He also advocates for the use of decision aids, which are tools that can be used to help inform patients by making the essential issues clear and easy to understand.

Reuland said good, shared decision-making is a process that takes time to establish, regardless of whether it is face-to-face or via telehealth. Studies and improvement efforts are underway to learn how to optimize shared decision-making. Indeed, Reuland and others at UNC have recently tested a video decision-making tool and found it increases understanding of the balance between risks and benefits in patients eligible for screening.

"It is important to note that shared decision-making is advisable for all patients considering initiation of annual lung cancer screening, based on ethical grounds, regardless of whether or not they have Medicare or Medicaid. To my knowledge, non-CMS third-party payers will reimburse for this counseling," Reuland said.

Credit: 
UNC Lineberger Comprehensive Cancer Center

Milk prebiotics are the cat's meow, Illinois research shows

image: Research from the University of Illinois identifies key milk oligosaccharides in dog and cat milk, and shows a molecular mimic of these compounds in pet foods make them highly palatable and digestible, and may shift the gut microbiome in positive ways.

Image: 
University of Illinois

URBANA, Ill. - If you haven't been the parent or caregiver of an infant in recent years, you'd be forgiven for missing the human milk oligosaccharide trend in infant formulas. These complex carbohydrate supplements mimic human breast milk and act like prebiotics, boosting beneficial microbes in babies' guts.

Milk oligosaccharides aren't just for humans, though; all mammals make them. And new University of Illinois research suggests milk oligosaccharides may be beneficial for cats and dogs when added to pet diets.

But before testing the compounds, scientists had to find them.

"When we first looked into this, there had only been one study on milk oligosaccharides in dogs, and none in domestic cats. The closest were really small studies on a single lion and a single clouded leopard," says Kelly Swanson, the Kraft Heinz Company Endowed Professor in Human Nutrition in the Department of Animal Sciences and the Division of Nutritional Sciences at Illinois.

"Our study was the first robust characterization of dog and cat milk oligosaccharides," he adds. "Our data not only provide a better understanding of how milk meets the nutritional needs of newborn kittens and puppies, but also how it helps promote gut immunity and establish a healthy gut microbial community early in life." That research appears in the journal PLoS ONE.

The foundational study identified three predominant oligosaccharide structures in canine milk: 3'sialyllactose, 6'-sialyllactose, and 2'fucosyllactose, the same compound showing up in many infant formulas today. Together, these three structures made up more than 90% of the total oligosaccharides in canine milk.

Feline milk was much more complex and balanced, with approximately 15 structures making up 90% of total oligosaccharides. Of these, difucosyllactose-N-hexaose b, 3'-sialyllactose, and lacto-N-neohexaose represented more than 10% each.

"Even though domestic dogs and cats both evolved as carnivores, they are metabolically distinct in many ways. Although pet cats still exist as true carnivores, pet dogs are omnivorous in nature," Swanson says. "These new milk oligosaccharide data highlight another interesting difference between the species, justifying further research to reveal their role in the nutritional and health status of newborn puppies and kittens."

Even before Swanson and his colleagues identified the oligosaccharides in cat and dog milk, the pet food industry was beginning to recognize the potential benefits of these compounds as supplements in pet foods. In 2019, Swiss biotech company Gnubiotics Sciences announced an animal milk oligosaccharide-like product known as GNU100, but it hadn't been tested in animals. Swanson's team took that on.

In two separate studies, both published in the Journal of Animal Science, Swanson and his colleagues determined the safety, palatability, and digestibility of GNU100 in dogs and cats.

First, in vitro laboratory tests with cellular colonies showed no toxic effects or tendencies to cause cell mutation. There was no reason to expect toxicity, but the result satisfies one of the basic FDA requirements for inclusion of any new ingredient in pet foods.

Next, the researchers mixed GNU100 at 1% with a fat source and coated commercial dry diets for cats or dogs. As a control, fat-coated diets without GNU100 were also offered. When animals got to choose between the control and 1% bowls, they went crazy for the GNU100.

"In the cats, it was a huge preference. They ate nearly 18 times more food with GNU100 than the control food. We had just been hoping they wouldn't reject it. You know, cats can be pretty finicky," Swanson says. "When we got the data back it was like, wow, they really love that stuff! And the dogs did, too."

Swanson explains GNU100 is composed of a complex mixture of oligosaccharides and peptides, small protein-containing compounds that may make the food more appetizing to cats and dogs.

Finally, the researchers included GNU100 in experimental diets at 0%, 0.5%, 1%, and 1.5% and fed them to healthy adult dogs and cats for six months. During that time, they measured stool quality, blood metabolites, and nutrient digestibility, and evaluated changes in gut metabolites and the gut microbial community.

Overall, cats and dogs did well with GNU100, with no adverse health effects. And the researchers saw shifts in the gut microbiome toward more beneficial species and their metabolite profiles.

Aside from the palatability test, changes associated with GNU100 were as expected, showing intriguing trends in gut microbiota and gut metabolites that Gnubiotics plans to explore in future studies. Swanson thinks they would have seen bigger benefits in a more targeted study focusing on newborn cats and dogs, geriatrics, or pets with compromised immune systems.

"Theoretically, these products should stabilize and feed good bacteria in the gut as well as limit the growth of potentially undesirable bacteria. So if an animal is undergoing treatment for something with antibiotics or is in a high stress situation, having that product in the diet might keep the gut from destabilizing," Swanson says. "Another target group for these products might be young animals as a way to maintain beneficial bacteria in the gut as they wean off their mothers. We'd need to do more testing to see if the product holds up in those target groups, but at least we know now that it is safe and well tolerated."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

A little squid and its glowing bacteria yield new clues to symbiotic relationships

image: The Hawaiian bobtail squid (Euprymna scolopes) has a unique symbiotic relationship with the bioluminescent bacteria Vibrio fischeri, which colonize a special light organ in the squid's mantle.

Image: 
David Slater

The relationship between the Hawaiian bobtail squid and the bioluminescent bacteria living in its light organ has been studied for decades as a model of symbiosis. Now researchers have used a powerful chemical analysis tool to identify a small molecule produced by the bacteria that appears to play an important role in their colonization of the light organ.

The study, published March 9 in the journal mBio, adds a new wrinkle to scientists' understanding of the chemical signaling involved in this iconic symbiotic relationship. "It's exciting that there are still new things to discover, even in such a well-studied system," said corresponding author Laura Sanchez, associate professor of chemistry and biochemistry at UC Santa Cruz.

The Hawaiian bobtail squid is a small nocturnal squid, about the size of a thumb, that lives in shallow coastal waters, hiding in the sand during the day and coming out at night to hunt for small shrimp and other prey. The bioluminescent glow from its light organ is directed downward and adjusted to match the intensity of light from the moon and stars, eliminating the squid's shadow and masking its silhouette. This "counterillumination" strategy helps conceal the squid both from bottom-dwelling predators and from its own prey.

A juvenile bobtail squid is completely free of bacteria when it first hatches, but within hours its light organ becomes colonized by a very specific type of bacteria called Vibrio fischeri. The baby squid enters an environment teeming with thousands of kinds of bacteria and millions of bacterial cells per milliliter of seawater, of which only a tiny fraction are V. fischeri. Yet only those specially adapted bacteria are able to take up residence inside the light organ.

"It's a very elegant symbiosis," Sanchez said. "We already knew that not all strains of V. fischeri are the same--some are better colonizers than others--and we wanted to know if that's being determined by chemical signals."

Sanchez's lab uses a technique called imaging mass spectrometry, which allows researchers to directly visualize the spatial distribution of all kinds of molecules in a sample, such as a squid specimen or a bacterial colony. Most techniques for seeing where specific molecules are in a sample involve labeling the targeted molecules. But imaging mass spectrometry allows untargeted investigations--in other words, you don't have to know what you're looking for.

"It's very difficult to visualize the chemistry in an organism," Sanchez explained. "With imaging mass spectrometry, we are directly detecting the chemicals, and we know where in the sample they are."

The small molecule identified in this study is a type of diketopiperazine (DKP), a large family of cyclic dipeptides. This particular DKP--cyclo(D-histidyl-L-proline), or cHP-3--was directly detected in the light organs of the colonized squid. It was also produced more abundantly by strains of V. fischeri that showed increased biofilm formation, which correlates with colonization ability. And finally, supplementing bacterial cultures with cHP-3 led to a concentration-dependent increase in bioluminescence.

"We know that it is produced during the first few hours of colonization when the symbiosis gets established, and we also know that it influences bacterial luminescence, and bioluminescence and colonization are tied together," Sanchez said.

The results indicate that cHP-3 is an important chemical signal specific to this symbiosis, but the researchers have not yet determined exactly what its role is or the details of its interactions.

"We're working on that now. We don't know the mechanisms involved, but there's a lot more going on than we thought there was," Sanchez said. "The next steps for us are to find the gene cluster that produces it, and to find how widely used it is."

Credit: 
University of California - Santa Cruz

Study reveals new hope for men with common urinary issues

A new systematic review of evidence recommends the use of behavioral self-management treatments for common urinary issues experienced by upwards of 70 percent of older men. Common symptoms include trouble urinating, increased frequency and incontinence. These symptoms can have a substantial negative impact on sleep, social functioning and quality of life. Several guidelines recommend self-management techniques like health education, advice on fluid intake, and bladder retraining; however, in practice, self-management is often excluded from the menu of treatment options that include medication and surgery.

Researchers at Bond University's Institute for Evidence-Based Healthcare found that self-management interventions reduced the severity of lower urinary tract symptoms. The reduction in symptoms appeared similar in groups receiving medications versus self-management interventions. However, compared with drugs alone, individuals who had both drug and self-management intervention experienced a small but meaningful reduction in symptom severity after six weeks. The authors recommend further research to determine the optimal components and delivery methods for self-management interventions so that these strategies can become standard options for men with lower urinary tract symptoms.

Credit: 
American Academy of Family Physicians

Microscope allows ultrafast nanoscale manipulation while tracking energy dynamics

image: Basic concept of THz-field-driven scanning tunneling luminescence (THz-STL) spectroscopy. Luminescence from a localized plasmon can be induced by THz-field-driven inelastically tunneled electrons.

Image: 
Yokohama National University

Since the early 2010s, ultrafast probing of materials at atomic-level resolution has been enabled by terahertz scanning tunneling microscopes (THz-STM). But these devices can't detect the dissipation of energy that happens during events such as when photons are emitted via recombination process of an electron-hole pair in a light emitting diode (LED). However, a new technique allows the tracking of just such energy dynamics alongside THz-STM, opening up new avenues of investigation for nanoscale science and technology.

Researchers in Japan have developed a microscopy technique that combines the ability to manipulate the motion of electrons on a femtosecond timescale and to detect a photon at sub-nanometer resolution. The new method offers a new platform for scientists to conduct experiments involving sensing and control of quantum systems, opening new doors for nanoscale science and the development of nanotechnologies.

The team, consists of scientists at Yokohama National University and RIKEN, published details of their technique in the journal ACS Photonics on January 27th.

The scanning tunneling microscope (STM) was developed in 1981 as an instrument that produces images of surfaces at the atomic level. The technique depends upon the phenomenon of quantum tunneling, in which a particle "tunnels" through an otherwise impenetrable barrier. The surface that is being investigated by the microscope is sensed by a very fine and sharp conducting tip. When the tip approaches the surface, a voltage applied across the tip and the surface allows electrons to tunnel through the vacuum between them. The current produced by this tunneling in turn provides information about the object that can then be translated into a visual image.

STM took a big leap forward in the early 2010s with the THz-STM technique, which uses an ultrafast electric-field pulse at the scanning probe tip of an STM to manipulate electrons at a timescale of under a picosecond (a trillionth of a second).

This is great for ultrafast probing of materials at atomic-level resolution, but can't detect the dissipation of energy that happens during quantum conversions. This includes, for example, electron-photon conversions, which is what happens when an injection of electron, or hole, hits a LED, knocking loose exactly one photon inside the LED semiconductor material. It would be very useful to combine the ultrafast atomic-level resolution of STM with being able to track such dynamics of diffusion of energy.

A technology that can indeed track such dynamics, called scanning tunneling luminescence spectroscopy (STL), measures photons converted by tunneling electrons and has been developed in parallel to THz-STM. STL offers abundant information on photon energy, intensity, polarization and the efficiency of its emission, triggered by electron tunneling.

"But THz-STM and STL had never been combined before in a single set-up," said Jun Takeda of Yokohama National University, who co-led the study. "So we put the two techniques together."

A lens was placed in such a way as to focus THz pulses onto the tip of the STM. Photons produced from these pulses were then collected using a second lens and directed to a photon detector, allowing the desired investigation of the energy dynamics of quantum conversions that occur during STM ultrafast probing of materials at the atomic level.

This revealed an ultrafast excitation of plasmons (surface electrons) at extremely high voltage.

"This excitation in turn could provide a unique new platform for experimentation and exploration of light-matter interactions in a 'plasmonic nanocavity', says Ikufumi Katayama, who also co-led the study. Plasmonic nanocavity is a nanometer-scale structure for trapping light but that would involve these surface electrons.

The nanocavity method should allow investigation of energy dynamics resulting from electron tunneling in semiconductors, and in other molecular systems at the timescale of even a femtosecond--a quadrillionth of a second, or the amount of time it typically takes for molecular dynamics, the physical movement of individual atoms or molecules, to occur. This should allow greater sensing and control of quantum systems, providing novel insights and advances in nanoscale technology and science.

Credit: 
Yokohama National University

Immune cell implicated in development of lung disease following viral infection

image: A new study from Washington University School of Medicine in St. Louis implicates a type of immune cell -- called a dendritic cell -- in the development of chronic lung diseases that can follow after a respiratory viral infection. Shown is a stained section of mouse lung. Epithelial cells, which line the airway, are red. Dendritic cells are green. Any cell nuclei are blue.

Image: 
Holtzman Lab

Scientists at Washington University School of Medicine in St. Louis have implicated a type of immune cell in the development of chronic lung disease that sometimes is triggered following a respiratory viral infection. The evidence suggests that activation of this immune cell -- a type of guardian cell called a dendritic cell -- serves as an early switch that, when activated, sets in motion a chain of events that drives progressive lung diseases, including asthma and chronic obstructive pulmonary disease (COPD).

The new study, published in The Journal of Immunology, opens the door to potential preventive or therapeutic strategies for chronic lung disease. More immediately, measuring the levels of these dendritic cells in clinical samples from patients hospitalized with a viral infection, such as influenza or COVID-19, could help doctors identify which patients are at high risk of respiratory failure and death.

Studying mice with a respiratory viral infection that makes the animals prone to developing chronic lung disease, the researchers showed that these dendritic cells communicate with the lining of the airway in ways that cause the airway-lining cells to ramp up their growth and inflammatory signals. The inflammation causes airway-lining cells to grow beyond their normal boundaries and turn into cells that overproduce mucus and cause inflammation, which in turn causes cough and difficulty breathing.

"We're trying to understand how a viral infection that seems to be cleared by the body can nevertheless trigger chronic, progressive lung disease," said senior author Michael J. Holtzman, MD, the Selma and Herman Seldin Professor of Medicine. "Not everyone experiences this progression. We believe there's some switch that gets flipped, triggering the bad response. We're identifying that switch and ways to control it. This work tells us that this type of dendritic cell is sitting right at that switch point."

Holtzman's past work had implicated the lining of the airway -- where the viral infection takes hold -- as the likely trigger for this process.

"But this study suggests that the cascade starts even further upstream," said Holtzman, also director of the Division of Pulmonary and Critical Care Medicine. "Dendritic cells are telling the cells lining the airway what to do. There's more work to be done, but this data tells us that the dendritic cells play an important role in getting the airway-lining cells onto the wrong path."

Holtzman calls this dendritic cell a type of sentinel because its job is to detect an invading virus and trigger the body's initial immune response against the infection. The problem comes when the cell doesn't shut down properly after the threat has passed.

"Many people never develop chronic lung disease after a viral infection," Holtzman said. "But others have a genetic susceptibility to this type of disease. People who are susceptible to virus-triggered disease include patients with asthma, COPD, and viral infections such as COVID-19. It's really critical to look for ways to fix this disease response and prevent the problems that might occur after the virus has gone."

In the meantime, Holtzman said, high levels of these dendritic cells and their products in the lungs of hospitalized patients could serve as a warning to doctors that such patients are likely to develop severe disease and should be provided with respiratory interventions and other supportive therapies that are precisely tailored to their disease process.

"Similarly, if this process is not underway, the patient might be more likely to avoid these types of long-term problems," Holtzman said. "We're pursuing this line of research to help improve prediction of severe lung disease after infection and to provide companion therapies that could prevent this switch from being flipped or flip it back to reverse the disease."

Credit: 
Washington University School of Medicine

Integration analysis of m6A regulators and m6A-related genes in hepatocellular carcinoma

https://doi.org/10.15212/bioi-2021-0002

Announcing a new article publication for BIO Integration journal. In this article the authors Jingdun Xie, Zhenhua Qi, Xiaolin Luo, Fang Yan, Wei Xing, Weian Zeng, Dongtai Chen and Qiang Li; from Sun Yat-sen University, Guangzhou, Guangdong, China discuss integration analysis of m6A regulators and m6A-related genes in hepatocellular carcinoma (HCC).

N6-Methyladenosine (m6A) RNA methylation of eukaryotic mRNA is involved in the progression of various tumors. This study comprehensively analyzed m6A regulators and m6A-related genes through an integrated bioinformatic analysis, including expression, clustering, protein-protein interaction, and prognosis, thus providing novel insights into the roles of m6A regulators and m6A-related genes in HCC.

192 candidate m6A-related genes and 3 m6A regulators, including YTHDF1, YTHDF2, and YTHDC1 were obtained. The expression of these genes and regulators differed significantly in different stages of HCC. Based on Cox regression analysis, 19 of 98 m6A-related prognostic genes were obtained to construct a risk score model. The 1- and 3-year area under the curves (AUCs) among HCC patients were greater than 0.7. Based on analysis of mutation differences between high- and low-risk score groups, the authors determined that TP53 had the highest mutation frequency in the high-risk HCC patient group, whereas titin (TTN) had the highest mutation frequency in the low-risk HCC patient group.

Credit: 
Compuscript Ltd

Sushi-like rolled 2D heterostructures may lead to new miniaturized electronics

image: Image of a heterotube diode: This device contains a MoS2 semiconductor shell (blue), over the insulator hBN shell (purple), over the carbon nanotube core (green) of the heteronanotube covered with gold electrodes (yellow).

Image: 
ELIZABETH FLORES-GOMEZ MURRAY/ PENN STATE

The recent synthesis of one-dimensional van der Waals heterostructures, a type of heterostructure made by layering two-dimensional materials that are one atom thick, may lead to new, miniaturized electronics that are currently not possible, according to a team of Penn State and University of Tokyo researchers.

Engineers commonly produce heterostructures to achieve new device properties that are not available in a single material. A van der Waals heterostructure is one made of 2D materials that are stacked directly on top of each other like Lego-blocks or a sandwich. The van der Waals force, which is an attractive force between uncharged molecules or atoms, holds the materials together.

According to Slava V. Rotkin, Penn State Frontier Professor of Engineering Science and Mechanics, the one-dimensional van der Waals heterostructure produced by the researchers is different from the van der Waals heterostructures engineers have produced thus far.

"It looks like a stack of 2D-layered materials that are rolled up in a perfect cylinder," Rotkin said. "In other words, if you roll up a sandwich, you keep all the good stuff in it where it should be and not moving around, but in this case you also make it a thin cylinder, very compact like a hot-dog or a long sushi roll. In this way, the 2D-materials still contact each other in a desired vertical heterostructure sequence while one needs not to worry about their lateral edges, all rolled up, which is a big deal for making super-small devices."

The team's research, published in ACS Nano, suggests that all 2D materials could be rolled into these one-dimensional heterostructure cylinders, known as hetero-nanotubes. The University of Tokyo researchers recently fabricated electrodes on a hetero-nanotube and demonstrated that it can work as an extremely small diode with high performance despite its size.

"Diodes are a major type of device used in optoelectronics -- they are in the core of photodetectors, solar cells, light emitting devices, etc.," Rotkin said. "In electronics, diodes are used in several specialized circuits; although the main element of electronics is a transistor, two diodes, connected back-to-back, may serve as a switch, too."

This opens a potential new class of materials for miniaturized electronics.

"It brings device technology of 2D materials to a new level, potentially enabling a new generation of both electronic and optoelectronic devices," Rotkin said.

Rotkin's contribution to the project was to solve a particularly challenging task, which was ensuring that they were able to make the one-dimensional van der Waals heterostructure cylinder have all the required material layers.

"Using the sandwich analogy again, we needed to know whether we had a shell of 'roast beef' along the entire length of a cylindrical sandwich or if there were regions where we have only 'bread' and 'lettuce' shells," Rotkin said. "Absence of a middle insulating layer would mean we failed in device synthesis. My method did explicitly show the middle shells were all there along the entire length of the device."

In regular, flat van der Waals heterostructures, confirming existence or absence of some layers can be done easily because they are flat and have a large area. This means a researcher can use various type microscopies to collect a lot of signal from the large, flat areas, so they are easily visible. When researchers roll them up, like in the case of a one-dimensional van der Waals heterostructure, it becomes a very thin wire-like cylinder that is hard to characterize because it gives off little signal and becomes practically invisible. In addition, in order to prove the existence of insulating layer in the semiconductor-insulator-semiconductor junction of the diode, one needs to resolve not just the outer shell of the hetero-nanotube but the middle one, which is completely shadowed by the outer shells of a molybdenum sulfide semiconductor.

To solve this, Rotkin used a scattering Scanning Near-field Optical Microscope that is part of the Material Research Institute's 2D Crystal Consortium, which can "see" the objects of nanoscale size and determine their materials optical properties. He also developed a special method of analysis of the data known as hyperspectral optical imaging with nanometer resolution, which can distinguish different materials and, thus, test the structure of the one-dimensional diode along its entire length.

According to Rotkin, this is the first demonstration of optical resolution of a hexagonal boron nitride (hBN) shell as a part of a hetero-nanotube. Much larger pure hBN nanotubes, consisting of many shells of hBN with no other types of material, were studied in the past with a similar microscope.

"However, imaging of those materials is quite different from what I have done before," Rotkin said. "The beneficial result is in the demonstration of our ability to measure the optical spectrum from the object, which is an inner shell of a wire that is just two nanometers thick. It's comparable to the difference between being able to see a wooden log and being able to recognize a graphite stick inside the pencil through the pencil walls."

Rotkin plans to expand his research to extend hyperspectral imaging to better resolve other materials, such as glass, various 2D materials, and protein tubules and viruses.

"It is a novel technique that will lead to, hopefully, future discoveries happening," Rotkin said.

Credit: 
Penn State

Chinese immigrants face "alarming" barriers to cancer screening, UCF study finds

image: Su-I Hou is a professor and interim chair of the University of Central Florida's Health Management & Informatics Department.

Image: 
University of Central Florida

Language difficulties and cultural barriers keep an "alarming" number of Chinese Americans from asking for cancer screenings that may protect their health, according to a new University of Central Florida study.

Su-I Hou, professor and interim chair of UCF's Health Management & Informatics Department, said her results show that physicians and members of the Chinese community need to improve their communication about the importance of cancer screenings. Her study was recently published in the Asian Pacific Journal of Cancer Prevention.

Hou surveyed 372 Chinese adults who attended churches providing services in Mandarin in the U.S. and Taiwan to assess their knowledge of cancer risks and whether they asked their primary care provider about cancer screenings. She found that Chinese adults seldom asked about screenings - even if they had family cancer history or had been a primary caregiver for someone with cancer. Chinese adults with family cancer history asked about cancer screenings only 32.2% of the time, and those without family cancer history asked only 21.5% of the time. Chinese cancer caregivers asked about cancer screenings 48.3% of the time. Non-caregivers asked only 23.4% of the time.

Hou said age was a factor in whether Chinese adults questioned their doctors about cancer screenings. Even though cancer risks increase with age and cancer screening communication is higher among the older age group, she said older Chinese participants may be the most reluctant to ask about screenings. She attributed this to traditional Chinese culture, in which questioning an expert can be regarded as disrespectful. She said younger Chinese adults may be more willing to question their healthcare provider - and thus are more likely to be their own healthcare advocates.

She said analysis of patient-provider communication was overall low and statistically similar between Chinese adults living in Taiwan and those who had immigrated to the United States.

"Chinese culture and language difficulties are deterrents to open dialogue between these patients and their providers," she said. "The Chinese culture teaches respect to authority and in our culture, the doctor is the authority. To question a physician is seen as a sign of disrespect."

The disconnect in patient-provider communication is especially alarming, she said, because cancer is the leading cause of death among Chinese Americans and that risk is growing as they adopt a more Western lifestyle, including unhealthy eating habits.

She centered her research on churches, noting that many new Chinese immigrants join a faith-based organization with services in Mandarin as a way to meet fellow Chinese and engage in an environment where their native language is spoken. She said such churches can help spread the word about the importance of cancer screenings and health advocacy and can help teach parishioners how to request screenings from their providers.

She said it's also incumbent upon providers to understand that their Chinese patients may be reluctant to be their own healthcare advocates. For that reason, she said, providers need to bring up the topic of cancer screening - even if their patients do not. Many of the survey participants said they did not ask about cancer screenings because their provider did not raise the subject.

Hou is part of UCF's Population Health Consortium, which includes interdisciplinary faculty from across the university who are dedicated to implementing best practices and eliminating health disparities.

"We all need to work together to recognize the barriers to providing quality healthcare to all," she said. "This study is another example of the importance of understanding cultural norms as we provide care."

Credit: 
University of Central Florida

Citizens and scientists release 28-year record of water quality in Buzzards Bay

image: Aerial view of The Knob and Quissett Harbor in Woods Hole, Mass., with greater Buzzards Bay in the distance. Buzzards Bay is approximately 28 miles long by 8 miles wide and is a popular destination for fishing, boating, and tourism.

Image: 
Fish Hawk Films

WOODS HOLE, Mass. -- A long-lasting, successful relationship between scientists at the MBL Ecosystems Center and the citizen-led Buzzards Bay Coalition has garnered a long-term record of water quality in the busy bay that lies west of Woods Hole. That record has already returned tremendous value and last week, it was published in Scientific Data, a Nature journal.

"We hope getting this data out will encourage scientists to use it to test new hypotheses and develop new insights into Bay health," said Rachel Jakuba, science director of the Buzzards Bay Coalition and lead author of the journal article.

Since 1992, a large and dedicated team of citizen volunteers, dubbed Baywatchers, has been collecting water samples from more than 200 sites along the coast of Buzzards Bay. The samples have been analyzed at the Marine Biological Laboratory (MBL) since 2008 under the direction of Chris Neill, a former MBL scientist who is now at Woodwell Climate Research Center, and MBL Senior Research Assistant Richard McHorney. The goal is to document the effects of nitrogen pollution in the Bay, including low oxygen levels that threaten marine life, in order to inform policies to improve Bay health.

"Baywatchers data directly influence policy by documenting impaired waters, making the public aware of long-term water quality trends, and importantly, documenting how water quality improves when communities upgrade water infrastructure, like fixing antiquated wastewater treatment plants," said Neill. "They also show the Bay's waters are warming rapidly."

The main sources of nitrogen pollution in the 430-square-mile Bay are private septic systems and underperforming wastewater treatment plants. Collaborations such as the MBL-Buzzards Bay Coalition's are essential to move science toward societal solutions.

"Scientists can provide information on the causes and consequences of excess nitrogen loading and suggest alternatives, while citizens groups can push for action and help bring together citizens, regulators, and policy makers to achieve a solution," said MBL Ecosystems Director Anne Giblin. Giblin and MBL Senior Scientist Ivan Valiela were among a group of scientists who helped the Coalition formulate and establish the Baywatchers program in the early 1990s.

Baywatchers data have been used to identify nearly 30 bodies of water around the Bay that do not meet federal standards under the Clean Water Act, evaluate wastewater discharge permits, support the development of targets for reduction of nitrogen pollution, and develop strategies for reaching those goals. And the Baywatchers program itself elevates public awareness and generates support for actions to control nutrient pollution and improve water quality.

"With a program like Baywatchers, every one of those citizen volunteers not only collects samples, they go out and talk to their friends about the nitrogen issue. That is a huge public education benefit. By making sure those volunteers are well educated in the scientific facts, you get this tremendous informal education program going," Giblin said.

Baywatchers is one of the largest and longest-running water quality monitoring programs in the country, and its dataset on water quality in Buzzards Bay keeps growing.

"Over the past 30 years, the Coalition has prioritized our commitment to comprehensive water quality monitoring above all else - placing sound science at the core of our work and successes in restoring and protecting the Bay. It is a function that continues to develop as we expand the density of our monitoring stations, parameters measured, methods for collection, and scientific collaborations. Making our entire dataset available through peer-reviewed publication is an important step and I'm indebted to the many scientists, citizens, and funders who got us to this milestone," said Mark Rasmussen, president of the Buzzards Bay Coalition.

Credit: 
Marine Biological Laboratory

Updates on the Baylor cranial gunshot wound prognosis score

image: Rates of mortality and good functional outcome (Glasgow Outcome Scale score of 4 or 5), segmented by Baylor score. From Yengo-Kahn et al.

Image: 
Copyright 2021 AANS.

CHARLOTTESVILLE, VA (MARCH 9, 2021). In 2014, the Journal of Neurosurgery published a paper by a group of researchers from Baylor College of Medicine in Houston, who developed a prognostic scoring system for use in patients who present to the emergency department with a gunshot wound to the head (GSWH).[1]

Today, we publish two papers by a group of researchers at Vanderbilt University Medical Center that extend our understanding of the Baylor GSWH scoring system and its application, externally validating it in a different group of patients presenting during a more recent time period in which better acute management techniques are available.

Background

The Baylor prognostic scoring system is a tool used when a patient with a GSWH presents at the emergency department to predict in-hospital survival and outcomes 6 months after injury. Baylor scores are determined using the following variables: the patient's age, neurological status (based on both the patient's pupillary response and Glasgow Coma Scale score at the time of hospital admission), and the trajectory of the bullet within the brain.

Baylor scores range from 0 to 5. One point each is assigned to age greater than 35 years, a Glasgow Coma Scale score of 3 or 4, and bilateral nonreactive pupils. Two points are added if the bullet trajectory passed through the posterior fossa or involved both hemispheres of the brain. Patients determined at hospital admission to have a Baylor score of 0 are more likely to have a good functional outcome (determined as a Glasgow Outcome Scale score of 4 or 5) 6 months after injury; patients with a Baylor score of 5 are unlikely to survive their hospital stay.

The initial paper on the Baylor scoring system involved 199 patients in Houston who were treated for a GSWH with penetration of the dura mater between 1990 and 2008. Although the system was validated internally at the time, since then there has been no published external validation in a separate group of patients.

Current Studies

Article 1. In the first of the new articles, "The value of simplicity: externally validating the Baylor cranial gunshot wound prognosis score", Aaron M. Yengo-Kahn, MD, and colleagues sought external validation of Baylor scores by reviewing the cases of all patients who had been admitted to Vanderbilt University Medical Center for a GSWH with dural penetration between January 1, 2009, and June 30, 2019. This group of 297 patients was "institutionally, geographically, and temporally distinct from the 1990-2008 cohort treated at Ben Taub General Hospital, which was used to develop the Baylor score."

The authors point out that many improvements in acute trauma management were made between the earlier Baylor group of patients and the current Vanderbilt group. Nevertheless, they hypothesized that the Baylor scoring system could still accurately predict in-hospital survival and functional outcome 6 months later.

Variables reviewed in the Vanderbilt cases included patients'¬¬¬ vital signs, pupil responses, and Glasgow Coma Scale scores at the time of hospital admission; patients' demographics, medical histories, and laboratory and neuroimaging reports; as well as the intent of injury (accident, suicide, etc.).

Patients' Baylor scores were calculated using the method described above. The ability of the Baylor score to predict mortality and functional outcome was assessed by determining the receiver operating characteristic curve and the area under the curve (AUC).

Of the 297 patients in the Vanderbilt group, 205 patients (69%) died and 69 patients (23%) attained good functional outcome.

Among the Vanderbilt population, the authors state, "Overall, the Baylor score showed excellent discrimination of mortality (AUC = 0.88) and good functional outcome (AUC = 0.90)," despite the fact that Baylor scores 3-5 underestimated in-hospital mortality and scores 0, 1, and 2 underestimated good functional outcome at 6 months.

The authors discuss other GSWH prognostic tools, but they are more complex and may not cover both survival and long-term functional status. In a clinical setting, a simple tool such as the Baylor score, which relies on just four variables and is capable of predicting survival and functional outcome, is enormously useful.

When asked about this article, Dr. Yengo-Kahn responded,

"The beauty of the Baylor score is the ease with which it can be applied. Frequently, trauma and neurologic surgeons face difficult decisions about how to convey the gravity of these patients' prognoses early in the hospital course. There is always the hope, especially, before the score was validated, that a patient will surprise us by "outperforming" their imaging- and exam-based prognosis, but this can be unproductive. This "false hope" may be at the cost of a substantial amount of resources and to the detriment of the patient (life prolonged in an undesirable state) and the family (standing witness to continued hospitalization, procedures, etc.), without a reasonable chance for survival. Now that the Baylor score has been validated, it may serve as a grounding force for family discussions and ensuring all treatment teams have a clear, objective, understanding of a patient's prognosis when considering additional treatments."

Article 2. The second paper, "Incorporating conditional survival into prognostication for gunshot wounds to the head" , by Patrick D. Kelly, MD, MSCI, and colleagues, examines data from the same patient population, but instead of applying the Baylor score at the time of admission, the authors applied it 48 hours later, in patients who had survived "the acute phase" of treatment. At that time point, 129 of the 297 patients were alive.

The authors state, "conditional survival is defined as the probability of survival as a function of the amount of time a patient has already survived. This concept is often used to explain how prognostic factors evolve over time."

Forty-two (33%) of the 129 patients who survived the first 48 hours of hospitalization later died of their injuries; the mortality rate in this subgroup was less than half that in the entire patient population. Sixty-two patients (48%) in the subgroup attained good functional outcomes; this good outcome rate was more than twice that in the entire patient population.

The 48-hour time point marked a "significant change in conditional survival and functional outcome"; patients in the study who survived past that time had significantly less severity of injury and a higher rate of neurosurgical interventions. Unfortunately, 18 to 25 days of hospitalization marked another change in patients with GSWHs--toward worse rates of good functional outcome.

According to the findings, "Among acute-phase survivors, the Baylor score accurately predicted mortality with an AUC of 0.7749 ... and good functional outcome with an AUC of 0.8146." Nevertheless, Baylor scores did overestimate the true mortality rates and underestimated the true rates of good functional outcome among acute-phase survivors in this study.

The authors believe that the Baylor scoring system can be a valuable tool at assessing GSWH patients at the 48-hour time point, bearing in mind the scores' proclivity to overestimate mortality and underestimate good functional outcome. The authors point out the importance of re-evaluating prognosis in patients still living at the 48-hour point, because prognostic scores based on arrival at the emergency department may no longer be as accurate in that group.

When asked about this article, Mr. Pious D. Patel responded,

"Traditional prognostic scores are great at estimating outcome at a single point in time. The Baylor score, for example, does this at the point of arrival to the emergency department. This can be invaluable on the first day, when counseling family members on their loved one's chances for meaningful recovery. But what can we tell the family on day 2, day 3, or day 10 of the hospital stay? This study is a step forward to answering this question by showing that GSWH patients' expected outcomes change with continued survival in the hospital. It is therefore important to keep reassessing prognosis during the inpatient stay. After 48 hours in the hospital, we can say from this study that patients' odds of survival and of good long-term functional outcome are doubled when compared to the time point of arrival to the hospital. With more data and further study, we envision the creation of a prognostic score that can be adjusted with continued time in the hospital to help communicate this changing prognosis to the family."

Credit: 
Journal of Neurosurgery Publishing Group

'Wearable microgrid' uses the human body to sustainably power small gadgets

video: This shirt harvests and stores energy from the human body to power small electronics. UC San Diego nanoengineers call it a "wearable microgrid"--it combines energy from the wearer's sweat and movement to provide sustainable power for wearable devices.

Image: 
UC San Diego Jacobs School of Engineering

Nanoengineers at the University of California San Diego have developed a "wearable microgrid" that harvests and stores energy from the human body to power small electronics. It consists of three main parts: sweat-powered biofuel cells, motion-powered devices called triboelectric generators, and energy-storing supercapacitors. All parts are flexible, washable and can be screen printed onto clothing.

The technology, reported in a paper published Mar. 9 in Nature Communications, draws inspiration from community microgrids.

"We're applying the concept of the microgrid to create wearable systems that are powered sustainably, reliably and independently," said co-first author Lu Yin, a nanoengineering Ph.D. student at the UC San Diego Jacobs School of Engineering. "Just like a city microgrid integrates a variety of local, renewable power sources like wind and solar, a wearable microgrid integrates devices that locally harvest energy from different parts of the body, like sweat and movement, while containing energy storage."

The wearable microgrid is built from a combination of flexible electronic parts that were developed by the Nanobioelectronics team of UC San Diego nanoengineering professor Joseph Wang, who is the director of the Center for Wearable Sensors at UC San Diego and corresponding author on the current study. Each part is screen printed onto a shirt and placed in a way that optimizes the amount of energy collected.

Biofuel cells that harvest energy from sweat are located inside the shirt at the chest. Devices that convert energy from movement into electricity, called triboelectric generators, are positioned outside the shirt on the forearms and sides of the torso near the waist. They harvest energy from the swinging movement of the arms against the torso while walking or running. Supercapacitors outside the shirt on the chest temporarily store energy from both devices and then discharge it to power small electronics.

Harvesting energy from both movement and sweat enables the wearable microgrid to power devices quickly and continuously. The triboelectric generators provide power right away as soon as the user starts moving, before breaking a sweat. Once the user starts sweating, the biofuel cells start providing power and continue to do so after the user stops moving.

"When you add these two together, they make up for each other's shortcomings," Yin said. "They are complementary and synergistic to enable fast startup and continuous power." The entire system boots two times faster than having just the biofuel cells alone, and lasts three times longer than the triboelectric generators alone.

The wearable microgrid was tested on a subject during 30-minute sessions that consisted of 10 minutes of either exercising on a cycling machine or running, followed by 20 minutes of resting. The system was able to power either an LCD wristwatch or a small electrochromic display--a device that changes color in response to an applied voltage--throughout each 30-minute session.

Greater than the sum of its parts

The biofuel cells are equipped with enzymes that trigger a swapping of electrons between lactate and oxygen molecules in human sweat to generate electricity. Wang's team first reported these sweat-harvesting wearables in a paper published in 2013. Working with colleagues at the UC San Diego Center for Wearable Sensors, they later updated the technology to be stretchable and powerful enough to run small electronics.

The triboelectric generators are made of a negatively charged material, placed on the forearms, and a positively charged material, placed on the sides of the torso. As the arms swing against the torso while walking or running, the oppositely charged materials rub against each and generate electricity.

Each wearable provides a different type of power. The biofuel cells provide continuous low voltage, while the triboelectric generators provide pulses of high voltage. In order for the system to power devices, these different voltages need to be combined and regulated into one stable voltage. That's where the supercapacitors come in; they act as a reservoir that temporarily stores the energy from both power sources and can discharge it as needed.

Yin compared the setup to a water supply system.

"Imagine the biofuel cells are like a slow flowing faucet and the triboelectric generators are like a hose that shoots out jets of water," he said. "The supercapacitors are the tank that they both feed into, and you can draw from that tank however you need to."

All of the parts are connected with flexible silver interconnections that are also printed on the shirt and insulated by waterproof coating. The performance of each part is not affected by repeated bending, folding and crumpling, or washing in water--as long as no detergent is used.

The main innovation of this work is not the wearable devices themselves, Yin said, but the systematic and efficient integration of all the devices.

"We're not just adding A and B together and calling it a system. We chose parts that all have compatible form factors (everything here is printable, flexible and stretchable); matching performance; and complementary functionality, meaning they are all useful for the same scenario (in this case, rigorous movement)," he said.

Other applications

This particular system is useful for athletics and other cases where the user is exercising. But this is just one example of how the wearable microgrid can be used. "We are not limiting ourselves to this design. We can adapt the system by selecting different types of energy harvesters for different scenarios," Yin said.

The researchers are working on other designs that can harvest energy while the user is sitting inside an office, for example, or moving slowly outside.

Credit: 
University of California - San Diego

Breaking waves and moisture transport drive extreme precipitation events

image: Despite being located in one of the driest regions of the world--the Atacama Desert in Northern Chile--the usually dry Copiapó River has flooded several times during the 19th and 20th centuries. Sedimentary deposits from the 2017 floods are shown here. The 2015 Atacama floods, analyzed in this paper, were among the worst ever recorded in the region, killing 178 people.

Image: 
Manu Abad via Imaggeo

MUNICH -- Around the world each year, extreme precipitation events cause catastrophic flooding that results in tragic loss of life and costly damage to infrastructure and property. However, a variety of different weather systems can cause these extreme events, so a detailed understanding of the atmospheric processes that lead to their formation is crucial.

Now, for the first time, a global analysis reveals that two intertwined atmospheric processes drive the formation of many large-scale extreme precipitation events around the world, particularly in dry subtropical regions where they can inflict catastrophic flooding, as occurred in March 2015 in the Atacama Desert.

Previous research on extreme precipitation events has mostly focused on wet regions, where cyclones are typically responsible for these events, whereas dry subtropical regions have been less studied. However, it is precisely these dry subtropical regions, including deserts, "where these mysterious events are least expected, but can cause devastating impacts," says Andries-Jan de Vries, an atmospheric scientist at ETH Zürich and the Max Planck Institute for Chemistry in Mainz, Germany, who authored the new study.

The results, published in the European Geosciences Union (EGU) journal Weather and Climate Dynamics, improve our understanding of atmospheric processes and weather systems that lead to extreme precipitation events. This, in turn, could help improve forecasts, perhaps leading to the development of early warning systems that could save lives.

The results could also improve our understanding of how these extreme events will respond to climate change. The intensity and frequency of these heavy rainfall events have been increasing in recent decades, and the trend is projected to continue under global warming.

Breaking Waves and Moisture Transports

This study highlights the role of two atmospheric processes in the formation of extreme precipitation events: the breaking of Rossby waves and intense moisture transport.

Rossby waves, also called planetary waves because they arise due to Earth's rotation, are waves occurring in the ocean and atmosphere that were first discovered in the 1930s by Carl Rossby. In the atmosphere, Rossby waves determine to a large extent the weather in midlatitude regions. Due to nonlinear processes, Rossby waves can amplify and eventually break (similar to ocean waves moving onshore).

Intense moisture transport refers to large masses of water vapor moving horizontally in the atmosphere. The process has been linked to extreme precipitation and flooding, often along the western coasts of continents. When the moisture transport appears in an elongate-shaped structure reaching lengths of several thousand kilometers, it is better known as an "atmospheric river."

"When Rossby waves amplify and break, cold-air masses intrude from high latitudes into lower latitudes, and vice versa," De Vries says. "This atmospheric process can drive intense moisture transport, destabilize the troposphere, and force air masses to ascend, which together favor the formation of extreme precipitation."

One key finding of the study is that the severity of the extreme precipitation is strongly influenced by the characteristics of the two atmospheric processes. "The stronger the wave breaking and the more intense the moisture transport, the larger the precipitation volumes," De Vries says.

Extreme Precipitation and Catastrophic Flooding

De Vries analyzed daily extreme precipitation events occurring around the world between 1979 and 2018. The analysis focused on larger-scale events and did not consider very local short-duration heavy rainfall, which are typically caused by single thunderstorms.

He found that Rossby wave breaking can explain more than 90 percent of extreme precipitation events over central North America and the Mediterranean. Over coastal zones, however, more than 95 percent of the extreme precipitation events were driven by intense moisture transport, which is consistent with the findings of previous studies on atmospheric rivers.

One of the most interesting findings was the discovery of locations where the two processes combined drive the extreme events. "Importantly, the combined occurrence of these two atmospheric processes can explain up to 70 percent of extreme precipitation events in regions where one would expect them the least--the dry subtropics," De Vries says. "Breaking waves that reach from the midlatitudes unusually far towards the equator can draw moisture from the humid tropics into the dry subtropics, which feeds the heavy rainfall.

The study further demonstrated that the combined processes played a key role in 12 historic extreme precipitation events that resulted in catastrophic flooding, thousands of fatalities and injuries, billions of dollars in damage, and sustained socioeconomic impacts lasting well beyond the flooding event. These floods included the Natal, South Africa, floods of September 1987; the Alpine floods in October 2000; the Uttarakhand, India, floods in June 2013; the Colorado floods of September 2013; and the Atacama Desert floods in March 2015.

Credit: 
European Geosciences Union