Tech

Risk of extinction cascades from freshwater mussels to a bitterling fish

image: Freshwater fish distributed in Japan, South Korea, and China. Listed as near threatened in Japanese Red List.

Image: 
Hiroki Hata (Ehime University)

Bitterling fishes (Subfamily: Acheilognathinae) spawn in the gills of living freshwater mussels obligately depending on the mussels for reproduction. On the Matsuyama Plain, Japan, populations of unionid mussels--Pronodularia japanensis, Nodularia douglasiae, and Sinanodonta lauta--have decreased rapidly over the past 30 years. Simultaneously, the population of a native bitterling fish, Tanakia lanceolata, which depends on the three unionids as a breeding substrate, has decreased. Furthermore, a congeneric bitterling, Tanakia limbata, has been artificially introduced, and hybridisation and genetic introgression occur between them. Here, we surveyed the reproduction and occurrence of hybridisation between native and invasive species of bitterling fishes. We collected mussels in which these bitterlings lay their eggs, kept them separately in aquaria, collected eggs and larvae ejected from the mussels, and genotyped them using six microsatellite markers and mitochondrial cytochrome b sequences.

The introduced T. limbata was more abundant, had a longer breeding period, and produced more juveniles than the native T. lanceolata. Hybrids between the two species occurred frequently, and in total 101 of the 837 juveniles genotyped were hybrids. The density of P. japanensis was low, at most 0.42 individuals/m2. Nodularia douglasiae and S. lauta have nearly or totally disappeared from these sites. Hybrid clutches of the Tanakia species occurred more frequently where the local density of P. japanensis was low. The mussels were apparently overused and used simultaneously by three species of bitterlings.

The decline of freshwater unionid populations has heightened hybridisation of native and invasive bitterling fishes by increasing the competition for a breeding substrate. We showed that a rapid decline of host mussel species and an introduction of an invasive congener have interacted to cause a rapid decline of native bitterling fish. The degradation of habitat and the introduction of invasive species interact to cause a cascade of extinctions in the native species. In our study, obligate parasite species are threatened because the host species are disappearing, resulting in a serious threat of coextinction.

Credit: 
Ehime University

Convex to concave: More metasurface moiré results in wide-range lens

image: Schematic drawing of working principle

Image: 
Kentaro Iwami/ TUAT

The odd, wavy pattern that results from viewing certain phone or computer screens through polarized glasses has led researchers to take a step toward thinner, lighter-weight lenses. Called moiré, the pattern is made by laying one material with opaque and translucent parts at an angle over another material of similar contrast.

A team of researchers from Tokyo University of Agriculture and Technology, TUAT, in Japan have demonstrated that moiré metalenses--tiny, patterned lenses composed of artificial "meta" atoms--can tune focal length along a wider range than previously seen. They published their results on November 23 in Optics Express, a journal of The Optical Society.

"Metalenses have attracted a lot of interest because they are so thin and lightweight, and could be used in ultra-compact imaging systems, like future smart phones, virtual reality goggles, drones or microbots," said paper author Kentaro Iwami, associate professor in the TUAT Department of Mechanical Systems Engineering.

The problem, Iwami said, is that to keep the metalenses compact enough for use in the desired applications, they have a limited focal tuning range for sight. Focal length, measured in millimeters, is the angle of view and strength of magnification and is dictated by the lens shape.

A convex lens, which has a positive focal length, brings light rays to a single point, while a concave lens, with a negative focal length, disperses the light rays. When combined in varifocal lenses, the result is a more complete, sharper image--but tuning the focal length from negative to positive in something as compact as a metalens is tricky, according to Iwami.

"We found that wide-focal length tuning from convex to concave can be achieved by rotational moiré metalenses," Iwami said.

The researchers developed metalenses with high-contrast artificial "meta" atoms composed of amorphous silicon octagonal pillars. When they overlaid one meta lens over the other, creating the moiré pattern, and rotated them, they could use infrared light to tune the focal length of the lenses.

Next the researchers plan to demonstrate wide-focal length tuning at a visible wavelength, and improve the quality of the lens, with the ultimate goal of realizing an ultra-compact imaging system.

Credit: 
Tokyo University of Agriculture and Technology

Alert system shows potential for reducing deforestation, mitigating climate change

CORVALLIS, Ore. - Forest loss declined 18% in African nations where a new satellite-based program provides free alerts when it detects deforestation activities.

A research collaboration that included Jennifer Alix-Garcia of Oregon State University found that the Global Land Analysis and Discovery System, known as GLAD, resulted in carbon sequestration benefits worth hundreds of millions of dollars in GLAD's first two years.

Findings were published today in Nature Climate Change.

The premise of GLAD is simple: Subscribe to the system, launch a free web application, receive email alerts when the GLAD algorithm detects deforestation going on and then take action to save forests.

GLAD, launched in 2016, delivers alerts created by the University of Maryland's Global Land Analysis and Discovery lab based on high-resolution satellite imaging from NASA's Landsat Science program. The information is made available to subscribers via the interactive web application, Global Forest Watch.

"Before GLAD, government agencies and other groups in the business of deforestation prevention had to lean on reports from volunteers or forest rangers," Alix-Garcia said. "Obviously the people making those reports can't be everywhere, which is a massive limitation for finding out about deforestation activity in time to prevent it."

Changes in land use make a huge difference in how much carbon dioxide reaches the atmosphere and warms the planet, said Alix-Garcia, an economist in OSU's College of Agricultural Sciences.

"Reforestation is good, but avoiding deforestation is way better - almost 10 times better in some instances," she said. "That's part of why cost-effective reduction of deforestation ought to be part of the foundation of global climate change mitigation strategies."

Deforestation, Alix-Garcia adds, is a key factor behind the 40% increase in atmospheric carbon dioxide since the dawn of the industrial age, which in turn is contributing heavily to a warming planet. According to the National Oceanic and Atmospheric Administration, the global average atmospheric carbon dioxide concentration in 2018 was 407.4 parts per million, higher than at any time in at least 800,000 years.

The annual rate of increase in atmospheric CO2 over the past six decades is roughly 100 times faster than increases resulting from natural causes, such as those that happened following the last ice age more than 10,000 years ago, according to NOAA.

Alix-Garcia, study leader Fanny Moffette of the University of Wisconsin and collaborators at the University of Maryland and the World Resources Institute looked at deforestation in 22 nations in the tropics in South America, Africa and Asia between 2011 and 2018 - the last five years before GLAD and first two years after.

In Africa, the results were telling: Compared to the prior five years, the first two years of GLAD showed 18% less forest loss where forest protectors were subscribing to the system.

Using a concept known as the social cost of carbon - the marginal cost to society of each additional metric ton of greenhouse gas that reaches the atmosphere - researchers estimate the alert system was worth between $149 million and $696 million in Africa those two years.

There was no substantial change in deforestation in Asia or South America, however, but possible explanations for that are numerous and suggest GLAD can make a greater difference in those places in years to come, the researchers say.

"We think that we see an effect mainly in Africa due to two main reasons," Moffette said. "One is because GLAD added more to efforts in Africa than on other continents, in the sense that there was already some evidence of countries using monitoring systems in countries like Indonesia and Peru. And Colombia and Venezuela, which are a large part of our sample, had significant political unrest during this period."

The GLAD program is still young and as more groups sign up to receive alerts and decide how to intervene in deforestation, the system's influence may grow, she added.

"Now that we know subscribers of alerts can have an effect on deforestation, there are ways in which our work can potentially improve the training the subscribers receive and support their efforts," Moffette said.

Credit: 
Oregon State University

Inflammation from ADT may cause fatigue in prostate cancer patients

TAMPA, Fla. -- Prostate cancer is one of the most common cancers among men in the U.S. For many patients, hormone therapy is a treatment option. This type of therapy, also called androgen deprivation therapy (ADT), reduces the level of testosterone and other androgens in the body. Lowering androgen levels can make prostate cancer cells grow more slowly or shrink tumors over time. However, patients receiving ADT often experience higher levels of fatigue, depression and cognitive impairment.

Moffitt Cancer Center researchers are investigating whether inflammation in the body, a side effect of ADT, contributes to these symptoms in prostate cancer patients. In a new study published in the journal Cancer, they pinpoint a specific inflammation marker that is associated with increased fatigue in this group of patients.

"This is the first study that we know of that examines the association between inflammation and symptoms of fatigue, depression or cognitive impairment in prostate cancer patients receiving ADT," said Heather Jim, Ph.D., corresponding author and co-leader of the Health Outcomes & Behavior Program at Moffitt. "Because the blocking of testosterone can increase inflammation in the body, we believe that inflammation may also be contributing to these symptoms."

For the study, the research team evaluated two groups of men: prostate cancer patients beginning ADT and a control group of healthy men the same age. The men were assessed at the start of the study and again at six and 12 months. Assessments included fatigue, depression and other neuropsychological tests and a blood draw. The bloodwork was to check for circulating markers of inflammation, specifically interleukin-1 receptor antagonist (IL-1RA), interleukin-6 (IL-6), soluble tumor necrosis factor receptor-2 (sTNF-R2) and C-reactive protein (CRP).

While the groups did not differ at baseline, researchers noticed a significant increase in fatigue and depressive symptoms in the ADT patients over the 12-month period. They also saw an increase in one inflammation marker, IL-6, in this group of patients.

"Interleukin-6 is a pro-inflammatory cytokine that is often associated with disruption of sleep and therefore fatigue," said Aasha Hoogland, Ph.D., lead study author and an applied research scientist in the Health Outcomes & Behavior Program at Moffitt. "Studies have shown testosterone can suppress the effects of IL-6, but ADT limits testosterone production in the body, which is why we may be seeing increased levels in this patient group."

The researchers say additional studies are needed to see if interventions, such as anti-inflammatory medications and exercise, can help alleviate fatigue and depressive symptoms in ADT patients.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

Supercapacitors challenge batteries

image: Graphene hybrid made from metal organic frameworks (MOF) and graphenic acid make an excellent positive electrode for supercapacitors, which thus achieve an energy density similar to that of nickel-metal hydride batteries.

Image: 
Prof. Dr. J. Kolleboyina / IITJ

A team working with Roland Fischer, Professor of Inorganic and Metal-Organic Chemistry at the Technical University Munich (TUM) has developed a highly efficient supercapacitor. The basis of the energy storage device is a novel, powerful and also sustainable graphene hybrid material that has comparable performance data to currently utilized batteries.

Usually, energy storage is associated with batteries and accumulators that provide energy for electronic devices. However, in laptops, cameras, cellphones or vehicles, so-called supercapacitors are increasingly installed these days.

Unlike batteries they can quickly store large amounts of energy and put it out just as fast. If, for instance, a train brakes when entering the station, supercapacitors are storing the energy and provide it again when the train needs a lot of energy very quickly while starting up.

However, one problem with supercapacitors to date was their lack of energy density. While lithium accumulators reach an energy density of up to 265 Kilowatt hours (KW/h), supercapacitors thus far have only been delivering a tenth thereof.

Sustainable material provides high performance

The team working with TUM chemist Roland Fischer has now developed a novel, powerful as well as sustainable graphene hybrid material for supercapacitors. It serves as the positive electrode in the energy storage device. The researchers are combining it with a proven negative electrode based on titan and carbon.

The new energy storage device does not only attain an energy density of up to 73 Wh/kg, which is roughly equivalent to the energy density of an nickel metal hydride battery, but also performs much better than most other supercapacitors at a power density of 16 kW/kg. The secret of the new supercapacitor is the combination of different materials - hence, chemists refer to the supercapacitor as "asymmetrical."

Hybrid materials: Nature is the role model

The researchers are betting on a new strategy to overcome the performance limits of standard materials - they utilize hybrid materials. "Nature is full of highly complex, evolutionarily optimized hybrid materials - bones and teeth are examples. Their mechanical properties, such as hardness and elasticity were optimized through the combination of various materials by nature," says Roland Fischer.

The abstract idea of combining basic materials was transferred to supercapacitors by the research team. As a basis, they used the novel positive electrode of the storage unit with chemically modified graphene and combined it with a nano-structured metal organic framework, a so-called MOF.

Powerful and stable

Decisive for the performance of graphene hybrids are on the one hand a large specific surface and controllable pore sizes and on the other hand a high electrical conductivity. "The high performance capabilities of the material is based on the combination of the microporous MOFs with the conductive graphene acid," explains first author Jayaramulu Kolleboyina, a former guest scientist working with Roland Fischer.

A large surface is important for good supercapacitors. It allows for the collection of a respectively large number of charge carriers within the material - this is the basic principle for the storage of electrical energy.

Through skillful material design, the researchers achieved the feat of linking the graphene acid with the MOFs. The resulting hybrid MOFs have a very large inner surface of up to 900 square meters per gram and are highly performant as positive electrodes in a supercapacitor.

Long stability

However, that is not the only advantage of the new material. To achieve a chemically stable hybrid, one needs strong chemical bonds between the components. The bonds are apparently the same as those between amino acids in proteins, according to Fischer: "In fact, we have connected the graphene acid with a MOF-amino acid, which creates a type of peptide bond."

The stable connection between the nano-structured components has huge advantages in terms of long term stability: The more stable the bonds, the more charging and discharging cycles are possible without significant performance impairment.

For comparison: A classic lithium accumulator has a useful life of around 5,000 cycles. The new cell developed by the TUM researchers retains close to 90 percent capacity even after 10,000 cycles.

International network of experts

Fischer emphasizes how important the unfettered international cooperation the researchers controlled themselves was when it came to the development of the new supercapacitor. Accordingly, Jayaramulu Kolleboyina built the team. He was a guest scientist from India invited by the Alexander von Humboldt Foundation and who by now is the head of the chemistry department at the newly established Indian Institute of Technology in Jammu.

"Our team also networked with electro-chemistry and battery research experts in Barcelona as well as graphene derivate experts from the Czech Republic," reports Fischer. "Furthermore, we have integrated partners from the USA and Australia. This wonderful, international co-operation promises much for the future."

Credit: 
Technical University of Munich (TUM)

Pandas' popularity not protecting neighbors

image: Asiatic black bear, pictured here in a camera trap image, find habitat geared toward giant pandas doesn't meet their needs

Image: 
Fang Wang, Michigan State University Center for Systems Integration and Sustainability

Forgive Asiatic black bear if they're not impressed with their popular giant panda neighbors.

For decades, conservationists have preached that panda popularity, and the resulting support for their habitat, automatically benefits other animals in the mountainous ranges. That logic extends across the world, as animals regarded as cute, noble or otherwise appealing drum up support to protect where they live.

Yet in Biological Conservation, scientists take a closer look at how other animals under the panda "umbrella" fare and find several species have every reason to be ticked at panda-centric policies.

"The popularity of giant pandas, as of the popularity of other beloved threatened animals across the world, has generated tremendous advances in protecting forests and other fragile habitats," said Jianguo "Jack" Liu, Michigan State University's Rachel Carson Chair in Sustainability and a paper author. "But this is an important reminder that it can't assume that what's good for a panda is automatically good for other species. Different species have specific needs and preferences."

The authors of "The hidden risk of using umbrella species as conservation surrogates: A spatio-temporal approach" used camera trap data collected throughout mountain ranges to get a clear understanding of what and how animals were using protected habitats.

What they discovered is that while the pandas are doing very well (the species in 2016 was declared "threatened" rather than "endangered" - a conservation point of pride). But three of the eight species focused upon in this study - the Asiatic black bear, the forest musk deer and the Chinese serow (a goat-like animal) seem to have suffered significant habitat loss and/or degradation under panda-centric habitat management. Pandas are picky about where they live - needing lots of bamboo, a gentle slope and no contact with humans. And the managed habitats have largely delivered for them. Just not so much for others.

Fang Wang, the paper's first author, noted that earlier efforts at tracking how a broader range of animals fared were handicapped by turning a blind eye to different habitat preferences, and not spotting potentially different habitat trends of other animals. The authors suggested that the forests and shrublands in lower elevations next to the habitats that best serve pandas could be better for bear and deer.

"China has made a tremendous achievement in establishing giant panda nature reserves, and now we're learning that one size does not fit all," said Wang, who with Liu and other authors is part of MSU's Center for Systems Integration and Sustainability. "China as well as other countries that face similar conservation challenges have the opportunity to move forward from rescuing single species to protecting animal communities and ecosystems."

Credit: 
Michigan State University

Single-cell analysis of metastatic gastric cancer finds diverse tumor cell populations associated with patient outcomes

image: Researchers from The University of Texas MD Anderson Cancer Center who profiled more than 45,000 individual cells from patients with peritoneal carcinomatosis (PC), a specific form of metastatic gastric cancer, defined the extensive cellular heterogeneity and identified two distinct subtypes correlated with patient survival.

Image: 
MD Anderson Cancer Center

HOUSTON -- Researchers from The University of Texas MD Anderson Cancer Center who profiled more than 45,000 individual cells from patients with peritoneal carcinomatosis (PC), a specific form of metastatic gastric cancer, defined the extensive cellular heterogeneity and identified two distinct subtypes correlated with patient survival.

Based on their findings, published today in Nature Medicine, the researchers developed and validated a gene expression signature capable of predicting patient survival better than other clinical features. If validated in prospective studies, this tool may be useful in stratifying patients with gastric cancer and directing them for more effective treatment strategies.

"In order to better treat patients with PC, we first have to understand the populations of metastatic cells in the peritoneal cavity," said co-corresponding author Linghua Wang, M.D., Ph.D., assistant professor of Genomic Medicine. "This is the most detailed analysis of these cells performed to date. That is the power of single-cell analysis - we are able to look at every single cell and get a picture of the landscape."

Peritoneal carcinomatosis is a condition in which cancer cells infiltrate and invade the peritoneal, or abdominal, cavity, adhering to the stomach and other organs. This can occur with other gastrointestinal cancers but is most commonly seen in patients with advanced gastric cancers, with roughly 45% of patients diagnosed with PC at some point. The condition leads to significant fluid accumulation in the abdominal cavity, and patients have an overall survival of less than six months.

"PC represents a major unmet clinical need, as we don't have effective treatment options available for these patients," said co-corresponding author Jaffer Ajani, M.D., professor of Gastrointestinal Medical Oncology. "Based on our findings, we need to move toward profiling these cells in each patient in order to offer more tailored treatment options."

For this study, the researchers isolated PC cells from ascites fluid collected from 20 patients with advanced gastric cancer. Ten of the patients were long-term survivors who survived more than one year after PC diagnosis, and 10 patients were short-term survivors who survived less than six months after PC diagnosis.

After performing single-cell RNA sequencing to analyze gene expression, the researchers were able to build the first "map" of PC cells, which describes the variety of cell types present and their functional states. The variability of cancer cells present within a tumor is known as intratumor heterogeneity, and it can lead to treatment failure and recurrence as distinct subtypes of cancer cells will respond differently to a given therapy.

The gene expression information also enabled the researchers to determine the origins of the PC cells, known as tumor cell lineage. They discovered that, although these were all gastric cancer cells, some appeared to originate from cells of the stomach while others more closely resembled cells of the intestine.

"The intriguing aspect is that, by classifying tumor cells based on lineage compositions, we noted two groups of patients," Wang said. "The more gastric-like PC cells had an aggressive phenotype and were associated with shorter survival. However, the more intestine-like PC cells were less aggressive, and patients had longer survival."

Based on these findings, the researchers developed a gene signature which robustly predicted patient survival better than various clinical features. They validated the signature in a second cohort of patients with advanced gastric cancer and PC and four large-scale localized gastric cancer cohorts totaling more than 1,300 patients.

Going forward, the researchers hope to validate the signature in prospective studies and to perform further analyses of PC cells in more patients to identify regulatory mechanisms of tumor cell lineage plasticity and new therapeutic targets that can be exploited to provide better treatment options for patients.

"This is an important first step toward a better understanding of the single-cell biology of these cancer cells, but we have more work to do," Ajani said. "We foresee that understanding this heterogeneity could one day be used to guide clinical decision making that is most beneficial to each patient."

Credit: 
University of Texas M. D. Anderson Cancer Center

Public concern about violence, firearms, COVID-19 pandemic in California

What The Study Did: The findings of a survey study using data from California suggests the COVID-19 pandemic was associated with increases in self-reported worry about violence for oneself and others, increased firearm acquisition and changes in firearm storage practices.

Authors: Nicole Kravitz-Wirtz, Ph.D., M.P.H., University of California Firearm Violence Research Center and Violence Prevention Research Program, Department of Emergency Medicine, University of California Davis School of Medicine in Sacramento, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.33484)

Editor's Note: The article includes funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

Super surfaces

image: Princeton University researchers have developed a device that focuses and directs terahertz waves for use in high-speed communications.

Image: 
Princeton University

Assembling tiny chips into unique programmable surfaces, Princeton researchers have created a key component toward unlocking a communications band that promises to dramatically increase the amount data wireless systems can transmit.

The programmable surface, called a metasurface, allows engineers to control and focus transmissions in the terahertz band of the electromagnetic spectrum. Terahertz, a frequency range located between microwaves and infrared light, can transit much more data than current, radio-based wireless systems. With fifth generation (5G) communications systems offering speeds 10 to 100 times faster than the previous generation, demand for bandwidth is ever increasing. Facing the emergence of technologies such as self-driving cars and augmented reality applications, the terahertz band presents an opportunity for engineers seeking ways to increase data transmission rates.

To harness the expanded space in the terahertz band, engineers will need to overcome some challenges, and that is where the metasurface comes in. Unlike radio waves, which easily pass through obstructions such as walls, terahertz works best with a relatively clear line of sight for transmission. The metasurface device, with the ability to control and focus incoming terahertz waves, can beam the transmissions in any desired direction. This can not only enable dynamically reconfigurable wireless networks, but also open up new high-resolution sensing and imaging technologies for the next generation of robotics, cyber-physical systems and industrial automation. Because the metasurface is built using standard silicon chip elements, it is low-cost and can be mass produced for placement on buildings, street signs and other surfaces.

"A terahertz beam would be like a laser pointer, whereas today's radio wave transmitters are like light bulbs that send light everywhere. A programmable metasurface is one that produces any possible fields; it's the ultimate projector " said Kaushik Sengupta, an associate professor of electrical engineering at Princeton and a lead author of a new study in reporting the results. Sengupta, whose research focuses on integrated chip-scale systems across radio waves, terahertz to optical frequencies, said the metasurface's low production cost and its programmability means it could be "a major enhancer for communications and network capabilities."

In a study published Dec. 14 in Nature Electronics, researchers from Sengupta's Integrated Micro-systems Research Lab described the design of the metasurface, which features hundreds of programmable Terahertz elements, each less than 100 micrometers (millionths of a meter) in diameter and a mere 3.4 micrometers tall, made of layers of copper and coupled with active electronics that collectively resonate with the structure. This allows adjustments to their geometry at a speed of several billions of times per second. These changes--which are programmable, based on desired application--split a single incoming terahertz beam up into several dynamic, directable terahertz beams that can maintain line-of-sight with receivers.

The Princeton researchers commissioned a silicon chip foundry to fabricate the metasurface as tiles onto standard silicon chips. In this way, the researchers showed that the programmable terahertz metasurface can be configured into low-cost, scalable arrays of tiles. "The tiles are like Lego blocks and are all programmable," said Sengupta. As a proof of concept, the Princeton researchers tested tile arrays measuring two-by-two with 576 such programmable elements and demonstrated beam control by projecting (invisible) terahertz holograms. These elements are scalable across larger arrays.

One possible way to incorporate these sorts of flat tiles into the built environment as next-generation communications network component would be to plaster them as a sort of "smart surface" wallpaper, Sengupta said,

Daniel Mittleman, a professor of engineering at Brown University who was not involved in the study, commented that the research represents a significant step toward terahertz communications.

"This new work demonstrates a fascinating approach which, unlike most previous efforts, is scalable into the terahertz range," said Mittleman. "The key takeaway is that we are now getting a handle on practical methods for actively controlling the wave front, beam size, beam direction, and other features of terahertz beams."

Numerous other applications for the technology include gesture recognition and imaging, as well as in industrial automation and security. Another potential application is autonomous or self-driving cars. These vehicles require precise sensing and imaging to make sense of the external world in real-time, and ideally even faster than a human driver. Semi-autonomous cars increasingly sold today use 77 GHz radars to detect pedestrians and other vehicles for the purposes of adaptive cruise control and engaging automatic emergency braking. For full, driverless autonomy, though, cars would benefit by "seeing" the road and obstacles better with terahertz-band sensors and cameras, along with being able to communicate with other vehicles more rapidly.

Looking ahead, the programmable metasurfaces will need further development, Sengupta said, to be integrated as components in innovative, next-generation networks.

"There are so many things that people would like to do that are not possible with current wireless technology," said Sengupta. "With these new metasurfaces for terahertz frequencies, we're getting a lot closer to making those things happen."

Credit: 
Princeton University, Engineering School

New clues on why pregnancy may increase risk of organ transplant rejection

A research study at the University of Chicago has found that in pregnancy, while the T cell response to a fetus becomes tolerant to allow for successful pregnancy, the part of the immune system that produces antibodies (known as the humoral response) becomes sensitized, creating memory B cells that can later contribute to the rejection of a transplanted organ.

The results help to clarify why it is that the immune system can tolerate a fetus during pregnancy, but later may be more likely to become sensitized to and reject an organ transplant. The study was published on January 4, 2021 in the Journal of Clinical Investigation.

The immune system is designed to respond to and protect against foreign invaders; it does this by recognizing molecules on foreign cells, known as antigens, and mounting an immune response that produces T cells to target and attack foreign cells directly, as well as memory B cells that produce antibodies to tag foreign cells for destruction by other blood cells.

In most cases, this system is extremely beneficial -- but in pregnancy, some adaptation is required to prevent the rejection of a fetus, which only shares half its genes with the mother and therefore presents foreign antigens to the mother's immune system.

This also has the paradoxical effect of increasing the risk of a rejection for a transplanted organ (or allograft) after a person has given birth, particularly if the transplanted organ such as a kidney is from the father of their children.

This new research was inspired by prior work showing that T cells become tolerized during pregnancy, meaning they don't respond to fetal antigens. "This was paradoxical to the transplant field, where we consider pregnancy a sensitizing event," said co-senior author Anita Chong, PhD, a professor of surgery at UChicago. "I wanted to know why it was that pregnancy resulted in sensitization to an allograft (transplanted organ) from the male partner, but enhanced tolerance to a fetus expressing the same antigens."

In the study, the investigators examined the immune response of female mice after receiving a transplanted heart from one of their offspring. By tracking both the T cell response and the humoral response, they could follow both arms of the immune response and study their effects on transplant rejection. They saw that the T cells did not react to the allograft, but the memory B cells did, producing antibodies against foreign antigens from the transplanted heart.

"Our assumption was that both arms of the immune system would be sensitized to the offspring-matched transplanted organ," said Chong, "But there's something about the fetus promoting T cell tolerance that is also preserved for the allograft. On the other hand, the antibodies that are produced to the fetus do not harm the fetus, but cause the rejection of the allograft."

Given the biology of pregnancy, the investigators say, these results make sense.

"Pregnancy cannot evolve to completely eliminate the humoral response because it's critical for a mother to be able to produce antibodies against infectious pathogens during pregnancy and breastfeeding; it's the only immunity a mother can pass to their child. So, the immune system is primed to make antibodies against anything foreign during this period, including those expressed by the fetus," said Chong. "As a result, the placenta has evolved ways to handle these antibodies in order to prevent fetus rejection in subsequent pregnancies."

These results are a promising start for preventing transplant rejection in people after pregnancies in the future.

"There is potential for applying therapies that would eliminate memory B cells and antibodies that now make it more difficult for these women to accept a transplant," said co-senior author Maria-Luisa Alegre, MD/PhD, a professor of medicine at UChicago. "This would level the playing field for women with children. We could eliminate antibodies and B cells before transplantation and eliminate the problem, while T cell responses to antigens shared by the fetus and the transplant would already be spontaneously partially suppressed."

What is not yet clear is how the sensitized humoral response overrides the T cell tolerance to reject an allograft in people after pregnancy, or how the T cell tolerance might be induced in non-mothers in order to prevent rejection in other populations.

As part of their ongoing collaboration, Chong and Alegre hope to continue working on this puzzle. "One aspect of future research is to see if we can exploit this ability of pregnancy to tolerize T cells to have better acceptance not only in people who have been pregnant, but in everybody," said Alegre. "Outside of pregnancy, people can get sensitized prior to transplantation in different ways, from disease or environmental antigens, and it can be difficult to protect the transplant from cross-reactive memory T cells. Now we're looking at how pregnancy can tolerize these memory T cells that are otherwise difficult to immunosuppress with current drugs."

Credit: 
University of Chicago Medical Center

First glimpse of polarons forming in a promising next-gen energy material

image: As this animation shows, polaronic distortions start very small and rapidly expand outward in all directions to a diameter of about 5 billionths of a meter, which is about a 50-fold increase. This nudges about 10 layers of atoms slightly outward within a roughly spherical area over the course of tens of picoseconds, or trillionths of a second. These distortions were measured for the first time in lead hybrid perovskites with an X-ray free electron laser at SLAC National Accelerator Laboratory.

Image: 
Greg Stewart/SLAC National Accelerator Laboratory

Polarons are fleeting distortions in a material's atomic lattice that form around a moving electron in a few trillionths of a second, then quickly disappear. As ephemeral as they are, they affect a material's behavior, and may even be the reason that solar cells made with lead hybrid perovskites achieve extraordinarily high efficiencies in the lab.

Now scientists at the Department of Energy's SLAC National Accelerator Laboratory and Stanford University have used the lab's X-ray laser to watch and directly measure the formation of polarons for the first time. They reported their findings in Nature Materials today.

"These materials have taken the field of solar energy research by storm because of their high efficiencies and low cost, but people still argue about why they work," said Aaron Lindenberg, an investigator with the Stanford Institute for Materials and Energy Sciences (SIMES) at SLAC and associate professor at Stanford who led the research.

"The idea that polarons may be involved has been around for a number of years," he said. "But our experiments are the first to directly observe the formation of these local distortions, including their size, shape and how they evolve."

Exciting, complex and hard to understand

Perovskites are crystalline materials named after the mineral perovskite, which has a similar atomic structure. Scientists started to incorporate them into solar cells about a decade ago, and the efficiency of those cells at converting sunlight to energy has steadily increased, despite the fact that their perovskite components have a lot of defects that should inhibit the flow of current.

These materials are famously complex and hard to understand, Lindenberg said. While scientists find them exciting because they are both efficient and easy to make, raising the possibility that they could make solar cells cheaper than today's silicon cells, they are also highly unstable, break down when exposed to air and contain lead that has to be kept out of the environment.

Previous studies at SLAC have delved into the nature of perovskites with an "electron camera" or with X-ray beams. Among other things, they revealed that light whirls atoms around in perovskites, and they also measured the lifetimes of acoustic phonons - sound waves ­- that carry heat through the materials.

For this study, Lindenberg's team used the lab's Linac Coherent Light Source (LCLS), a powerful X-ray free-electron laser that can image materials in near-atomic detail and capture atomic motions occurring in millionths of a billionth of a second. They looked at single crystals of the material synthesized by Associate Professor Hemamala Karunadasa's group at Stanford.

They hit a small sample of the material with light from an optical laser and then used the X-ray laser to observe how the material responded over the course of tens of trillionths of a second.

Expanding bubbles of distortion

"When you put a charge into a material by hitting it with light, like what happens in a solar cell, electrons are liberated, and those free electrons start to move around the material," said Burak Guzelturk, a scientist at DOE's Argonne National Laboratory who was a postdoctoral researcher at Stanford at the time of the experiments.

"Soon they are surrounded and engulfed by a sort of bubble of local distortion - the polaron - that travels along with them," he said. "Some people have argued that this 'bubble' protects electrons from scattering off defects in the material, and helps explain why they travel so efficiently to the solar cell's contact to flow out as electricity."

The hybrid perovskite lattice structure is flexible and soft ­- like "a strange combination of a solid and a liquid at the same time," as Lindenberg puts it - and this is what allows polarons to form and grow.

Their observations revealed that polaronic distortions start very small ­- on the scale of a few angstroms, about the spacing between atoms in a solid - and rapidly expand outward in all directions to a diameter of about 5 billionths of a meter, which is about a 50-fold increase. This nudges about 10 layers of atoms slightly outward within a roughly spherical area over the course of tens of picoseconds, or trillionths of a second.

"This distortion is actually quite large, something we had not known before," Lindenberg said. "That's something totally unexpected."

He added, "While this experiment shows as directly as possible that these objects really do exist, it doesn't show how they contribute to the efficiency of a solar cell. There's still further work to be done to understand how these processes affect the properties of these materials."

Credit: 
DOE/SLAC National Accelerator Laboratory

Fluoride to the rescue?

Scientists have long been aware of the dangerous overuse of antibiotics and the increasing number of antibiotic-resistant microbes that have resulted. While over-prescription of antibiotics for medicinal use has unsettling implications for human health, so too does the increasing presence of antibiotics in the natural environment. The latter may stem from the improper disposal of medicines, but also from the biotechnology field, which has depended on antibiotics as a selection device in the lab.

"In biotech, we have for a long time relied on antibiotic and chemical selections to kill cells that we don't want to grow," said UC Santa Barbara chemical engineer Michelle O'Malley. "If we have a genetically engineered cell and want to get only that cell to grow among a population of cells, we give it an antibiotic resistance gene. The introduction of an antibiotic will kill all the cells that are not genetically engineered and allow only the ones we want -- the genetically modified organisms [GMOs] -- to survive. However, many organisms have evolved the means to get around our antibiotics, and they are a growing problem in both the biotech world and in the natural environment. The issue of antibiotic resistance is a grand challenge of our time, one that is only growing in its importance."

Further, GMOs come with a containment issue. "If that GMO were to get out of the lab and successfully replicate in the environment, you could not predict what traits it would introduce into the natural biological world," O'Malley explained. "With the advent of synthetic biology, there is increasingly a risk that things we're engineering in the lab could escape and proliferate into ecosystems where they don't belong."

Now, research conducted in O'Malley's lab and published in the journal Nature Communications describes a simple method to address both the overuse of antibiotics, as well as containment of GMOs. It calls for replacing antibiotics in the lab with fluoride.

O'Malley described fluoride as "a pretty benign chemical that is abundant in the world, including in groundwater." But, she notes, it is also toxic to microorganisms, which have evolved a gene that encodes a fluoride exporter that protects cells by removing fluoride encountered in the natural environment.

The paper describes a process developed by Justin Yoo, a former graduate student researcher in O'Malley's lab. It uses a common technique called homologous recombination to render non-functional the gene in a GMO that encodes a fluoride exporter, so the cell can no longer produce it. Such a cell would still thrive in the lab, where fluoride-free distilled water is normally used, but if it escaped into the natural environment, it would die as soon as it encountered fluoride, thus preventing propagation.

Prior to this research, Yoo was collaborating with the paper's co-author Susanna Seppala, a project scientist in O'Malley's lab, in an effort to use yeast to characterize fluoride transport proteins that Seppala had identified in anaerobic fungi. A first step in this project was for Yoo to remove native yeast fluoride transporters.

Shortly after generating the knockout yeast strain, Yoo attended a synthetic biology conference where he heard a talk on a novel biocontainment mechanism intended to prevent genetically modified E. coli bacteria from escaping lab environments. At that talk, he recalled, "I realized that the knockout yeast strain I had generated could potentially act as an effective biocontainment platform for yeast."

"Essentially, what Justin did was to create a series of DNA instructions you can give to cells that will enable them to survive when fluoride is around," O'Malley said. "Normally, if I wanted to select for a genetically engineered cell in the lab, I'd make a plasmid [a genetic structure in a cell, typically a small circular DNA strand, that can replicate independently of the chromosomes] that had an antibiotic resistance marker so that it would survive if an antibiotic was around. Justin is replacing that with the gene for these fluoride exporters."

The method, which O'Malley characterized as "low-hanging fruit -- Justin did all of these studies in about a month," also addresses a simple economic limitation to antibiotic-driven cell selection in biotechnology labs. Aside from fueling the rise of resistant strains of bacteria, she continued, "from a biotech standpoint, the process of creating antibiotic-resistant organisms is also pretty darn expensive. If you were going to run a ten-thousand-liter fermentation, and it may cost you thousands of dollars per fermentation to add some antibiotics, that's a crazy amount of money." Notably, using fluoride at a low concentration would cost only about four cents per liter.

Clearly, said Seppala, "we'd much rather use a chemical like fluoride that's relatively benign, abundant and cheap, and can be used to do the same thing that is achieved using a conventional antibiotic."

Yoo explained that the role of the fluoride transporters had only recently been elucidated, in 2013, when this project began. Emerging approaches to implementing biocontainment have focused on using biological parts that are foreign to the organism of interest, shifting focus toward what Yoo described as "brilliant, yet complex, systems," while perhaps diverting attention from this simpler approach.

Credit: 
University of California - Santa Barbara

Sweetened beverage sales bounced back quickly after Cook County tax repealed

image: Lisa Powell

Image: 
UIC

Following the repeal of the short-lived Cook County, Illinois Sweetened Beverage Tax, sales of sweetened beverages went right back to where they were before the tax went into place, according to a new study led by researchers at the University of Illinois Chicago. The study is published in JAMA Network Open.

The tax, which included both sugar-sweetened and artificially-sweetened diet beverages, was largely pitched as a way to reduce county budget deficits. The tax lasted just four months -- it went into effect on Aug. 2, 2017 and ended on Dec. 1, 2017.

"We know that the tax worked to bring down demand for sweetened beverages significantly while it was in place," said Lisa Powell, UIC distinguished professor and director of health policy and administration at the School of Public Health and lead author of the paper. "The repeal of the Cook County Sweetened Beverage Tax was a missed public health opportunity. If it had stayed in place, we could have seen a lasting reduction in consumption of sweetened beverages, which are linked to obesity, Type 2 diabetes and cardiovascular disease which, in turn, have recently been found to be associated with increased risk of severe illness from COVID-19."

Previous research by Powell and colleagues showed that while the tax was in effect, it worked to bring down volume sold of sweetened beverages by 27% in Cook County, with a net effect of 21% after taking into account cross-border shopping in response to the tax.

In the new study, Powell and Julien Leider, a senior research specialist at the UIC Institute for Health Research and Policy, compared the price and volume of sweetened beverages sold in Cook County in the two years before the tax, during the four months the tax was in place and in the eight months after the tax was repealed relative to St. Louis, Missouri, which did not have a similar tax.

Employing a different analytical method than they used in their previous research, Powell and Leider found that sweetened beverages increased in price by 1.13 cents per fluid ounce in Cook County while the tax was in place. After the repeal of the tax, the price dropped 1.19 cents per fluid ounce. They also found that the volume of sweetened beverages sold in Cook County dropped by about 26% under the tax and increased by about 30% after the tax was repealed. Ultimately, there was no net change in the volume of sweetened beverages sold pre-tax compared to after the tax was repealed.

"Volume of sweetened beverages sold in Cook County went right back to pre-tax levels following the repeal of the tax," Powell said.

Powell notes that the results suggest that the tax worked to bring down demand for sweetened beverages through price point alone and did not appear to change perceptions regarding the harms linked to consuming sugary beverages. Public messaging about the tax, focused mostly on proceeds being used to address budgetary deficits rather than on public health.

"We don't know if public messaging were more focused on health benefits if there would have been some lasting impact of the tax, but as it stands, we see that the substantial impact from the tax fully disappeared once it w
as repealed," Powell said.

Credit: 
University of Illinois Chicago

One in four doctors attacked, harassed on social media

CHICAGO --- While many physicians benefit from social media by networking with potential collaborators or interfacing with patients, a new study from Northwestern University and the University of Chicago found many physicians also report being sexually harassed and personally attacked on these platforms on the basis of their religion, race or medical recommendations.

Although the data were collected before the COVID-19 outbreak, the findings highlight the intensity of online harassment before the pandemic, which has only intensified since the spring, the study authors said.

"If anything, our data is likely an underestimate of the true extent of attacks and harassment post-pandemic since so many doctors started to advocate for public health measures during the pandemic and have been met with an increasingly polarized populace emboldened by leadership that devalues science and fact," said senior and corresponding author Dr. Vineet Arora, assistant dean for scholarship and discovery at the University of Chicago Pritzker School of Medicine.

The study was published Jan. 4 in JAMA Internal Medicine.

This is the first known study to describe physician experiences with online harassment. It found one in four physicians report being personally attacked on social media, including being barraged by negative reviews, receiving coordinated harassment and threats at work, and having their personal information shared publicly. Some attacks were particularly disturbing, such as threats of rape and death, the study authors said.

Women were disproportionately affected by personal attacks and sexual harassment, with one in six women physicians reporting being sexually harassed on social media.

"We worry this emotionally distressing environment will drive women physicians off social media, which has been well-documented as a helpful career-advancement tool," said first author Tricia Pendergrast, a second-year medical student at Northwestern University Feinberg School of Medicine. "Women in medicine are already less likely to hold leadership positions or be first or last authors of research, so disproportionately abstaining from a platform used for collaboration and networking due to sexual harassment and personal attacks should be a cause for concern."

Physicians should be supported online as trusted messengers, the study authors said. The study highlights the need for medical institutions to have a plan in place to respond to this type of online harassment so physicians' careers aren't negatively impacted long term.

"Doctors and other health care workers are already facing unprecedented stress and mental health challenges from their work," Arora said. "Any stress from being online will compound that and put them at risk especially as doctors are being asked to be more vocal on social media to promote vaccination and more."

To help diffuse these types of attacks, Arora co-founded a coalition of physicians and health care professionals in Illinois, the Illinois Medical Professionals Action Collaborative Team (IMPACT4HC), which brings together healthcare workers to educate and advocate for evidence-based solutions on social media.

"It feels much easier to advocate on social media as part of a group," Arora said. "The nice thing is that on #medtwitter, you are not alone. There are many who will come to your aid. And together, we not only have a louder voice but we can support each other though this stressful time."

Credit: 
Northwestern University

Reawakened geyser does not foretell Yellowstone volcanic eruptions, study shows

image: A 2019 eruption of Steamboat Geyser in the Norris Geyser Basin of Yellowstone National Park. The geyser's first documented activity was in 1878, and it has turned off and on sporadically since, once going for 50 years without erupting. In 2018 it reactivated after a three-and-a-half-year hiatus, for reasons that are still unclear.

Image: 
UC Berkeley photo by Mara Reed

When Yellowstone National Park's Steamboat Geyser -- which shoots water higher than any active geyser in the world -- reawakened in 2018 after three and a half years of dormancy, some speculated that it was a harbinger of possible explosive volcanic eruptions within the surrounding geyser basin. These so-called hydrothermal explosions can hurl mud, sand and rocks into the air and release hot steam, endangering lives; such an explosion on White Island in New Zealand in December 2019 killed 22 people.

A new study by geoscientists who study geysers throws cold water on that idea, finding few indications of underground magma movement that would be a prerequisite to an eruption. The geysers sit just outside the nation's largest and most dynamic volcanic caldera, but no major eruptions have occurred in the past 70,000 years.

"Hydrothermal explosions -- basically hot water exploding because it comes into contact with hot rock -- are one of the biggest hazards in Yellowstone," said Michael Manga, professor of earth and planetary sciences at the University of California, Berkeley, and the study's senior author. "The reason that they are problematic is that they are very hard to predict; it is not clear if there are any precursors that would allow you to provide warning."

He and his team found that, while the ground around the geyser rose and seismicity increased somewhat before the geyser reactivated and the area currently is radiating slightly more heat into the atmosphere, no other dormant geysers in the basin have restarted, and the temperature of the groundwater propelling Steamboat's eruptions has not increased. Also, no sequence of Steamboat eruptions other than the one that started in 2018 occurred after periods of high seismic activity.

"We don't find any evidence that there is a big eruption coming. I think that is an important takeaway," he said.

The study will be published this week in Proceedings of the National Academy of Sciences.

Manga, who has studied geysers around the world and created some in his own laboratory, set out with his colleagues to answer three main questions about Steamboat Geyser: Why did it reawaken? Why is its period so variable, ranging from 3 to 17 days? and Why does it spurt so high?

The team found answers to two of those questions. By comparing the column heights of 11 different geysers in the United States, Russia, Iceland and Chile with the estimated depth of the reservoir of water from which their eruptions come, they found that the deeper the reservoir, the higher the eruption jet. Steamboat Geyser, with a reservoir about 25 meters (82 feet) below ground, has the highest column -- up to 115 meters, or 377 feet -- while two geysers that Manga measured in Chile were among the lowest -- eruptions about a meter (3 feet) high from reservoirs 2 and 5 meters below ground.

"What you are really doing is you are filling a container, it reaches a critical point, you empty it and then you run out of fluid that can erupt until it refills again," he said. "The deeper you go, the higher the pressure. The higher the pressure, the higher the boiling temperature. And the hotter the water is, the more energy it has and the higher the geyser."

To explore the reasons for Steamboat Geyser's variability, the team assembled records related to 109 eruptions going back to its reactivation in 2018. The records included weather and stream flow data, seismometer and ground deformation readings, and observations by geyser enthusiasts. They also looked at previous active and dormant periods of Steamboat and nine other Yellowstone geysers, and ground surface thermal emission data from the Norris Geyser Basin.

They concluded that variations in rainfall and snow melt were probably responsible for part of the variable period, and possibly for the variable period of other geysers as well. In the spring and early summer, with melting snow and rain, the underground water pressure pushes more water into the underground reservoir, providing more hot water to erupt more frequently. During winter, with less water, lower groundwater pressure refills the reservoir more slowly, leading to longer periods between eruptions. Because the water pushed into the reservoir comes from places even deeper than the reservoir, the water is decades or centuries old before it erupts back to the surface, he said.

In October, Manga's team members demonstrated the extreme impact water shortages and drought can have on geysers. They showed that Yellowstone's iconic Old Faithful Geyser stopped erupting entirely for about 100 years in the 13th and 14th centuries, based on radiocarbon dating of mineralized lodgepole pine trees that grew around the geyser during its dormancy. Normally the water is too alkaline and the temperature too high for trees to grow near active geysers. The dormancy period coincided with a lengthy warm, dry spell across the Western U.S. called the Medieval Climate Anomaly, which may have caused the disappearance of several Native American civilizations in the West.

"Climate change is going to affect geysers in the future," Manga said.

Manga and his team were unable to determine why Steamboat Geyser started up again on March 15, 2018, after three years and 193 days of inactivity, though the geyser is known for being far more variable than Old Faithful, which usually goes off about every 90 minutes. They could find no definitive evidence that new magma rising below the geyser caused its reactivation.

The reactivation may have to do with changes in the internal plumbing, he said. Geysers seem to require three ingredients: heat, water and rocks made of silica -- silicon dioxide. Because the hot water in geysers continually dissolves and redeposits silica -- every time Steamboat Geyser erupts, it brings up about 200 kilograms, or 440 pounds of dissolved silica. Some of this silica is deposited underground and may change the plumbing system underneath the geyser. Such changes could temporarily halt or reactivate eruptions if the pipe gets rerouted, he said.

Manga has experimented with geysers in his lab to understand why they erupt periodically, and at least in the lab, it appears to be caused by loops or side chambers in the pipe that trap bubbles of steam that slowly dribble out, heating the water column above until all the water can boil from the top down, explosively erupting in a column of water and steam.

Studies of water eruptions from geysers could give insight into the eruptions of hot rock from volcanoes, he said.

"What we asked are very simple questions and it is a little bit embarrassing that we can't answer them, because it means there are fundamental processes on Earth that we don't quite understand," Manga said. "One of the reasons we argue we need to study geysers is that if we can't understand and explain how a geyser erupts, our hope for doing the same thing for magma is much lower."

Credit: 
University of California - Berkeley