Tech

Study could explain tuberculosis bacteria paradox

image: Oleg Igoshin is a professor of bioengineering at Rice University and a a senior investigator at Rice's Center for Theoretical Biological Physics.

Image: 
Jeff Fitlow/Rice University

HOUSTON - (Feb. 22, 2021) -Tuberculosis bacteria have evolved to remember stressful encounters and react quickly to future stress, according to a study by computational bioengineers at Rice University and infectious disease experts at Rutgers New Jersey Medical School (NJMS).

Published online in the open-access journal mSystems, the research identifies a genetic mechanism that allows the TB-causing bacterium, Mycobacterium tuberculosis, to respond to stress rapidly and in manner that is "history-dependent," said corresponding author Oleg Igoshin, a professor of bioengineering at Rice.

Researchers have long suspected that the ability of TB bacteria to remain dormant, sometimes for decades, stems from their ability to behave based upon past experience.

Latent TB is an enormous global problem. While TB kills about 1.5 million people each year, the World Health Organization estimates that 2-3 billion people are infected with a dormant form of the TB bacterium.

"There's some sort of peace treaty between the immune system and bacteria," Igoshin said. "The bacteria don't grow, and the immune system doesn't kill them. But if people get immunocompromised due to malnutrition or AIDS, the bacteria can be reactivated."

One of the most likely candidates for a genetic switch that can toggle TB bacteria into a dormant state is a regulatory network that is activated by the stress caused by immune cell attacks. The network responds by activating several dozen genes the bacteria use to survive the stress. Based on a Rice computational model, Igoshin and his longtime Rutgers NJMS collaborator Maria Laura Gennaro and colleagues predicted just such a switch in 2010. According to the theory, the switch contained an ultrasensitive control mechanism that worked in combination with multiple feedback loops to allow hysteresis, or history-dependent behavior.

"The idea is that if we expose cells to intermediate values of stress, starting from their happy state, they don't have that much of a response," Igoshin explained. "But if you stress them enough to stop their growth, and then reduce the stress level back to an intermediate level, they remain stressed. And even if you fully remove the stress, the gene expression pathway stays active, maintaining a base level of activity in case the stress comes back."

In later experiments, Gennaro's team found no evidence of the predicted control mechanism in Mycobacterium smegmatis, a close relative of the TB bacterium. Since both organisms use the same regulatory network, it looked like the prediction was wrong. Finding out why took years of follow-up studies. Gennaro and Igoshin's teams found that the TB bacterium, unlike their noninfectious cousins, had the hysteresis control mechanism, but it didn't behave as expected.

"Hysteretic switches are known to be very slow, and this wasn't," Igoshin said. "There was hysteresis, a history-dependent response, to intermediate levels of stress. But when stress went from low to high or from high to low, the response was relatively fast. For this paper, we were trying to understand these somewhat contradictory results. "

Igoshin and study co-author Satyajit Rao, a Rice doctoral student who graduated last year, revisited the 2010 model and considered how it might be modified to explain the paradox. Studies within the past decade had found a protein called DnaK played a role in activating the stress-response network. Based on what was known about DnaK, Igoshin and Rao added it to their model of the dormant-active switch.

"We didn't discover it, but we proposed a particular mechanism for it that could explain the rapid, history-dependent switching we'd observed," Igoshin said. "What happens is, when cells are stressed, their membranes get damaged, and they start accumulating unfolded proteins. Those unfolded proteins start competing for DnaK."

DnaK was known to play the role of chaperone in helping rid cells of unfolded proteins, but it plays an additional role in the stress-response network by keeping its sensor protein in an inactive state.

"When there are too many unfolded proteins, DnaK has to let go of the sensor protein, which is an activation input for our network," Igoshin said. "So once there are enough unfolded proteins to 'distract' DnaK, the organism responds to the stress."

Gennaro and co-author Pratik Datta conducted experiments at NJMS to confirm DnaK behaved as predicted. But Igoshin said it is not clear how the findings might impact TB treatment or control strategies. For example, the switch responds to short-term biochemical changes inside the cell, and it's unclear what connection, if any, it may have with long-term behaviors like TB latency, he said.

"The immediate first step is to really try and see whether this hysteresis is important during the infection," Igoshin said. "Is it just a peculiar thing we see in our experiments, or is it really important for patient outcomes? Given that it is not seen in the noninfectious cousin of the TB bacterium, it is tempting to speculate it is related to survival inside the host."

Credit: 
Rice University

SwRI scientists image a bright meteoroid explosion in Jupiter's atmosphere

image: SwRI scientists studied the area imaged by Juno's UVS instrument on April 10, 2020, and determined that a large meteoroid had exploded in a bright fireball in Jupiter's upper atmosphere. The UVS swath includes a segment of Jupiter's northern auroral oval, appearing purely in green, representing hydrogen emissions. In contrast, the bright spot (see enlargement) appears mostly yellow, indicating significant emissions at longer wavelengths.

Image: 
SwRI

SAN ANTONIO -- Feb. 22, 2021 -- From aboard the Juno spacecraft, a Southwest Research Institute-led instrument observing auroras serendipitously spotted a bright flash above Jupiter's clouds last spring. The Ultraviolet Spectrograph (UVS) team studied the data and determined that they had captured a bolide, an extremely bright meteoroid explosion in the gas giant's upper atmosphere.

"Jupiter undergoes a huge number of impacts per year, much more than the Earth, so impacts themselves are not rare," said SwRI's Dr. Rohini Giles, lead author of a paper outlining these findings in Geophysical Research Letters. "However, they are so short-lived that it is relatively unusual to see them. Only larger impacts can be seen from Earth, and you have to be lucky to be pointing a telescope at Jupiter at exactly the right time. In the last decade, amateur astronomers have managed to capture six impacts on Jupiter."

Since Juno arrived at Jupiter in 2016, UVS has been used to study the morphology, brightness and spectral characteristics of Jupiter's auroras as the spacecraft cartwheels close to its surface every 53 days. During the course of a 30-second spin, UVS observes a swath of the planet. The UVS instrument has occasionally observed short-lived, localized ultraviolet emissions outside of the auroral zone, including a singular event on April 10, 2020.

"This observation is from a tiny snapshot in time -- Juno is a spinning spacecraft, and our instrument observed that point on the planet for just 17 milliseconds, and we don't know what happened to the bright flash outside of that time frame," Giles said, "But we do know that we didn't see it on an earlier spin or a later spin, so it must have been pretty short-lived."

Previously, UVS had observed a set of eleven bright transient flashes that lasted 1 to 2 milliseconds. They were identified as Transient Luminous Events (TLEs), an upper atmospheric phenomenon triggered by lightning. The team initially thought this bright flash might be a TLE, however, it was different in two key ways. While it was also short-lived, it lasted at least 17 milliseconds, much longer than a TLE. It also had very different spectral characteristics. Spectra of TLEs and auroras feature emissions of molecular hydrogen, the main component of Jupiter's atmosphere. This bolide event had a smooth "blackbody'" curve, which is what is expected from a meteor.

"The flash duration and spectral shape match up well with what we expect from an impact," Giles said. "This bright flash stood out in the data, as it had very different spectral characteristics than the UV emissions from the Jupiter's auroras. From the UV spectrum, we can see that the emission came from blackbody with a temperature of 9600 Kelvin, located at an altitude of 140 miles above the planet's cloud tops. By looking at the brightness of the bright flash, we estimate that it was caused by an impactor with a mass of 550-3,300 pounds."

Comet Shoemaker-Levy was the largest observed Jupiter impactor. The comet broke apart in July 1992 and collided with Jupiter in July 1994, which was closely observed by astronomers worldwide and the Galileo spacecraft. An SwRI-led team detected impact-related X-ray emissions from Jupiter's northern hemisphere, and prominent scars from the impacts persisted for many months.

"Impacts from asteroids and comets can have a significant impact on the planet's stratospheric chemistry -- 15 years after the impact, comet Shoemaker Levy 9 was still responsible for 95% of the stratospheric water on Jupiter," Giles said. "Continuing to observe impacts and estimating the overall impact rates is therefore an important element of understanding the planet's composition."

Credit: 
Southwest Research Institute

NASA's Swift helps tie neutrino to star-shredding black hole

image: The Zwicky Transient Facility captured this snapshot of tidal disruption event AT2019dsg, circled, on Oct. 19, 2019.

Image: 
ZTF/Caltech Optical Observatories

For only the second time, astronomers have linked an elusive particle called a high-energy neutrino to an object outside our galaxy. Using ground- and space-based facilities, including NASA's Neil Gehrels Swift Observatory, they traced the neutrino to a black hole tearing apart a star, a rare cataclysmic occurrence called a tidal disruption event.

"Astrophysicists have long theorized that tidal disruptions could produce high-energy neutrinos, but this is the first time we've actually been able to connect them with observational evidence," said Robert Stein, a doctoral student at the German Electron-Synchrotron (DESY) research center in Zeuthen, Germany, and Humboldt University in Berlin. "But it seems like this particular event, called AT2019dsg, didn't generate the neutrino when or how we expected. It's helping us better understand how these phenomena work."

The findings, led by Stein, were published in the Feb. 22 issue of Nature Astronomy and are available online.
Neutrinos are fundamental particles that far outnumber all the atoms in the universe but rarely interact with other matter. Astrophysicists are particularly interested in high-energy neutrinos, which have energies up to 1,000 times greater than those produced by the most powerful particle colliders on Earth. They think the most extreme events in the universe, like violent galactic outbursts, accelerate particles to nearly the speed of light. Those particles then collide with light or other particles to generate high-energy neutrinos. The first confirmed high-energy neutrino source, announced in 2018, was a type of active galaxy called a blazar.

Tidal disruption events occur when an unlucky star strays too close to a black hole. Gravitational forces create intense tides that break the star apart into a stream of gas. The trailing part of the stream escapes the system, while the leading part swings back around, surrounding the black hole with a disk of debris. In some cases, the black hole launches fast-moving particle jets. Scientists hypothesized that tidal disruptions would produce high-energy neutrinos within such particle jets. They also expected the events would produce neutrinos early in their evolution, at peak brightness, whatever the particles' production process.

AT2019dsg was discovered on April 9, 2019, by the Zwicky Transient Facility (ZTF), a robotic camera at Caltech's Palomar Observatory in Southern California. The event occurred over 690 million light-years away in a galaxy called 2MASX J20570298+1412165, located in the constellation Delphinus.

As part of a routine follow-up survey of tidal disruptions, Stein and his team requested visible, ultraviolet, and X-ray observations with Swift. They also took X-ray measurements using the European Space Agency's XMM-Newton satellite and radio measurements with facilities including the National Radio Astronomy Observatory's Karl G. Jansky Very Large Array in Socorro, New Mexico, and the South African Radio Astronomy Observatory's MeerKAT telescope.

Peak brightness came and went in May. No clear jet appeared. According to theoretical predictions, AT2019dsg was looking like a poor neutrino candidate.

Then, on Oct. 1, 2019, the National Science Foundation's IceCube Neutrino Observatory at the Amundsen-Scott South Pole Station in Antarctica detected a high-energy neutrino called IC191001A and backtracked along its trajectory to a location in the sky. About seven hours later, ZTF noted that this same patch of sky included AT2019dsg. Stein and his team think there is only one chance in 500 that the tidal disruption is not the neutrino's source. Because the detection came about five months after the event reached peak brightness, it raises questions about when and how these occurrences produce neutrinos.

"Tidal disruption events are incredibly rare phenomena, only occurring once every 10,000 to 100,000 years in a large galaxy like our own. Astronomers have only observed a few dozen at this point," said Swift Principal Investigator S. Bradley Cenko at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Multiwavelength measurements of each event help us learn more about them as a class, so AT2019dsg was of great interest even without an initial neutrino detection."

For example, tidal disruptions generate visible and UV light in the outer regions of their hot accretion disks. In AT2019dsg, these wavelengths plateaued shortly after they peaked. That was unusual because such plateaus typically appear only after a few years. The researchers suspect the galaxy's monster black hole, with a mass estimated at 30 million times the Sun's, could have forced the stellar debris to settle into a disk more quickly than it might have around a less massive black hole.

AT2019dsg is one of only a handful of known X-ray-emitting tidal disruptions. Scientists think the X-rays come from either the inner part of the accretion disk, close to the black hole, or from high-speed particle jets. The outburst's X-rays faded by an unprecedented 98% over 160 days. Stein's team doesn't see clear evidence indicating the presence of jets and instead suggests rapid cooling in the disk most likely explains the precipitous drop in X-rays.

Not everyone agrees with this analysis. Another explanation, authored by DESY's Walter Winter and Cecilia Lunardini, a professor at Arizona State University in Tempe, proposes that the emission came from a jet that was swiftly obscured by a cloud of debris. The researchers published their alternative interpretation in the same issue of Nature Astronomy.

Astronomers think radio emission in these phenomena comes from the black hole accelerating particles, either in jets or more moderate outflows. Stein's team thinks AT2019dsg falls into the latter category. The scientists also discovered that the radio emission continued steadily for months and did not fade along with the visible and UV light, as previously assumed.

The neutrino detection, combined with the multiwavelength measurements, prompted Stein and his colleagues to rethink how tidal disruptions might produce high-energy neutrinos.

The radio emission shows that particle acceleration happens even without clear, powerful jets and can operate well after peak UV and visible brightness. Stein and his colleagues suggest those accelerated particles could produce neutrinos in three distinct regions of the tidal disruption: in the outer disk through collisions with UV light, in the inner disk through collisions with X-rays, and in the moderate outflow of particles through collisions with other particles.

Stein's team suggests AT2019dsg's neutrino likely originated from the UV-bright outer part of the disk, based on the fact that the particle's energy was more than 10 times greater than can be achieved by particle colliders.

"We predicted that neutrinos and tidal disruptions could be related, and seeing that for the first time in the data is just very exciting," said co-author Sjoert van Velzen, an assistant professor at Leiden University in the Netherlands. "This is another example of the power of multimessenger astronomy, using a combination of light, particles, and space-time ripples to learn more about the cosmos. When I was a graduate student, it was often predicted this new era of astronomy was coming, but now to actually be part of it is very rewarding."

Credit: 
NASA/Goddard Space Flight Center

Politics and the brain: Attention perks up when politicians break with party lines

image: Political psychologist Ingrid Haas has studied how the brain reacts to politicians with positions that are incongruent to their political affiliation.

Image: 
Craig Chandler University Communication University of Nebraska-Lincoln

In a time of extreme political polarization, hearing that a political candidate has taken a stance inconsistent with their party might raise some questions for their constituents.

Why don't they agree with the party's position? Do we know for sure this is where they stand?

New research led by University of Nebraska-Lincoln political psychologist Ingrid Haas has shown the human brain is processing politically incongruent statements differently -- attention is perking up -- and that the candidate's conviction toward the stated position is also playing a role.

In other words, there is a stronger neurological response happening when, for example, a Republican takes a position favorable to new taxes, or a Democratic candidate adopts an opinion critical of environmental regulation, but it may be easier for us to ignore these positions when we're not exactly sure where the candidate stands.

Using functional Magnetic Resonance Imaging, or fMRI, at Nebraska's Center for Brain Biology and Behavior, Haas and her collaborators, Melissa Baker of the University of California-Merced, and Frank Gonzalez of the University of Arizona, examined the insula and anterior cingular cortex in 58 individuals -- both regions of the brain that are involved with cognitive function -- and found increased activity when the participants read statements incongruent with the candidate's stated party affiliation. The participants were also shown a slide stating how certain the candidates felt about the positions.

"The biggest takeaway is that people paid more attention to uncertainty when it was attached to the consistent information, and they were more likely to dismiss it when it was attached to the inconsistent information," Haas, associate professor of political science, said. "In these brain regions, the most activation was to incongruent trials that were certain.

"If you definitely know that the candidate is deviating from party lines, so to speak, that seemed to garner more response from our participants, whereas if there's a suspicion that they're deviating from party line, but it's attached to more uncertainty, we didn't see participants engaged in so much processing of that information."

Haas said these trials didn't examine what the voter might decide to do with this information, but that participants were paying more attention to incongruent statements overall.

"We didn't look at whether they're less likely to vote for the candidate, but what we show is increased neural activation associated with those trials," Haas said. "They are taking longer to process the information and taking longer to make a decision about how they feel about it. That does seem to indicate that it's garnering more attention."

The research raises a possible answer to the perennial question of why politicians are frequently less explicit in their opinions, or why they may flip-flop on a stated position.

"Our work points to a reason why politicians might deploy uncertainty in a strategic way," Haas said. "If a politician has a position that is definitely incongruent from the party's stated position, the idea is that rather than put that out there, given that people might grasp onto it and pay more attention to it, it might be strategic for them to mask their true positions instead."

The article, "Political uncertainty moderates neural evaluation of incongruent policy positions," was published Feb. 22 in a special issue of Philosophical Transactions B, "The political brain: neurocognitive and computational mechanisms."

Credit: 
University of Nebraska-Lincoln

Oncotarget: MEK is a promising target in the basal subtype of bladder cancer

image: MEK inhibitor response correlates with basal subtype. Average and standard deviation for DSS3 response to (A) Trametinib, (B) TAK-733, (C) Normalized MEK inhibitors, and (D) Average drug response, grouped by cell line subtype. Each point represents an individual cell line. Center line is average and brackets are standard deviation. Significance determined using Mann-Whitney test, *p < 0.05, or Kruskal-Wallis with Dunn test for multiple comparisons, ***p < 0.001.

Image: 
Correspondence to - Matthew B. Soellner - soellner@umich.edu and Sofia D. Merajver - smerajve@med.umich.edu

Oncotarget recently published in "MEK is a promising target in the basal subtype of bladder cancer" by Merrill, et al. which reported that while many resources exist for the drug screening of bladder cancer cell lines in 2D culture, it is widely recognized that screening in 3D culture is more representative of in vivo response.

To address the need for 3D drug screening of bladder cancer cell lines, the authors screened 17 bladder cancer cell lines using a library of 652 investigational small-molecules and 3 clinically relevant drug combinations in 3D cell culture.

Their goal was to identify compounds and classes of compounds with efficacy in bladder cancer.

Utilizing established genomic and transcriptomic data for these bladder cancer cell lines, they correlated the genomic molecular parameters with drug response, to identify potentially novel groups of tumors that are vulnerable to specific drugs or classes of drugs.

Importantly, the Oncotarget authors demonstrate that MEK inhibitors are a promising targeted therapy for the basal subtype of bladder cancer, and their data indicate that drug screening of 3D cultures provides an important resource for hypothesis generation.

"The Oncotarget authors demonstrate that MEK inhibitors are a promising targeted therapy for the basal subtype of bladder cancer, and their data indicate that drug screening of 3D cultures provides an important resource for hypothesis generation"

Dr. Matthew B. Soellner and Dr. Sofia D. Merajver from The University of Michigan said, "Bladder cancer is the most frequent cancer of the urinary system in the United States with nearly 82,000 new cases each year and 18,000 deaths, affecting men more often, in a 3:1 ratio."

Bladder cancer can be divided broadly into non-muscle-invasive bladder cancer and muscle-invasive bladder cancer.

The Genomics of Drug Sensitivity in Cancer represents one of the largest efforts in total drugs, screening 19 bladder cancer cell lines against 518 drugs.

Indeed, screening in 3D using ultra-low attachment plates is ideal for bladder cancer cell culture, and this method has been utilized in seminal studies for screening patient-derived organoids to predict patient response to drug treatments.

Therefore, there is a utility in screening bladder cancer cell lines in large drug screens in 3D cultures to identify novel therapeutic options for future testing in PDOs and, ultimately, patients.

Then, utilizing established genomic and transcriptomic data for these bladder cancer cell lines, including prioritized mutations, copy number variants, and RNA-based molecular subtyping, they correlated these molecular parameters with drug response identifying potentially novel groups of tumors that are vulnerable to specific drugs or classes of drugs.

The Soellner/Merajver Research Team concluded in their Oncotarget Research Paper, "this work is a valuable resource for the identification of experimental therapeutics in bladder cancer, having screened 652 investigational therapeutics and 3 drug combinations in 17 bladder cancer cell lines, using a 3D cell culture format. As next steps, we pose that this work be used to further test additional therapeutic options for patients with bladder cancer. Moreover, this work highlights a need for biomarkers of drug response, beyond mutational data. Lastly, using these methods, we identify MEK inhibitors as a promising therapeutic in the basal bladder cancer subtype. Important future work will investigate the specific molecular features of the basal subtype that make these cells more sensitive to MEK inhibition, and if this MEK sensitivity signature is applicable to other cancer subtypes."

Sign up for free Altmetric alerts about this article

DOI - https://doi.org/10.18632/oncotarget.27767

Full text - https://www.oncotarget.com/article/27767/text/

Correspondence to - Matthew B. Soellner - soellner@umich.edu and Sofia D. Merajver - smerajve@med.umich.edu

Keywords -
bladder cancer,
drug screen,
3D culture,
basal bladder cancer,
MEK inhibition

About Oncotarget

Oncotarget is a biweekly, peer-reviewed, open access biomedical journal covering research on all aspects of oncology.

To learn more about Oncotarget, please visit https://www.oncotarget.com or connect with:

SoundCloud - https://soundcloud.com/oncotarget
Facebook - https://www.facebook.com/Oncotarget/
Twitter - https://twitter.com/oncotarget
LinkedIn - https://www.linkedin.com/company/oncotarget
Pinterest - https://www.pinterest.com/oncotarget/
Reddit - https://www.reddit.com/user/Oncotarget/

Oncotarget is published by Impact Journals, LLC please visit http://www.ImpactJournals.com or connect with @ImpactJrnls

Journal

Oncotarget

DOI

10.18632/oncotarget.27767

Credit: 
Impact Journals LLC

Traditional hydrologic models may misidentify snow as rain, new citizen science data shows

image: From a backcountry area near Lake Tahoe, Desert Research Institute scientist Monica Arienzo collects field data from her smartphone for the Tahoe Rain or Snow project. January 2021.

Image: 
Desert Research Institute

Reno, Nev. (Feb. 22, 2021)- Normally, we think of the freezing point of water as 32°F - but in the world of weather forecasting and hydrologic prediction, that isn't always the case. In the Lake Tahoe region of the Sierra Nevada, the shift from snow to rain during winter storms may actually occur at temperatures closer to 39.5°F, according to new research from the Desert Research Institute (DRI), Lynker Technologies, and citizen scientists from the Tahoe Rain or Snow project.

The new paper, which published this month in Frontiers in Earth Science, used data collected by 200 volunteer weather spotters to identify the temperature cutoff between rain and snow in winter storms that occurred during the 2020 season. Their results have implications for the accuracy of water resources management, weather forecasting, and more.

"Scientists use a temperature threshold to determine where and when a storm will transition from rain to snow, but if that threshold is off, it can affect our predictions of flooding, snow accumulation, and even avalanche formation," said Keith Jennings, Ph.D., Water Resources Scientist at Lynker Technologies and one of the lead authors on the study.

Previous studies have found that thresholds used are particularly problematic in the Sierra Nevada, where a significant proportion of winter precipitation falls near 32°F. When the temperature is near freezing, weather forecasts and hydrologic models have difficulty correctly predicting whether it will be raining or snowing.

Tahoe Rain or Snow was launched in 2019 to take on the challenge of enhancing the prediction of snow accumulation and rainfall that may lead to flooding by making real-time observations of winter weather. The team is comprised of two scientists, one education specialist, and about 200 volunteer weather spotters from the Lake Tahoe and western slope regions of the Sierra Nevada and Truckee Meadows.

"Tahoe Rain or Snow harnesses the power of hundreds of local volunteers. The real-time observations that they share with scientists add an incredible amount of value to the study of hydrology and clarify crucial gaps left by weather models," said Meghan Collins, MS, Education Program Manager for DRI and another lead author on the paper.

In 2020, these citizen scientists submitted over 1,000 timestamped, geotagged observations of precipitation phase through the Citizen Science Tahoe mobile phone app. Ground-based observations submitted by the Tahoe Rain or Snow team in 2020 showed that a much warmer temperature threshold of 39.5°F for splitting precipitation into rain and snow may be more accurate for our mountain region. In contrast, a 32°F rain-snow temperature threshold would have vastly overpredicted rainfall, leading to pronounced underestimates of snow accumulation. Such model errors can lead to issues in water resources management, travel planning, and avalanche risk prediction.

"Tahoe Rain or Snow citizen scientists across our region open a door to improve our understanding of winter storms", said Monica Arienzo, Ph.D., Assistant Research Professor of Hydrology at DRI and another lead author on the paper. "Growing our team of volunteer scientists is important given that climate change is causing the proportion of precipitation falling as snow to decrease, and they help enhance the predictions of precipitation that we rely on in the Sierra Nevada and Truckee Meadows."

Tahoe Rain or Snow is continuing in 2021. To join, text WINTER to 877-909-0798. You will find out how to download the Citizen Science Tahoe app and receive alerts as to good times to send weather observations. Tahoe Rain or Snow particularly needs observations from sparsely populated, remote, or backcountry areas of the Sierra Nevada.

Credit: 
Desert Research Institute

Stem cells provide hope for dwindling wildlife populations

A paper recently published in the scientific journal Stem Cells and Development shares an important advancement in conservation -- one that may make the difference between survival and extinction for wildlife species that have been reduced to very small population sizes. Using fibroblast cells that have been preserved in San Diego Zoo Global's Frozen Zoo®, scientists have been able to generate induced pluripotent stem cells of northern and southern white rhinoceroses. This important breakthrough is the first step in a complex process for generating gametes from deceased and non-reproductive individuals of these two subspecies.

"For the northern white rhino, which is functionally extinct, the only hope for survival may be in the creation of gametes from cells that were frozen in our labs decades ago," said Marisa Korody, Ph.D., lead author of the study and northern white rhino genetics scientist for San Diego Zoo Global. "What we have just done is taken the first step towards being able to bring this subspecies back to life."

San Diego Zoo Global has fibroblast cell lines from 12 individual northern white rhinos preserved in its Frozen Zoo®. It is thought that the genetic diversity in those frozen samples may be enough to bring the species back from the brink of extinction using this new advancement. However, scientists point out that this emerging technology can only be used if the living cells from critically endangered wildlife species are saved for the future.

"When Dr. (Kurt) Benirschke created the Frozen Zoo more than 40 years ago, we did not know how important it might be for the future," said Oliver Ryder, Ph.D., director of Conservation Genetics at San Diego Zoo Global. "As species are placed increasingly at risk, we now recognize that what we have may be the key to saving these species. However, we will not be able to offer this technology unless we have saved cell lines the way we did with the northern white rhino."

Credit: 
San Diego Zoo Wildlife Alliance

Stroke of luck: Scientists discover target for stroke therapy in blood-brain barrier

image: Scientists from Japan and the United States have identified a new mechanism of blood-brain barrier degradation in the post-stroke brain, involving acrolein-induced modifications of proheparanase. This discovery could lead to the production of newer and more effective drugs for stroke-related disorders.

Image: 
Toubibe from Pixabay

Strokes are a leading cause of poor quality of life or even death in Japan and the world over. Since its characterization, several researchers have been working tooth and nail to identify drug-accessible and effective therapeutic targets for this debilitating condition. One such region of interest for drug targets is the blood-brain barrier (BBB).

The BBB is a structure located around the brain, which prevents the entry of unnecessary circulating cells and biomolecules into the brain. The blood vessels in the BBB are coated with a distinct and protective layer of sugar, called the endothelial glycocalyx, which prevents their entry. However, in the event of a stroke, which results in the blockage or severance of blood vessels in the brain, studies have shown that this glycocalyx and, in turn, the integrity of the BBB, get compromised. In addition, damage to the blood vessels leads to neuronal death and the build-up of toxic byproducts like acrolein.

A group of researchers from Japan and the United States wanted to explore how the degradation of the glycocalyx takes place during an ischemic stroke. Junior Associate Professor Kyohei Higashi from Tokyo University of Science, one of the researchers, explains the motivation behind the research, "When brain tissue becomes necrotic due to ischemia, the function of the BBB is disrupted and immune cells infiltrate the brain, exacerbating inflammation, but the details of this process are still unclear." For the first time, as detailed by the study published in Journal of Biological Chemistry, the group of scientists, led by Dr. Higashi, have identified a possible mechanism that links acrolein accumulation to glycocalyx modifications, which results in damage to the BBB. The team, also comprising Naoshi Dohmae and Takehiro Suzuki from RIKEN Center for Sustainable Resource Science, Toshihiko Toida from Chiba University, Kazuei Igarashi from Amine Pharma Research Institute, Robert J. Linhardt from Rensselaer Polytechnic Institute, and Tomomi Furihata from Tokyo University of Pharmacy and Life Sciences, used mouse models of stroke as well as in vitro ("in the lab") experiments using cerebral capillary endothelial cells to accurately study the mechanisms behind the breakdown of the BBB.

The researchers initially identified that the major sugars in the glycocalyx, heparan sulphate and chondroitin sulfate, showed decreased levels in the 'hyperacute phase' after a stroke. They also found the increased activity of glycocalyx-degrading enzymes like hyaluronidase 1 and heparanase. Upon further in vitro investigation using cell lines, they found that acrolein exposure led to the activation of the precursor of heparanase (proHPSE). Specifically, they found that the acrolein modified specific amino acids on the structure of proHPSE, activating it. They concluded that this mechanism possibly led to the degradation of the glycocalyx, and the subsequent disruption of the BBB.

The team's discovery is critical, as the acrolein-modified proHPSE could be a novel and potentially effective drug target for post-stroke inflammation. As Dr. Higashi, who is also the corresponding author of the study, speculates, "Because proHPSE, but not HPSE, localizes outside cells by binding with heparan sulfate proteoglycans, acrolein-modified proHPSE represents a promising target to protect the endothelial glycocalyx."

Indeed, we hope that the further investigation of this mechanism would lead us to therapies that are more effective in tackling stroke-related illnesses!

Credit: 
Tokyo University of Science

Polymer film protects from electromagnetic radiation, signal interference

image: A polymer film filled with quasi-1D TaSe3 nanowires.

Image: 
Zahra Barani/UC Riverside

As electronic devices saturate all corners of public and personal life, engineers are scrambling to find lightweight, mechanically stable, flexible, and easily manufactured materials that can shield humans from excessive electromagnetic radiation as well as prevent electronic devices from interfering with each other.

In a breakthrough report published in Advanced Materials--the top journal in the field-- engineers at the University of California, Riverside describe a flexible film using a quasi-one-dimensional nanomaterial filler that combines excellent electromagnetic shielding with ease of manufacture.

"These novel films are promising for high-frequency communication technologies, which require electromagnetic interference shielding films that are flexible, lightweight, corrosion resistant, inexpensive, and electrically insulating," said senior author Alexander A. Balandin, a distinguished professor of electrical and computer engineering at UC Riverside's Marlan and Rosemary Bourns College of Engineering. "They couple strongly to high-frequency radiofrequency radiation while remaining electrically insulating in direct current measurements."

Electromagnetic interference, or EMI, occurs when signals from different electronic devices cross each other, affecting performance. The signal from a cell phone or laptop WiFi, or even a kitchen blender, might cause static to appear on a TV screen, for example. Likewise, airlines instruct passengers to turn off cellphones during landing and takeoff because their signals can disrupt navigation signals.

Engineers long ago learned that any electrical device could possibly influence the functioning of a nearby device and developed materials to shield electronics from interfering signals. But now that electronic devices have become ubiquitous, small, wirelessly connected, and critical to innumerable essential services, the opportunities for and risks of EMI-caused malfunctions have proliferated, and conventional EMI shielding materials are often insufficient. More electronic devices mean humans are also exposed to greater electromagnetic radiation than in the past. New shielding materials will be needed for the next generation of electronics.

Balandin led a team that developed the scalable synthesis of composites with unusual fillers-- chemically exfoliated bundles of quasi-one-dimensional van der Waals materials. The composites demonstrated exceptional EMI shielding materials in the gigahertz and sub-terahertz frequency ranges, important for current and future communication technologies, while remaining electrically insulating.

Graphene is the most famous van der Waals material. It is two-dimensional because it is a plane of strongly bound atoms. Many planes of graphene, weakly coupled by van der Waals forces, make up a bulk graphite crystal. For many years, research was focused specifically on two-dimensional layered van der Waals materials, which exfoliate into planes of atoms.

One dimensional van der Waals materials consist of strongly bound atomic chains, rather than planes, which are weakly bound by van der Waals forces. Such materials exfoliate into needle-like "one-dimensional" structures rather than two-dimensional planes. The Balandin group conducted pioneering studies of one-dimensional metals demonstrating their unusual properties. In the new paper, the Balandin group reports using a chemical process that could be scaled up for mass production of these one-dimensional materials.

Doctoral student Zahra Barani and Fariboz Kargar, a research professor and project scientist with Balandin's Phonon Optimized Engineered Materials, or POEM Center, synthesized the unique composites by treating the transition metal trichalcogenides, or TaSe3, a layered van der Waals material with a quasi-one dimensional crystal structure, with chemicals that caused it to shed needle-like, quasi-1D van der Waals nanowires with extremely large aspect ratios of up to ~106-- massively longer than thick. In previous research, the group discovered that bundles of quasi-1D TaSe3 atomic threads can support high-current densities.

"There was no standard recipe for exfoliation of these materials. I did many trial and error experiments, while checking the cleavage energy and other important parameters to exfoliate them with high yield. I knew that the key is to get bundles with as high aspect ratio as I can, since EM waves couple with longer and thinner strands better. That required optical microscopy and scanning electron microscopy characterization after each exfoliation step," first-author Barani said.

The researchers filled a matrix made from a special polymer with bundles of the exfoliated TaSe3 to produce a thin, black film. The synthesized composite films, while remaining electrically insulating, demonstrated exceptional performance in blocking electromagnetic waves. The polymer composites with low loadings of the fillers were especially effective.

"The electromagnetic shielding effectiveness of composites is correlated with the aspect ratio of the fillers. The higher the aspect ratio, the lower the filler concentration needed to provide significant EM shielding," Kargar said. "This is beneficial, since by lowering the filler content one would take advantage of inherent properties of polymers such as light weight and flexibility. In this regard, I can say this class of materials are exceptional once they are exfoliated properly, controlling the thickness and length."

"In the end, I got them right, prepared a composite and measured the EMI properties. The results were amazing: no electric conductivity but more than 99.99% of EMI shielding for micrometer thick films," Barani added.

The quasi-1D van der Waals metallic fillers can be produced inexpensively and in large quantities. Balandin said that research on atomic bundles of quasi-1D van der Waals materials as individual conductors, and composites with such materials is just beginning.

"I am sure we will soon see a lot of progress with quasi-1D van der Waals materials, as happened with quasi-2D materials," he said.

Credit: 
University of California - Riverside

Texas A&M-UTMB team identifies potential drug to treat SARS-CoV-2

A federally approved heart medication shows significant effectiveness in interfering with SARS-CoV-2 entry into the human cell host, according to a new study by a research team from Texas A&M University and The University of Texas Medical Branch (UTMB).

The medication bepridil, which goes by the trade name Vascor, is currently approved by the U.S. Food and Drug Administration (FDA) to treat angina, a heart condition.

The team's leaders are College of Science professor Wenshe Ray Liu, professor and holder of the Gradipore Chair in the Department of Chemistry at Texas A&M, and Chien-Te Kent Tseng, professor and director of the SARS/MERS/COVID-19 Laboratory at UTMB. Liu also holds joint faculty positions in Texas A&M's colleges of medicine and agriculture and life sciences.

"Only one medication is currently available, Remdesivir, to provide limited benefits to COVID-19 patients, and the virus may easily evade it," Liu said. "Finding alternative medicines is imperative. Our team screened more than 30 FDA/European Medicines Agency approved drugs for their ability to inhibit SARS-COV-2's entry into human cells. The study found bepridil to offer the most potential for treatment of COVID-19. As a result, we are advocating for the serious consideration of using bepridil in clinical tests related to SARS-CoV-2."

The Texas A&M-UTMB study is now available at the website of the peer-reviewed Proceedings of the National Academy of Sciences of the United States of America (PNAS) and is scheduled for print publication on March 9.

The team, which includes six other researchers from Texas A&M and four from UTMB, now plans to advance their work to animal models with a potential for clinical trials.

Credit: 
Texas A&M University

How outdoor pollution affects indoor air quality

image: Minute-by-minute outdoor and indoor air particulate matter measurements during an August 2018 wildfire event.

Image: 
Daniel Mendoza.

Just when you thought you could head indoors to be safe from the air pollution that plagues the Salt Lake Valley, new research shows that elevated air pollution events, like horror movie villains, claw their way into indoor spaces. The research, conducted in conjunction with the Utah Division of Facilities Construction and Management, is published in Science of the Total Environment.

In a long-term study in a Salt Lake-area building, researchers found that the amount of air pollution that comes indoors depends on the type of outdoor pollution. Wildfires, fireworks and wintertime inversions all affect indoor air to different degrees, says Daniel Mendoza, a research assistant professor in the Department of Atmospheric Sciences and visiting assistant professor in the Department of City & Metropolitan Planning. The study is unique, Mendoza says, combining a long-term indoor air quality monitoring project with paired outdoor measurements and research-grade instruments.

"We all know about the inversions," Mendoza says. "We all know how large of a problem wildfires are. But do we really know what happens when we're inside?"

The setup

Mendoza, who also holds appointments as an adjunct assistant professor in the Pulmonary Division at the School of Medicine and as a senior scientist at the NEXUS Institute, and his colleagues set up their air monitoring equipment at the Unified State Laboratories in Taylorsville, Utah. They placed three sensors to measure airborne concentrations of particulate matter: One on the roof to measure outdoor air, one in the air handling room--where the outdoor air comes in--and one in an office. The building uses a 100% outside air filtration system; this is not typical for most commercial buildings, which usually use some amount of recirculated air.

The sensors stayed in place from April 2018 to May 2019, just over a year. In the Salt Lake Valley, a year's air quality events include fireworks-laden holidays on Independence Day and Pioneer Day (July 24), smoke from wildfires throughout the West that settles in the bowl-like valley and wintertime inversions in which the whole valley's emissions are trapped in a pool of cold air.

Through it all, the team's sensors kept watch. Amid the expected events, however, a private fireworks show took place on Aug. 17, 2018, within five miles of the study building, providing an unexpected research opportunity. More on that later.

Inversions


Minute-by-minute outdoor and indoor air particulate matter measurements during a December 2018 inversion.

During a wintertime inversion event in December, as the Air Quality Index outdoors reached orange and red levels, the indoor air quality reached yellow levels and stayed there until the inversion cleared. In all, the pollution levels inside were about 30% of what they were outside.

That's not surprising, Mendoza says. During inversions, only around 20% of the air pollution is what's called primary pollution - the particulate matter that comes directly from combustion exhaust. The rest is secondary--formed as gases undergo chemical reactions under specific meteorological conditions and combine to form solid particulates. As soon as the air comes indoors, those meteorological conditions change.

"That changes the chemical environment for these particles and they actually dissociate," Mendoza says. "That's what we're suspecting is happening when these particles come into the building and that's why we don't observe them."

Wildfires


Minute-by-minute outdoor and indoor air particulate matter measurements during an August 2018 wildfire event.

In late August 2018, when three active wildfires were burning in California, indoor air pollution rose to about 78% of outside pollution levels.

"For nearly 48 hours," the researchers wrote, "indoor air quality reached levels considered problematic for health compromised populations and nearly reached levels considered unsafe for all populations."

It's important to note, though, that thanks to the building's air handling system, the air is still safer inside than outside.

The reason for the higher infiltration of particulate matter, Mendoza says, is that smoke particles are stable and don't break down in different temperature and humidity conditions.

"We see those particles travel straight through the system," Mendoza says, "because there's no specific filtration that blocks out these particles. Smoke particles can also be smaller in size; that's why they're so dangerous for us."

Fireworks


Minute-by-minute outdoor and indoor air particulate matter measurements during the 2018 Independence Day holiday.

Utah has two major fireworks holidays: July 4 and July 24 (Pioneer Day). But the researchers happened to catch a signal from a private fireworks event just a few weeks before the wildfire smoke event, providing an opportunity to see how fireworks shows, both large and small, affected indoor air quality.

The smoke from fireworks is somewhere between inversion pollution and wildfires. It contains primary smoke particles as well as the gases that can combine to produce secondary particulates, which can come from the chemicals used to produce fireworks' bright colors.

On the night of July 4, 2018, air quality sharply deteriorated once fireworks shows began and stayed in the red range, with spikes into the purple "very unhealthy" range, for about three hours. Indoor air quality reached orange levels, registering about 30% of the outdoor air pollution.

"It was only after 8 a.m. on July 5 that indoor air quality returned to pre-fireworks levels," the researchers write.


Minute-by-minute outdoor and indoor air particulate matter measurements during a private fireworks event.

The private fireworks show on August 17 lasted only 30 minutes, and although the scope was much smaller, the smoke was still enough to raise the indoor air quality index to orange for several minutes.

"Even a 'small' fireworks show did have a marked impact on indoor air quality," Mendoza says. That matters to people with respiratory challenges who can see large-scale, poor air quality events like inversions and fireworks holidays coming--but who might find private fireworks shows an unpleasant surprise.

The commercial building that the researchers studied is a somewhat controlled environment. Learning about indoor air quality in homes will be a greater challenge. "You have kids coming in with mud or with dirt on their feet, you have vacuuming and cooking. So that's going to be our next step." As many people are spending more time at home due to the COVID-19 pandemic, the research will hopefully help understand what actions people can take to improve their indoor air quality.

"There is a lot of opportunity to reduce the pollutants that reach occupants in buildings, both commercial and residential," says Sarah Boll, assistant director of the Utah Division of Facilities Construction and Management. "To me, that is the great part of this work--with more research it can point the way to protecting people indoors."

Credit: 
University of Utah

Researchers learn that pregnant women pass along protective COVID antibodies to their babies

Researchers Learn that Pregnant Women Pass Along Protective COVID Antibodies to their Babies

Antibodies that guard against COVID-19 can transfer from mothers to babies while in the womb, according to a new study from Weill Cornell Medicine and NewYork-Presbyterian researchers published in the American Journal of Obstetrics and Gynecology.

This discovery, published Jan. 22, adds to growing evidence that suggests that pregnant women who generate protective antibodies after contracting the coronavirus often convey some of that natural immunity to their fetuses. The findings also lend support to the idea that vaccinating mothers-to-be may also have benefits for their newborns.

"Since we can now say that the antibodies pregnant women make against COVID-19 have been shown to be passed down to their babies, we suspect that there's a good chance they could pass down the antibodies the body makes after being vaccinated as well," said Dr. Yawei Jenny Yang, an assistant professor of pathology and laboratory medicine at Weill Cornell Medicine and the study's senior author.

Dr. Yang and her team analyzed blood samples from 88 women who gave birth at NewYork-Presbyterian/Weill Cornell Medical Center between March and May 2020, a time when New York City was the global epicenter of the pandemic. All of the women had COVID-19 antibodies in their blood, indicating that they had contracted the virus at some point even though 58 percent of those women had no symptoms. Furthermore, while antibodies were detected in both symptomatic and asymptomatic women, the researchers observed that the concentration of antibodies was significantly higher in symptomatic women. They also found that the general pattern of antibody response was similar to the response seen in other patients, confirming that pregnant women have the same kind of immune response to the virus as the larger patient population--something that hadn't previously been known for sure, since a woman's immune system changes throughout pregnancy.

In addition, the vast majority of the babies born to these women--78 percent--had detectable antibodies in their umbilical cord blood. There was no evidence that any of the infants had been directly infected with the virus and all were COVID negative at the time of birth, further indicating that the antibodies had crossed the placenta--the organ that provides oxygen and nutrients to a growing baby during pregnancy--into the fetal bloodstream. Newborns with symptomatic mothers also had higher antibody levels than those whose mothers had no COVID symptoms.

This data implies that pregnant women could pass along vaccine-generated antibodies in the same way, potentially shielding both mother and child from future infection. However it is not yet known exactly how protective these antibodies might be, or how long that protection might last. Dr. Laura Riley, chair of the Department of Obstetrics and Gynecology at Weill Cornell Medicine, obstetrician and gynecologist-in-chief at NewYork-Presbyterian/Weill Cornell and one of the study's co-authors, is still advising pregnant patients who decide to get vaccinated to continue to follow current safety guidelines to prevent the spread of the disease. Dr. Riley, Dr. Yang and their colleagues are leading follow-up investigations that are currently enrolling pregnant women who receive the vaccine, as well as vaccinated mothers who are breastfeeding, to assess the antibody response in those groups after vaccination. That information could help guide maternal vaccination strategies moving forward.

"The $1 million question is: Will the group of women who are now being vaccinated get the same type of protection? We don't know that yet," Dr. Riley said. "Getting those answers is going to be really important."

Credit: 
Weill Cornell Medicine

Study: Effects of past ice ages more widespread than previously thought

image: The extent of frost cracking in modern North America and during the last glacial maximum

Image: 
Jill A. Marshall

FAYETTEVILLE, Ark. - Cold temperatures, prevalent during glacial periods, had a significant impact on past and modern unglaciated landscapes across much of North America, according to a recent study by University of Arkansas geologist Jill A. Marshall.

Marshall, assistant professor of geosciences, is the first author of the study, published in the journal Geophysical Research Letters.

The findings help shape understanding of the earth's "Critical Zone," the relatively thin layer of the planet that extends from where vegetation meets the atmosphere to the lowermost extent of weathered bedrock. "Climate and ecosystems determine how quickly bedrock weathers, how soil is produced, how sediment moves on land and in rivers and other factors that shape the landscape," the authors wrote.

In cold lands, such as Alaska today, frost can crack or weather rock that is at or near the surface of the earth - making it more porous and turning solid rock into sediment. By applying a frost-weathering model to North America paleoclimate simulations tracking temperatures during the Last Glacial Maximum approximately 21,000 years ago, Marshall and her team determined that a large swath of North America, from Oregon to Georgia and as far south as Texas and Arkansas, were likely affected by such periglacial processes.

While permafrost landscapes like the modern Arctic experience frozen ground for two years or more, periglacial landscapes, though not permanently frozen, experience below-freezing temperature for much of the year. Though the evidence of past periglacial processes is easily hidden by vegetation and/or erased by subsequent geological processes, the teams' results suggest that frost weathering (and by extent other periglacial processes) covered an area about 3.5 times larger than the mapped extent of permafrost during the Last Glacial Maximum. This predicted influence of past cold climates on below ground weathering may significantly influence modern landscape attributes that we depend on such as soil thickness and water storage.

"Based on the widespread occurrence of glacial-period frost weathering over meter-scale depths, we suggest that past cold climates have had a significant impact on modern landscapes, both through lingering impact on subsurface pathways for water and thus chemical weathering, and the rock damage that contributes to the rate at which rock disaggregates into sediment and potential instability due to non-steady rates of hillslope and river processes," the paper states.

Credit: 
University of Arkansas

New sensor paves way to low-cost sensitive methane measurements

image: Researchers have developed a new sensor that uses an interband cascade light emitting device (ICLED) and could allow practical and low-cost detection of low concentrations of methane.

Image: 
Sameer Khan

WASHINGTON -- Researchers have developed a new sensor that could allow practical and low-cost detection of low concentrations of methane gas. Measuring methane emissions and leaks is important to a variety of industries because the gas contributes to global warming and air pollution.

"Agricultural and waste industries emit significant amounts of methane," said Mark Zondlo, leader of the Princeton University research team that developed the sensor. "Detecting methane leaks is also critical to the oil and gas industry for both environmental and economic reasons because natural gas is mainly composed of methane."

In The Optical Society (OSA) journal Optics Express, researchers from Princeton University and the U.S. Naval Research Laboratory demonstrate their new gas sensor, which uses an interband cascade light emitting device (ICLED) to detect methane concentrations as low as 0.1 parts per million. ICLEDs are a new type of higher-power LED that emits light at mid-infrared (IR) wavelengths, which can be used to measure many chemicals.

"We hope that this research will eventually open the door to low-cost, accurate and sensitive methane measurements," said Nathan Li, first author of the paper. "These sensors could be used to better understand methane emissions from livestock and dairy farms and to enable more accurate and pervasive monitoring of the climate crisis."

Building a less expensive sensor
Laser-based sensors are currently the gold standard for methane detection, but they cost between USD 10,000 and 100,000 each. A sensor network that detects leaks across a landfill, petrochemical facility, wastewater treatment plant or farm would be prohibitively expensive to implement using laser-based sensors.

Although methane sensing has been demonstrated with mid-IR LEDs, performance has been limited by the low light intensities generated by available devices. To substantially improve the sensitivity and develop a practical system for monitoring methane, the researchers used a new ICLED developed by Jerry Meyer's team at the U.S. Naval Research Laboratory.

"The ICLEDs we developed emit roughly ten times more power than commercially available mid-IR LEDs had generated, and could potentially be mass-produced," said Meyer. "This could enable ICLED-based sensors that cost less than USD 100 per sensor."

To measure methane, the new sensor measures infrared light transmitted through clean air with no methane and compares that with transmission through air that contains methane. To boost sensitivity, the researchers sent the infrared light from the high-power ICLED through a 1-meter-long hollow-core fiber containing an air sample. The inside of the fiber is coated with silver, which causes the light to reflect off its surfaces as it travels down the fiber to the photodetector at the other end. This allows the light to interact with additional molecules of methane in the air resulting in higher absorption of the light.

"Mirrors are commonly used to bounce light back and forth multiple times to increase sensor sensitivity but can be bulky and require precise alignment," said Li. "Hollow core fibers are compact, require low volumes of sample gas and are mechanically flexible."

Measuring up to laser-based sensors
To test the new sensor, the researchers flowed known concentrations of methane into the hollow core fiber and compared the infrared transmission of the samples with state-of-the-art laser-based sensors. The ICLED sensor was able to detect concentrations as low as 0.1 parts per million while showing excellent agreement with both calibrated standards and the laser-based sensor.

"This level of precision is sufficient to monitor emissions near sources of methane pollution," said Li. "An array of these sensors could be installed to measure methane emissions at large facilities, allowing operators to affordably and quickly detect leaks and mitigate them."

The researchers plan to improve the design of the sensor to make it practical for long-term field measurements by investigating ways to increase the mechanical stability of the hollow-core fiber. They will also study how extreme weather conditions and changes in ambient humidity and temperature might affect the system. Because most greenhouse gases, and many other chemicals, can be identified by using mid-IR light, the methane sensor could also be adapted to detect other important gases.

Credit: 
Optica

'Walking' molecule superstructures could help create neurons for regenerative medicine

image: Confocal image of superstructure bundle

Image: 
Stupp Lab / Northwestern University

Imagine if surgeons could transplant healthy neurons into patients living with neurodegenerative diseases or brain and spinal cord injuries. And imagine if they could "grow" these neurons in the laboratory from a patient's own cells using a synthetic, highly bioactive material that is suitable for 3D printing.

By discovering a new printable biomaterial that can mimic properties of brain tissue, Northwestern University researchers are now closer to developing a platform capable of treating these conditions using regenerative medicine.

A key ingredient to the discovery is the ability to control the self-assembly processes of molecules within the material, enabling the researchers to modify the structure and functions of the systems from the nanoscale to the scale of visible features. The laboratory of Samuel I. Stupp published a 2018 paper in the journal Science which showed that materials can be designed with highly dynamic molecules programmed to migrate over long distances and self-organize to form larger, "superstructured" bundles of nanofibers.

Now, a research group led by Stupp has demonstrated that these superstructures can enhance neuron growth, an important finding that could have implications for cell transplantation strategies for neurodegenerative diseases such as Parkinson's and Alzheimer's disease, as well as spinal cord injury.

"This is the first example where we've been able to take the phenomenon of molecular reshuffling we reported in 2018 and harness it for an application in regenerative medicine," said Stupp, the lead author on the study and the director of Northwestern's Simpson Querrey Institute. "We can also use constructs of the new biomaterial to help discover therapies and understand pathologies."

A pioneer of supramolecular self-assembly, Stupp is also the Board of Trustees Professor of Materials Science and Engineering, Chemistry, Medicine and Biomedical Engineering and holds appointments in the Weinberg College of Arts and Sciences, the McCormick School of Engineering and the Feinberg School of Medicine.

The paper was published today (Feb. 22) in the journal Advanced Science.

Walking molecules and 3D printing

The new material is created by mixing two liquids that quickly become rigid as a result of interactions known in chemistry as host-guest complexes that mimic key-lock interactions among proteins, and also as the result of the concentration of these interactions in micron-scale regions through a long scale migration of "walking molecules."

The agile molecules cover a distance thousands of times larger than themselves in order to band together into large superstructures. At the microscopic scale, this migration causes a transformation in structure from what looks like an uncooked chunk of ramen noodles into ropelike bundles.

"Typical biomaterials used in medicine like polymer hydrogels don't have the capabilities to allow molecules to self-assemble and move around within these assemblies," said Tristan Clemons, a research associate in the Stupp lab and co-first author of the paper with Alexandra Edelbrock, a former graduate student in the group. "This phenomenon is unique to the systems we have developed here."

Furthermore, as the dynamic molecules move to form superstructures, large pores open that allow cells to penetrate and interact with bioactive signals that can be integrated into the biomaterials.

Interestingly, the mechanical forces of 3D printing disrupt the host-guest interactions in the superstructures and cause the material to flow, but it can rapidly solidify into any macroscopic shape because the interactions are restored spontaneously by self-assembly. This also enables the 3D printing of structures with distinct layers that harbor different types of neural cells in order to study their interactions.

Signaling neuronal growth

The superstructure and bioactive properties of the material could have vast implications for tissue regeneration. Neurons are stimulated by a protein in the central nervous system known as brain-derived neurotrophic factor (BDNF), which helps neurons survive by promoting synaptic connections and allowing neurons to be more plastic. BDNF could be a valuable therapy for patients with neurodegenerative diseases and injuries in the spinal cord but these proteins degrade quickly in the body and are expensive to produce.

One of the molecules in the new material integrates a mimic of this protein that activates its receptor known as Trkb, and the team found that neurons actively penetrate the large pores and populate the new biomaterial when the mimetic signal is present. This could also create an environment in which neurons differentiated from patient-derived stem cells mature before transplantation.

Now that the team has applied a proof of concept to neurons, Stupp believes he could now break into other areas of regenerative medicine by applying different chemical sequences to the material. Simple chemical changes in the biomaterials would allow them to provide signals for a wide range of tissues.

"Cartilage and heart tissue are very difficult to regenerate after injury or heart attacks, and the platform could be used to prepare these tissues in vitro from patient-derived cells," Stupp said. "These tissues could then be transplanted to help restore lost functions. Beyond these interventions, the materials could be used to build organoids to discover therapies or even directly implanted into tissues for regeneration since they are biodegradable."

Credit: 
Northwestern University