Earth

ITMO University scientists develop a tool for wireless charging of multiple devices

Researchers from the Faculty of Physics and Engineering managed to achieve simultaneous power transfer at various frequencies with the help of a metasurface. It will allow us to simultaneously charge devices from different manufacturers with different power transfer standards. The paper was published in Applied Physics Letters.

When we need to borrow a charger for our device, we often face the fact that different manufacturers produce different charger connectors. Wireless charging isn't a solution either: companies use different power transfer systems that work at different frequencies.

"There are various wireless power transfer standards with different frequencies, so you can't just use a charger by any manufacturer," says Polina Kapitanova, a researcher at ITMO University's Department of Physics and Engineering. "For example, Huawei uses one wireless power transfer frequency for mobile phones and another - for smart glasses, so you can't charge these devices with the same charger."

It's not very efficient. Many researchers are working on wireless charging surfaces suitable for several devices. One of the research teams engaged in this problem is based at ITMO University's Department of Physics and Engineering.

"What we propose is a brand-new metasurface that can be used as a transmitter in the wireless power transfer system that would allow users to charge several devices at once," says Polina Kapitanova. "This surface can be used at one frequency or at several."

The designed metasurface is made out of conductors arranged in a special way. They are connected with capacitors that tune into the necessary frequency. Such a system can spread on quite a large area, so that it can be used as a table or a nightstand functioning as a big charger.

"As it turned out, this structure has unique properties, including reverse frequency dispersion that can be efficiently applied in wireless power transfer," explains Polina Kapitanova. "This structure has several modes (resonant frequencies) that have a uniform magnetic field. It allows us to transfer energy wirelessly. At the same time, the electric field is hidden at the edges of the structure, at the capacitors, and it's safer for users that way."

This concept is a part of a promising smart table project by scientists from ITMO University's Department of Physics and Engineering. They created a prototype of the metasurface and studied its properties with different frequencies.

"In this paper, we present a demo version: we place several receiving resonators loaded on light-emitting diodes with different working frequencies on the metasurface," says Mingzhao Song, a researcher at the Department of Physics and Engineering. "The diodes light up regardless of the position and orientation of receivers, which means that energy gets transferred."

Now the scientists need to evaluate the level of decrease in the electric field in order to make the charger safer and faster.

Credit: 
ITMO University

Novel PROTAC enhances its intracellular accumulation and protein knockdown

Cancer therapies sometimes involve drugs that mediate the breakdown of specific intracellular proteins that participate in cancer formation and proliferation. Proteolysis-targeting chimeras or PROTACs are a promising type of protein-degrader molecules, but their effectiveness has been challenged by their limited ability to accumulate inside cells.

In this study published in the journal Nature Communications, a team led by researchers at Baylor College of Medicine developed an improved type of PROTAC that has enhanced intracellular accumulation and functions, not only as a degrader, but also as an inhibitor of the target protein. This exciting discovery led to a recently funded research grant from the National Cancer Institute (NCI) scored at top one percentile. The researchers hope that their work can be used to develop optimal PROTACs for clinical applications in the future.

"PROTACs are molecules made of two parts: one binds to the target protein and the other links to the enzyme ubiquitin ligase. The two parts are joined by a chemical linker," said corresponding author Dr. Jin Wang, associate professor of pharmacology and chemical biology and of molecular and cellular biology at Baylor.

Once PROTAC binds to its target protein and the ubiquitin ligase, the ligase will attach ubiquitin groups to the surface of the target protein, which tags it for degradation by the proteasome inside the cell.

Wang and his colleagues were interested in improving PROTAC's ability to degrade target proteins inside cells. Drawing from their years of expertise in the fields of organic chemistry and chemical biology, they experimented with different chemical binders to determine how they affected PROTAC's efficacy. Serendipitously, they discovered that a specific type of chemistry can enhance PROTAC's intracellular accumulation.

Making an improved PROTAC

The researchers worked with a well-known target of PROTAC, the enzyme Bruton's tyrosine kinase (BTK). They used different types of binders - reversible noncovalent, reversible covalent, and irreversible covalent - to construct PROTACs that targeted BTK.

"When we compared the different constructs in their ability to degrade BTK, we were excited to discover that the cyano-acrylamide-based reversible covalent chemical binder significantly enhanced the intracellular accumulation and target engagement of PROTACs, much better than the others," said co-first author Dr. Wen-Hao Guo, a postdoctoral associate in the Wang lab.

"Furthermore, we developed RC-1, a reversible covalent BTK PROTAC that was effective as both a BTK inhibitor and degrader," said co-first author Dr. Xiaoli Qi, assistant professor in pharmacology and chemical biology at Baylor. "This represents a novel mechanism of action for PROTACs. Our work suggests the possibility that this strategy to improve PROTACs can be applied to target other molecules."

Credit: 
Baylor College of Medicine

NASA sees typhoon Bavi from one million miles away

image: Typhoon Bavi was moving through the Yellow Sea on Aug. 25, 2020 when an image of it was captured from 1 million miles away. This full-disk image of the Earth was taken by NASA's EPIC Camera aboard NOAA's DSCOVR satellite.

Image: 
NASA/NOAA

Typhoon Bavi is a large storm moving through the Yellow Sea. A NASA camera captured an image of the Northwestern Pacific Ocean that showed Bavi headed north.

NASA's Earth Polychromatic Imaging Camera (EPIC), a four megapixel CCD camera and telescope aboard NOAA's DSCOVR satellite in orbit 1 million miles from Earth, captured a full disk image of the Northwestern Pacific Ocean side of the globe. Typhoon Bavi was moving through the Yellow Sea on Aug. 25, 2020 when the image of it was captured.

EPIC maintains a constant view of the fully illuminated Earth as it rotates, providing scientific observations of ozone, vegetation, cloud height and aerosols in the atmosphere.  DSCOVR is a partnership between NASA, NOAA and the U.S. Air Force with the primary objective of maintaining the nation's real-time solar wind monitoring capabilities, which are critical to the accuracy and lead time of space weather alerts and forecasts from NOAA.

On Aug. 26 at 4 a.m. EDT (0900 UTC), Typhoon Bavi was located near latitude 32.4 degrees north and longitude 124.5 degrees east. That is about 169 nautical miles east-southeast of Shanghai, China. Bavi has maximum sustained winds near 100 knots (115 mph/185 kph) and was moving to the north-northwest.

Bavi is moving north and the Joint Typhoon Warning Center noted the storm has reached peak intensity.  The storm is expected to weaken and start to become extra-tropical as it makes landfall in western North Korea and northeastern China.

Credit: 
NASA/Goddard Space Flight Center

Nanodots made of photovoltaic material support waveguide modes

image: New spectroscopic technique for studying nanostructures demonstrates that stibnite nanodots can act as high-optical-quality waveguides and are promising candidates as photoswitchable materials for future applications

Image: 
Zhan et al., doi: 10.1117/1.AP.2.4.046004

Antimony sulfide, or stibnite (Sb2S3), has been investigated intensively in recent years as a promising material for nontoxic, environmentally friendly solar cells. It is now possible to fabricate thin photovoltaic films from an ink containing nanoparticles of stibnite, and to nanopattern those films for 2D and 3D structures of pretty much any shape. Such simple, cost-effective production methods fulfill prerequisites for reliable, widespread use.

Since stibnite is an effective semiconductor (i.e., it has a high absorption coefficient and carrier mobility), its nanostructure holds promise as a photoswitchable material for all-optical signal processing and computing. Petra Groß, researcher at the Institute for Physics at University of Oldenburg explains, "Illumination with near-infrared light, with wavelengths for which stibnite is largely transparent, can result in an ultrafast change of its refractive index. This means that a surface patterned with stibnite nanoparticles could enable optical properties like reflection of color appearance to be switched by an infrared light pulse."

If stibnite nanostructures are to be used in switchable nanodevices, high optical quality is essential. A recent study published in Advanced Photonics investigated the optical properties of stibnite nanostructures. The study demonstrated that stibnite nanodots can act as high-optical quality waveguides. This finding, together with the easy 2D and 3D structuring capabilities and interesting optical properties, indicates strong potential for stibnite nanostructures as switchable materials for future applications.

Stibnite nanodots

The lead author of the study, Jinxin Zhan, is currently a doctoral student in the Near-Field Photonics Laboratory of Professor Christoph Lienau at University of Oldenburg. Zhan explains that electron microscope images of stibnite indicate a rather uneven surface. Collaborating with researchers at University of Konstanz, Zhan and her team aimed to estimate the optical properties of the stibnite nanostructure by investigating stibnite nanodots (400-nm diameter) atop a stibnite surface.

Zhan says, "Such an optical inspection is difficult. The size of the nanostructures is usually smaller than the wavelength of visible light, such that spectroscopic measurements are typically performed only on ensembles of several nanostructures."

Nanoparticle focus

To achieve the difficult optical inspection, Zhan and her team developed a novel kind of near-field spectroscopy that allows optical study of single nanoparticles. It is based on scattering-type scanning near-field optical microscopy (SOM), where a gold probe with a sharp tip of about 10-nm radius of curvature is brought close to nanostructure's surface and scanned across it. The light scattered away from the structure by the tip is collected by a detector.

Zhan notes, "Usually, there is a large amount of background light present, which we suppress by modulating the tip-sample distance and by mixing the scattered light with a broadband reference laser. A monochromator equipped with a fast line camera enables us to measure complete spectra at every position while raster scanning." The spectral bandwidth is 200 nm, and the spatial resolution is about 20 nm, such that the team can study the optical properties, or spectrally resolved intensity profiles, within individual nanodots.

The resulting maps of the stibnite nanoparticles revealed that they act as high-refractive index, dielectric waveguides, despite their irregular surface apparent in structural studies. Zhan explains further, "With our new method, we see mode profiles across the nanodots that are very similar to the mode profiles of guided waves in optical glass fibers. A calculation shows that a cylindrical waveguide of stibnite with 400-nm diameter should support four modes. A calculated superposition of these four lowest-order modes matches our experimental observation very well. These modes are supported over the whole 200-nm bandwidth of our near-field spectroscopy measurement."

Lienau noted that this novel technique offers a totally new way of "seeing" minute amounts of nanomaterials and opens the door towards studying the dynamics of their optical excitations on ultrafast time scales. He says, "The spectroscopic technique developed by Jinxin Zhan and Petra Groß is exceptionally promising. Already now, the team has demonstrated local light scattering spectroscopy with deep subwavelength resolution and high sensitivity. We are confident that we will be able to further improve the spatial resolution to the few-nanometer range quickly."

Credit: 
SPIE--International Society for Optics and Photonics

Army scientists take new spin on quantum research

Army researchers discovered a way to further enhance quantum systems to provide Soldiers with more reliable and secure capabilities on the battlefield.

Specifically, this research informs how future quantum networks will be designed to deal with the effects of noise and decoherence, or the loss of information from a quantum system in the environment.

As one of the U.S. Army's priority research areas in its Modernization Strategy, quantum research will help transform the service into a multi-domain force by 2035 and deliver on its enduring responsibility as part of the joint force providing for the defense of the United States.

"Quantum networking, and quantum information science as a whole, will potentially lead to unsurpassed capabilities in computation, communication and sensing," said Dr. Brian Kirby, researcher at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "Example applications of Army interest include secure secret sharing, distributed network sensing and efficient decision making."

This research effort considers how dispersion, a very common effect found in optical systems, impacts quantum states of three or more particles of light.

Dispersion is an effect where a pulse of light spreads out in time as it is transmitted through a medium, such as a fiber optic. This effect can destroy time correlations in communication systems, which can result in reduced data rates or the introduction of errors.

To understand this, Kirby said, consider the situation where two light pulses are created simultaneously and the goal is to send them to two different detectors so that they arrive at the same time. If each light pulse goes through a different dispersive media, such as two different fiber optic paths, then each pulse will be spread in time, ultimately making the arrival time of the pulses less correlated.

"Amazingly, it was shown that the situation is different in quantum mechanics," Kirby said. "In quantum mechanics, it is possible to describe the behavior of individual particles of light, called photons. Here, it was shown by research team member Professor James Franson from the University of Maryland, Baltimore County, that quantum mechanics allows for certain situations where the dispersion on each photon can actually cancel out so that the arrival times remain correlated."

The key to this is something called entanglement, a strong correlation between quantum systems, which is not possible in classical physics, Kirby said.

In this new work, Nonlocal Dispersion Cancellation for Three or More Photons, published in the peer-reviewed Physical Review A, the researchers extend the analysis to systems of three or more entangled photons and identify in what scenarios quantum systems outperform classical ones. This is unique from similar research as it considers the effects of noise on entangled systems beyond two-qubits, which is where the primary focus has been.

"This informs how future quantum networks will be designed to deal with the effects of noise and decoherence, in this case, dispersion specifically," Kirby said.

Additionally, based on the success of Franson's initial work on systems of two-photons, it was reasonable to assume that dispersion on one part of a quantum system could always be cancelled out with the proper application of dispersion on another part of the system.

"Our work clarifies that perfect compensation is not, in general, possible when you move to entangled systems of three or more photons," Kirby said. "Therefore, dispersion mitigation in future quantum networks may need to take place in each communication channel independently."

Further, Kirby said, this work is valuable for quantum communications because it allows for increased data rates.

"Precise timing is required to correlate detection events at different nodes of a network," Kirby said. "Conventionally the reduction in time correlations between quantum systems due to dispersion would necessitate the use of larger timing windows between transmissions to avoid confusing sequential signals."

Since Kirby and his colleagues' new work describes how to limit the uncertainty in joint detection times of networks, it will allow subsequent transmissions in quicker succession.

The next step for this research is to determine if these results can be readily verified in an experimental setting.

Credit: 
U.S. Army Research Laboratory

Transplanted brown-fat-like cells hold promise for obesity and diabetes

video: Scientists at Joslin Diabetes Center have delivered a proof of concept for a novel cell-based therapy against this dangerous condition.

The potential therapy for obesity would transplant HUMBLE (human brown-like) fat cells, human white fat cells that have been genetically modified to become similar to heat-generating brown fat cells, says Yu-Hua Tseng, PhD, a Senior Investigator in Joslin's Section on Integrative Physiology and Metabolism.

Image: 
Joslin Diabetes Center

BOSTON - (August 26, 2020) - Obesity is the main cause of type 2 diabetes and related chronic illnesses that together will kill more people around the globe this year than the Covid-19 coronavirus. Scientists at Joslin Diabetes Center have delivered a proof of concept for a novel cell-based therapy against this dangerous condition.

The potential therapy for obesity would transplant HUMBLE (human brown-like) fat cells, human white fat cells that have been genetically modified to become similar to heat-generating brown fat cells, says Yu-Hua Tseng, PhD, a Senior Investigator in Joslin's Section on Integrative Physiology and Metabolism.

Brown fat cells burn energy instead of storing energy as white fat cells do, says Tseng, senior author on a paper about the work in Science Translational Medicine. In the process, brown fat can lower excessive levels of glucose and lipids in the blood that are linked to metabolic diseases such as diabetes.

However, people who are overweight or obese tend to have less of this beneficial brown fat--a barrier that HUMBLE cells are designed to overcome, Tseng says.

She and her colleagues created the cells from human white fat cells in a progenitor stage (not yet fully developed into their final fat form). The investigators used a variant of the CRISPR-Cas9 genome editing system to boost expression of a gene called UCP1, which triggers white fat cell progenitors to develop into brown fat-like cells.

Transplanted into mice lacking an immune system, the HUMBLE progenitor cells developed into cells that functioned very much like the mice's own brown fat cells, says Tseng, who is also a professor of medicine at Harvard Medical School.

Her team compared transplants of these cells versus the original white fat cells in mice who were put on a high fat diet. Mice given the HUMBLE transplants displayed much greater sensitivity to insulin and ability to clear glucose from the blood (two key factors that are impaired in type 2 diabetes).

Additionally, the mice receiving HUMBLE transplants put on less weight than mice with transplanted white fat cells, remaining in the same range as animals who received brown fat cells.

Perhaps surprisingly, the Joslin scientists demonstrated that these benefits were mostly due to signals from the transplanted cells to endogenous (existing) brown fat cells in the mice. "Cells in different tissues communicate with each other," Tseng says. "In this case, we found that our transplanted HUMBLE cells secrete a molecule called nitric oxide, which is carried by red blood cells to the endogenous brown cells and activates those cells."

If the HUMBLE technique continues to prove out in pre-clinical research, it might eventually be possible to generate this type of cell for individual patients, Tseng suggests. Such a procedure would remove a tiny amount of a patient's white fat cells, isolate the progenitor cells, modify those cells to boost expression of UCP1, and then return the resulting HUMBLE cells to the patient.

However, that individualized approach would be complicated and expensive, so the Tseng lab is pursuing two alternative routes that may be more practical for clinical use.

One alternative is to use cells that are not personalized but instead are encapsulated via biomaterials that protect the cells from rejection by a patient's immune system. (Joslin researchers and their collaborators have long studied such materials for cell transplants for type 1 diabetes.) The other option is gene therapies that directly express the UCP1 gene in white fat progenitor cells in the body, so that those cells acquire HUMBLE-like properties.

Tseng emphasizes that this research is moving ahead despite the Covid-19 pandemic, which puts people with diabetes at much higher risk of serious outcomes if they are infected.

"Employing cell-based or gene therapies to treat obesity or type 2 diabetes used to be science fiction," she says. "Now scientific advances, such as CRISPR gene-editing technologies, will help us to improve the metabolism, the body weight, the quality of life and the overall health of people with obesity and diabetes."

Credit: 
Joslin Diabetes Center

Gunshot injuries in California drop, but percentage of firearm death goes up

UC Davis researchers hope other states can use their study efforts as a basis for improving firearm injury prevention.

“We found that the number of nonfatal firearm injuries in California decreased over an 11-year period, primarily due to a drop in firearm assaults,” said Sarabeth Spitzer, lead author and a UC Davis research intern at the time of the study. “However, the lethality of those and other firearm injuries did not go down. In fact, it went up.”

The new study is online today in JAMA Network Open.

Nonfatal firearm injuries pose a significant health burden with great social and economic costs. For individuals who survive firearm injuries, the long-term physical and psychological effects can be devastating, requiring years of healthcare attention and rehabilitation. Until now, there has been relatively little knowledge about the distribution and types of nonfatal firearm injuries in the U.S.

The study used data from California’s Office of Statewide Health Planning and Development (OSHPD) for individuals treated in Emergency Departments (ED) or discharged from hospitals and data from the Centers for Disease Control and Prevention (CDC) WISQARS for fatal firearm injuries. Data were for the period between Jan. 1, 2005 and Dec. 31, 2015.

The study found that over the 11-year study period, there were approximately 81,000 ED visits and hospitalizations due to nonfatal firearm injuries. It also found that the overall rate of nonfatal firearm injuries decreased by 38%. While this decrease was driven primarily by a substantial decrease in firearm assaults, self-inflicted and unintentional injuries remained stable. The study identified around 70% of the nonfatal injuries were from an assault, 24% were unintentional injuries, 2% self-inflicted and 5% undetermined.

To capture the lethality of firearm injuries, the researchers calculated two versions of the case fatality ratio (CFR), the proportion of injuries that are fatal. The overall CFR is for all people who sustain a firearm injury, whether or not those people receive medical care (most people who die from gunshot wounds do so at the scene of the shooting). The clinical CFR is for people who survive to reach the hospital and receive medical care. California’s overall CFR for firearm injuries increased from 27.6% in 2005 to 32.2% in 2015, while the clinical CFR remained stable, at about 8%. 

“The increase in the overall CFR may be due to a rise in the proportion of self-inflicted firearm injuries, which are usually deadlier than the other types of firearm injuries,” said Spitzer, currently a general surgery resident at Brigham and Women's Hospital.

Nonfatal firearm injuries by race and income

The study also found significant differences in income and source of payment for medical care by cause of injury. Those with assault-related injuries were more likely to be within the lowest income quartile (30%) and have self-pay (33%) or government coverage (37%).

Consistent with previous studies, it found that assaultive injuries were concentrated among young Black (33%) and Hispanic (46%) individuals from urban, lower income areas. In fact, Black men had an assault-related injury rate four times that of Hispanic men, the next highest racial group in terms of such injuries. The average age of individuals with assault-related injuries was 27 years.

Self-inflicted injuries were more likely older (average of 42 years) and concentrated among white (62%) individuals in higher income areas. Self-inflicted gunshot wounds are usually more severe than other gunshot injuries and require longer hospital stays.

The mean age of all individuals with firearm injuries was 27.5 years, and around 90% of them were men.

Nonfatal injury rates vary widely among California counties

The researchers mapped the county-level rates of nonfatal injury and found that incidents vary considerably among California counties.

In 2015, San Joaquin County had the highest nonfatal injury rate of 39.7 per 100,000 people - compared to Sonoma County’s 3.6 injuries per 100,000, the lowest in California. Around 48% of California’s 58 counties experienced a decrease in the rate of nonfatal firearm injury during the study period. The counties with the largest relative decrease in firearm injuries were Sonoma (73.8%) and Los Angeles (58.2%). Counties with rate increases tended to be in Northern California.

The study also found that urban counties had higher rates of firearm injury than rural counties.

Policy implications

There are two possible general explanations for the increase in the fatality rate: the treatment of firearm injuries has not substantially improved over that period of time, or the treatment did get better, but the injuries got worse. While there is no indication that the severity of injuries got worse, there is a need for improved timely treatment of firearm injuries.

One possible approach is for first responders to adopt the “scoop and run” strategy -currently implemented in Philadelphia - to get the injured to a health care facility as fast as possible. Timely transport of the injured to the hospital is key in saving lives.

“This study advances our understanding of the incidence, distribution and lethality of firearm injuries in California,” said Garen Wintemute, emergency physician and director of the UC Davis Violence Prevention Research Program (VPRP) and the UC Firearm Violence Research Center (UCFC). “We hope other states will use this as a model to evaluate the burden of nonfatal firearm injuries as a basis for improved prevention efforts.”

Wintemute also noted evidence of successful efforts, such as the major decrease in injuries in Los Angeles County.

In addition to Spitzer, study authors were Veronica A. Pear, Christopher McCort and Garen Wintemute from University of California Firearm Violence Research Center and Violence Prevention Research Program at University of California, Davis.

This research was supported by the University of California Firearm Violence Research Center with funds from the State of California. Additional support came from the California Wellness Foundation (2014-255), the Heising-Simons Foundation (2017-0447) and the University of California, Davis, Violence Prevention Research Program.

Article: Spitzer, Pear, McCort and Wintemute. Incidence, distribution, and lethality of firearm injuries in California from 2005 to 2015. JAMA Network Open. Doi: 10.1001/jamanetworkopen.2020.14736

Journal

JAMA Network Open

DOI

10.1001/jamanetworkopen.2020.14736

Credit: 
University of California - Davis Health

Tethering together type 2 diabetes drugs increases efficacy of combination therapy

DURHAM, N.C. - Biomedical engineers at Duke University have shown that the efficacy of a two-pronged type 2 diabetes treatment increases when the drugs are linked by a heat-sensitive tether rather than simply concurrently administered. The combination molecule is formed by an elastin-like polypeptide (ELP) linker that forms a gel-like depot when injected under the skin, which slowly dissolves and releases the active drug over time.

This novel approach features the commonly prescribed type 2 diabetes drug glucagon-like peptide-1 (GLP-1) and the compelling drug candidate fibroblast growth factor 21 (FGF21) that together create tight glycemic control and potent weight-reduction in diabetic mice. Coupled with the slow-release function of the ELP, the effects last longer than one week with a single injection.

Because GLP-1, a short peptide, and FGF21, a large folded protein, are such different compounds, these findings suggest that this approach to combination drug design could be applied to disease therapies beyond diabetes.

The results appear online on August 26 in the journal Science Advances.

"In the burgeoning field of multi-functioning single-molecule diabetes drug design, researchers primarily unite drugs that are similar in size, structure and function," said Caslin Gilroy, a postdoctoral scholar at the University of California, Berkeley, who led the project while completing her PhD in biomedical engineering at Duke. "Being able to combine such structurally distinct drugs into a single molecule while maintaining the bioactivity and stability of each is a big technological achievement."

Type 2 diabetes is a progressive disease where body tissues become resistant to the effects of insulin, which regulates the movement of sugar from the bloodstream into cells. When this carefully tuned system breaks down, blood sugar levels remain toxically elevated and a host of serious complications can follow. While many treatment options exist, a single drug is rarely able to treat an advanced case. Conventional medications lose their potency over time and frequently cause weight gain, which itself can promote insulin resistance and exacerbate the disease.

A growing class of drugs is based on GLP-1, a naturally occurring peptide released from the intestines after a meal. GLP-1 therapy enhances the release of insulin from the pancreas while promoting weight loss. However, the high doses of GLP-1 that are sometimes necessary to maintain healthy blood sugar levels have been shown to cause gastrointestinal distress. Researchers are exploring combination therapies that strategically pair GLP-1 with additional drugs to maximize glucose control, minimize side effects and augment weight loss.

While most drug combinations incorporate small peptides from the same family as GLP-1, Gilroy and Ashutosh Chilkoti, the Alan L. Kaganov Distinguished Professor of Biomedical Engineering at Duke, chose to work with FGF21. A metabolic hormone, FGF21 regulates insulin sensitivity, energy expenditure and fat metabolism within body tissues.

"FGF21 functions through a different mechanism than GLP-1, and we hypothesized that the two drugs would complement each other nicely," said Gilroy. "GLP-1 increases insulin secretion by the pancreas, while FGF21 enhances the body's response to the insulin. GLP-1 reduces food intake, while FGF21 helps burn more calories."

But rather than simply injecting diabetic mice with both drugs at the same time, the researchers decided to link GLP-1 and FGF21 together into a single molecule. This approach to combination therapy has several advantages. A single molecule is more predictable in how it will disperse through the body, act on its target tissues and eventually be cleared. A single drug is also beneficial for the prescribing physician and patient, as it reduces the medication burden and simplifies the treatment regimen. And the FDA approval process for a single drug is more straightforward than for a drug mixture.

GLP-1 and FGF21, however, are both peptide-based drugs, heavily reliant on shape and surface features to function. Tethering the two without interfering with either is easier said than done.

To form one drug out of two, the researchers turned to the ELP--a specialty of the Chilkoti research group. ELPs are chains of repetitive peptide sequences that are highly disordered in nature. This disorder provides flexibility, enabling drugs fused at each end of the ELP the room to do their respective jobs. The modularity of ELPs also make them highly tunable, allowing for the design of the best delivery system possible.

Peptide-based drugs suffer from two notable disadvantages; they have a short half-life, due to rapid clearance from the body, and they must be administered by needle. An ELP-based delivery platform, however, addresses both of these issues.

"Linking the drugs to an ELP allows us to design a compound that is liquid at room temperature but forms a gel-like depot upon injection," said Gilroy. "The depot dissolves over the course of at least a week, slowly and regularly releasing drug to the system over time."

Chilkoti already has two Phase II clinical trials underway using ELPs as slow-release delivery systems. One trial aims to treat pulmonary arterial hypertension, while the second involves a potential therapy for COVID-19.

In the study, after verifying that GLP-1 and FGF21 retain their respective functions and potencies when linked together by an ELP, Gilroy and Chilkoti tested their multi-functioning, slow-release molecule in a mouse model of diabetes.

The results show that levels of drug circulating in the system remained steady while blood sugar levels were brought down to a healthy level and maintained for up to 10 days following a single dosing. Mice treated with the GLP-1/FGF21 combination drug were better able to recover from a glucose challenge compared to either drug alone, and were the only test group to lose weight during the trial.

The drug combination also worked better when GLP-1 and FGF21 were tethered together rather than being delivered as a mixture of individual drugs. The researchers think that linking them guarantees that GLP-1 and FGF21 are always acting in concert at the same point in time, allowing their mechanisms of action to synergize and work together.

"We had speculated that we may see synergy when we combined GLP-1 and FGF-21 because they have different modes of action," said Chilkoti. "That was really just a hope at the outset of this project, and we were more than pleasantly surprised when Caslin showed that combining these drugs into a single molecule clearly showed a synergistic therapeutic effect compared to a mixture of the two drugs. The data is so compelling that we believe it's ready for a company to pursue this strategy commercially. Duke's Office of Licensing and Ventures is currently looking to license it."

Credit: 
Duke University

Native desert bighorn sheep in ecologically intact areas are less vulnerable to climate change

image: A bighorn ram perched on a cliff in Grand Canyon National Park.

Image: 
Tyler Creech

CORVALLIS, Ore. - In the American Southwest, native desert bighorn sheep populations found in landscapes with minimal human disturbance, including several national parks, are less likely to be vulnerable to climate change, according to a new study led by Oregon State University.

The study, published in the journal Frontiers in Ecology and Evolution, is one of the largest genetic studies conducted on desert bighorn sheep. The researchers used genetic information from more than 1,600 individuals in 62 populations in and around 10 National Park Service units in four states - Arizona, California, Nevada and Utah. Park service units include parks as well as other administrative units, such as reserves and recreation areas.

The researchers found that the least vulnerable bighorn populations are primarily in and around Death Valley National Park and Grand Canyon National Park. The results suggest that protecting these landscapes should be a priority for native bighorn conservation, said lead author Tyler Creech, an OSU graduate now at the Center for Large Landscape Conservation in Bozeman, Montana.

Meanwhile, the researchers determined that the populations with the highest overall vulnerability are primarily located outside of national park units in the southern Mojave Desert and in southeastern Utah.

In the study, the researchers analyzed the genetic structure and diversity of bighorn sheep populations and how connected they are to other populations, both genetically and geographically, and used that information to infer their vulnerability to a changing climate.

"We used DNA samples from bighorn sheep to tell us how genetically diverse populations are," Creech said. "The populations that are less genetically diverse and less connected to their neighbors are more likely to be negatively impacted by climate change."

"Genetic diversity allows populations to adapt to new environments," said study co-author Clint Epps, a wildlife biologist and associate professor in the Department of Fisheries and Wildlife in OSU's College of Agricultural Sciences. "This study highlights the important role our national park units can play in keeping these populations up as the climate changes."

The researchers primarily used fecal pellet samples to obtain DNA from up to 85 individual bighorn in each population, and combined genetic datasets from multiple projects covering different portions of the study area, dating back to 2000. After the samples were processed and genotyped, they grouped the individuals into populations based on the locations where they were sampled, then quantified the isolation and genetics of each population.

They also considered how exposure to harsher climatic conditions within bighorn sheep habitat "patches" could influence populations' vulnerability. Desert bighorn sheep live in some of the hottest and driest landscapes in the U.S., and climate modeling shows those areas could get hotter and drier.

To assess climate change exposure, they used an index known as "forward climate velocity," which indicates the speed at which species must migrate to maintain constant climate conditions. They considered two greenhouse gas emissions scenarios for the 2050s developed by the Intergovernmental Panel on Climate Change, one that models moderate emissions and the other that models high emissions.

"We believe this approach was suitable for assessing relative exposure of desert bighorn populations across a large geographic range because although temperature and precipitation are known to influence fitness of desert bighorn sheep, the specific climatic conditions to which bighorn are most sensitive are not fully understood and may vary geographically," Creech said.

Credit: 
Oregon State University

Researchers pursue 'hidden pathology' to explain fatigue in multiple sclerosis

Up to 60 percent of patients with multiple sclerosis (MS) report that fatigue is the disease's most debilitating symptom. And yet, fatigue remains one of MS's mysteries -- despite its prevalence and significance, the root cause of the symptom remains unclear. In a study published in Neurology Neuroimmunology & Neuroinflammation, investigators from Brigham and Women's Hospital used positron emission technology (PET) imaging to look for brain's immune cells that may become erroneously activated in MS, leading to fatigue. The team describes a potential link to brain inflammation that may help explain the connection between MS and fatigue.

"Fatigue correlates poorly with the conventional markers of multiple sclerosis -- the brain lesions we see using magnetic resonance imaging (MRI) don't associate well with fatigue," said corresponding author Tarun Singhal, MD, a neurologist and nuclear medicine physician in the Department of Neurology and director of the PET Imaging Program in Neurologic Diseases at the Ann Romney Center for Neurologic Diseases at Brigham and Women's Hospital. "So we went searching for a hidden pathology; something that has gone undetected until now in the context of fatigue in MS."

Singhal and colleagues used a second-generation radioligand known as [F-18]PBR06 to conduct PET imaging. Singhal describes this tracer as a "radiolabel detective" that can snoop for clues. Once injected, the tracer travels to the brain, binds to abnormally activated immune cells called microglia (and to some extent, additionally, to other immune and support cells called astrocytes) and emits gamma rays that can be picked up by a scanner.

The team performed PET scans on 12 MS patients and 10 healthy controls, finding strong correlations between MS patients' self-reported fatigue risk scores and activation of immune cells in very specific regions of the brain. These regions included the substantia nigra -- which translates literally to "the dark substance." The substantia nigra is the site where dopamine is produced (dopaminergic neurons appear darker on pathology, giving the region its name). Dopamine plays many roles in the body and is required for stimulating attention and wakefulness patterns in the brain. Several additional areas of the brain also correlated significantly with fatigue scores, but there was no association between fatigue scores and brain atrophy and lesion load in MS patients.

The researchers note that given the study's small sample size, additional study is needed to validate their findings.

"We detected a widespread network of very specific regions whose inflammation correlates with fatigue scores and all have implications for contributions to fatigue," said Singhal. "We are now pursuing further study to confirm our findings in a larger sample size and are looking at interactions between neurochemistry and neuroinflammation."

Credit: 
Brigham and Women's Hospital

Majority of groundwater stores resilient to climate change

image: Groundwater fed irrigation of Boro rice during the dry season in the Ganges-Brahmaputra Basin.

Image: 
Mohammed Shamshudda/Richard Taylor

Fewer of the world's large aquifers are depleting than previously estimated, according to a new study by the University of Sussex and UCL.

Groundwater, the world's largest distributed store of freshwater, plays a critical role in supplying water for irrigation, drinking and industry, and sustaining vital ecosystems.

Previous global studies of changes in groundwater storage, estimated using data from the GRACE (Gravity Recovery and Climate Experiment) satellite mission and global models, have concluded that intensifying human water withdrawals in the majority of the world's large aquifer systems are causing a sustained reduction in groundwater storage, depleting groundwater resources.

Yet this new study, published in Earth System Dynamics, reveals that depletion is not as widespread as reported, and that replenishment of groundwater storage depends upon extreme rainfall that is increasing under global climate change.

Lead author, Dr Mohammad Shamsudduha, Lecturer in Physical Geography and a member of the Sussex Sustainability Research Programme at the University of Sussex, said: "The cloud of climate change has a silver lining for groundwater resources as it favours greater replenishment from episodic, extreme rainfalls in some aquifers located around the world mainly in dry environments. This new analysis provides a benchmark alongside conventional, ground-based monitoring of groundwater levels to assess changes in water storage in aquifers over time. This information is essential to inform sustainable management of groundwater resources."

This new study updates and extends previous analyses, accounting for strong seasonality in groundwater storage in the analysis of trends. It shows that a minority (only 5) of the world's 37 large aquifers is undergoing depletion that requires further attention for better management.

Co-author, Professor of Hydrogeology, Richard Taylor from UCL Geography, said: "The findings do not deny that groundwater depletion is occurring in many parts of the world but that the scale of this depletion, frequently associated with irrigation in drylands, is more localised than past studies have suggested and often occurs below a large (~100 000 km2) 'footprint' of mass changes tracked by a pair of GRACE satellites."

For the majority, trends are non-linear and irregular, exhibiting considerable variability in volume over time. The study shows further that variability in groundwater storage in drylands is influenced positively and episodically by years of extreme (>90th percentile) precipitation.

For example, in the Great Artesian Basin of Australia, extreme seasonal rainfall over two successive summers in 2010 and 2011 increased groundwater storage there by ~90 km3, more than ten times total annual freshwater withdrawals in the UK. Elsewhere in the Canning Basin of Australia, however, groundwater depletion is occurring at a rate of 4.4 km3 each year that is associated with its use in the extraction of iron ore.

To avoid continued depletion of aquifers, the study promotes sustainable groundwater withdrawals through augmented replenishments from extreme rainfall and 'managed aquifer recharge' practices.

Credit: 
University of Sussex

New method to track ultrafast change of magnetic state

image: As this illustration shows, the researchers were able to measure the magnetization dynamics in the iron nanofilm caused by ultrafast electronic and acoustic processes.

Image: 
Image: Bielefeld University/W. Zhang

An international team of physicists from Bielefeld University, Uppsala University, the University of Strasbourg, University of Shanghai for Science and Technology, Max Planck Institute for Polymer Research, ETH Zurich, and the Free University Berlin have developed a precise method to measure the ultrafast change of a magnetic state in materials. They do this by observing the emission of terahertz radiation that necessarily accompanies such a magnetization change. Their study, titled 'Ultrafast terahertz magnetometry', has been published today (25.08.2020) in Nature Communications.

Magnetic memories are not just acquiring higher and higher capacity by shrinking the size of magnetic bits, they are also getting faster. In principle, the magnetic bit can be 'flipped'--that is, it can change its state from 'one' to 'zero' or vice versa--on an extremely fast timescale of shorter than one picosecond. One picosecond is one millionth of one millionth of a second. This could allow the operation of magnetic memories at terahertz switching frequencies, corresponding to extremely high terabit per second (Tbit/s) data rates.

'The actual challenge is to be able to detect such a magnetization change quickly and sensitively enough,' explains Dr Dmitry Turchinovich, professor of physics at Bielefeld University and the leader of this study. 'The existing methods of ultrafast magnetometry all suffer from certain significant drawbacks such as, for example, operation only under ultrahigh vacuum conditions, the inability to measure on encapsulated materials, and so on. Our idea was to use the basic principle of electrodynamics. This states that a change in the magnetization of a material must result in the emission of electromagnetic radiation containing the full information on this magnetization change. If the magnetization in a material changes on a picosecond timescale, then the emitted radiation will belong to the terahertz frequency range. The problem is, that this radiation, known as "magnetic dipole emission", is very weak, and can be easily obscured by light emission of other origins.'

Wentao Zhang, a PhD student in the lab of Professor Dmitry Turchinovich, and the first author of the published paper says: 'It took us time, but finally we succeeded in isolating precisely this magnetic dipole terahertz emission that allowed us to reliably reconstruct the ultrafast magnetization dynamics in our samples: encapsulated iron nanofilms.'

In their experiments, the researchers sent very short pulses of laser light onto the iron nanofilms, causing them to demagnetize very quickly. At the same time, they were collecting the terahertz light emitted during such a demagnetization process. The analysis of this terahertz emission yielded the precise temporal evolution of a magnetic state in the iron film.

'Once our analysis was finished, we realized that we actually saw far more than what we had expected,' continues Dmitry Turchinovich. 'It has already been known for some time that iron can demagnetize very quickly when illuminated by laser light. But what we also saw was a reasonably small, but a very clear additional signal in magnetization dynamics. This got us all very excited. This signal came from the demagnetization in iron--actually driven by the propagation of a very fast pulse of sound through our sample. Where did this sound come from? Very easy: when the iron film absorbed the laser light, it not only demagnetized, it also became hot. As we know, most materials expand when they get hot--and this expansion of the iron nanofilm launched a pulse of terahertz ultrasound within our sample structure. This sound pulse was bouncing back and forth between the sample boundaries, internal and external, like the echo between the walls of a big hall. And each time this echo passed through the iron nanofilm, the pressure of sound moved the iron atoms a little bit, and this further weakened the magnetism in the material.' This effect has never been observed before on such an ultrafast timescale.

'We are very happy that we could see this acoustically-driven ultrafast magnetization signal so clearly, and that it was so relatively strong. It was amazing that detecting it with THz radiation, which has a sub-mm wavelength, worked so well, because the expansion in the iron film is only tens of femtometres which is ten orders of magnitude smaller,' says Dr Peter M. Oppeneer, a professor of physics at Uppsala University, who led the theoretical part of this study.

Dr. Pablo Maldonado, a colleague of Peter M. Oppeneer who performed the numerical calculations that were crucial for explaining the observations in this work, adds: 'What I find extremely exciting is an almost perfect match between the experimental data and our first-principles theoretical calculations. This confirms that our experimental method of ultrafast terahertz magnetometry is indeed very accurate and also sensitive enough, because we were able to distinguish clearly between the ultrafast magnetic signals of different origins: electronic and acoustic.'

The remaining co-authors of this publication have dedicated it to the memory of their colleague and a pioneer in the field of ultrafast magnetism, Dr. Eric Beaurepaire from the University of Strasbourg. He was one of the originators of this study, but passed away during its final stages.

Credit: 
Bielefeld University

NASA's terra satellite catches the demise of post-tropical cyclone Marco

image: On Aug. 25 at 12:30 a.m. EDT (0430 UTC), the MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on post-tropical cyclone Marco that showed a small area of storms where cloud top temperatures were as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius).

Image: 
NASA/NRL

NASA's Terra satellite passed over the Gulf of Mexico early on Aug. 25 and found a very small area of convection from post-tropical cyclone Marco, northeast of its center. All watches and warnings have been dropped as the storm continues to weaken toward dissipation.

Visible imagery and surface observations indicated that Marco made landfall around 7 p.m. EDT on Aug. 24 near the mouth of the Mississippi River. The center continued to move west and moved offshore and south of Louisiana by Aug. 25.

NASA's Terra Satellite Reveals Effects of Wind Shear 

NASA's Terra satellite uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

On Aug. 25 at 12:30 a.m. EDT (0430 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite observed Marco in infrared light and found a small area of storms where cloud top temperatures as cold as minus 50 degrees Fahrenheit (minus 45.5 Celsius) over the western Florida Panhandle and coastal Alabama. Those storms were being pushed northeast of Marco's center from southwesterly wind shear. Satellite imagery also shows the low-level circulation center was a swirl of clouds south of Louisiana, over the Gulf of Mexico.

In the Aug. 25, Marco discussion at 5 a.m. EDT, NHC Senior Hurricane Specialist Stacy Stewart noted, "Marco has been devoid of any significant convection for at least 12 hours.  [NOAA's Advanced Scatterometer] ASCAT scatterometer surface wind data around 0239Z (10:39 p.m. EDT on Aug. 24) suggested that Marco might have degenerated in a north-to-south elongated trough (elongated area of low pressure). Based on this information, Marco has been downgraded to post-tropical remnant low [pressure area]."

About Wind Shear  

The shape of a tropical cyclone provides forecasters with an idea of its organization and strength. When outside winds batter a storm, it can change the storm's shape and push much of the associated clouds and rain to one side of it. That is what wind shear does.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

Marco's Final Status

At 5 a.m. EDT (0900 UTC) on Aug. 25, NOAA's National Hurricane Center (NHC) reported the center of Post-Tropical Cyclone Marco was located near latitude 28.8 degrees north and longitude 91.2 degrees west. That is about 60 miles (100 km) south of Morgan City, La. and 110 miles (175 km) south-southeast of Lafayette, La. The post-tropical cyclone was moving toward the west near 10 mph (17 kph), and this general motion is expected to continue for the next day or so. Maximum sustained winds were near 30 mph (45 kph) with higher gusts. The estimated minimum central pressure was 1008.

Marco Nears its End

Brisk southwesterly vertical wind shear of 30 knots is forecast to increase to near 35 knots in 24 hours, which should prevent the redevelopment of deep convection near the center. On the forecast track, Marco should continue moving westward just offshore the coast of Louisiana until the system dissipates.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

Credit: 
NASA/Goddard Space Flight Center

To be or not to be in the ER, that is the question

During the process of cellular protein synthesis mistakes can happen. Sometimes, proteins end up being misfolded. They do not shape up into the specific 3-D structure that is required for proper function. Misshaped secreted and transmembrane proteins usually trigger safety mechanisms that dispose of them by shuttling them from their place of synthesis, the endoplasmic reticulum (ER), to the cytosol, where they are degraded in a cellular structure called proteasome.

"Many proteins in the cell have sugar (glycan) chemical groups attached to them. Research has suggested that when these glycosylated proteins are misfolded, removing the N-glycan groups might be one of the steps required for their destruction," said Dr. Hamed Jafar-Nejad, associate professor of molecular and human genetics at Baylor College of Medicine and lead scientist of the study.

For many years, scientists have thought that once misfolded glycoproteins exit the ER and enter the cytosol, the enzyme N-glycanase 1 (NGLY1) removes their N-glycan groups, thereby facilitating the disposal of the misfolded glycoproteins by the proteasome.

"We found that for a critical signaling protein called BMP4, removal of N-glycans by NGLY1 does not occur after the misfolded molecules have been transferred into the cytosol, but is instead required for the transfer itself to be accomplished," said first author Dr. Antonio Galeone, who was a postdoctoral fellow in Dr. Jafar-Nejad's lab during the development of this project. He is currently in the Department of Biosciences at the University of Milan.

An intriguing finding

In humans, loss-of-function mutations in NGLY1 cause a multisystem developmental disorder called NGLY1 deficiency. Jafar-Nejad, Galeone and their colleagues from other institutions work with fruit fly and mouse models to investigate how NGLY1 mutations lead to developmental defects in various organs, hoping to find ways to treat this rare condition.

Previous findings from the Jafar-Nejad group had shown that the fruit fly equivalent of human NGLY1 is required to promote bone morphogenetic protein (BMP) signaling in a specific developmental context. However, how this actually happened at a molecular level remained a mystery. Moreover, whether mammalian NGLY1 plays a role in BMP signaling was not known.

In the current study, the researchers discovered that NGLY1 promotes the activity of one of the BMP pathway ligands called BMP4, both in fruit flies and mammals, by removing the N-glycan groups from misfolded BMP4 proteins.

But how does the degradation of misfolded BMP4 already removed from the ER contribute to BMP4 signaling mediated by properly folded BMP4 molecules that remain in the ER and are later secreted?

The researchers expected that the elimination of misfolded BMP4 proteins would happen as had been suggested for other proteins: defective BMP4 molecules would move into the cytosol, where NGLY1 removed the N-glycan groups, followed by proteasomal degradation.

Unexpectedly, they found that the N-glycan groups were removed before the defective BMP4 was fully moved from the ER into the cytosol. If the N-glycan groups were not removed, the defective BMP4 molecules did not transfer into the cytosol and accumulated in the ER.

Novel regulation of BMP4 signaling

Intrigued by these findings, Jafar-Nejad, Galeone and their colleagues reviewed the scientific literature and found previous work showing that NGLY1 is not exclusively a free cytosolic enzyme. Specifically, biochemical experiments had suggested that a small fraction of NGLY1 associates with the ER, although the functional significance of this association was not known.

The researchers showed that when misfolded BMP4 forms in the ER, NGLY1 is recruited to the ER through interaction with another protein called VCP.

Using laboratory-made NGLY1 mutations that impair NGLY1's ability to bind VCP and be recruited to the ER without affecting its ability to remove N-glycan groups, the researchers showed that a perfectly functional NGLY1 that cannot be recruited to the ER cannot remove N-glycan groups from misfolded BMP4 molecules. This leads to the accumulation of misfolded BMP4 molecules in the ER and induction of ER stress, both of which may contribute to disease.

"Importantly, pharmacological inhibition of proteasomal function resulted in accumulation of de-glycosylated BMP4 in the cell, but did not impair BMP4 signaling, strongly suggesting that the critical function of NGLY1 in BMP4 signaling is to help remove misfolded BMP4 molecules from the ER," Galeone said. "Once these molecules are in the cytosol, they do not inhibit normal BMP4 signaling anymore, whether they are degraded by the proteasome or not."

These and other experiments led the researchers to propose that in normal conditions, accumulation of misfolded BMP4 in the ER triggers recruitment of NGLY1 to the ER. The ER-associated NGLY1 removes the N-glycan groups from the misfolded BMP4 molecules, promoting their transfer into the cytosol. This in turn allows properly folded BMP4 molecules to traffic from the ER to the extracellular space, where they will conduct their function.

A better understanding of NGLY1 deficiency

Before this study, NFE2L1 was the only biologically relevant, direct target of NGLY1 that had been identified in animals. NFE2L1 is critical in the activation of proteasomal gene expression and can only function when its N-glycans are removed by NGLY1.

The researchers' findings identify a new critical target of NGLY1 and indicate that there is a division of labor in the function of NGLY1: only NGLY1 molecules recruited to the ER can remove glycan groups from BMP4, but NGLY1 molecules in the cytosol can remove glycans from NFE2L1, even when they are not recruited to the ER.

"This suggests that loss of NGLY1 not only leads to the accumulation of misfolded proteins in the cytosol, but can also result in the accumulation of other not-yet-identified NGLY1 targets in the ER," Jafar-Nejad said.

The study also suggests that, in addition to mutations that inactivate NGLY1, mutations that affect NGLY1's ability to be recruited to the ER might also cause some of the characteristics of NGLY1 deficiency observed in human patients.

Therefore, mutations that abolish NGLY1's binding to VCP, but spare its enzymatic activity, might cause a yet-to-be-determined subset of the characteristics of NGLY1 deficiency observed in human patients.

"Identification of a new direct target of NGLY1 with broad roles in mammalian biology may help explain how NGLY1 deficiency affects multiple organs in human patients, and potentially guide the discovery of therapeutic approaches," Galeone said.

BMP4 not only plays critical roles in animal development but also is implicated in certain cancers, such as ovarian and esophageal malignancies. Discovering important pathways involved in rare conditions such as NGLY1 deficiency also can benefit research on common diseases in which those pathways are involved.

Credit: 
Baylor College of Medicine

Single-cell RNA sequencing sheds new light on cancer cells' varied response to chemotherapy

Chemotherapy works by attacking rapidly dividing cells within the body. But small pockets of cancer cells can withstand its assault, allowing the cancer eventually to return.

Gaining a better understanding of why some cancer cells survive while others die is critical for making chemotherapy more effective, says Jun Hee Lee, Ph.D., a cancer researcher at the University of Michigan Rogel Cancer Center.

Using a technique called single-cell RNA sequencing, a research team from U-M was able to show for the first time how individual cells within a single population of cancer cells respond differently to the DNA damage caused by chemotherapy. The responses, they found, fall into three groups: activating genes that control cell death, cell division, or stress response, according to findings published in Cell Reports.

"Collectively, we observed that cells with different fates actually had completely distinct sets of activated genes and that these different 'transcriptomic landscapes' dictate the fates of cells after DNA damage from chemotherapy," says Lee, co-senior author of the study and an associate professor of molecular and integrative physiology at Michigan Medicine.

While DNA contains the complete instruction manual for the cell, sequences that are transcribed into RNA tell the story of which genes are switched on or off at a given time -- that is, which sets of individual instructions are being acted upon. The transcriptome is the complete set of these RNA sequences within a given cell.

Applying single-cell techniques

Among scientists, single-cell analysis is frequently compared to a fruit smoothie, Lee notes. Many types of studies measure characteristics or responses across a group of cells -- a mixture of individual players that contribute to a greater whole, like fruit in a smoothie. This can provide useful information, but can also obscure differences between and among the individual contributors -- the cell-level equivalent of the strawberries and blueberries and bananas in the smoothie. Single-cell techniques allow those individual differences to be teased out.

The study analyzed more than 10,000 cells from three colon cancer cell lines. The cells were exposed to the chemotherapy agent fluorouracil, which is commonly used against colon cancer and other types of cancer. Some of the observations were replicated with additional techniques and different chemotherapy drugs.

"Previously, the scientific consensus was that DNA damage leads to a fairly uniform transcriptional response, which leads to different cell fates in a passive way, based on the given levels of gene expression in the cell," Lee says. "In contrast, we found that different DNA damage response genes were often upregulated only in the subset of cells that all share a particular cell fate."

The group is conducting ongoing research to understand what factors that cause some cells have one fate and others to have a different fate.

"If we learn that there's a certain sub-population of cells with specific characteristics that allow them to survive chemotherapy when other cells die, then scientists might look for ways to target those cells specifically," Lee says

Making data available to other researchers

The research team, which was co-led by Hyun Min Kang, Ph.D., an associate professor of biostatistics at the School of Public Health, is also making their data available online for other researchers.

"For instance, other scientists can examine how individual genes are expressed across single cells before and after chemotherapy treatments. and how the specific gene expression is correlated with the chemotherapy dose, or with the expression of other genes," Kang says. "The online tool can also be used by researchers to test new hypotheses and generate new data -- and therefore has the potential to accelerate future research on DNA damage responses."

Credit: 
Michigan Medicine - University of Michigan