Tech

CNIO and CNIC find clues to clarify why cohesine has a role in cancer and cardiac development

image: Transversal sections of wild type (WT) and STAG2 knock out (KO) murine embryos at day 9.5 of gestation stained with antibodies against STAG2 (top) or hematoxilin-eosin (H.E, bottom) reveal the smaller size and the malformation of hearts developed in the absence of cohesin STAG2.

Image: 
CNIO

Ana Losada, from the Spanish National Cancer Research Centre (CNIO), is the scientist who identified cohesin in vertebrates, a protein that is essential in cell division. Losada has studied cohesin since she identified it in vertebrates in 1998, at the Cold Spring Harbor Laboratory in New York. "It is fascinating," she says. "Now we know that cohesin plays a role in several types of cancer, and a large number of researchers worldwide have become greatly interested in this protein complex."

This week, Losada, Head of the Chromosome Dynamics Group at CNIO, is publishing a paper about her team's new finding on the role of cohesin in the high-impact journal Cell Reports. Co-authors are Paco Real, Head of the Epithelial Carcinogenesis Group at CNIO, and Miguel Manzanares, Head of the Functional Genomics Research Group at the Spanish National Centre for Cardiovascular Research (CNIC), currently at the Severo Ochoa Centre for Molecular Biology (CBMSO). Andrés Hidalgo, Head of the Imaging the Cardiovascular Inflammation and the Immune Response at CNIC has also participated in the study.

Cohesin has a role in chromosome segregation, forming a ring that "embraces" DNA. The classical image of a dividing cell shows the chromatids of duplicated chromosomes held together at the centromere forming an X. Cohesin holds the sister chromatids together in a bond that is dissolved at the end of the cell division process and in this way ensures their equal distribution to daughter cells.

Cohesin was first discovered in budding yeast, a single-cell microorganism. Losada identified it in an amphibian, the African clawed frog (Xenopus laevis), whose cohesin is similar to that of humans.

A new field of research was opened that has since then attracted the interest of several research teams worldwide. There is a lot to study in this field. Cohesin "has been evolutionarily conserved in very different species for millions of years, which means it has a very important role," Losada says.

Soon afterwards, other functions of cohesin were discovered. In addition to holding the two chromosome copies together until the daughter cells separate at the end of the cell division process, cohesin "participates in genome architecture, in DNA spatial organisation," Losada adds. Cohesin participates in the appropriate folding of chromosomes, which is essential for the information encoded in the genome to be read at the right moment and in the right cell types.

Approximately five years ago, mutations in genes encoding cohesin were identified in several types of cancer. In bladder cancer, for instance, cohesin mutations are very common, as found by the CNIO groups headed by Paco Real and Núria Malats. Furthermore, mutations in genes encoding cohesin and its regulators are associated with rare diseases known as cohesinopathies. These findings have attracted the interest of more research groups worldwide.

Currently, Losada is no longer studying frogs but mice, the main animal model at CNIO. Her study throws light on the mechanisms of cohesin function and what happens when it does not work properly.

"Actually, cohesin is a protein complex made of multiple subunits," Losada explains. "Cells have two versions of this complex; one of them carries the STAG1 subunit, and the other, the STAG2 subunit. Although they are similar, they are not identical, so we want to understand why they are there, what they do and what the consequences are if the cell loses one or the other."

Essential in embryos, nonessential in adults

The answers to these questions motivated by curiosity will most probably help scientists understand the biology of cancer and other diseases.

In the new article published in Cell Reports, Losada, Real and Manzanares report the generation of a mouse model where they were able to inactivate the STAG2 subunit in different development stages, including the adult stage. They found that STAG2 was mostly nonessential in adult animals: its inactivation in all tissues did not cause tumours, but was detrimental to the health of the animals.

In embryos, however, STAG2 is essential. This means that embryos lacking STAG2 die before birth; their overall development is delayed, and they show severe defects in cardiac development.

The finding may seem just a small puzzle piece, but it will probably help scientists understand the functioning of cells and organisms better. "Our results provide substantial evidence of the specific functions performed by the different subunits of the cohesin complex in different cells and tissues and how their malfunction is associated with disease," the authors of the article add.

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

How airplanes counteract St. Elmo's Fire during thunderstorms

CAMBRIDGE, MA -- At the height of a thunderstorm, the tips of cell towers, telephone poles, and other tall, electrically conductive structures can spontaneously emit a flash of blue light. This electric glow, known as a corona discharge, is produced when the air surrounding a conductive object is briefly ionized by an electrically charged environment.

For centuries, sailors observed corona discharges at the tips of ship masts during storms at sea. They coined the phenomenon St. Elmo's fire, after the patron saint of sailors.

Scientists have found that a corona discharge can strengthen in windy conditions, glowing more brightly as the wind further electrifies the air. This wind-induced intensification has been observed mostly in electrically grounded structures, such as trees and towers. Now aerospace engineers at MIT have found that wind has an opposite effect on ungrounded objects, such as airplanes and some wind turbine blades.

In some of the last experiments performed in MIT's Wright Brothers Wind Tunnel before it was dismantled in 2019, the researchers exposed an electrically ungrounded model of an airplane wing to increasingly strong wind gusts. They found that the stronger the wind, the weaker the corona discharge, and the dimmer the glow that was produced.

The team's results were first published in the Journal of Geophysical Research: Atmospheres on July 28. The study's lead author is Carmen Guerra-Garcia, an assistant professor of aeronautics and astronautics at MIT. Her co-authors at MIT are Ngoc Cuong Nguyen, a senior research scientist; Theodore Mouratidis, a graduate student; and Manuel Martinez-Sanchez, a post-tenure professor of aeronautics and astronautics.

Electric friction

Within a storm cloud, friction can build up to produce extra electrons, creating an electric field that can reach all the way to the ground. If that field is strong enough, it can break apart surrounding air molecules, turning neutral air into a charged gas, or plasma. This process most often occurs around sharp, conductive objects such as cell towers and wing tips, as these pointed structures tend to concentrate the electric field in a way that electrons are pulled from surrounding air molecules toward the pointed structures, leaving behind a veil of positively charged plasma immediately around the sharp object.

Once a plasma has formed, the molecules within it can begin to glow via the process of corona discharge, where excess electrons in the electric field ping-pong against the molecules, knocking them into excited states. In order to come down from those excited states, the molecules emit a photon of energy, at a wavelength that, for oxygen and nitrogen, corresponds to the characteristic blueish glow of St. Elmo's fire.

In previous laboratory experiments, scientists found that this glow, and the energy of a corona discharge, can strengthen in the presence of wind. A strong gust can essentially blow away the positively charged ions, that were locally shielding the electric field and reducing its effect -- making it easier for electrons to trigger a stronger, brighter glow.

These experiments were mostly carried out with electrically grounded structures, and the MIT team wondered whether wind would have the same strengthening effect on a corona discharge that was produced around a sharp, ungrounded object, such as an airplane wing.

To test this idea, they fabricated a simple wing structure out of wood and wrapped the wing in foil to make it electrically conductive. Rather than try to produce an ambient electric field similar to what would be generated in a thunderstorm, the team studied an alternative configuration in which the corona discharge was generated in a metal wire running parallel to the length of the wing, and connecting a small high-voltage power source between wire and wing. They fastened the wing to a pedestal made from an insulating material that, because of its nonconductive nature, essentially made the wing itself electrically suspended, or ungrounded.

The team placed the entire setup in MIT's Wright Brothers Wind Tunnel, and subjected it to increasingly higher velocities of wind, up to 50 meters per second, as they also varied the amount of voltage that they applied to the wire. During these tests, they measured the amount of electrical charge building up in the wing, the current of the corona and also used an ultraviolet-sensitive camera to observe the brightness of the corona discharge on the wire.

In the end, they found that the strength of the corona discharge and its resulting brightness decreased as the wind increased -- a surprising and opposite effect from what scientists have seen for wind acting on grounded structures.

Pulled against the wind

The team developed numerical simulations to try and explain the effect, and found that, for ungrounded structures, the process is largely similar to what happens with grounded objects -- but with something extra.

In both cases, the wind is blowing away the positive ions generated by the corona, leaving behind a stronger field in the surrounding air. For ungrounded structures, however, because they are electrically isolated, they become more negatively charged. This results in a weakening of the positive corona discharge. The amount of negative charge that the wing retains is set by the competing effects of positive ions blown by the wind and those attracted and pulled back as a result of the negative excursion. This secondary effect, the researchers found, acts to weaken the local electric field, as well as the corona discharge's electric glow.

"The corona discharge is the first stage of lightning in general," Guerra-Garcia says. "How corona discharge behaves is important and kind of sets the stage for what could happen next in terms of electrification."

In flight, aircraft such as planes and helicopters inherently produce wind, and a glow corona system like the one tested in the wind tunnel could actually be used to control the electrical charge of the vehicle. Connecting to some prior work by the team, she and her colleagues previously showed that if a plane could be negatively charged, in a controlled fashion, the plane's risk of being struck by lightning could be reduced. The new results show that charging of an aircraft in flight to negative values can be achieved using a controlled positive corona discharge.

''The exciting thing about this study is that, while trying to demonstrate that the electrical charge of an aircraft can be controlled using a corona discharge, we actually discovered that classical theories of corona discharge in wind do not apply for airborne platforms, that are electrically isolated from their environment," Guerra-Garcia says. "Electrical breakdown occurring in aircraft really presents some unique features that do not allow the direct extrapolation from ground studies."

Credit: 
Massachusetts Institute of Technology

'Insect apocalypse' may not be happening in US

image: Scientists have been warning about an 'insect apocalypse' in recent years, noting sharp declines in specific areas -- particularly in Europe. A new study shows these warnings may have been exaggerated and are not representative of what's happening to insects on a larger scale.

Image: 
UGA

Scientists have been warning about an "insect apocalypse" in recent years, noting sharp declines in specific areas -- particularly in Europe. A new study shows these warnings may have been exaggerated and are not representative of what's happening to insects on a larger scale.

University of Georgia professor of agroecology Bill Snyder sought to find out if the so-called "insect apocalypse" is really going to happen, and if so, had it already begun. Some scientists say it might be only 30 years before all insects are extinct, so this is a really important and timely question for agriculture and conservation.

Snyder and a team of researchers from UGA, Hendrix College and the U.S. Department of Agriculture used more than 5,300 data points for insects and other arthropods -- collected over four to 36 years at monitoring sites representing 68 natural and managed areas -- to search for evidence of declines across the United States.

Some groups and sites showed increases or decreases in abundance and diversity, but many remained unchanged, yielding net abundance and biodiversity trends generally indistinguishable from zero. This lack of overall increase or decline was consistent across arthropod feeding groups, and was similar for heavily disturbed versus relatively natural sites. These results were recently published in "Nature Ecology and Evolution."

Local observations

The idea for the study started last year with a cross-country road trip for Snyder from Washington state to his new home in Georgia.

"I had the same observation a lot of people had. We had our drive across the country -- you don't see as many insects squished on your car or windshield."

When he got to his home in Bishop, Georgia, it seemed like a different story.

"I noticed the lights outside were full of insects, as many as I remember as a kid," he said. "People have this notion -- there seems like [there are] fewer insects -- but what is the evidence?"

There is some alarming evidence that European honey bees have problems, but Snyder was curious if insects everywhere are in decline. "We depend on insects for so many things," he said. "If insects disappear it would be really, really bad. Maybe the end of human existence."

He was discussing the topic with another biologist and friend, Matthew Moran at Hendrix College, and they recalled the U.S. National Science Foundation's network of Long-Term Ecological Research (LTER) sites, which were established in 1980 and encompass a network of 25 monitoring locations across each of the country's major ecoregions.

Ecological sampling

The NSF's LTER data is publicly available, but has not previously been gathered into a single dataset to be examined for evidence of broad-scale density and biodiversity change through time until now.

Arthropod data sampled by the team included grasshoppers in the Konza Prairie in Kansas; ground arthropods in the Sevilleta desert/grassland in New Mexico; mosquito larvae in Baltimore, Maryland; macroinvertebrates and crayfish in North Temperate Lakes in Wisconsin; aphids in the Midwestern U.S.; crab burrows in Georgia coastal ecosystems; ticks in Harvard Forest in Massachusetts; caterpillars in Hubbard Brook in New Hampshire; arthropods in Phoenix, Arizona; and stream insects in the Arctic in Alaska.

The team compared the samples with human footprint index data, which includes multiple factors like insecticides, light pollution and built environments to see if there were any overall trends.

"No matter what factor we looked at, nothing could explain the trends in a satisfactory way," said Michael Crossley, a postdoctoral researcher in the UGA department of entomology and lead author of the study. "We just took all the data and, when you look, there are as many things going up as going down. Even when we broke it out in functional groups there wasn't really a clear story like predators are decreasing or herbivores are increasing."

"This is an implication for conservation and one for scientists, who have been calling for more data due to under-sampling in certain areas or certain insects. We took this opportunity to use this wealth of data that hasn't been used yet," explained Crossley, an agricultural entomologist who uses molecular and geospatial tools to understand pest ecology and evolution and to improve management outcomes. "There's got to be even more data sets that we don't even know about. We want to continue to canvass to get a better idea about what's going on."

Good and bad news

To answer Snyder's broad question of, "Are there overall declines?" No, according to the study. "But we're not going to ignore small changes," Snyder said. "It's worthwhile to differentiate between the two issues."

Particular insect species that we rely on for the key ecosystem services of pollination, natural pest control and decomposition remain unambiguously in decline in North America, the authors note.

In Europe, where studies have found dramatic insect declines, there may be a bigger, longer-term impact on insects than the U.S., which has a lower population density, according to Snyder.

"It's not the worst thing in the world to take a deep breath," suggested Snyder. "There's been a lot of environmental policies and changes. A lot of the insecticides used in agriculture now are narrow-acting. Some of those effects look like they may be working."

When it comes to conservation, there's always room for everyone to pitch in and do their part.

"It's hard to tell when you're a single homeowner if you're having an effect when you plant more flowers in your garden," he said. "Maybe some of these things we're doing are starting to have a beneficial impact. This could be a bit of a hopeful message that things that people are doing to protect bees, butterflies and other insects are actually working."

Credit: 
University of Georgia

Physicists cast doubt on neutrino theory

image: UC associate professor Alexandre Sousa holds a neutrino model that changes color when tossed in the air to demonstrate how neutrinos change "flavor."

Image: 
Joseph Fuqua II/UC Creative + Brand

University of Cincinnati physicists, as part of an international research team, are raising doubts about the existence of an exotic subatomic particle that failed to show up in twin experiments.

UC College of Arts and Sciences associate professor Alexandre Sousa and assistant professor Adam Aurisano took part in an experiment at the Fermi National Accelerator Laboratory in search of sterile neutrinos, a suspected fourth "flavor" of neutrino that would join the ranks of muon, tau and electron neutrinos as elementary particles that make up the known universe.

Finding a fourth type of neutrino would be huge, Sousa said. It would redefine our understanding of elementary particles and their interactions in what's known as the Standard Model.

Researchers in two experiments called Daya Bay and MINOS+ collaborated on complementary projects in an intense effort to find sterile neutrinos using some of the world's most advanced and precise tools.

"We apparently don't see any evidence for them," Aurisano said.

The study was published in the journal Physical Review Letters and was featured in Physics Magazine, published by the American Physical Society.

"It's an important result for particle physics," Sousa said. "It provides an almost definitive answer to a question that has been lingering for over 20 years."

The research builds on previous studies that offered tantalizing possibilities for finding sterile neutrinos. But the new results suggest sterile neutrinos might not have been responsible for the anomalies researchers previously observed, Aurisano said.

"Our results are incompatible with the sterile neutrino interpretation of the anomalies," he said. "So these experiments remove a possibility - the leading possibility - that oscillations into sterile neutrinos solely explain these anomalies."

Neutrinos are tiny, so tiny they can't be broken down into something smaller. They are so small that they pass through virtually everything - mountains, lead vaults, you - by the trillions every second at virtually the speed of light. They are generated by the nuclear fusion reactions powering the sun, radioactive decays in nuclear reactors or in the Earth's crust, and in particle accelerator labs, among other sources.

And as they travel, they often transition from one type (tau, electron, muon) to another or back.

But theorists have suggested there might be a fourth neutrino that interacts only with gravity, making them far harder to detect than the other three that also interact with matter through the weak nuclear force.

The experiment Daya Bay is composed of eight detectors arrayed around six nuclear reactors outside Hong Kong. MINOS+ uses a particle accelerator in Illinois to shoot a beam of neutrinos 456 miles through the curvature of the Earth to detectors waiting in Minnesota.

"We would all have been absolutely thrilled to find evidence for sterile neutrinos, but the data we have collected so far do not support any kind of sterile neutrino oscillation," said Pedro Ochoa-Ricoux, associate professor at the University of California, Irvine.

Researchers expected to see muon neutrinos seemingly vanish into thin air when they transitioned into sterile neutrinos. But that's not what happened.

"We expected to see muon neutrinos oscillating to sterile neutrinos and disappear," Aurisano said.

Despite the findings, Aurisano said he thinks sterile neutrinos do exist, at least in some form.

"I think sterile neutrinos are more likely than not to exist at high energies. At the very beginning of the universe, you'd expect there would be sterile neutrinos," he said. "Without them, it's hard to explain aspects of neutrino mass."

But Aurisano is skeptical about finding light sterile neutrinos that many theorists expected them to find in the experiments.

"Our experiment disfavors light or lower-mass sterile neutrinos," he said.

Sousa said some of his research was truncated somewhat by the global COVID-19 pandemic when Fermilab shut down accelerator operations months earlier than expected. But researchers continued to use massive supercomputers to examine data from the experiments, even while working from home during the quarantine.

"It's one of the blessings of high energy physics," Aurisano said. "Fermilab has all the data online and the computing infrastructure is spread out around the world. So as long as you have the internet you can access all the data and all the computational facilities to do the analyses."

Still, Aurisano said it takes some adjusting to work from home.

"It was easier when I had dedicated hours at the office. It's a challenge sometimes to work from home," he said.

Credit: 
University of Cincinnati

COMBAT study preliminary results show response of 32% in treatment of pancreatic tumors

SCOTTSDALE, Ariz. -- Aug 11, 2020 -- Working with an international team of researchers, HonorHealth Research Institute and the Translational Genomics Research Institute (TGen), an affiliate of City of Hope, were instrumental in one of the first clinical trials showing how pancreatic cancer patients can benefit from immunotherapy, according to a four-year study published in a premier scientific journal, Nature Medicine.

The "COMBAT trial" (NCT02826486) is a prospective, open label, phase IIa clinical trial for patients with metastatic pancreatic cancer, meaning their cancer had spread to other parts of the body. Patients were given pembrolizumab, an immune therapy drug, in combination with BL-8040, an agent that makes the tumor microenvironment more receptive to immune therapy.

The study was conducted in Arizona at the HonorHealth Research Institute and at 30 other locations in the U.S. and across the globe, including Spain, Israel and South Korea.

The two-part clinical trial began in September 2016:

Cohort 1, a group of 37 patients whose cancer had already progressed on other therapies, were treated with pembrolizumab and BL-8040. Importantly, it appeared this combination therapy made pancreatic cancer more "hot," meaning it could work in tandem with the body's own immune system. Previous studies have shown pancreatic tumors to be "cold," meaning immune therapies like pembrolizumab were not able to act on the cancer.

Preliminary results of Cohort 2 were reported in the manuscript on a group of 22 patients (out of approximately 40 patients in total expected in the cohort), who had previously received one line of chemotherapy. These patients received pembrolizumab and BL-8040, as well as chemotherapy drugs 5-fluorouracil and nano-liposomal irinotecan.

"The percentage of meaningful tumor shrinkage was 32% in Cohort 2, which is double what is available for individuals with pancreatic cancer with traditional chemotherapy. While the study is small, these preliminary results are encouraging and there is hope that we will be able to do larger trials to see if the response to therapy is high and if it is better in comparison to traditional treatment," said Erkut Borazanci, M.D., M.S., a medical oncologist and physician-investigator at HonorHealth Research Institute, a clinical associate professor at TGen, and one of the paper's authors.

This clinical trial is currently in a follow-up phase of the study.

Pancreatic cancer is an aggressive disease that carries a high mortality rate. It is the third-leading cause of cancer death in the U.S., following lung and colorectal cancers. In 2020, the five-year survival rate for pancreatic cancer is 10%, which has increased from 6% in 2014.

Next steps for this research would be to compare this COMBAT combination therapy in future studies to other treatment options, such 5-fluorouracil, leucovorin and nano-liposomal irinotecan.

COMBAT derives its name from letters in one of the study's descriptions: Combination of BL-8040 and Pembrolizumab in Patients with Metastatic Pancreatic Cancer.

Credit: 
The Translational Genomics Research Institute

NASA finds Mekkhala coming apart after landfall in Southeastern China

image: NASA-NOAA's Suomi NPP satellite provided forecasters with a visible image of Tropical Storm Mekkhala as it was making landfall in southeastern China.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA-NOAA's Suomi NPP satellite provided forecasters with a visible image of former Typhoon Mekkhala shortly after it made landfall in southeastern China. Wind shear had torn the storm apart.

Mekkhala made landfall in Fujian, southeastern China, bringing strong winds and torrential rain. According to the China Meteorological Agency, the typhoon came ashore in coastal areas of Zhangpu County at around 7:30 a.m. local time on Aug. 11 (7:30 p.m. EDT on Aug. 10). The storm generated at least 170 mm (6.7 inches) of rainfall in Zhangpu County by the middle of the day on Aug. 11.

At 11 p.m. EDT on Aug. 10 (0300 UTC, Aug. 11) the Joint Typhoon Warning Center (JTWC) issued their final bulletin on Mekkhala. At that time, the storm was centered near latitude 24.1 degrees north and longitude 117.7 degrees east, about 216 nautical miles west-southwest of Taipei, Taiwan. Mekkhala's maximum sustained winds were near 70 knots (81 mph/129 kph) at landfall. It continued to move to the north-northwest.

Wind Shear Tearing at Mekkhala

The shape of a tropical cyclone provides forecasters with an idea of its organization and strength. When outside winds batter a storm, it can change the storm's shape and push much of the associated clouds and rain to one side of it which is what wind shear does.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels.

Suomi NPP's Satellite View

On Aug. 11, the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard NASA-NOAA's Suomi NPP satellite showed a somewhat shapeless storm. Satellite imagery showed that Typhoon Mekkhala was quickly becoming sheared, with the outer low-level bands of thunderstorms on the north side of the system exposed, but with deep convection and strong thunderstorms remaining in place over and south of the center.

The JTWC expects Mekkhala to dissipate within the next day or two.

Credit: 
NASA/Goddard Space Flight Center

New approach for calculating radiation dosimetry allows for individualized therapy

image: (A) Initial coarse alignment of full image volume for serial 4-, 24-, and 96-h quantitative SPECT images. (B) Output of focused image registration based on contour-defined mask (dashed region of serial maximum-intensity projections in A). Three-color maximum-intensity projection is used to verify alignment for serial quantitative SPECT images, with 3-dimensional contour region highlighted. Black indicates good alignment whereas color-fringing shows areas with spatial offset. (C) Three-phase exponential curve is generated and parameters are used for subsequent population analysis.

Image: 
P. Jackson, M. Hofman et al., Peter MacCallum Cancer Centre, Melbourne, Australia

Reston, VA--Researchers have developed a simplified process that could enhance personalization of cancer therapy based on a single nuclear medicine scan. The novel method uses a follow-up single photon emission computed tomography/computed tomography (SPECT/CT) scan to obtain reliable radiation dose estimates to tumors and at-risk organs. The study is published in The Journal of Nuclear Medicine.

"In most radionuclide therapies, a fixed amount of radionuclide is administered, which does not take into account individual patient variability in tumor and normal tissue uptake, and hence absorbed dose," said John Violet, BSc, MBBS, MRCP, FRCR, PhD, FRANZCR, radiation oncologist at the Peter MacCallum Cancer Centre in Melbourne, Australia. "'Dosimetry-led' administration, on the other hand, where administered activity is adjusted based upon predictable tumor and normal tissue absorbed doses, has the potential to greatly improve therapeutic outcomes. As dosimetry requires multiple SPECT/CT scans post-therapy, however, it is rarely performed outside clinical trials."

Rather than rely on multiple SPECT/CT scans to calculate dosimetry, researchers sought to develop a dose-response estimation from a single post-treatment scan. The study presented a simplified dosimetry method and applied it to the assessment of 177Lu-PSMA-617 therapy for metastatic prostate cancer. 177Lu-PSMA-617 therapy is known to be a highly active therapy for PSMA-expressing metastatic prostate cancers, but currently there is no standardization of how treatment is given.

In the study, 29 patients receiving 177Lu-PSMA-617 were imaged with SPECT/CT at four, 24 and 96 hours to characterize tracer pharmacokinetics in tumors and in normal tissues of interest. To evaluate serial measurements, three-dimensional volumes of interest were defined on the first image volume and a method was developed to automatically align each contoured subregion and assess pharmacokinetics. This data was used to develop a population PSMA clearance model for tumor and healthy organs.

By applying a generic pharmacokinetic model for PSMA retention, it is possible to estimate radiation absorbed dose with a single image measurement. The range of added uncertainty can be appreciated based on the variability in population curves when normalized to the single time point. Tumor dose estimates were most accurate using delayed scanning at times beyond 72 hours. Dose to healthy tissues was best characterized by scanning patients in the first two days of treatment because of the larger degree of tracer clearance in this early phase.

"This work shows for the first time that following Lu-PSMA administration, it is possible to determine a good estimate of activity and absorbed dose with just a single measurement," noted Violet. "Simplification of dosimetry methods to make them a routine application would greatly assist in developing radiobiological models specific to Lu-PSMA, and other radionuclide therapies, allowing optimization of therapy for individual patients that are maximally effective and safe."

Improved understanding of radiation dose-response to radionuclide therapy may provide a rationale for moving away from fixed activity prescriptions. This would allow clinicians to tailor treatments based on an individual patient's physiology so successive treatment cycles could be adjusted to yield improved treatment outcomes. "The routine application of radiation dosimetry would be an important step in achieving the goal of 'dosimetry-led' treatment, providing significant improvements in both the safety and efficacy of treatment," explained Violet. "The adoption of routine dosimetry across molecular imaging and nuclear medicine in general would be a major advance in the field for both the clinical application of established therapies and the optimization of novel treatments."

Credit: 
Society of Nuclear Medicine and Molecular Imaging

Stack and twist: physicists accelerate the hunt for revolutionary new materials

Scientists at the University of Bath have taken an important step towards understanding the interaction between layers of atomically thin materials arranged in stacks. They hope their research will speed up the discovery of new, artificial materials, leading to the design of electronic components that are far tinier and more efficient than anything known today.

Smaller is always better in the world of electronic circuitry, but there's a limit to how far you can shrink a silicon component without it overheating and falling apart, and we're close to reaching it. The researchers are investigating a group of atomically thin materials that can be assembled into stacks. The properties of any final material depend both on the choice of raw materials and on the angle at which one layer is arranged on top of another.

Dr Marcin Mucha-Kruczynski who led the research from the Department of Physics, said: "We've found a way to determine how strongly atoms in different layers of a stack are coupled to each other, and we've demonstrated the application of our idea to a structure made of graphene layers."

The Bath research, published in Nature Communications, is based on earlier work into graphene - a crystal characterised by thin sheets of carbon atoms arranged in a honeycomb design. In 2018, scientists at the Massachusetts Institute of Technology (MIT) found that when two layers of graphene are stacked and then twisted relative to each other by the 'magic' angle of 1.1°, they produce a material with superconductive properties. This was the first time scientists had created a super-conducting material made purely from carbon. However, these properties disappeared with the smallest change of angle between the two layers of graphene.

Since the MIT discovery, scientists around the world have been attempting to apply this 'stacking and twisting' phenomenon to other ultra-thin materials, placing together two or more atomically different structures in the hope of forming entirely new materials with special qualities.

"In nature, you can't find materials where each atomic layer is different," said Dr Mucha-Kruczynski. "What's more, two materials can normally only be put together in one specific fashion because chemical bonds need to form between layers. But for materials like graphene, only the chemical bonds between atoms on the same plane are strong. The forces between planes - known as van der Waals interactions - are weak, and this allows for layers of material to be twisted with respect to each other."

The challenge for scientists now is to make the process of discovering new, layered materials as efficient as possible. By finding a formula that allows them to predict the outcome when two or more materials are stacked, they will be able to streamline their research enormously.

It is in this area that Dr Mucha-Kruczynski and his collaborators at the University of Oxford, Peking University and ELETTRA Synchrotron in Italy expect to make a difference.

"The number of combinations of materials and the number of angles at which they can be twisted is too large to try out in the lab, so what we can predict is important," said Dr Mucha-Kruczynski.

The researchers have shown that the interaction between two layers can be determined by studying a three-layer structure where two layers are assembled as you might find in nature, while the third is twisted. They used angle-resolved photoemission spectroscopy - a process in which powerful light ejects electrons from the sample so that the energy and momentum from the electrons can be measured, thus providing insight into properties of the material - to determine how strongly two carbon atoms at a given distance from each other are coupled. They have also demonstrated that their result can be used to predict properties of other stacks made of the same layers, even if the twists between layers are different.

The list of known atomically thin materials like graphene is growing all the time. It already includes dozens of entries displaying a vast range of properties, from insulation to superconductivity, transparency to optical activity, brittleness to flexibility. The latest discovery provides a method for experimentally determining the interaction between layers of any of these materials. This is essential for predicting the properties of more complicated stacks and for the efficient design of new devices.

Dr Mucha-Kruczynski believes it could be 10 years before new stacked and twisted materials find a practical, everyday application. "It took a decade for graphene to move from the laboratory to something useful in the usual sense, so with a hint of optimism, I expect a similar timeline to apply to new materials," he said.

Building on the results of his latest study, Dr Mucha-Kruczynski and his team are now focusing on twisted stacks made from layers of transition metal dichalcogenides (a large group of materials featuring two very different types of atoms - a metal and a chalcogen, such as sulphur). Some of these stacks have shown fascinating electronic behaviour which the scientists are not yet able to explain.

"Because we're dealing with two radically different materials, studying these stacks is complicated," explained Dr Mucha-Kruczynski. "However, we're hopeful that in time we'll be able to predict the properties of various stacks, and design new multifunctional materials."

Credit: 
University of Bath

Oscillatory optics: Nonlinear, multi-mode waveguide for flip-flopping (yet stable) azimuthons

image: Azimuthon transition between dipole and hexapole, from Zhang et al, doi 10.1117/1.AP.2.4.046002.

Image: 
Zhang et al.

The optical vortex plays an increasingly important role in optical information processing. As an information carrier, it improves the capacity of channels and offers an independent aspect for analysis--different from polarization, intensity, phase, and path. A new degree of freedom for encoding and encrypting optical information may be provided via nonlinear optics, using vortex beams known as azimuthons, which carry an orbital angular momentum and can now be made to exhibit a mutual conversion pattern known as Rabi oscillation.

A quantum effect named after 1944 Nobel Prize in Physics recipient Isidor Rabi, Rabi oscillation denotes a periodic motion--a kind of flopping--between two different energy levels in the presence of an oscillatory driving field. The effect is also investigated in photonics, where the oscillatory driving field can be mimicked by a slight periodic longitudinal modulation of the refractive index change of the medium. Until now, Rabi oscillations have mostly been studied in linear systems, but the most interesting phenomena tend to pop up in the nonlinear realm.

Azimuthons exhibit steady rotation upon propagation, but they are generally unstable, so they typically disintegrate during the process of propagation. A solution to that instability problem is found in nonlinearity, as proposed by a research team led by Yiqi Zhang from Xi'an Jiaotong University, who recently demonstrated weakly nonlinear waveguides that guarantee the stable propagation of azimuthons. Their report is published in Advanced Photonics.

Nonlinear, multimodal waveguide: from rotation to oscillation

The team determined specific requirements for a weak nonlinearity: (1) both the linear and nonlinear induced index changes are small compared to the ambient refractive index, and (2) the induced nonlinear index change is much smaller than the linear one. Their theoretical investigations demonstrated that the depth of the induced potential is tightly related to the transverse size of the waveguide. This indicated that multimode optical fibers could be used to obtain a deep potential energy well that would also allow for shallow modulation.

With these elements in mind, Zhang and colleagues developed a weakly nonlinear waveguide that is explicitly multimode. They paired various modes with different modal distributions. By introducing a π-phase shift and an amplitude modulation to one mode, and then superposing the other, they obtained an azimuthally modulated vortex. Thanks to the presence of nonlinearity and a weak longitudinally periodic modulation to the potential, the azimuthally modulated vortex rotates with a fixed angular velocity during propagation and exhibits Rabi oscillation (flopping) between the two modes.

The authors noted that the spatial symmetry of the propagating beam will change periodically in the process. According to coupled-mode theory, Rabi oscillations are mainly affected by the longitudinal modulation strength and the spatial symmetry of the superposed azimuthons. Without longitudinal modulation, the azimuthon rotates with a constant velocity and its profile is preserved. But, with slight longitudinal modulation, mutual conversion of different azimuthons can be observed during propagation: the Rabi oscillations.

Senior author Yiqi Zhang, associate professor at the School of Electronic Science and Engineering of Xi'an Jiaotong University, remarks, "Since our model supports higher order azimuthons with higher topological charges, there are many choices to select the orbital angular momentum carried by the light beams. However, the Rabi oscillation can only happen between azimuthons with certain profiles. We now have some ideas for how to overcome that limitation, so we'll continue our investigations."

Varieties of vortices

Beyond its promise for enhancing optical information processing, the demonstration of Rabi oscillations of azimuthons contributes to a better understanding of the rotational dynamics of vortices and the inherent role of nonlinearity in regulating those dynamics. Intrigued by spatial field manipulation in nonlinear optical systems, the authors note that their findings are important for potential advances in photonics, for example in helical waveguides or topological insulators. More broadly, their insights may contribute to improved understanding of such everyday phenomena as cyclones or tornados, or vortices forming in the wake of an airplane.

Credit: 
SPIE--International Society for Optics and Photonics

Giant photothermoelectric effect in silicon nanoribbon photodetectors

image: a, The structure diagram, the calculated hole temperature distribution and the photoelectric response of the photothermoelectric effect photodetector by silicon nanoribbons. The thickness of the silicon nanomaterial (black part) is 80 nm, and the two ends are gold electrodes (yellow part). The incident laser power density is 16 W cm-2. The hole temperature reaches the maximum value of 670 K in the center of the laser spot. b, The simulated open-circuit voltage dependence of incident laser intensity with different carrier-lattice interaction times and doping concentrations. c, The measured dependence of the open-circuit voltage on the incident laser intensity. The saturation behavior of the open-circuit voltage with increasing laser intensity can be explained by the saturation of carrier temperature. The carrier-lattice interaction time in the model is fitted to be 160 ps.

Image: 
by Wei Dai, Weikang Liu, Jian Yang, Chao Xu, Alessandro Alabastri, Chang Liu, Peter Nordlander, Zhiqiang Guan, and Hongxing Xu

Photoelectric conversion is an efficient and green way of energy conversion, which has important applications in light energy utilization and optical information devices. It is reported that up to 40% of the energy loss in the single-junction solar cells is due to the thermal loss of the carrier and the low light absorption of the device. The utilization of hot carrier energy in nanomaterials is the key to further improve the photoelectric conversion efficiency. Photothermoelectric effect is a photoelectric response driven by the gradient distribution of carrier temperature which is different from the lattice temperature. This effect has been reported in low dimensional nanomaterials, such as carbon nanotubes, III-V semiconductor nanowires, graphene, black phosphorus, and so on. Unfortunately, due to the low light absorption of low dimensional nanomaterials and the lack of high quality and reliable fabrication process, the photothermoelectric effect is still difficult to be applied in practical applications.

In a new paper published in Light Science & Application, a team of scientists, led by Academician Hongxing Xu and Associate Professor Zhiqiang Guan from School of Physics and Technology, Wuhan University, China, and co-workers have observed the giant photothermoelectric effect in the earth-abundant and CMOS compatible silicon nanomaterials. The photoelectric response of the photothermoelectric effect is 3-4 orders of magnitude higher than that reported previously, benefitting from the optimization process by the photo-thermal-electric multi-physics model with the carrier-lattice two-temperature model. It provides an important way for the practical application of photothermoelectric effect to improve photoelectric conversion efficiency by using hot carrier energy.

Photothermoelectric effect is a new type of photoelectric conversion mechanism, which uses the temperature gradient of photogenerated hot carriers to drive the directional movement of carriers and generate open-circuit voltage or short-circuit current photoelectric response. This effect usually occurs in low-dimensional nanomaterials due to the limited energy exchange between hot carriers and phonons by the suppressed phonon scattering and induces the carrier temperature different from the lattice temperature. Previous studies on hot carriers in silicon materials were mainly focused on the transient hot carrier temperature generated by short-pulse laser. The photothermoelectric effect based on the temperature distribution of steady hot carriers is rarely reported. These scientists summarized the reasons for the successful observation of giant photothermoelectric effects in silicon nanomaterials:

"The successful observation of photothermoelectric effect depends on ohmic electrode contact, appropriate doping concentration and size-limited carrier-phonon interaction. Ohmic electrode contact avoids the interference of the photovoltaic effect. The low doping concentration and the carrier-phonon interaction time up to 160 ps result in a stable temperature difference of 300K at the hot and cold end of the device and an open circuit photovoltage response of 105 V/W under 633 nm laser irradiation. The successful modeling of experimental results and the optimization of photothermoelectric effect by the photo-thermal-electric multi-physics model with the carrier-lattice two-temperature model are also the key to success."

"It is very interesting to study the generation, relaxation and transportation of photogenerated carriers in nanomaterials. How to utilize the energy of hot carriers is a great challenge. Although the reported photothermoelectric effect in this paper faces the problems of slow speed and low energy filling ratio, we hope the further in-depth study of the photothermoelectric effect and the synergistic effect of photothermoelectric effect with other photoelectric conversion mechanisms such as photovoltaic effect and with the plasmon-enhanced light absorption, will finally break the limit of photoelectric conversion efficiency by using hot carrier energy." they added.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

Gluten in wheat: What has changed during 120 years of breeding?

image: From the archive, the researchers selected five leading wheat varieties for each decade of the 120 years examined. In order to generate comparable samples, they cultivated the different varieties in 2015, 2016 and 2017 under the same geographical and climatic conditions.

Image: 
Katharina Scherf / Leibniz-LSB@TUM

In recent years, the number of people affected by coeliac disease, wheat allergy or gluten or wheat sensitivity has risen sharply. But why is this the case? Could it be that modern wheat varieties contain more immunoreactive protein than in the past? Results from a study by the Leibniz-Institute for Food Systems Biology at the Technical University of Munich and the Leibniz Institute of Plant Genetics and Crop Plant Research are helping to answer this question.

Wheat grains contain about 70 percent starch. Their protein content is usually 10 to 12 percent. Gluten accounts for the lion's share of proteins, at around 75 to 80 percent. Gluten is a compound mixture of different protein molecules. These can be roughly divided into two subgroups: "gliadins" and "glutenins".

It has long been known that wheat proteins can trigger disorders such as coeliac disease or wheat allergies. Approximately 1 or 0.5 percent of the adult population is affected worldwide. In addition, non-celiac gluten sensitivity (NCGS) is becoming increasingly important in the western world.

"Many people fear that modern wheat varieties contain more immunoreactive proteins than in the past and that this is the cause of the increased incidence of wheat-related disorders," says Darina Pronin from the Leibniz-Institute for Food Systems Biology, who was significantly involved in the study as part of her doctoral thesis. Where gluten is concerned, the protein group of gliadins in particular is suspected of causing undesired immune reactions, explains the food chemist.

60 wheat varieties from the period 1891 - 2010 analyzed

But how big are the differences between old and new wheat varieties really? In order to help clarify this, Katharina Scherf and her team at the Leibniz-Institute for Food Systems Biology investigated the protein content of 60 preferred wheat varieties from the period between 1891 and 2010. This was made possible by the Leibniz Institute of Plant Genetics and Crop Plant Research. It has an extensive seed archive. From the archive, the researchers selected five leading wheat varieties for each decade of the 120 years examined. In order to generate comparable samples, they cultivated the different varieties in 2015, 2016 and 2017 under the same geographical and climatic conditions.

Analyses by the team of scientists show that, overall, modern wheat varieties contain slightly less protein than old ones. In contrast, the gluten content has remained constant over the last 120 years, although the composition of the gluten has changed slightly. While the proportion of critically viewed gliadins fell by around 18 percent, the proportion of glutenins rose by around 25 percent. In addition, the researchers observed that higher precipitation in the year of the harvest was accompanied by a higher gluten content in the samples.

Environmental conditions are more relevant than the variety

"Surprisingly, environmental conditions such as precipitation had an even greater influence on protein composition than changes caused by breeding. In addition, at least on the protein level, we have not found any evidence that the immunoreactive potential of wheat has changed as a result of the cultivation factors," explains Katharina Scherf, who is now continuing her research as a professor at the Karlsruhe Institute of Technology (KIT). However, Scherf also points out that not all protein types contained in wheat have been investigated with regard to their physiological effects. Therefore, there is still a lot of research to be done.

Credit: 
Leibniz-Institut für Lebensmittel-Systembiologie an der TU München

Car passengers can reduce pollution risk by closing windows and changing route

Drivers and passengers can inhale significantly lower levels of air pollution by setting their vehicle's ventilation systems more effectively and taking a 'cleaner' route to their destination, a new study reveals.

Road transport emissions are major source of urban air pollution - nitrogen oxides (NOx) and particulate matter (PM) coming from vehicle exhausts, plus non-exhaust emissions such as brake dust, tyre wear and road dust in the case of PM.

Outdoor air pollution is estimated to contribute to 40,000 deaths in Britain annually and an estimated 7 million deaths globally - linked to diseases ranging from lung cancer to stroke and respiratory infection.

Scientists at the University of Birmingham have found that - if vehicle ventilation is set correctly - drivers and passengers are exposed to up to 49 % less PM2.5 and 34 % less Nitrogen Dioxide (NO2) than the on-road levels. They have published their findings in the journal Atmospheric Environment.

Lead author Dr. Vasileios Matthaios, from the University of Birmingham, commented: "Exposure to air pollution within the vehicle depends upon both the ventilation setting and the type of route. The lowest exposure to particles and gases is when the windows are closed with recirculation and air conditioning switched on.

"Drivers and passengers inhale more air pollution when traveling on urban roads, followed by ring-roads and sub-urban roads. However, because concentrations inside a vehicle are lower and occupants are not as active, they inhale less air pollution than people cycling or walking on the same routes."

Researchers explored within-vehicle levels of NO2 and PM2.5 under different vehicle ventilation settings and driving routes during real-world driving experiments around the city of Birmingham.

Four vehicles were driven on a consistent route of three contrasting road types, measuring simultaneous within-vehicle and ambient levels of particulate matter (PM10, PM2.5, PM1), ultrafine particles number (UFP), lung surface deposited area (LSDA), nitric oxide (NO) and nitrogen dioxide (NO2).

"Our findings show that vehicle passengers can modify their exposure and inhalation dose through ventilation setting and route choice - this may have significant health impacts upon the most exposed groups such as professional drivers," Dr. Vasileios Matthaios added.

Increasing urbanisation together with growth in vehicle ownership and passenger-journeys have contributed to growth in traffic-related ambient air pollution.

Researchers noted that related health issues depend on an individual's exposure to air pollution and the vulnerability of the individual to a given dose. This, in turn, depends on route selection, time of day, transport type, respiration rate and, in the case of vehicles, ventilation options and efficacy and type of cabin filters.

As part of the MMAP-VEX project 'Measuring and Modelling Air Pollution Within Vehicles - Implications for daily EXposure and Human Health', University of Birmingham researchers will further investigate other aspects that affect within vehicle air pollution under real world driving conditions, including:

Testing different types of cabin filters such as activated carbon and standard pollen cabin filters, to find their infiltration efficacy;

Influence of indoor air purifiers to within-vehicle exposure reduction; and

Passenger exposure variation at different driving locations, for example waiting at traffic lights, roundabouts and in tunnels.

Credit: 
University of Birmingham

Investigating a thermal challenge for MOFs

image: An illustration of metal organic frameworks (MOFS).

Image: 
Christopher E. Wilmer / University of Pittsburgh

To the naked eye, metal organic frameworks (MOFs) look a little like sand. But if you zoom in, you will see that each grain looks and acts more like a sponge--and serves a similar purpose. MOFs are used to absorb and hold gases, which is useful when trying to filter toxic gases out of the air or as a way to store fuel for natural gas- or hydrogen gas-powered engines.

New research led by an interdisciplinary team across six universities examines heat transfer in MOFs and the role it plays when MOFs are used for storing fuel. Corresponding author Christopher Wilmer, William Kepler Whiteford Faculty Fellow and assistant professor of chemical and petroleum engineering at the University of Pittsburgh's Swanson School of Engineering, coauthored the work with researchers at Carnegie Mellon University, the University of Virginia, Old Dominion University, Northwestern University, and the Karlsruhe Institute of Technology in Karlsruhe, Germany. The findings were recently published in Nature Communications.

"One of the challenges with using MOFs for fuel tanks in cars is that you have to be able to fill up in a few minutes or less," explains Wilmer. "Unfortunately, when you quickly fill these MOF-based tanks with hydrogen or natural gas they get very hot. It's not so much a risk of explosion--though there is one--but the fact that they can't store much gas when they're hot. The whole premise of using them to store a lot of gaseous fuel only works at room temperature. For other industrial applications you face a similar problem - whenever gases are loaded quickly the MOFs become hot and no longer work effectively."

In other words, for MOFs to be useful for these applications, they would need to be kept cool. This research looked at thermal transport in MOFs, to explore how quickly they can shed excess heat, and the group found some surprising results.

"When you take these porous materials, which to begin with are thermally insulating, and you fill them with gas, it appears that they become even more insulating. This is surprising because usually, empty pockets like those in insulation or double-paned windows provide good thermal insulation," explains Wilmer. "By taking porous materials and filling them, thereby removing those gaps, you would expect the thermal transport to improve, making it more thermally conductive. The opposite happens; they become more insulating."

To reach their conclusion, researchers conducted two simultaneous experiments using two different methods and MOFs synthesized in two different labs. Both groups observed the same trend: that the MOFs become more insulated when filled with adsorbates. Their experimental findings were also validated by atomistic simulations at Pitt in collaboration with Carnegie Mellon University.

"Our work indicates potential challenges ahead for the use of MOFs outside of research labs, but that is a necessary step in the process," says Alan McGaughey, professor of mechanical engineering at Carnegie Mellon. "As these materials advance toward broad, real-world usage, researchers will need to continue investigating once-overlooked properties of these materials, like thermal transport, and find the best way to use them to fit our needs."

Credit: 
University of Pittsburgh

Modelling parasitic worm metabolism suggests strategy for developing new drugs against infection

Scientists have revealed a way to eradicate parasitic worms by stopping them from using alternative metabolism pathways provided by bacteria that live within them, according to new findings published today in eLife.

The study has identified three potential drugs that are active against the parasitic worm Brugia malayi (B. malayi), a leading cause of disability in the developing world.

Latest figures from 2015 suggest an estimated 40 million people in the world have lymphatic filariasis (elephantiatis) caused by worms such as B. malayi, with an estimated one billion people at risk. Current prevention and treatment efforts rely on a small selection of drugs, but these have limited effectiveness and must be taken for 15 years, and there is an emerging threat of drug resistance.

"One alternative strategy for preventing lymphatic filariasis has been to use traditional antibiotics to target bacteria that live within most filarial worms," explains lead author David Curran, Research Associate at the Hospital for Sick Children (SickKids) in Toronto, Canada. "These bacteria, from the genus Wolbachia, are specific to each worm and are known to be essential for the worms to survive and reproduce."

While targeting the Wolbachia bacteria with antibiotics is a viable strategy, Curran adds that long treatment times and the unsuitability of these antibiotics for pregnant women and children prevent their widespread use, and there remains an urgent need to identify novel targets for treatments. In this study, he and his colleagues looked at targeting both the worm and the bacteria by identifying the essential biological processes provided by the bacteria that the worm depends on.

To do this, they built a model of all the metabolic pathways that take place in the worm and in its resident bacteria. They then systematically changed different components of the model, such as oxygen levels, glucose levels, and which enzymes were activated, to see the effects on the worm's growth. Their final model included 1,266 metabolic reactions involving 1,252 metabolites and 1,011 enzymes linked to 625 genes.

To cope with the different nutrient conditions, the worm adapted its use of different metabolic pathways - including those provided by the Wolbachia bacteria - throughout the different stages of its lifecycle. To see which of the metabolic reactions were critical for survival and reproduction, the team removed some of the possible pathways from the model. They identified 129 metabolic reactions that slowed the growth to less than 50% of the baseline level. Of these, 50 were metabolic processes provided by the Wolbachia bacteria.

Having identified these essential metabolic reactions, the team searched for drugs that could block crucial molecules involved in activating these reactions, using databases of existing drugs and their targets. They identified three drugs: fosmidomycin, an antibiotic and potential antimalarial drug; MDL-29951, a treatment being tested for epilepsy and diabetes; and tenofovir, which is approved for treating hepatitis B and HIV. These drugs reduced the numbers of Wolbachia bacteria per worm by 53%, 24% and 30%, respectively.

"We also found that two of the drugs, fosmidomycin and tenofovir, reduced the worm's reproductive ability," explains co-senior author Elodie Ghedin, previously Professor of Biology and Professor of Epidemiology at New York University, and now Senior Investigator at the National Institutes of Health, Maryland, US. "Fosmidomycin also appeared to affect movement in the worms."

"All three of the drugs tested appear to act against adult B. malayi worms by affecting the metabolism of the worms themselves or their resident bacteria," concludes co-senior author John Parkinson, Senior Scientist, Molecular Medicine program, SickKids, and Associate Professor, Biochemistry & Molecular and Medical Genetics, University of Toronto. "This validates our model as a realistic construction of the metabolic processes in these debilitating parasites, and suggests that its use may yield further therapeutic targets with more research."

Credit: 
eLife

NASA finds Jangmi now an Extra-Tropical Storm

image: NASA's Aqua satellite provided a visible image to forecasters of Extra-tropical storm Jangmi in the Sea of Japan on Aug. 11, 2020.

Image: 
Image Courtesy: NASA Worldview, Earth Observing System Data and Information System (EOSDIS).

NASA's Aqua satellite obtained a visible image of Tropical Storm Jangmi after it transitioned into an extra-tropical storm.

The Joint Typhoon Warning Center (JTWC) posted its final bulletin on Tropical Storm Jangmi on Aug. 10 at 11 a.m. EDT (1500 UTC). At that time, it was located near latitude 26.9 degrees north and longitude 130.4 degrees east, about 139 miles northeast of Chinhae, South Korea. Jangmi was speeding to the north-northeast at 29 knots and had maximum sustained winds of 35 knots (40 mph).

On Aug. 11, Jangmi had moved into the Sea of Japan and had become extra-tropical.

 What an Extra-tropical Storm Means

When a storm becomes extra-tropical, it means that a tropical cyclone has lost its "tropical" characteristics. The National Hurricane Center defines "extra-tropical" as a transition that implies both poleward displacement (meaning it moves toward the north or south pole) of the cyclone and the conversion of the cyclone's primary energy source from the release of latent heat of condensation to baroclinic (the temperature contrast between warm and cold air masses) processes. It is important to note that cyclones can become extratropical and retain winds of hurricane or tropical storm force.

 The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite captured a visible image of extra-tropical storm Jangmi in the Sea of Japan, near the Korea Strait. The Korea Strait is located between South Korea and Japan, connecting the East China Sea, the Yellow Sea and the Sea of Japan. The MODIS image showed that the storm had become somewhat elongated. Satellite imagery was created using NASA's Worldview product at NASA's Goddard Space Flight Center in Greenbelt, Md.

On Aug. 11, the Japan Meteorological Agency posted advisories along the prefectures in the Sea of Japan as Jangmi is forecast to move through it in a north-northeasterly direction and weaken. For updated warnings and watches, visit: https://www.jma.go.jp/en/warn/.

About NASA's Worldview and Aqua Satellite

NASA's Earth Observing System Data and Information System (EOSDIS) Worldview application provides the capability to interactively browse over 700 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks "right now."

NASA's Aqua satellite is one in a fleet of NASA satellites that provide data for hurricane research.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center