Earth

An itch to scratch: NCATS, NIDCR scientists identify potential approach to chronic problem

image: Nerves that stimulate skin are grouped in structures next to the spinal cord. Here, nerves in such a structure -- called a dorsal root ganglion -- that are involved in detecting an itch are labeled green. Nerves involved in sensing pain, temperature and other stimuli are shown in magenta.

Image: 
Hans Juergen Solinski, National Institute of Dental and Craniofacial Research

Chronic itch goes beyond being just a simple annoyance; it can greatly affect a person's quality of life. While scientists have some clues to its causes, effective therapies have been elusive.

Now, using a technique called quantitative high-throughput screening to sort through more than 86,000 compounds at the same time, researchers at NCATS and the National Institute of Dental and Craniofacial Research (NIDCR) report a new strategy that may eventually help alleviate chronic itch. They've shown that blocking a receptor, or docking station, found on the surface of both mouse and human spinal cord neurons could be key.

Several years ago, Mark Hoon, Ph.D., and his colleagues at NIDCR found a receptor, Npr1, on mouse spinal cord neurons for a protein associated with itch. The protein fit into Npr1 like a key into a lock, helping turn on the itch sensation. Npr1 appeared to be a potential target for drugs to halt itch.

Hoon contacted NCATS scientist James Inglese, Ph.D., and his team for help in identifying compounds that could block Npr1 activity. The researchers developed a series of assays, or tests, and used robots to screen compounds in human cells, finding approximately 1,400 molecules worth examining more closely. They then developed additional assays to narrow the list to 15 compounds. They showed a subset of these compounds could halt both human and mouse versions of the receptor from working. A follow-up study in mice showed that blocking the receptor reduced scratching.

Next, the scientists will examine more candidate compounds and determine how they block Npr1. They hope the findings will help them choose which compounds to study further and chemically modify as potential anti-itch drugs. Hoon, Inglese and their colleagues reported the results online July 10 in Science Translational Medicine.

"This is a proof-of-concept study and an important application of what NCATS does," Inglese said. "We wanted to show that by pharmacologically blocking the target receptor, the approach could be successful in finding a drug to treat chronic itch. Because it can take a long time to develop an ideal compound, the rationale behind the approach needs to be well vetted."

Credit: 
NIH/National Center for Advancing Translational Sciences (NCATS)

New developments with Chinese satellites over the past decade

image: Seventeen Chinese self-developed FengYun (FY) meteorological satellites have been launched, which are widely applied in weather analysis, numerical weather forecasting and climate prediction, as well as environment and disaster monitoring. Currently, 7 satellites are in operation.

Image: 
National Satellite Meteorological Center of the China Meteorological Administration

To date, 17 Chinese self-developed FengYun (FY) meteorological satellites have been launched, which are widely applied in weather analysis, numerical weather forecasting and climate prediction, as well as environment and disaster monitoring. Currently, seven satellites are in operation.

"The FY series satellite program has gone through four main stages," according to Dr. Peng Zheng, Deputy Director at the National Satellite Meteorological Center of the China Meteorological Administration, and the first author of a recently published review (https://link.springer.com/article/10.1007/s00376-019-8215-x) . (Note: When the article was submitted for review in late 2018, eight FY satellites were in operation. One of them retired in March 2019.)

"The first stage primarily focused on research and development (R&D) of satellite technology. FY-1A operated for 39 days and FY-1B for 158 days. Meanwhile, FY-2A operated for about six months and FY-2B for about eight months.

In the second stage, the R&D satellites were transformed to operational ones. Since FY-1C in 1999 and FY-2C in 2004, FY satellites have been stable in orbit and capable of supporting continuous measurements in an operational manner.

In the third stage, the first-generation satellites were transformed to second-generation satellites. During the past decade, the new-generation FY polar and GEO satellites, FY-3A in 2008 and FY-4A in 2016, have been in operation. Advanced instruments capable of multiple types of measurements have been mounted on the platform of the new-generation FY satellites, including multiband optical imaging, atmospheric sounding, microwave imaging, hyperspectral trace gas detection, and full-band radiation budget measurement. The new epoch for comprehensive earth observations has begun.

The latest and current stage is focused on the accuracy and precision of satellite measurements. High performance in image navigation and radiometric calibration is essential to support various quantitative data applications, such as quantitative remote sensing and satellite data assimilation."

Dr. Zhang and his team--a group of researchers from the National Satellite Meteorological Center of China Meteorological Administration--have had their summary of Chinese meteorological satellites published in a special issue(https://link.springer.com/journal/376/36/9/page/1) of Advances in Atmospheric Sciences on the National Report (2011-2018) to the International Union on Geophysics and Geodesy (IUGG) Centennial by the China National Committee for IAMAS.

In this review paper, they report the latest progress, major achievements and future plans of Chinese meteorological satellites; particularly, the improvements in core data processing techniques including image navigation, radiometric calibration and validation, are addressed.

China has become one of few countries that maintain polar and geostationary meteorological satellites operationally. With the associated open data policy and stable and accurate measurements, the FY satellites are becoming an important component of the space-based global observing system. FY satellite data delivery services support direct broadcasting users, CMACAST users, and web portal users. Web portal users can obtain the data through an FTP push service, FTP pull service, or manual service. Users can access the data online (http://satellite.nsmc.org.cn/portalsite/default.aspx) after a quick and free-of-charge registration process.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Mystery behind striped barley solved

image: Picture of an albostrians barley.

Image: 
Josef Bergstein / IPK Gatersleben

Plants with green leaves and stems are a common sight and are one of the most natural things on earth. But when considering that this colouring is achieved by small chlorophyll-filled organelles called chloroplasts, distributed within plant cells, where they utilise their green pigmentation to convert solar energy into chemical energy, this green colouration no longer seems to be such a trivial thing. Because of their fundamental role in plant biology, chloroplasts and their ability to colour plants have been since long in the focus of intense research. Specifically, genetically impaired chloroplasts which no longer or only partially express pigments, are used to identify genes and understand the molecular mechanisms within plant cells. One such plant which displays the effect of mutated chloroplasts is called Albostrians barley. Instead of growing green leaves, this grass plant is patterned with green-white stripes, an effect called variegation. Even though scientists have been using albostrians mutants in order to investigate plant cellular mechanisms, the underlying gene behind this variegation effect was unknown until recently. A group of scientists from the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK) in Gatersleben together with researchers from the Humboldt University Berlin and KWS LOCHOW GmbH have now identified the underlying gene, HvAST, which is causing this albostrians phenotype, spurring novel insights into chloroplast development.

Chloroplasts are green, chlorophyll-filled plastids found in plant cells. These plant organelles play a pivotal role for life on earth, as they perform photosynthesis, thus enabling plants to develop and grow. Chloroplast biogenesis is the process in which chloroplasts mature in plant cells from so-called pro-plastids. This process can be affected by external factors, such as temperature, and relies heavily on the synergic expression of proteins encoded within the nuclear and the plastid genome of the plant. As the plastid genome encodes only a fraction of proteins required for chloroplast biogenesis, the nuclear genome delivers the vast majority. Consequently, mutations within the nuclear genes can easily result in defective chloroplasts.

Whilst functioning chloroplasts are of highest importance in nature, impaired chloroplasts are of great value within research. Mutants which lead to aberrations in the colouration of plants can be used as genetic tools to identify genes involved in chloroplast biogenesis and to understand related molecular processes within plants. Especially mutations which lead to variegation, the appearance of differently coloured (white, yellow, green) areas on plants, are of great interest for plant researchers. This phenotype is caused by the presence of both normal and defective chloroplasts in different sectors of the same plant tissue, and scientists are able to utilise this during research, for example when investigating communication between cell organelles or when examining the molecular mechanisms behind variegation itself.

Albostrians barley, with its green-white striping, is a well-known example of a plant mutant displaying variegation. As a model plant, it has helped broaden the field of chloroplast biology during previous research work. However, the clarification of the mechanism leading to the albostrians-specific phenotype of variegation had previously been hindered by the fact that the causal underlying gene was unknown. Scientists from various research groups of the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK Gatersleben) in cooperation with researchers from the Humboldt University Berlin and KWS LOCHOW GmbH have now identified the responsible gene.

Utilising positional cloning, the ALBOSTRIANS gene was identified as the CCT-domain gene HvAST. The scientists validated the functionality of the gene twice, first by inducing a knock-out by chemical mutation and detecting the responsible mutated gene through TILLING and then by inducing mutations through RNA-guided Cas9 endonuclease mediated precise gene editing. Further, HvAST was found to be a homolog of the CCT Motif transcription factor gene AtCIA2 of the plant Arabidopsis thaliana. However, while AtCIA2 is reported to be involved in the expression of nuclear genes and thus plays a key role in chloroplast biogenesis, the researchers surprisingly found that CCT-domain containing protein HvAST was localised to plastids in barley and found no clear evidence of nuclear localisation. Nevertheless, HvAST presumably has an important function for plastid ribosome formation during the early embryo development and consequently for chloroplast development.

"Since the early 1950s, scientists have studied the variations of pigmentation, as this phenomenon allows insights into important gene functions and regulations and therefore into the basics of the genetics of living organisms. With the present work, we have succeeded in identifying one of the key genes involved in this process", says Prof. Dr. Nils Stein from the IPK and corresponding author of the team. Dr. Viktor Korzun leading researcher at KWS SAAT SE, also involved in the study continues: "The new insight into the role of this CCT-domain-containing protein, and more importantly, the identification of the gene underlying the "albostrians" phenotype can now be followed up by new in-depth investigation of the mechanisms of the barley mutant, and is likely to foster new research in the area of leaf variegation and chloroplast development."

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

Nanotechnology delivers hepatitis B vaccine

Brazilian and European researchers have demonstrated exactly how a nanotechnology-based compound delivers an oral vaccine against hepatitis B to the immune system. When particles containing silica and an antigen combine, even though they are different sizes, they reach the intestine without being destroyed by the acidity of the digestive system.

A compound of nanostructured SBA-15 silica and HBsAg, the hepatitis B surface antigen, was submitted to different types of X-ray imaging in European laboratories.

The nanostructured silica was developed by researchers at the University of São Paulo's Physics Institute (IF-USP) in Brazil. The antigen was created by the Butantan Institute, which is also in São Paulo.

The study was supported by São Paulo Research Foundation - FAPESP and European research funders. The results are published in Scientific Reports.

The aim of the study was to understand how 22 nanometers-sized antigen binds to silica nanotubes with a diameter of approximately 10 nanometers and a honeycomb-like structure. One nanometer (1 nm) is a billionth of a meter.

Studies carried out at USP revealed the measurements of both the antigen and the silica nanotubes using small-angle X-ray scattering (SAXS), dynamic light scattering (DLS), and transmission electron microscope.

"Despite the size difference, tests [in animals] produced an excellent immune response to the oral vaccine - as good as the injectable form or better," said Márcia Fantini, Full Professor at IF-USP.

X-ray and neutron imaging was coordinated by Heloisa Bordalo, a Brazilian researcher at the University of Copenhagen's Niels Bohr Institute in Denmark. In collaboration with other researchers in Denmark as well as colleagues in France, Germany, Sweden and Switzerland, Bordalo submitted the compound to small-angle X-ray scattering (SAXS), among other techniques.

The three-dimensional images obtained by these techniques showed that although the antigen did not enter the nanotubes, it was retained in 50 nm macropores between the nanotubes. This protected it from the acidity of the digestive system.

The images also enabled the researchers to determine the ideal proportion of silica and HBsAg so that the antigen did not agglomerate, hindering the dispersion of the active principle in the patient's intestine.

"The oral and intranasal routes are natural modes of vaccine administration. Nature is the best vaccination agent. However, a vaccine that contains a protein, as in this case, is destroyed by high acidity and its own proteases in passing through the stomach, so it doesn't reach the immune system, particularly the small intestine," said Osvaldo Augusto Sant'Anna, Scientific Leader at Butantan Institute and responsible for development of the HbsAg antigen.

Before proceeding to clinical trials, the team will test polymers that can be used to coat the entire structure and increase the medication's resistance to the human stomach. In animal trials, the formulation proved to be as effective as the injected vaccine, if not more so, in delivering the antigen to the intestine, where the immune system can detect it and produce antibodies against the virus.

According to the World Health Organization (WHO), approximately 257 million people currently live with hepatitis B worldwide.

Polyvaccine

Through a project supported by FAPESP, the group led by Sant'Anna, Fantini and Bordalo is now developing new antigens to add to the compound. The idea is to have at least a triple vaccine by adding other antigens against diphtheria and tetanus.

However, the formulation may evolve to become a polyvaccine that also immunizes people against whooping cough, poliomyelitis and Haemophilus influenzae type B (Hib), the bacterium that causes meningitis and pneumonia, among other diseases.

The antigens must combat the diseases without interfering with each other. "There have been very interesting results with diphtheria, and we're now going to test it for tetanus, initially in injectable form," Sant'Anna said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Area of brain linked to spatial awareness and planning also plays role in decision making

New research by neuroscientists at the University of Chicago shows that the posterior parietal cortex (PPC), an area of the brain often associated with planning movements and spatial awareness, also plays a crucial role in making decisions about images in the field of view.

"Traditionally this part of the brain has been thought to be involved in controlling spatial attention and planning actions. There has been less attention paid to how much of a role this brain area plays in processing the visual stimuli themselves," said David Freedman, PhD, professor of neuroscience at UChicago and the senior author of the study, published this week in Science. "Here we were able to show that it plays an important role in making sense of the things we see, perhaps even more so than its role in planning your next action or directing your attention."

Freedman and Yang Zhou, PhD, a postdoctoral researcher, trained monkeys to play a simple computer game in which they reported their decisions about different types of images displayed on a computer monitor by moving their eyes toward a designated target. For example, if the animals were shown a pattern of dots moving up and to the left, they were supposed to move their eyes toward a green spot. If the dots were moving to the opposite direction, they should move their eyes toward a red spot.

For the new study, the researchers tested whether a specific region of the PPC called the lateral intraparietal area (LIP) was directly involved in guiding these decisions. They gave the animals a drug which temporarily halted neural activity in the LIP area, then they had the monkeys perform the same tasks. While the drug was active, the monkeys' decisions about the visual patterns they viewed were impaired; once the drug wore off, their decisions returned to normal.

The researchers also recorded activity in the same pool of neurons once the drug had worn off and found that activity in that area of the brain was indeed strongly correlated with the same kinds of decisions which had been impaired during the experiments.

Deeper understanding of how the brain interprets things we see

The findings provide new context to help understand why a 2016 study by another group in Nature reported that deactivating parts of LIP seemed not to have any impact on decision making. That study only examined LIP's role in motor planning--such as the decision about whether to look leftwards or rightwards. In contrast, the current study shows that LIP is more involved in making sense of the visual images that the subjects are viewing, rather than deciding which actions they should take next.

"All the neuronal data we examined in our past experiments gave us the impression that this area of the brain was involved in processing the meaning of visual images during decision making," Freedman said. "Now we find that indeed, when we temporarily shut the activity down in that part of the brain it really does affect the sensory parts of decisions."

Freedman says the new study provides an opportunity for neuroscientists to rethink the brain mechanisms involved in decision-making, visual categorization, and sensory and motor processing. The work could also lead to a deeper understanding of how the brain interprets the things we see in order to make decisions. Understanding this process in detail will be critical for developing new treatments for brain-based diseases and disorders which affect decision making.

"These results show that the brain's parietal cortex is an important hub for guiding decisions, so now we're even more motivated to move ahead and try to work out the details of neural circuits in this part of the brain that actually carry out these cognitive functions," he said.

Credit: 
University of Chicago Medical Center

New vaccine strategy boosts T-cell therapy

A promising new way to treat some types of cancer is to program the patient's own T cells to destroy the cancerous cells. This approach, termed CAR-T cell therapy, is now used to combat some types of leukemia, but so far it has not worked well against solid tumors such as lung or breast tumors.

MIT researchers have now devised a way to super-charge this therapy so that it could be used as a weapon against nearly any type of cancer. The research team developed a vaccine that dramatically boosts the antitumor T cell population and allows the cells to vigorously invade solid tumors.

In a study of mice, the researchers found that they could completely eliminate solid tumors in 60 percent of the animals that were given T-cell therapy along with the booster vaccination. Engineered T cells on their own had almost no effect.

"By adding the vaccine, a CAR-T cell treatment which had no impact on survival can be amplified to give a complete response in more than half of the animals," says Darrell Irvine, who is the Underwood-Prescott Professor with appointments in Biological Engineering and Materials Science and Engineering, an associate director of MIT's Koch Institute for Integrative Cancer Research, and the senior author of the study.

Leyuan Ma, an MIT postdoc, is the lead author of the study, which appears in the July 11 online edition of Science.

Targeting tumors

So far, the FDA has approved two types of CAR-T cell therapy, both used to treat leukemia. In those cases, T cells removed from the patient's blood are programmed to target a protein, or antigen, found on the surface of B cells. (The "CAR" in CAR-T cell therapy is for "chimeric antigen receptor.")

Scientists believe one reason this approach hasn't worked well for solid tumors is that tumors usually generate an immunosuppressive environment that disarms the T cells before they can reach their target. The MIT team decided to try to overcome this by giving a vaccine that would go to the lymph nodes, which host huge populations of immune cells, and stimulate the CAR-T cells there.

"Our hypothesis was that if you boosted those T cells through their CAR receptor in the lymph node, they would receive the right set of priming cues to make them more functional so they'd be resistant to shutdown and would still function when they got into the tumor," Irvine says.

To create such a vaccine, the MIT team used a trick they had discovered several years ago. They found that they could deliver vaccines more effectively to the lymph nodes by linking them to a fatty molecule called a lipid tail. This lipid tail binds to albumin, a protein found in the bloodstream, allowing the vaccine to hitch a ride directly to the lymph nodes.

In addition to the lipid tail, the vaccine contains an antigen that stimulates the CAR-T cells once they reach the lymph nodes. This antigen could be either the same tumor antigen targeted by the T cells, or an arbitrary molecule chosen by the researchers. For the latter case, the CAR-T cells have to be re-engineered so that they can be activated by both the tumor antigen and the arbitrary antigen.

In tests in mice, the researchers showed that either of these vaccines dramatically enhanced the T-cell response. When mice were given about 50,000 CAR-T cells but no vaccine, the CAR-T cells were nearly undetectable in the animals' bloodstream. In contrast, when the booster vaccine was given the day after the T-cell infusion, and again a week later, CAR-T cells expanded until they made up 65 percent of the animals' total T cell population, two weeks after treatment.

This huge boost in the CAR-T cell population translated to complete elimination of glioblastoma, breast, and melanoma tumors in many of the mice. CAR-T cells given without the vaccine had no effect on tumors, while CAR-T cells given with the vaccine eliminated tumors in 60 percent of the mice.

Long-term memory

This technique also holds promise for preventing tumor recurrence, Irvine says. About 75 days after the initial treatment, the researchers injected tumor cells identical to those that formed the original tumor, and these cells were cleared by the immune system. About 50 days after that, the researchers injected slightly different tumor cells, which did not express the antigen that the original CAR-T cells targeted; the mice could also eliminate those tumor cells.

This suggests that once the CAR-T cells begin destroying tumors, the immune system is able to detect additional tumor antigens and generate populations of "memory" T cells that also target those proteins.

"If we take the animals that appear to be cured and we rechallenge them with tumor cells, they will reject all of them," Irvine says. "That is another exciting aspect of this strategy. You need to have T cells attacking many different antigens to succeed, because if you have a CAR-T cell that sees only one antigen, then the tumor only has to mutate that one antigen to escape immune attack. If the therapy induces new T-cell priming, this kind of escape mechanism becomes much more difficult."

While most of the study was done in mice, the researchers showed that human cells coated with CAR antigens also stimulated human CAR-T cells, suggesting that the same approach could work in human patients. The technology has been licensed to a company called Elicio Therapeutics, which is seeking to test it with CAR-T cell therapies that are already in development.

"There's really no barrier to doing this in patients pretty soon, because if we take a CAR-T cell and make an arbitrary peptide ligand for it, then we don't have to change the CAR-T cells," Irvine says. "I'm hopeful that one way or another this can get tested in patients in the next one to two years."

Credit: 
Massachusetts Institute of Technology

REM sleep silences the siren of the brain

Upset by something unpleasant? We have all been there. Fortunately, it also passes. A new day, a new beginning. At least: if you have restful REM sleep. Researchers at the Netherlands Institute for Neuroscience discovered why you will be better able to bear tomorrow what you are distressed about today. And why that can go wrong.

Siren of the brain

Something frightening or unpleasant does not go unnoticed. In our brain, the so-called limbic circuit of cells and connections immediately becomes active. First and foremost, such experiences activate the amygdala. This nucleus of brain cells located deep in the brain can be regarded as the siren of the brain: attention! In order for the brain to function properly, the siren must also be switched off again. For this, a restful REM sleep, the part of the sleep with the most vivid dreams, turns out to be essential.

Good sleepers

The researchers placed their participants in a MRI scanner in the evening and presented a specific odor while they made them feel upset. The brain scans showed how the amygdala became active. The participants then spent the night in the sleep lab, while the activity of their sleeping brain was measured with EEG, and the specific odor was presented again on occasion. The next morning, the researchers tried to upset their volunteers again, in exactly the same way as the night before. But now they did not succeed so well in doing this. Brain circuits had adapted overnight; the siren of the brain no longer went off. The amygdala responded much less, especially in those who had had a lot of restful REM sleep and where meanwhile exposed to the specific odor.

Restless sleepers

However, among the participants were also people with restless REM sleep. Things went surprisingly different for them. Brain circuits had not adapted well overnight: the siren of the brain continued to sound the next morning. And while the nocturnal exposure to the odor helped people with restful REM sleep adapt, the same exposure only made things worse for people with restless REM sleep.

Neuronal connections weaken and strengthen

During sleep, 'memory traces' of experiences from the past day are spontaneously played back, like a movie. Among all remnants of the day, a specific memory trace can be activated by presenting the same odor as the one that was present during the experience while awake. Meanwhile, memory traces are adjusted during sleep: some connections between brain cells are strengthened, others are weakened. Restless REM sleep disturbs these nocturnal adjustments, which are essential for recovery and adaptation to distress.

Transdiagnostic importance

The findings were published on 11 July in the leading journal Current Biology. The finding can be of great importance for about two-thirds of all people with a mental disorder, as both restless REM sleep and a hyperactive amygdala are the hallmarks of post-traumatic stress disorder (PTSD), anxiety disorders, depression and insomnia. People with PTSD carry their traumatic experience to the next day: people with an anxiety disorder take their greatest fear with them, people with depression their despair, and people with chronic insomnia their tension. Authors Rick Wassing, Frans Schalkwijk and Eus van Someren predict that treatment of restless REM sleep could transdiagnostically help to process emotional memories overnight and give them a better place in the brain.

Credit: 
Netherlands Institute for Neuroscience - KNAW

The best of both worlds: how to solve real problems on modern quantum computers

image: Photo shows Dr. Alexeev with a model of an IBM Q quantum computer.

Image: 
Argonne National Laboratory

In recent years, quantum devices have become available that enable researchers — for the first time — to use real quantum hardware to begin to solve scientific problems. However, in the near term, the number and quality of qubits (the basic unit of quantum information) for quantum computers are expected to remain limited, making it difficult to use these machines for practical applications.

A hybrid quantum and classical approach may be the answer to tackling this problem with existing quantum hardware. Researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and Los Alamos National Laboratory, along with researchers at Clemson University and Fujitsu Laboratories of America, have developed hybrid algorithms to run on quantum machines and have demonstrated them for practical applications using IBM quantum computers (see below for description of Argonne's role in the IBM Q Hub at Oak Ridge National Laboratory [ORNL]) and a D-Wave quantum computer.

“This approach will enable researchers to use near-term quantum computers to solve applications that support the DOE mission. For example, it can be applied to find community structures in metabolic networks or a microbiome.” — Yuri Alexeev, principal project specialist, Computational Science division

The team’s work is presented in an article entitled “A Hybrid Approach for Solving Optimization Problems on Small Quantum Computers” that appears in the June 2019 issue of the Institute of Electrical and Electronics Engineers (IEEE) Computer Magazine. 

Concerns about qubit connectivity, high noise levels, the effort required to correct errors, and the scalability of quantum hardware have limited researchers’ ability to deliver the solutions that future quantum computing promises.

The hybrid algorithms that the team developed employ the best features and capabilities of both classical and quantum computers to address these limitations. For example, classical computers have large memories capable of storing huge datasets — a challenge for quantum devices that have only a small number of qubits. On the other hand, quantum algorithms perform better for certain problems than classical algorithms.

To distinguish between the types of computation performed on two completely different types of hardware, the team referred to the classical and quantum stages of hybrid algorithms as central processing units (CPUs) for classical computers and quantum processing units (QPUs) for quantum computers.

The team seized on graph partitioning and clustering as examples of practical and important optimization problems that can already be solved using quantum computers: a small graph problem can be solved directly on a QPU, while larger graph problems require hybrid quantum-classical approaches.

As a problem became too large to run directly on quantum computers, the researchers used decomposition methods to break the problem down into smaller pieces that the QPU could manage — an idea they borrowed from high-performance computing and classical numerical methods.

All the pieces were then assembled into a final solution on the CPU, which not only found better parameters, but also identified the best sub-problem size to solve on a quantum computer.

Such hybrid approaches are not a silver bullet; they do not allow for quantum speedup because using decomposition schemes limits speed as the size of the problem increases. In the next 10 years, though, expected improvements in qubits (quality, count, and connectivity), error correction, and quantum algorithms will decrease runtime and enable more advanced computation.

“In the meantime,” according to Yuri Alexeev, principal project specialist in the Computational Science division, “this approach will enable researchers to use near-term quantum computers to solve applications that support the DOE mission. For example, it can be applied to find community structures in metabolic networks or a microbiome.”

Credit: 
DOE/Argonne National Laboratory

New sensor could shake up earthquake response efforts

image: A new sensor developed at Lawrence Berkeley National Laboratory combines laser beams with a position sensitive detector to directly measure drift between building stories, an essential part of assessing earthquake damages in a building and deeming them safe to reoccupy.

Image: 
Diana Swantek/Berkeley Lab

Last week's massive southern California earthquakes shut down Ridgecrest Regional Hospital throughout the July 4 holiday weekend while the tiny town of Ridgecrest assessed the damages. A new optical sensor developed at Lawrence Berkeley National Laboratory (Berkeley Lab) could speed up the time it takes to evaluate whether critical buildings like these are safe to occupy shortly after a major earthquake.

The technology - which autonomously captures and transmits data depicting the relative displacement between two adjacent stories of a shaking building - is able to provide reliable information about building damage immediately following an earthquake, and could expedite efforts to safely assess, repair, and reoccupy buildings post-quake.

Scientists and engineers at Berkeley Lab, Lawrence Livermore National Laboratory, and the University of Nevada-Reno began working to design an optical method of measuring interstory drift within buildings in 2015. After four years of extensive peer-reviewed research and simulative testing at the University of Nevada's Earthquake Engineering Laboratory, the Discrete Diode Position Sensor (DDPS) will be deployed for the first time this summer in a multi-story building at Berkeley Lab - which sits adjacent to the Hayward Fault, considered one of the most dangerous faults in the United States.

"Until now, there's been no way to accurately and directly measure drift between building stories, which is a key parameter for assessing earthquake demand in a building," said David McCallen, a senior scientist in the Energy Geosciences Division at Berkeley Lab and faculty member at the University of Nevada, who leads the research collaboration.

The debut of DDPS comes as governments at every level make post-earthquake building inspection and reoccupation a central focus of response planning, and as the highly anticipated next generation of remote connectivity - 5G - becomes reality for rapid data transmission. "We are excited that this sensor technology is now ready for field trials, at a time when post-earthquake response strategies have evolved to prioritize safe, continued building functionality and re-occupancy in addition to 'life safety,'" McCallen said.

Optics makes a difference in monitoring seismic structural health

Measuring building interstory drift has been a factor in assessing buildings for post-earthquake damage for some time, yet finding a reliable method to do so has been fraught with challenges. Traditionally, engineers mounted strong motion earthquake accelerometers at select elevations to secure data on the back-and-forth and side-to-side force imposed on a shaking building. But processing the acceleration data from these instruments to obtain building drift displacements is very challenging due to the frequency limitations of the sensors, especially when buildings have sustained permanent displacements associated with damage. Even more difficult is receiving data quickly enough to inform decision-making on continuity of operations and occupant safety. In addition, because typical building accelerometer-based instrumentation can be quite costly, systems tend to be very sparse with accelerometers on relatively few buildings.

DDPS leverages a promising new alternative for directly measuring building interstory drift that combines laser beams with optical sensors. This technique centers around projecting laser light across a story height to sense the position at which the light strikes a detector located on the adjacent building floor to directly measure structural drift. The tool developed at Berkeley Lab relies on utilizing a laser source and position sensitive detector. Making use of a geometric array of small, inexpensive light-sensitive photodiodes, the sensor is able to instantly track the position of an impinging laser beam.

"Previous generations of DDPS were quite a bit larger than the system we are now able to deploy," says McCallen. "Based on design advancements and lessons learned, the sensor is a quarter of the size of our original sensor design, but features 92 diodes staggered in a rectangular array so that the laser beam is always on one or more diodes."

So far, DDPS has held up to three rounds of rigorous experimental shake table testing.

"The rigorous testing the DDPS has undergone indicates how the drift displacements measured on the three testbeds compared to representative drifts that could be achieved on an actual full-scale building undergoing strong shaking from an earthquake," McCallen said.

Why DDPS is smart for cities

The most populous town affected by the earthquakes in southern California earlier this month was Ridgecrest itself, a city of 29,000 which sits at the epicenter of a magnitude 7.1 earthquake which took place on July 5. Even though this is a small population center, the building damage estimates are still in the $100-million range.

If an earthquake of that magnitude were to hit Los Angeles 150 miles to the south of tiny Ridgecrest, or San Francisco, nearly 400 miles north, literally hundreds to thousands of buildings would be at stake for damage. In that scenario, the ability to measure and display key interstory drift information immediately after an earthquake would provide critical new data for making informed decisions on building occupancy - giving first responders information to help guide their efforts to evacuate a building, and municipalities the potential to maintain functional use of important facilities such as hospitals.

In addition, understanding a building's drift profile would allow a quick determination of building damage potential, letting building inspectors know where to look for potential damage. This will be an important capability in moving beyond time-consuming and challenging manual inspections of hundreds of buildings after the next major urban earthquake.

McCallen noted, "The major earthquakes that struck in southern California this past week serve as a reminder of the risks associated with seismic activity across many regions of the United States. These events put an exclamation point on the need for continued societal focus on earthquake readiness and resilience, including an ability to provide the sensors and data analysis that can rapidly measure infrastructure health and inform the most effective response after the next major quake."

This research was funded by the U.S. Department of Energy's (DOE) Nuclear Safety Research and Development (NSR&D) Program managed by the Office of Nuclear Safety within the DOE Office of Environment, Health, Safety and Security. An objective of the NSR&D program is to establish an enduring Departmental commitment and capability to utilize NSR&D in preventing and/or reducing high consequence-low probability hazards and risks posed by DOE and NNSA nuclear facilities, operations, nuclear explosives, and environmental restoration activities.

Credit: 
DOE/Lawrence Berkeley National Laboratory

How procrastinators and doers differ genetically

image: The Bochum-based research team: Professor Onur Güntürkün, Caroline Schlüter, associate professor Dr. Sebastian Ocklenburg, Dr. Marlies Pinnow and Dr. Erhan Genç (from left).

Image: 
RUB, Kramer

Some people tend to postpone actions. In women, this trait is associated with a genetic predisposition towards a higher level of dopamine in the brain. This is what researchers from Ruhr-Universität Bochum and the Technical University of Dresden discovered using genetic analyses and questionnaires. They were unable to identify this correlation in men. "The neurotransmitter dopamine has repeatedly been associated with increased cognitive flexibility in the past," says Dr. Erhan Genç from the Bochum Department of Biopsychology. "This is not fundamentally bad but is often accompanied by increased distractibility."

Erhan Genç reports on the results together with Caroline Schlüter, Dr. Marlies Pinnow, Professor Onur Güntürkün, Professor Christian Beste and associate professor (PD) Dr. Sebastian Ocklenburg in the journal Social Cognitive and Affective Neuroscience on 3 July 2019.

Only for women

The research group investigated the genotype of 278 men and women. They were particularly interested in what is known as the tyrosine hydroxylase gene (TH gene). Depending on the expression of the gene, people's brains contain differing amounts of neurotransmitters from the catecholamine family, to which the neurotransmitter dopamine belongs. The team also used a questionnaire to record how well the participants were able to control their actions. Women with poorer action control had a genetic predisposition towards higher dopamine levels.

Dopamine and action control

Whether someone tends to postpone tasks or tackle them directly depends on the individual's ability to maintain a specific intention to act without being distracted by interfering factors. Dopamine could be crucial here. In previous studies, the neurotransmitter has not only been associated with increased cognitive flexibility but also seems to make it easier for information to enter the working memory.

"We assume that this makes it more difficult to maintain a distinct intention to act," says doctoral candidate Caroline Schlüter. "Women with a higher dopamine level as a result of their genotype may tend to postpone actions because they are more distracted by environmental and other factors."

More susceptible to genetic differences?

Previous studies have revealed gender-specific differences between the expression of the TH gene and behaviour. "The relationship is not yet understood fully, but the female sex hormone oestrogen seems to play a role," explains Erhan Genç. Oestrogen indirectly influences dopamine production in the brain and increases the number of certain neurons that respond to signals from the dopamine system. "Women may therefore be more susceptible to genetic differences in dopamine levels due to oestrogen, which, in turn, is reflected in behaviour," says the biopsychologist.

Outlook

In future studies, the research team intends to investigate to what extent oestrogen levels actually influence the relationship between the TH gene and action control. "This would require taking a closer look at the menstrual cycle and the associated fluctuations in the participants' oestrogen levels," explains Caroline Schlüter.

In addition to dopamine, the TH gene also influences norepinephrine, another important neurotransmitter from the catecholamine family. The researchers aim to examine the role that these two neurotransmitters play in action control in further studies.

Credit: 
Ruhr-University Bochum

Ammonia from agriculture influences cloud formation over Asia

image: With measurement flights during the Asian monsoon, satellite observations, and laboratory analyses, researchers solved the puzzle of the Asian tropopause aerosol layer.

Image: 
Dr. Erik Kretschmer

The Asian tropopause aerosol layer (ATAL) is located at twelve to 18 kilometers height above the Middle East and Asia. This accumulation of aerosols in the Asian monsoon was discovered first in 2011. Its composition and effect, however, have been unknown so far. A European consortium of scientists has now found at this layer consists of crystalline ammonium nitrate. In the AIDA cloud chamber, climate researchers of Karlsruhe Institute of Technology (KIT) demonstrated how this substance is formed in the upper troposphere. The results are reported in Nature Geoscience.

Using a smart combination of remote sensing, in situ measurements, meteorological model calculations, specific laboratory measurements, and detailed numerical simulations, the team studied the distribution and composition of aerosols in the ATAL. Aerosols are smallest suspended particles from a variety of natural and anthropogenic sources. In the atmosphere, aerosols act as condensation nuclei to which gaseous water vapor attaches and, thus, forms cloud droplets. For the first time, a research aircraft flew through the upper levels of the Asian monsoon to study key processes of global importance. The different methods and instruments complemented each other to verify the measured results. Scientists from KIT, Forschungszentrum Jülich (FZJ), Johannes Gutenberg University and Max Planck Institute for Chemistry, both in Mainz, Alfred Wegener Institute, the University of Wuppertal, Laboratoire de Métérologie Dynamique, Paris, and the Istituto di Scienze dell'Atmosfera e del Clima, Rome, took part.

"Surprisingly, we detected crystalline ammonium nitrate as a main constituent in large parts of the ATAL," says Dr. Michael Höpfner from the Atmospheric Trace Gases and Remote Sensing Division of KIT's Institute of Meteorology and Climate Research (IMK-ASF). The unexpected results measured, among others, by the GLORIA instrument of KIT and Forschungszentrum Jülich were then confirmed by climate researchers at KIT's AIDA cloud chamber: "Our experiments revealed that, contrary to the prevailing opinion, liquid ammonium nitrate droplets crystallize to solid particles at minus 50 degrees in the presence of small, mainly sulfur-containing pollutions. These solid particles continue to exist even under temperature and humidity conditions of the upper troposphere," says Dr. Robert Wagner from the Atmospheric Aerosol Research Division of KIT's Institute of Meteorology and Climate Research. With satellite observations, the researchers indeed found large amounts of ammonium nitrate aerosols above Asia. These observations reach back into the year 1997 when the ATAL was not yet supposed to exist.

"With this, we have solved the long-standing puzzle of the composition of ATAL," says Michael Höpfner. So far, it has been considered highly improbable that this aerosol exists at such high altitudes, because the precursory ammonia gas is washed out of the atmosphere very quickly by rain. "But we detected unparalleled ammonia concentrations during the Asian monsoon: the values are up to fifty times higher than in previous measurements," Höpfner adds. This ammonium mainly originates from agriculture, in particular from lifestock farming and fertilization. The highest ammonia emissions are currently found in Asia. During the monsoon, polluted air masses are transported from the land surface to heights of up to 18 km. Here, ammonia reacts to ammonium nitrate, an aerosol that influences both the formation and properties of clouds.

"It is now for the first time that our data prove that ammonium nitrate aerosols are omnipresent in the upper troposphere during the Asian monsoon," Höpfner says. These results are relevant in particular to the interactions of clouds and aerosols, which represent one of the biggest uncertainties in climate modeling. Moreover, the findings prove that ammonia emitted on the ground has a big influence on the processes in the upper troposphere and potentially on the Asian climate.

Tracking Down Ammonia: The GLORIA Measurement Instrument and AIDA Cloud Chamber

The aircraft campaign was part of the StratoClim project in which 37 scientific organizations from eleven European countries, the USA, Bangladesh, India, and Nepal collaborate under the direction of the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research. The high-altitude aircraft M55-Geophysika carried 25 specially developed instruments to heights above 20 km, about twice the height usually reached by airplanes. A major instrument on board of Geophysika was the infrared spectrometer GLORIA (Gimballed Limb Observer for Radiance Imaging of the Atmosphere) measuring height distribution of a variety of trace gases along the flight path. Measurements during the flights mainly concentrated on ammonia, as it is largely involved in the formation of aerosol particles. GLORIA presently is the only instrument that can measure ammonia at these heights.

Based on the data measured by the satellite instrument MIPAS (Michelson Interferometer for Passive Atmospheric Sounding) of KIT's IMK-ASF for height distribution of more than 30 trace gases between 2002 and 2012, the scientists for the first time acquired the global distribution of ammonia and ammonium nitrate at the same time. For their studies, they also used the AIDA (Aerosol Interactions and Dynamics in the Atmosphere) facility on KIT's Campus North. It is the only facility worldwide, where aerosol and climate processes can be studied under atmospheric conditions. In the facility, all temperature and pressure conditions in the lower and middle atmosphere can be simulated.

Credit: 
Karlsruher Institut für Technologie (KIT)

How DNA outside cells can be targeted to prevent the spread of cancer

Cell-free DNA (cfDNA) is DNA found in trace amounts in blood, which has escaped degradation by enzymes. Scientists from Tokyo University of Science, led by Prof Ryushin Mizuta, have now discovered exactly how cfDNA is generated. They also talk about the applications of DNase1L3--the enzyme mainly responsible for generating cfDNA--as a novel molecule to prevent the spread of tumors. Prof Mizuta says, "The results of this study are an important step toward one phase of an exciting new era of genomic medicine."

In 1994, a mutation in a well-known cancer-associated gene, RAS, was found in cfDNA from the blood of cancer patients. This sparked interest in the potential use of cfDNA as a diagnostic marker for tumors. Fetus cfDNA derived from pregnant women had already gained popularity as a tool for prenatal screening. In this day and age, given the multitude of advances in genomics, genetic analysis using cfDNA could revolutionize the era of precision medicine or "genomic medicine." This basically means that one can get medication tailor-made according to one's genetic makeup.

However, until now, exactly what gives rise to cfDNA was a question that was left unanswered. Is it derived from cells that undergo programmed death in the body (apoptosis) or is it derived from cells dying by injury or inflammation (necrosis)? What are the DNA-degrading enzymes (termed "endonucleases") involved? Is there more to cfDNA that meets the eye? The study group led by Prof Mizuta, of the Research Institute of Biomedical Sciences at Tokyo University of Science, have now answered all of these questions.

Prior to this study, these scientists had already discovered an endonuclease, DNase1L3 (also called DNase γ), and found that it causes cellular DNA fragmentation during necrosis: when a cell membrane is abruptly broken, DNase1L3 in the blood stream rapidly degrades the cellular DNA into single nucleosomes (the basic units of DNA packaging). They had also found that this DNase1L3 plays second fiddle to caspase-activated DNase (CAD; the main degrading enzyme in apoptosis) during apoptosis: CAD degrades the initial fully packaged DNA called "chromatin" and the apoptotic cells are scavenged by specialized "eating" cells in the immune system, called macrophages. However, when some cells escape this scavenging process, they flow into the bloodstream and undergo "secondary" necrosis, after which DNase1L3 breaks down the DNA into nucleosomes.

Now in this particular study, the researchers used genetically manipulated mice as study models to pinpoint the enzymes responsible for generating cfDNA. They induced both apoptosis and necrosis in normal mice, mice deficient for CAD, mice deficient for DNase1L3, and CAD + DNase1L3-double-deficient mice. Through a technique called electrophoresis, the scientists observed that that blood from DNase1L3-deficient mice had much lower concentrations of cfDNA than blood from CAD-deficient mice and normal mice, in both apoptosis- and necrosis-induced groups. Interestingly, blood from CAD + DNase1L3-double-deficient mice did not show any traces of cfDNA at all. The scientists thus concluded that during apoptosis, DNase1L3 is crucial as a "backup" enzyme for CAD in degrading condensed chromatin into fragments (single nucleosomes), thus giving rise to cfDNA. And in necrosis, DNaseIL3 is absolutely essential for generating cfDNA.

The researchers also checked the activity of DNase1L3 and DNase1 (another DNA-degrading enzyme) in blood and found that apoptosis and necrosis increased the activity of both DNase1L3 and DNase1. However, even when no cfDNA was observed in CAD + DNase1L3-double-deficient mice, DNase1 activity was observed. This proved that DNase1 is not essential for cfDNA generation.

The researchers then shed some light on the physiological/medical importance of DNase1L3. Prof Mizuta says, "Because this enzyme is produced mainly by macrophages, there could be a correlation between DNase1L3 activity and inflammation."

After infection or injury, a group of specialized immune cells called neutrophils release small sticky fibers of chromatin, which is undegraded dead-cell DNA. These fibers are called neutrophil extracellular traps (NETs). Although NETs can stop harmful bacteria from spreading in the bloodstream, NET release can sometimes become uncontrolled; this could cause clotting or embolism (lodging of the clot inside a blood vessel), a potentially fatal condition. Prof Mizuta states that DNase1L3 can degrade NETs into cfDNA and thus be used to treat thrombosis caused by NETs.

NETs are also known to be the "seeding soil" for tumors. Tumor cells released in blood might latch onto NETs and grow on them and spread to other organs. For this, Prof Mizuta says, "Because DNase1L3 degrades NETs and generates cfDNA, we speculate that DNase1L3 treatment may also be useful to prevent tumor metastasis. We are now conducting experiments to test this speculation."

That said, can more research on cell-free DNA make human life cancer-free? Only time will tell...

Credit: 
Tokyo University of Science

Alternating currents cause Jupiter's aurora

An international team of researchers has succeeded in measuring the current system responsible for Jupiter's aurora. Using data transmitted to Earth by NASA's Juno spacecraft, they showed that the direct currents were much weaker than expected and that alternating currents must therefore play a special role. On Earth, on the other hand, a direct current system creates its aurora. Jupiter's electric current system is kept going in particular by large centrifugal forces, which hurl ionized sulfur dioxide gas from the gas giant's moon Io through the magnetosphere.

Professor Dr Joachim Saur from the Institute of Geophysics and Meteorology at the University of Cologne was involved in the project. The article 'Birkeland currents in Jupiter's magnetosphere observed by the polar-orbiting Juno spacecraft' is published in the current issue of Nature Astronomy.

Jupiter, the largest planet in the solar system, has the brightest aurora, with a radiant power of 100 terawatts (100,000,000,000 kilowatts = one hundred billion KW). 100,000 power plants would be needed to produce this light. Similarly to the ones on Earth, Jupiter's aurora display themselves as two huge oval rings around the poles. They are driven by a gigantic system of electrical currents that connects the polar light region with Jupiter's magnetosphere. The magnetosphere is the region around a planet that is influenced by its magnetic field. Most of the electric currents run along Jupiter's magnetic field lines, also known as Birkeland currents.

NASA's Juno spacecraft has been in a polar orbit around Jupiter since July 2016. Its goal is to better understand the interior and aurora of Jupiter. Juno has now measured for the first time the electric direct current system responsible for Jupiter's aurora. For this purpose, the scientists measured the magnetic field environment of Jupiter with high precision in order to derive the electric currents. The total current is approximately 50 million amperes. However, this value is clearly below the theoretically expected values. The reason for this deviation are small-scale, turbulent alternating currents (also referred to as Alfvenic currents), which have so far received little attention. 'These observations, combined with other Juno spacecraft measurements, show that alternating currents play a much greater role in generating Jupiter's aurora than the direct current system,' Joachim Saur said. He has been doing research on these turbulent alternating currents for 15 years, stressing their importance. Jupiter's aurora differ from those on Earth, which are essentially generated by direct currents. The Earth's northern lights shine about a thousand times weaker because the Earth is smaller than Jupiter, has a weaker magnetic field and rotates more slowly.

'Jupiter's electric current systems are driven by the enormous centrifugal forces in Jupiter's rapidly rotating magnetosphere,' Saur remarked. The volcanically active Jupiter moon Io produces one ton of sulfur dioxide gas per second, which ionizes into Jupiter's magnetosphere. 'Because of Jupiter's fast rotation - a day on Jupiter lasts only ten hours - the centrifugal forces move the ionized gas in Jupiter's magnetic field, which generate the electric currents,' the geophysicist concludes.

Credit: 
University of Cologne

Scientists discover a novel perception mechanism regulating important plant processes

An international research team has revealed a novel mechanism for the perception of endogenous peptides by a plant receptor. The discovery of this activation mechanism sets a new paradigm for how plants react to internal and external cues. The study 'Mechanisms of RALF peptide perception by a heterotypic receptor complex' was published today in the journal Nature.

Similar to insulin in humans, plants also produce peptide hormones that orchestrate internal processes and responses, including growth, development, and immunity. One of them is RALF23, which belongs to the large family of RALF plant peptides. Notably, the study revealed a novel recognition mechanism for the RALF23 peptide signals by plant receptors. Since RALF peptides play major roles in multiple important plant processes, these findings will impact our understanding of how several additional important receptors control fundamental plant processes.

Previous work by the group of Professor Dr Cyril Zipfel at The Sainsbury Laboratory (Norwich, UK) and now at the University of Zürich (Zürich, Switzerland) had identified that RALF23 regulates plant innate immunity. Using a combination of genetics, biochemistry and structural biology, a close collaboration between this group and the group of Professor Dr Jijie Chai at the Innovation Center for Structural Biology and the Joint Center for Life Sciences of Tsinghua and Peking Universities (Beijing, China) and at the University of Cologne (Cologne, Germany) has now identified the molecular basis for RALF23 perception. This work further involved collaborators from the Gregor Mendel Institute (Vienna, Austria).

Professor Jijie Chai said: 'We were excited about the results, when we saw that RALF23 needs two distinct types of proteins - a receptor kinase (FERONIA) and an unrelated membrane-associated protein - to be recognized. The way these three proteins form an impressive perception complex might apply to other plant receptors that recognize peptide hormones.'

Professor Cyril Zipfel added: 'FERONIA is a plant receptor that was actually identified at the University of Zürich over a decade ago by my colleague Professor Ueli Grossniklaus for its important role in reproduction, but has since been shown to play key roles in multiple plant processes. Now that we understand the molecular basis of how FERONIA can perceive RALF peptides, it will help characterize how this unique receptor controls several aspects of plants' life.'

Credit: 
University of Cologne

Why sex becomes less satisfying with age

CLEVELAND, Ohio (July 10, 2019)--The number of women regularly having sex declines with age, and the number of women enjoying sex postmenopause is even lower. Although these facts are not surprising, the causes for these declines may be because previous research focused largely on biological causes only. However, a new UK study identifies psychosocial contributors. Study results are published online today in Menopause, the journal of The North American Menopause Society (NAMS).

It's hard to pick up a woman's magazine or ob/gyn journal anymore without reading an article about how and why a woman's libido and level of sexual satisfaction decline during and after menopause. Substantial research has been conducted into biological reasons such as hot flashes, sleep disruption, vaginal dryness, and painful intercourse. Much less is known about the effect of various psychosocial changes that are common postmenopause. These include body image concerns, self-confidence and perceived desirability, stress, mood changes, and relationship issues.

Of the research that has been conducted regarding psychological influences, most of it has focused on quantitative results. A study of nearly 4,500 postmenopausal women involved in the UK Collaborative Trial of Ovarian Cancer Screening (UKCTOCS), however, looked at free-text data to better understand why women felt a certain way and the depth of those feelings.

Among other things, the UKCTOCS sexual activity data showed that, at baseline, before the start of annual screening, approximately half of the women were sexually active. A decrease in all aspects of sexual activity was observed over time: sexual activity was less frequent, not as pleasurable, and more uncomfortable. The primary reason for absence of sexual activity was the lack of a partner, mainly because of widowhood.

Other commonly cited reasons for decreased activity included (in rank order) a partner's medical condition, a partner's sexual dysfunction, the woman's own physical health problems, menopause-related symptoms, and prescribed medication. Contributing most often to low libido were relationship problems, logistics, and perceptions of aging. Only 3% of participants described positive sexual experiences, whereas only 6% sought medical help for sexual problems.

Study results appear in the article "Sexual functioning in 4,418 postmenopausal women participating in UKCTOCS: a qualitative free-text analysis."

"Sexual health challenges are common in women as they age, and partner factors play a prominent role in women's sexual activity and satisfaction, including the lack of a partner, sexual dysfunction of a partner, poor physical health of a partner, and relationship issues," says Dr. Stephanie Faubion, NAMS medical director. "In addition, menopause-related problems such as vaginal dryness and pain with sex have been identified as problems affecting sexual function, yet few women seek treatment for these issues, despite the availability of effective therapies."

Credit: 
The Menopause Society