Culture

Researchers quantify carbon changes in Sierra Nevada meadow soils

image: A Sierra Nevada meadow that sequestered large amounts of soil carbon part of study to quantify carbon changes.

Image: 
Photo by C.C. Reed, University of Nevada, Reno.

RENO, Nev. - Meadows in the Sierra Nevada mountains are critical components of watersheds. In addition to supplying water to over 25 million people in California and Nevada, meadows contain large quantities of carbon belowground. While it has been known for some time that meadows have large quantities of soil carbon, whether meadow soils are gaining or losing carbon has remained unclear.

A new study led by researchers in the College of Agriculture, Biotechnology & Natural Resources at the University of Nevada, Reno, has demonstrated for the first time that meadows throughout the region are both gaining and losing carbon at high rates. Capture and storage of carbon in soil is a natural way to reduce carbon dioxide levels in the atmosphere and combat climate change. However, human activities can disrupt natural processes and lead to the loss of soil carbon to the atmosphere. These results suggest that meadow management may either contribute to climate change or mitigate the harmful effects of increasing atmospheric carbon dioxide.

The research was conducted in partnership with the University of California Merced, as well as numerous restoration practitioners and conservation organizations in more than a dozen meadows throughout the Sierra Nevada mountains. The study is aimed at arming restoration practitioners with information to help make good management decisions.

"Meadows are known for their lush, diverse vegetation supported by soils that stay wet into the summer," explained University of Nevada, Reno, doctoral candidate Cody Reed, who led the study. "However, a long history of human activity in many meadows throughout the Sierra Nevada has resulted in drier soils and the replacement of wetland vegetation with sparse grasses and shrubs."

In research published this week in the scientific journal Ecosystems, Reed and her coauthors, including Associate Professors Benjamin Sullivan and Paul Verburg and Professor Emeritus Sherman Swanson from the University, revealed that meadows with wetland plant communities and dense root mats were large net carbon sinks during the year measured, meaning they removed carbon from the atmosphere. In fact, per acre, the amount of carbon captured in these meadows was similar to rates measured in tropical rainforests. On the other hand, meadows with more bare ground and plant communities associated with drier soil released large amounts of carbon from the soil to the atmosphere.

In the long-term, such changes to the large soil carbon stocks in meadows could add up. And unlike in forests, where most carbon is sequestered in wood aboveground, the change in carbon in meadows is belowground. This means meadow soil carbon is less vulnerable to disturbances such as wildfire and may persist in the ecosystem for longer than aboveground carbon. At the same time, soil carbon provides other important benefits besides taking carbon out of the atmosphere.

"Worldwide, soils may contain up to four times as much carbon as the atmosphere," Reed explained. "Soil carbon in meadows also helps improve water quality and quantity, as well as soil fertility to support diverse plant communities important for wildlife and grazing."

The research will likely help meadow restoration practitioners identify meadows in need of conservation to maintain soil carbon gains and meadows in need of restoration to prevent additional losses of soil carbon to the atmosphere. The researchers estimate that three acres of surrounding forest are required to offset the amount of carbon lost by one acre of degraded meadow. On the other hand, one acre of meadow may sequester as much carbon as six acres of forest.

"Our research shows meadows may be some of the best bang for the buck in terms of carbon management in the region," Sullivan said. "My hope is that soil carbon sequestration can be integrated with other objectives to achieve management strategies that improve ecosystem functions in meadows."

Sullivan said he and his colleagues in the College's Department of Natural Resources & Environmental Science and Experiment Station plan to conduct future research to quantify the impacts of meadow restoration on soil carbon.

Credit: 
University of Nevada, Reno

Cleveland Clinic led trial shows drug effective in 96% of patients with recurrent pericarditis

EMBARGOED UNTIL 8:34 P.M. ET, Monday, November 16, 2020, Cleveland: Cleveland Clinic researchers leading a global clinical trial have found that rilonacept, an FDA approved drug for other inflammatory diseases, resolved acute pericarditis episodes and reduced risk of pericarditis recurrence. The study was published today in the New England Journal of Medicine and presented at the American Heart Association's Scientific Sessions.

Pericarditis is an inflammation of the pericardium, which is two thin layers of tissue that surround the heart and help it function. A common symptom is severe sharp chest pain, which is caused by the inflamed layers rubbing against the heart. Pericarditis can be acute, recurrent or chronic, and often occurs after a viral infection or cardiac surgery. Recurrent pericarditis usually occurs 4-6 weeks after the first episode of acute pericarditis and often causes debilitating chest pain, physical limitations, hospitalizations and decreased quality of life.

Currently, there are no FDA approved therapies for pericarditis. Anti-inflammatories and steroids- often with harsh side effects- are used to treat the condition. The Rhapsody global Phase 3 clinical trial studied 61 patients with recurrent pericarditis, who were randomized to rilonacept or placebo. After 16 weeks of treatment, 81% of patients on once-weekly rilonacept, reported no or minimal pericarditis symptoms versus 25% on the placebo. The drug not only resolved active episodes after the first dose, but it also decreased recurrences by 96%.

All patients involved who had been taking corticosteroids tapered and successfully transitioned to rilonacept.

"Recurring pericarditis is painful and can be debilitating to those who suffer from it," said Allan Klein M.D., director of the Center for the Diagnosis and Treatment of Pericardial Diseases at Cleveland Clinic, co-principal investigator of the study, and a paid member of Kiniksa Pharmaceuticals' scientific advisory board. "The Rhapsody trial was highly successful, essentially stopping this disease, and provides a new hope for these patients."

Of the 25 safety events in the study, 23 occurred in the placebo group, while only two occurred in the rilonacept group.

Credit: 
Cleveland Clinic

X-ray study explores potential of hepatitis C drugs to treat COVID-19

image: The heart-shaped SARS-CoV-2 main protease enables the virus to reproduce by cutting long chains of proteins that activate the replication process. Experiments show existing drugs used to treat hepatitis C may have potential to treat COVID-19 by stopping the "heart" of the virus.

Image: 
Michelle Lehman, Jill Hemman/ORNL, U.S. Dept. of Energy

Experiments led by researchers at the Department of Energy's Oak Ridge National Laboratory have determined that several hepatitis C drugs can inhibit the SARS-CoV-2 main protease, a crucial protein enzyme that enables the novel coronavirus to reproduce.

Inhibiting, or blocking, this protease from functioning is vital to stopping the virus from spreading in patients with COVID-19. The study, published in the journal Structure, is part of efforts to quickly develop pharmaceutical treatments for COVID-19 by repurposing existing drugs known to effectively treat other viral diseases.

"Currently, there are no inhibitors approved by the Food and Drug Administration that target the SARS-CoV-2 main protease," said ORNL lead author Daniel Kneller. "What we found is that hepatitis C drugs bind to and inhibit the coronavirus protease. This is an important first step in determining whether these drugs should be considered as potential repurposing candidates to treat COVID-19."

The SARS-CoV-2 coronavirus spreads by expressing long chains of polyproteins that must be cut by the main protease to become functional proteins, making the protease an important drug target for researchers and drug developers.

In the study, the team looked at several well-known drug molecules for potential repurposing efforts including leupeptin, a naturally occurring protease inhibitor, and three FDA-approved hepatitis C protease inhibitors: telaprevir, narlaprevir and boceprevir.

The team performed room temperature X-ray measurements to build a three-dimensional map that revealed how the atoms were arranged and where chemical bonds formed between the protease and the drug inhibitor molecules.

The experiments yielded promising results for certain hepatitis C drugs in their ability to bind and inhibit the SARS-CoV-2 main protease -- particularly boceprevir and narlaprevir. Leupeptin exhibited a low binding affinity and was ruled out as a viable candidate.

To better understand how well or how tightly the inhibitors bind to the protease, they used in vitro enzyme kinetics, a technique that enables researchers to study the protease and the inhibitor in a test tube to measure the inhibitor's binding affinity, or compatibility, with the protease. The higher the binding affinity, the more effective the inhibitor is at blocking the protease from functioning.

"What we're doing is laying the molecular foundation for these potential drug repurposing inhibitors by revealing their mode of action," said ORNL corresponding author Andrey Kovalevsky. "We show on a molecular level how they bind, where they bind, and what they're doing to the enzyme shape. And, with in vitro kinetics, we also know how well they bind. Each piece of information gets us one step closer to realizing how to stop the virus."

The study also sheds light on a peculiar behavior of the protease's ability to change or adapt its shape according to the size and structure of the inhibitor molecule it binds to. Pockets within the protease where a drug molecule would attach are highly malleable, or flexible, and can either open or close to an extent depending on the size of the drug molecules.

Before the paper was published, the researchers made their data publicly available to inform and assist the scientific and medical communities. More research, including clinical trials, is necessary to validate the drugs' efficacy and safety as a COVID-19 treatment.

"The research suggests that hepatitis C inhibitors are worth thinking about as potential repurposing candidates. Immediately releasing our data allows the scientific community to start looking at the interactions between these inhibitors and the protease," said ORNL corresponding author Leighton Coates. "You can't design a drug without knowing how it works on a molecular level, and the data we're providing is exactly what developers need to design stronger, more tightly binding drugs for more effective treatments."

Credit: 
DOE/Oak Ridge National Laboratory

Solitary bees are born with a functional internal clock - unlike honeybees

image: Female of the solitary red mason bee inside the Locomotor Activity Monitor system for registering activity rhythms

Image: 
The authors

Social insects like honeybees and hornets evolved from solitary bees and wasps, respectively. A common trait of many social insects is age-specific behavior: when they emerge from the pupa, workers typically specialize in around-the-clock tasks inside the darkness of the nest, starting with brood care. But they gradually shift towards more cyclic tasks away from center of the nest as they get older -- culminating in foraging outside, exclusively in daylight, towards the end of their life. Here, researchers find evidence that this shift from around-the-clock to rhythmic tasks, which does not occur in solitary insects, seems to be driven by a slower maturation of the endogenous (i.e. internal) "circadian" clock of social honeybees compared to solitary bees.

They find that in the solitary red mason bee Osmia bicornis, where females of every age forage to provide food for offspring, females and males emerge with a mature, fully functional circadian clock, as shown by their 24-h movement rhythm and the activity of the brain cells that produce the "pacemaker" protein Pigment-Dispersing Factor (PDF).

"Our results indicate that the maturation of the circadian clock is delayed in social honeybees as compared to solitary mason bees. We predict that rapid maturation of the circadian clock is the ancestral condition and will be found throughout the solitary bees and wasps, which all need to forage and perform brood care throughout their lifespan. In contrast, a delay in maturation may have evolved secondarily in social species, to enable age-related behavioral shifts from acyclic brood care to daily foraging," says Dr Katharina Beer, a postdoctoral scientist at the Department of Animal Ecology and Tropical Biology of the Julius Maximilian Universitaet Wuerzburg, Germany. Her study, done with Professor Charlotte Helfrich-Foerster, the Chair of Neurobiology and Genetics at Julius Maximilian University, is published in the open-access journal Frontiers in Cell and Developmental Biology.

The red mason bee is an important pollinator for agriculture, occurring across Europe, North Africa, and the Near East. Unlike in social insects, there is no infertile worker caste: females overwinter in their natal nest to emerge in the spring, mate, and build new nests in hollow stems. Females forage on many different plant species for pollen and nectar, which they store inside a row of sealed cells, laying a single egg per cell.

Beer and Helfrich-Foerster collected newly emerged bees -- 40 workers (females) from isolated brood combs of honeybees (Apis mellifera), and 56 females and 31 males from isolated pupal cocoons of O. bicornis -- and placed these inside separate tubes inside a Locomotor Activity Monitor system, an apparatus for registering the activity of individual. Infrared beams cross at the center of each tube, which are interrupted whenever the bee moves. By registering the activity pattern, in darkness and at constant temperature, around the clock for 3 to 45 days after emergence, the researchers could derive the rhythm of each bee.

The results showed that none of the honeybees showed a spontaneous ca. 24-h rhythm immediately after emergence from the pupa -- regardless of the degree of contact with the natal colony -- while 88% of O. bicornis females and males did so. The 12% of O. bicornis who did not died young. However in time, at an age of at least two days, each honeybee also developed a pronounced ca. 24-h rhythm. Beer and Helfrich-Foerster conclude that solitary O. bicornis bees, but not social honeybees, emerge with a functional endogenous circadian clock.

What is the neural basis for this difference? To answer this question, the researchers used immunohistochemistry to compare the maturation of the neurons that synthetize PDF, in the two species. In the brain of insects, a cluster of specialized cells called the lateral ventral neurons functions as a circadian pacemaker, secreting bursts of PDF, a so-called neuromodulator which affects the activity of the Central Nervous System. By staining dissected, fixed brains with two antibodies, the first recognizing PDF and the second, labeled with an fluorescent tag, binding to the first, Beer and Helfrich-Förster could count the PDF-producing neurons in each brain hemisphere at different ages, from just before emergence to four weeks after. They show that these steadily increase in number with age in honeybees, but not in O. bicornis.

The authors conclude that the circadian pacemaker is fully mature in newly emerged O. bicornis, but needs to develop to become fully active in honeybees, explaining why young honeybees don't yet show an circadian rhythm. This delay in maturation is likely an evolutionary adaptation to sociality, where young bees need to engage in around-the-clock care of their immature siblings.

"It will be most interesting to perform similar studies on other social insects such as ants, which have various forms of social behavior: some display age-related behavior like in honeybees, wile others don't. If we find similar differences in the maturation of the endogenous clock related to the different social structures of these ant species, this will strongly support our hypothesis," concludes Helfrich-Foerster.

Credit: 
Frontiers

Making the best decision: Math shows diverse thinkers equal better results

image: Bhargav Karamched is an assistant professor of mathematics at Florida State University.

Image: 
Courtesy of Bhargav Karamched

TALLAHASSEE, Fla. - Whether it is ants forming a trail or individuals crossing the street, the exchange of information is key in making everyday decisions. But new Florida State University research shows that the group decision-making process may work best when members process information a bit differently.

Bhargav Karamched, assistant professor of mathematics, and a team of researchers published a new study today that tackles how groups make decisions and the dynamics that make for fast and accurate decision making. He found that networks that consisted of both impulsive and deliberate individuals made, on average, quicker and better decisions than a group with homogenous thinkers.

"In groups with impulsive and deliberate individuals, the first decision is made quickly by an impulsive individual who needs little evidence to make a choice," Karamched said. "But, even when wrong, this fast decision can reveal the correct options to everyone else. This is not the case in homogenous groups."

The paper is published in Physical Review Letters.

Researchers noted in the paper that the exchange of information is crucial in a variety of biological and social functions. But Karamched said although information sharing in networks has been studied quite a bit, very little work has been done on how individuals in a network should integrate information from their peers with their own private evidence accumulation. Most of the studies, both theoretical and experimental, have focused on how isolated individuals optimally gather evidence to make a choice.

"This work was motivated by that," Karamched said. "How should individuals optimally accumulate evidence they see for themselves with evidence they obtain from their peers to make the best possible decisions?"

Krešimir Josi?, Moores Professor of Mathematics, Biology and Biochemistry at the University of Houston and senior author of the study, noted that the process works best when individuals in a group make the most of their varied backgrounds to collect the necessary materials and knowledge to make a final decision.

"Collective social decision making is valuable if all individuals have access to different types of information," Josi? said.

Karamched used mathematical modeling to reach his conclusion but said there is plenty of room for follow-up research.

Karamched said that his model assumes that evidence accrued by one individual is independent of evidence collected by another member of the group. If a group of individuals is trying to make a decision based on information that is available to everyone, additional modeling would need to account for how correlations in the information affects collective decision-making.

"For example, to choose between voting Republican or Democrat in an election, the information available to everyone is common and not specifically made for one individual," he said. "Including correlations will require developing novel techniques to analyze models we develop."

Credit: 
Florida State University

Discovery of protein's 'Achilles heel' paves way for novel class of anti-HIV drugs

image: Panel A shows the normal distribution of CD4 (green) and MHC-I (HLA, stained red) in T-lymphocytes. Panel B shows the change in distribution of these two proteins in a T-lymphocyte also expressing the HIV protein Nef

Image: 
Estela A. Pereira & Luis L. P. da Silva / FMRP-USP

The discovery of a potential “Achilles heel” in Nef, the protein that is crucial to HIV virulence and its capacity to trigger AIDS, paves the way for the development of a new class of drugs against the virus. Researchers have succeeded in demonstrating a structure that binds this protein to another called AP-2 and regulates endocytosis, the process by which cells take in substances such as nutrients or pathogens by engulfing them in a vesicle.

As a result, it is increasingly clear how Nef manages to subvert human cells’ defense mechanisms, enabling HIV to replicate and bringing the symptoms of AIDS closer.

In an article published in Nature Structural & Molecular Biology (NSMB), PhD candidate Yunan Januário and Professor Luis Lamberti Pinto da Silva at the University of São Paulo’s Ribeirão Preto Medical School (FMRP-USP) in Brazil comment on the importance of recent discoveries about this protein.

“Researchers have succeeded in obtaining the three-dimensional structure between Nef-CD4 and AP-2, showing the contact surfaces and making possible other studies designed to yield a molecule that occupies this space so as to prevent the effects of the protein from advancing. This ‘photograph’ can serve as the basis for the development of other anti-HIV therapies,” Silva told Agência FAPESP.

With more than 15 years of experience in the field, Silva was invited by NSMB to comment on a study described in the article “Structural basis of CD4 downregulation by HIV-1 Nef”, published in the same issue of the journal and with Yonghwa Kwon, a researcher at the University of Massachusetts (USA) as first author.

“Notably, the study by Kwon et al. provides the structural basis for the cooperative assembly of the AP-2–Nef–CD4 tripartite complex and shows that Nef directly bridges CD4 and AP-2. The study also reveals a structural relationship between Nef-mediated downregulation of CD4 and Nef’s antagonism of major histocompatibility complex I (MHC-I)”, the two Brazilians write in their NSMB article.

According to Silva, Nef was known to be crucial to the advancing effects of HIV as AIDS progresses. In addition, the protein can continue to be produced by the organism of patients in treatment or whose viral load remains too small to be detected. “This has been linked to co-morbidities of the infection. Nef is important, and no drugs exist to target it,” he said.

Since the discovery that AIDS is caused by HIV almost 40 years ago, scientists have learned a great deal about the intricate mechanisms by which the virus attacks the immune system, invading and controlling cells, but they have not yet found a way to block Nef directly. They know Nef is a multifunctional protein with a complex mechanism. It is not part of the structure of HIV, but modifies cells to allow the virus to replicate.

Several classes of antiretroviral drugs act on the immune system to block different stages of the HIV multiplication cycle, reduce viral load, and even halt the development of the disease. The most common antiretrovirals include reverse transcriptase inhibitors (nucleoside and non-nucleoside), which disrupt the virus’s use of reverse transcriptase to copy its genetic material and multiply inside defense cells. They also include protease and integrase inhibitors, which hobble viral replication and cell invasion respectively, and entry inhibitors, which stop the virus from invading defense cells by blocking receptors such as CCR5.

All these drugs have led to remarkable progress in terms of reducing AIDS morbidity and mortality, but viral resistance and the drugs’ side effects require a non-stop search for new ways of combating the virus.

Some 38 million people were living with the virus worldwide in 2019, including 1.8 million children aged 14 or less, according to the Joint United Nations Program on HIV and AIDS (UNAIDS), which also says 67% had access to antiretroviral therapy. In Brazil, the Health Ministry’s statistics show about 866,000 people with HIV in 2019.

Other binding pathways

Silva and his research group at FMRP-USP published a study early this year showing how Nef uses another cellular protein, AP-1G2, and the link between the two pathways. The study was supported by FAPESP.

The group analyzed the effects of Nef on host cell endomembranes, describing the mechanisms by which it uses AP-1G2 and sends CD4 to lysosomes (cell organelles containing enzymes that break down proteins and other molecules). This removes them from the cell surface and helps viral release from cells to spread the infection.

CD4 is the receptor used by HIV to invade cells. If it remains on the cell surface, viral release is blocked: hence its removal by Nef from infected cells. “We pointed to the common point between these two pathways,” Silva said. “In order to send CD4 and MHC-I to lysosomes, Nef hijacks a third cell protein common to both pathways. These findings can help other groups show exactly how Nef interacts with the third protein to identify a novel target, just as was done with AP-2.”

Silva and his group are currently working on another study, also supported by FAPESP. The goal is to discover other targets of Nef. “Several ingredients are available. Now someone has to come along to bring it all together and obtain this molecule capable of inhibiting Nef,” he said.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

US agricultural water use declining for most crops and livestock production

URBANA, Ill. - Climate change and a growing world population require efficient use of natural resources. Water is a crucial component in food production, and water management strategies are needed to support worldwide changes in food consumption and dietary patterns.

Agricultural production and food manufacturing account for a third of water usage in the U.S. Water use fluctuates with weather patterns but is also affected by shifts in production technology, supply-chain linkages, and domestic and foreign consumer demand.

A comprehensive University of Illinois study looked at water withdrawals in U.S. agriculture and food production from 1995 to 2010. The main trend was a decline in water use, driven by a combination of factors.

"Overall, the use of water for irrigation decreased by 8.3% over this period," says Sandy Dall'erba, regional economist at U of I and co-author on the study.

"However, one needs to identify the drivers of water use by crop as they differ from one commodity to the next, so water-saving strategies for one crop may not be relevant for another one," Dall'erba explains. "For instance, water use in cereal grains, fruits, and vegetables is mostly driven by the efficiency of the irrigation system, domestic per-capita income, and sales to the food processing industry. If irrigation is more efficient, water demand decreases. When demand for fruits and vegetables decreased in 2005-2010 during the financial crisis, so did demand for water."

Oilseed crops, on the other hand, have experienced a 98% increase in water demand over the period. The change is primarily driven by international supply-chain linkages. It means foreign companies, mostly in China, have purchased large amount of U.S. oilseed crops for further processing.

"There has also been a shift in consumer demand from red meat to white meat in the U.S. People consume less beef and more chicken, which require 3.5 times less water per pound of production. Those trends in consumption and taste have helped the U.S. reduce water use for livestock by 14%," Dall'erba says.

Dall'erba and co-author Andre Avelino performed a structural decomposition analysis, looking at 18 factors that drive U.S. water withdrawals across eight crops, six livestock categories, and 11 food manufacturing industries.

Based on data from Exiobase, a global supply-chain database, their analysis included water that's embedded into the production at all stages of the domestic and international supply chain, from crops and livestock to processed food production­ - highlighting the interconnectedness of global agribusiness.

For example, crops produced in the U.S. may rely on fertilizers produced in a different country. Similarly, soybeans produced in the U.S. could be used for food processing in China, or to feed livestock in Europe.

The current U.S.-China trade war is likely to affect these these supply-chain linkages, as Chinese import of oilseeds shifts to South America and Europe. The U.S. exported less soybean and pork to China over the last two years; therefore, less water was embedded into those exports. However, the next few years under a new U.S. administration may see an improvement in these relationships, Dall'erba notes.

The COVID-19 pandemic is also likely affecting water usage. Unemployment and economic crises have always impacted consumer demand, and international trade has sharply declined since the pandemic began. The 2008 recession resulted in decreased water usage and similar effects are expected in the current crisis, Dall'erba states.

Traditionally, scientists measuring the amount of water associated with production and the supply chain rely on a worldwide data set called the Water Footprint Network (WFN), which is based on a crop water-use model. However, Dall'erba and Avelino used data from the U.S. Geological Survey (USGS), which are based on observations rather than physical models.

"With the USGS data we find a decrease in the amount of water used, while the WFN data indicated a small increase. The difference is not large, but it's still a big deal, because you would find the wrong trend and reach misleading conclusions if you use the wrong data set," Dall'erba explains.

"This is important information for researchers. If you're in a situation where you have access to data based on observations rather than crop models, you should use those official data, especially because water saving policies are based on this dataset," he notes.

The questions addressed in this research are extremely relevant for any country that is heavy on agricultural production, Dall'erba says. "Namely, how can we feed the 10 billion people we expect to be at the global level by 2080, considering that we cannot necessarily expand the amount of land that's going to be used? And, given climate change, there is quite a lot of uncertainty with respect to the availability of water needed to grow crops and feed livestock in the years to come."

Water management strategies may include farm-level efforts such as increasing efficiency of the irrigation system, switching crops, and growing genetically modified crops.

Other measures may include policies aimed at affecting consumer behavior such as increasing taxes on water-intensive products and supporting ecolabeling, Dall'erba suggests.

Ecolabeling would require food manufacturing companies to report the amounts of water, carbon dioxide emissions, and labor associated with production. That could help consumers make informed choices and potentially shift consumption to less water-intensive products, he concludes.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Trifarotene in moderate acne: No study data for the assessment of the added benefit

Trifarotene is a drug for the external treatment of acne vulgaris of the face and trunk. It is suitable for affected people with many comedones, papules and pustules, i.e. with moderate acne for which systemic therapy is not yet an option. The Federal Joint Committee (G-BA) now commissioned the German Institute for Quality and Efficiency in Health Care (IQWiG) to investigate whether the drug offers these patients aged twelve years and older an added benefit in comparison with combination therapy of adapalene and benzoyl peroxide or of clindamycin and benzoyl peroxide, which is also applied externally.

As the drug manufacturer's dossier did not contain any study data suitable for a direct or indirect comparison with this appropriate comparator therapy, it was concluded that an added benefit of trifarotene is not proven.

Placebo-controlled approval studies

The manufacturer cited two randomized controlled trials from the approval process in which trifarotene was compared with placebo, so they cannot be used for comparison with an established treatment alternative. In addition, the treatment phases of 12 weeks were very short. For a chronic condition like acne vulgaris, a minimum of 24 weeks is needed for a benefit assessment. The dossier contained supplementary information on a 1-year study, but this study had no control arm.

Enough affected people for good comparative studies

According to the manufacturer, there are about one to two million people with moderate acne vulgaris in Germany alone. Katharina Biester from IQWiG's Drug Assessment Department is critical of the fact that, even in such a widespread therapeutic indication, the approval studies were again only carried out in comparison with placebo: "There is a declared political will in Europe to generate evidence for comparisons between drugs. One wonders, therefore, why development programmes for new drugs are not aimed at answering both the questions of the approval and those of the benefit assessment."

G-BA decides on the extent of added benefit

The dossier assessment is part of the early benefit assessment according to the Act on the Reform of the Market for Medicinal Products (AMNOG) supervised by the G-BA. After publication of the dossier assessment, the G-BA conducts a commenting procedure and makes a final decision on the extent of the added benefit.

Credit: 
Institute for Quality and Efficiency in Health Care

Teeth grinding and facial pain increase due to coronavirus stress and anxiety

The stress and anxiety experienced by the general population during Israel's first lockdown brought about a significant rise in orofacial and jaw pain, as well as jaw-clenching in the daytime and teeth-grinding at night, according to a new study from Tel Aviv University (TAU).

The research also found that women suffered more from these symptoms more than men, and that 35- to 55-year-olds suffered most.

"We believe that our findings reflect the distress felt by the middle generation, who were cooped up at home with young children, without the usual help from grandparents, while also worrying about their elderly parents, facing financial problems and often required to work from home under trying conditions," the researchers say.

The study was led by Dr. Alona Emodi-Perlman and Prof. Ilana Eli of TAU's Goldschleger School of Dental Medicine at TAU's Sackler Faculty of Medicine. The paper was published in the Journal of Clinical Medicine on October 12, 2020.

The study examined questionnaires that assessed the presence and possible worsening of these symptoms in the general population during the first COVID-19 lockdown, due to the national emergency and rise in anxiety levels. The questionnaire was answered by a total of 1,800 respondents in Israel and Poland.

During Israel's first lockdown, the general population exhibited a considerable rise in orofacial pain, as well as jaw-clenching in the daytime and teeth-grinding at night - physical symptoms often caused by stress and anxiety. The prevalence of symptoms rose from about 35% pre-pandemic to 47%; the prevalence of jaw-clenching in the daytime rose from about 17% to 32%; and teeth-grinding at night rose from about 10% to 36%. People who had suffered from these symptoms before the pandemic exhibited a rise of about 15% in their severity.

Altogether a rise of 10%-25% was recorded in these symptoms, which often reflect emotional stress.

In addition, comparing findings in Israel to results in Poland, the researchers found that probability of TMD and Bruxism was much higher among respondents in Poland.

Credit: 
American Friends of Tel Aviv University

Omega-3s did not reduce cardiac events in recent heart attack survivors

DALLAS, Nov. 15, 2020 -- A daily dose of omega-3 fatty acids did not reduce the risk of cardiac events, including secondary heart attack, stroke, bypass surgery or death, among elderly people who had survived a recent heart attack, according to late-breaking research presented today at the American Heart Association's Scientific Sessions 2020. The virtual meeting is Friday, November 13 - Tuesday, November 17, 2020, and is a premier global exchange of the latest scientific advancements, research and evidence-based clinical practice updates in cardiovascular science for health care worldwide. The manuscript of this study is simultaneously published today in Circulation, journal of the American Heart Association.

Some studies have found that a high intake of fish-oil-derived omega-3 fatty acids is associated with reduced risk of cardiovascular events. However, some recent studies have shown the therapy to have no effect on overall cardiovascular health. The OMEMI Trial (OMega-3 fatty acids in Elderly patients with Myocardial Infarction) was conducted in Norway and investigated whether adding 1.8 grams of omega-3 fatty acids to standard treatment after a heart attack in elderly patients prevented new cardiac events.

"This study focused on a particularly vulnerable patient group, in this case elderly patients with recent cardiovascular disease and a high load of risk factors, where the effects of preventive measures are usually the most prominent," said Are A. Kalstad, M.D., a principal investigator of the study, and a researcher at the Center for Clinical Research at Oslo University Hospital in Oslo, Norway. "The fact that no indication of any impact from the omega-3 fatty acids were found in this group, along with the results of other recent neutral trials, suggests that omega-3 supplements are ineffective for cardiovascular prevention."

The study included more than 1,000 patients (ages 70 to 82, mostly white, 29% women) who had all been hospitalized due to a heart attack within the previous two months. Most participants (97%) were taking cholesterol-lowering medications (statins), and most (86%) were on two blood thinner medications. Researchers randomly assigned half of the participants to receive an omega-3 fatty acids oral supplement daily and half to receive a placebo capsule of corn oil daily.

After 2 years of follow-up, researchers analyzed the number of cardiac events in each group, including heart attacks, strokes, revascularization (bypass surgery or angioplasty), hospitalizations for heart failure or deaths. The data indicates no difference in the rate of cardiac events between the two groups - about 20% for both groups.

Additionally, the researchers found no statistical differences between the omega-3 group and the patients taking the placebo when they examined subgroups of patients divided by age, gender, diabetes status, kidney function and triglyceride levels. Furthermore, the secondary endpoint of the study was new atrial fibrillation, which occurred in almost twice as many in the study's omega-3 group compared to the placebo group, 7.2 % vs 4.0 %, respectively, although the numbers were too small to be of statistical significance.

Credit: 
American Heart Association

From the inside out - how the brain forms sensory memories

image: In the top-most layer of the neocortex, memory-related information is relayed by synapses from the thalamus (orange), which are in turn controlled by local 'gatekeeper' neurons (blue).

Image: 
MPI for Brain Research

The brain encodes information collected by our senses. However, to perceive our environment and to constructively interact with it, these sensory signals need to be interpreted in the context of our previous experiences and current aims. In the latest issue of Science, a team of scientists led by Dr. Johannes Letzkus, Research Group Leader at the Max Planck Institute for Brain Research, has identified a key source of this experience-dependent top-down information.

The neocortex is the largest area of the human brain. It has expanded and differentiated enormously during mammalian evolution, and is thought to mediate many of the capacities that distinguish humans from their closest relatives. Moreover, dysfunctions of this area also play a central role in many psychiatric disorders. All higher cognitive functions of the neocortex are enabled by bringing together two distinct streams of information: a 'bottom-up' stream carrying signals from the surrounding environment, and a 'top-down' stream that transmits internally-generated information encoding our previous experiences and current aims.

"Decades of investigation have elucidated how sensory inputs from the environment are processed. However, our knowledge of internally-generated information is still in its infancy. This is one of the biggest gaps in our understanding of higher brain functions like sensory perception," says Letzkus. This motivated the team to search for the sources of these top-down signals. "Previous work by us and many other scientists had suggested that the top-most layer of neocortex is likely a key site that receives inputs carrying top-down information. Taking this as a starting point allowed us to identify a region of the thalamus - a brain area embedded deep within the forebrain - as a key candidate source of such internal information."

Motivated by these observations Dr. M. Belén Pardi, the first author of the study and postdoctoral researcher in the Letzkus lab, devised an innovative approach that enabled her to measure the responses of single thalamic synapses in mouse neocortex before and after a learning paradigm. "The results were very clear," Pardi remembers. "Whereas neutral stimuli without relevance were encoded by small and transient responses in this pathway, learning strongly boosted their activity and made the signals both faster and more sustained over time." This suggests that the thalamic synapses in neocortex encode the previous experience of the animal. "We were really convinced that this is the case when we compared the strength of the acquired memory with the change in thalamic activity: This revealed a strong positive correlation, indicating that inputs from the thalamus prominently encode the learned behavioral relevance of stimuli," says Letzkus.

But is this mechanism selective for these top-down memory-related signals? Sensory stimuli can be relevant because of what we have learned to associate with them, but also merely due to their physical properties. For instance, the louder sounds are the more readily they recruit attention in both humans and animals. However, this is a low-level function that has little to do with previous experience. "Intriguingly, we found very different, indeed opposite, encoding mechanisms for this bottom-up form of relevance" says Pardi.

Given their central importance, the scientists speculated that the way these signals are received in the neocortex must be tightly regulated. Pardi and co-workers addressed this in further experiments, combined with computational modeling in collaboration with the laboratory of Dr. Henning Sprekeler and his team at Technische Universität Berlin. The results indeed identified a previously unknown mechanism that can finely tune the information along this pathway, identifying a specialized type of neuron in the top-most layer of neocortex as a dynamic gatekeeper of these top-down signals.

"These results reveal the thalamic inputs to sensory neocortex as a key source of information about the past experiences that have been associated with sensory stimuli. Such top-down signals are perturbed in a number of brain disorders like autism and schizophrenia, and our hope is that the present findings will also enable a deeper understanding of the maladaptive changes that underlie these severe conditions," concludes Letzkus.

Credit: 
Max-Planck-Gesellschaft

Identification of the SARS-CoV-2 virus features causing COVID-19 using primate model

image: Macaques were exposed to a high titre of SARS-CoV-2 via combined transmission routes. In their pulmonary lesions, acute interstitial pneumonia with endotheliitis (mononuclear and PMN leukocytes infiltrates within the intima along the lumen of many vessels, oedema of vessel, and focal haemorrhage) was observed.

Image: 
Korea Research Institute of Bioscience and Biotechnology (KRIBB)

Features of the SARS-CoV-2 virus causing COVID-19, which could be useful for developing vaccines and treatment strategies, were identified using a nonhuman primate model developed at the Korea Research Institute of Bioscience and Biotechnology (KRIBB).

The work was initiated in February this year by the research team led by Dr. Jung Joo Hong at the KRIBB National Primate Research Center, and resulted in successful development of a nonhuman primate model of COVID-19 infection, the fourth model reported worldwide, following China, the Netherlands, and the US. The results of the study were part of a larger research project aiming to identify key features of severe acute respiratory syndrome coronavirus-2(SARS-CoV-2), the virus causing COVID-19, and to test for the efficacies of COVID-19 vaccines and treatments using the primate model.

In the primate study, vascular abnormalities due to the infection, reasons underlying fatality of COVID-19 infection, particularly in immunocompromised patients, sites of SARS-CoV-2 multiplication inside human body, and the time-course and were investigated.

The research team showed, for the first time, that SARS-CoV-2 caused vascular inflammation and that the endotheliitis persisted even 3 days after the infection. Further, they confirmed immunosuppression, which is typically observed in patients with immunodeficiency, when the viral load increased precipitously during COVID-19 infection(first 2 days after getting infected).

This study was featured on the cover of the Journal of Infectious Diseases, a world-class academic journal in the field of infectious diseases. The issue's online edition became available on August 3, 2020 and the article will be printed the November 15 issue.

The research team observed that the virus multiplied rapidly in the upper and lower respiratory tracts of the experimental primates in first 2 days after the viral infection. Subsequently, the viral load decreased quickly, and the viral activity was not detected 7 days after the infection.

These findings are expected to provide novel insights regarding the diagnostic challenges associated with a false positive test, i.e., a positive result of the reverse transcriptase polymerase chain reaction(RT-PCR) test for an asymptomatic.

Credit: 
National Research Council of Science & Technology

Ticagrelor was not superior to clopidogrel to reduce heart attack risk during angioplasty

DALLAS, Nov. 14, 2020 -- The use of the more potent antiplatelet medication ticagrelor was not superior to clopidogrel in the reduction of the rate of heart attack or severe complications among people undergoing an elective procedure to open a blocked artery, according to late-breaking research presented today at the American Heart Association's Scientific Sessions 2020. The virtual meeting is Friday, November 13-Tuesday, November 17, 2020, and is a premier global exchange of the latest scientific advancements, research and evidence-based clinical practice updates in cardiovascular science for health care worldwide. The manuscript of this study is simultaneously published today in Lancet.

The randomized, international, multi-center ALPHEUS trial enrolled 1,833 adults (average age 67% male ) at 49 centers in two European countries (France and the Czech Republic). The participants had heart disease and elected to undergo percutaneous coronary intervention (also known as PCI or angioplasty) to open a narrow or blocked blood vessel supplying blood to the heart.

The trial examined whether ticagrelor, a more potent antiplatelet medication than the standard of care (clopidogrel), may reduce the rate heart attack and complications around the time of the procedure for patients who are considered high-risk due to at least one additional underlying condition, such as renal insufficiency, diabetes, overweight, prior acute coronary syndome or heart failure, or because of a high risk procedure (multiple stents needed, bifurcation stenting, left main, etc.).

"While ticagrelor has been shown in previous studies to be beneficial in reducing blood clots among people who experienced a sudden blockage in a coronary artery, it has never been evaluated in patients undergoing elective percutaneous coronary intervention with more stable heart conditions," said Johanne Silvain, M.D., Ph.D., lead study author, and an interventional cardiologist and director of the Coronary Intensive Care Unit at the Institut de Cardiologie of the Hospital Pitié-Salpêtrière in Paris.

Percutaneous coronary intervention (PCI) is a common, non-surgical procedure that uses an inflatable balloon to open a blocked heart artery. Artery-clogging plaque is pushed against the blood vessel wall to restore blood flow through the artery. A small mesh tube called a stent is sometimes placed in the artery to keep it open.

The risk of severe complications, including stent clots, stroke and death, during percutaneous coronary intervention is less than 1% (the rate was 0.5% in the ALPHEUS trial). However, some patients experience heart attacks provoked by the PCI procedure itself.

In this study, a control group of patients received clopidogrel plus aspirin before PCI and for 30 days after the procedure. The experimental group received ticagrelor plus aspirin before PCI and for 30 days after.

The study, which was conducted over more than three years, found:

The rate of periprocedural heart attack and severe complications, including stent clots, stroke and death, was similar between the control group and the experimental group.

There was no increase in excessive bleeding 48-hours after PCI among the patients treated with ticagrelor compared to the control group.

"These results show there is no scientific data to support a change in the standard of care," Silvain said. "The use of ticagrelor should be reserved for treating patients who experienced a sudden blockage in a coronary artery (acute coronary syndrome), not those who have stable heart conditions and elect PCI."

Study limitations include that "it was an open label trial, myocardial infarction and injury were used as the clinical outcome as hard clinical endpoint which are rare in elective PCI. Patients under chronic clopidogrel therapy were included, and all troponin assays were allowed to reflect real-life but may have brought heterogeneity," he explained.

Credit: 
American Heart Association

Rivaroxaban may be as effective as warfarin for bioprosthetic mitral valves, AF

DALLAS, Nov. 14, 2020 -- The primary results from the RIVER Trial, Rivaroxaban for Valvular Heart disease and Atrial Fibrillation, show that rivaroxaban is comparable to warfarin, which is currently the standard of care anticoagulant prescribed for patients with bioprosthetic mitral valves, according to late-breaking research presented today at the American Heart Association's Scientific Sessions 2020. The meeting is virtual, Friday, November 13-Tuesday, November 17, 2020, and is a premier global exchange of the latest scientific advancements, research and evidence-based clinical practice updates in cardiovascular science for health care worldwide.

Researchers say clinicians have used direct oral anticoagulants (DOACs) such as rivaroxaban off-label for patients with bioprosthetic mitral valves who have atrial fibrillation or flutter.

The RIVER trial followed 1,005 patients from 49 sites in Brazil who had a bioprosthetic mitral valve and atrial fibrillation or flutter for 12 months. Patients were randomized to rivaroxaban 20 mg once daily or the vitamin K antagonist anticoagulant warfarin (dose-adjusted to an international normalized ratio (INR) between 2.0-3.0). The primary endpoint was a composite of death, major cardiovascular events (stroke, transient ischemic attack, systemic embolism, valve thrombosis or hospitalization for heart failure) or major bleeding over 12 months.

"This is the largest trial designed to evaluate the safety and efficacy of direct oral anticoagulants in patients with bioprosthetic mitral valves and atrial fibrillation or flutter. Earlier trials of direct oral anticoagulants vs. warfarin for atrial fibrillation or flutter together included fewer than 200 patients with bioprosthetic mitral valves," said lead study author Otavio Berwanger, M.D., Ph. D., a cardiologist and epidemiologist, and the director of the Research Institute Hcor, Heart Hospital (Hospital de Coracao) in Sao Paulo, Brazil.

Patients who received rivaroxaban had an average of almost one year (347.5 days) free from the primary endpoint, similar to those treated with warfarin (340.1 days).

"Additionally, our confidence interval likely excluded an effect size larger than 1.4 days free from events favoring warfarin, clearly demonstrating the non-inferiority effect of rivaroxaban in this clinical setting," Berwanger said. "There was a similarly nonsignificant numeric trend favoring rivaroxaban for most secondary endpoints, however, the incidence of stroke was lower in the rivaroxaban than in the warfarin group. Importantly, safety events such as valve thrombosis, major/non-major clinically relevant and total bleeding were not statistically different between the rivaroxaban and warfarin groups."

A subgroup analysis of 18.8% of RIVER patients with bioprosthetic mitral valve implantation within the prior three months showed a mean of 35.1 days longer without evidence of the primary outcome compared to patients treated with warfarin.

"The results from the RIVER Trial are consistent with previous research, including ROCKET and other pivotal trials of DOACs, and can inform clinical practice for patients with bioprosthetic mitral valves," Berwanger said. "For a subgroup of patients with a mitral valve replacement within the last three months, rivaroxaban was statistically and clinically superior to warfarin."

The main study limitations are the open-label design, and the fact that the findings cannot be extrapolated to patients with a bioprosthetic aortic valve in the aortic position, or for patients with mitral stenosis or with mechanical valves. Trials in these populations are ongoing.

Credit: 
American Heart Association

New insights can foster development of natural and safer fungicides

image: Isolating fungi sample from a colony

Image: 
Sylvain Dubey

The agricultural industry relies heavily on chemical fungicides to protect crops. Many of these products have a detrimental effect on human and animal health. As a result, some of the most effective fungicides are considered to be banned in a number of geographies, especially in Europe.

"Despite efforts to develop natural protection, very few efficient solutions have been commercialized," explained Sylvain Dubey, a plant pathologist based in Switzerland. "And there are still no effective environmentally friendly methods of combatting the fungal pathogens that infect large groups of crops."

Dubey and colleagues were inspired by earlier research in which they showed that compounds with antifungal activities can be found in various organisms from prokaryotes to eukaryotes such as plants, with oil mixtures containing terpenes, phenols and alcohols, glycosides, alkaloids, and recently discovered glucosinolate derivatives. These latter compounds are primarily present in the Brassicaceae family.

"Because of these results, we investigated the fungitoxic activity of natural isothiocyanate derivatives of glucosinolates as well as semi-synthetic derivatives and their synergy."

In a recent study published in PhytoFrontiers journal, they confirmed that 13 natural and semi-synthetic glucosinolate derivatives are efficient fungicides alone or when used in combination against widespread genetically distant species of fungal plant pathogens. Combinations of these compounds showed strong synergistic fungitoxic effects.

"Interestingly, physico-chemical characteristics of fungitoxic glucosinolate derivatives differ from those showing no activities or from those that are known for their insecticidal or insect attractive properties," explained Dubey. "Indeed, fungitoxic compounds exhibit significantly higher bioaccumulation potential and are more lipophilic than compounds with no fungitoxic activities. It is therefore possible to anticipate the potential function of glucosinolate derivatives based on their physico-chemical properties and the differences in the way compounds will interact with the surrounding environment and hence organisms."

Dubey and colleagues hope that their findings will pave the way for parties looking to develop environmentally friendly methods of combating fungal pathogens. Their research mirrors other research that has shown that biological compounds can be used to replace environmentally damaging pesticides. For more information, read "Isothiocyanate Derivatives of Glucosinolates as Efficient Natural Fungicides" published in PhytoFrontiers journal.

Credit: 
American Phytopathological Society