Earth

Among effective antihypertensive drugs, less popular choice is slightly safer

NEW YORK, NY (July 26, 2021)--Two types of drugs that are recommended as a first treatment for patients with high blood pressure were found equally effective in improving cardiovascular outcomes, but the more popular type causes slightly more side effects, finds a multinational observational study led by researchers at Columbia University Vagelos College of Physicians and Surgeons.

The study, which analyzed claims and electronic health data from millions of patients worldwide, is the largest to compare the safety and efficacy of angiotensin-converting enzyme (ACE) inhibitors and angiotensin receptor blockers (ARBs), two commonly prescribed antihypertensive drugs.

"Physicians in the United States and Europe overwhelmingly prescribe ACE inhibitors, simply because the drugs have been around longer and tend to be less expensive than ARBs," says George Hripcsak, MD, the Vivian Beaumont Allen Professor and chair of biomedical informatics at Columbia University Vagelos College of Physicians and Surgeons and senior author of the study.

"But our study shows that ARBs are associated with fewer side effects than ACE inhibitors. The study focused on first-time users of these drugs. If you're just starting drug therapy for hypertension, you might consider trying an ARB first. If you're already taking an ACE inhibitor and you're not having any side effects, there is nothing that we found that would indicate a need for a change."

The study was published online in Hypertension.

Narrowing Down Choices

Once a physician decides to prescribe medication to control a patient's high blood pressure, the next decision--which one to choose--is complicated.

"U.S. and European hypertension guidelines list 30 medications from five different drug classes as possible choices, yet there are very few head-to-head studies to help physicians determine which ones are better," Hripcsak says. "In our research, we are trying to fill in this information gap with real-world observational data."

ACE inhibitors and ARBs are among the choices, and they have a similar mechanism of action. Both reduce the risk of stroke and heart attacks, though it's known that ACE inhibitors are associated with increased risk of cough and angioedema (severe swelling in the face and airways).

"We wanted to see if there were any surprises--were both drug classes equally effective, and were ARBs producing any unexpected side effects when used in the real world?" Hripcsak says. "We're unlikely to see head-to-head clinical trials comparing the two since we are reasonably sure that both are effective."

Electronic Health Records Provide Answer

The researchers instead turned to large databases to answer their questions. They analyzed insurance claims and electronic health records from approximately 3 million patients in Europe, Korea, and the United States who were starting antihypertensive treatment with either an ACE inhibitor or an ARB.

Data from electronic health records and insurance claims are challenging to use in research. They can be inaccurate, incomplete, and contain information that biases the results. So the researchers employed a variety of cutting-edge mathematical techniques developed by the Observational Health Data Science and Informatics (OHDSI) collaborative network to dramatically reduce bias and balance the two treatment groups as if they had been enrolled in a prospective study.

Using this approach, the researchers tracked four cardiovascular outcomes--heart attack, heart failure, stroke, and sudden cardiac death--and 51 adverse events in patients after they started antihypertensive treatment.

The researchers found that the vast majority of patients--2.3 million--were prescribed an ACE inhibitor. There were no significant differences between the two drug classes in reducing the risk of major cardiovascular complications in people with hypertension. Patients taking ACE inhibitors had a higher risk of cough and angioedema, but the study also found they had a slightly higher risk of pancreatitis and gastrointestinal bleeding.

"Our study largely confirmed that both antihypertensive drug classes are similarly effective, though ARBs may be a little safer than ACE inhibitors," Hripcsak says. "This provides that extra bit of evidence that may make physicians feel more comfortable about prescribing ARBs versus ACE inhibitors when initiating monotherapy for patients with hypertension. And it shows that large-scale observational studies such as this can offer important insight in choosing among different treatment options in the absence of large randomized clinical trials."

Credit: 
Columbia University Irving Medical Center

New research identifies cancer types with little survival improvements in adolescents and young adul

Survival rates for adolescents and young adults diagnosed with cancer have varied considerably depending on cancer type. A new study indicates that survival for multiple cancer types in such patients has improved in recent years, but some patients diagnosed with common cancer types still show limited survival improvements. The results are published by Wiley early online in CANCER, a peer-reviewed journal of the American Cancer Society.

For the study, investigators at the National Cancer Institute analyzed survival trends related to cancers with the highest mortality rates in adolescents and young adults. Relying on information from the Surveillance, Epidemiology, and End Results (SEER) cancer registry and the National Center for Health Statistics, the team focused on incidence, mortality, and survival rates for nine cancer types from 1975-2016. By examining survival rates over time among adolescents and young adults with the most lethal forms of cancer, they were able to identify those cancers with the greatest need for future research.

The investigators uncovered significant improvements in 5-year survival for young patients with brain and other nervous system tumors, colon and rectum cancer, lung and bronchus cancer, acute myelogenous leukemia, and Non-Hodgkin lymphoma. Limited or no improvement in survival was found for those with female breast cancer, cervical cancer, ovarian cancer, and bone/joint sarcomas.

For female breast cancer: 5-year relative survival increased from 1985 through 2007, while mortality declined from 1986 to 2012 and increased since 2012.

For cervical cancer: 5-year relative survival remained steady from 1975 through 2011. There was a declining incidence rate and a flat mortality rate after 2005.

For ovarian cancer: 5-year relative survival rose slightly for the whole period. Incidence and mortality rates dropped between 1993 and 1996.

For bone/joint sarcomas: 5-year relative survival increased from 1975 to 1989 before leveling off.

"As the SEER cancer data collection expands over time, including more years of diagnosis, we are able to piece together a larger part of the cancer survival story for the adolescent and young adult population in the United States," said lead author Denise Riedel Lewis, PhD, MPH. "These results will help refocus our research efforts on adolescent and young adult cancer survivorship."

Credit: 
Wiley

'Feel good' brain messenger can be willfully controlled, new study reveals

image: UC San Diego researchers and their colleagues have discovered that spontaneous impulses of dopamine, the neurological messenger known as the brain's "feel good" chemical, occur in the brains of mice.

Image: 
Julia Kuhl

From the thrill of hearing an ice cream truck approaching to the spikes of pleasure while sipping a fine wine, the neurological messenger known as dopamine has been popularly described as the brain's "feel good" chemical related to reward and pleasure.

A ubiquitous neurotransmitter that carries signals between brain cells, dopamine, among its many functions, is involved in multiple aspects of cognitive processing. The chemical messenger has been extensively studied from the perspective of external cues, or "deterministic" signals. Instead, University of California San Diego researchers recently set out to investigate less understood aspects related to spontaneous impulses of dopamine. Their results, published July 23 in the journal Current Biology, have shown that mice can willfully manipulate these random dopamine pulses.

Rather than only occurring when presented with pleasurable, or reward-based expectations, UC San Diego graduate student Conrad Foo led research that found that the neocortex in mice is flooded with unpredictable impulses of dopamine that occur approximately once per minute.

Working with colleagues at UC San Diego (Department of Physics and Section of Neurobiology) and the Icahn School of Medicine at Mount Sinai in New York, Foo investigated whether mice are in fact aware that these impulses--documented in the lab through molecular and optical imaging techniques--are actually occurring. The researchers devised a feedback scheme in which mice on a treadmill received a reward if they showed they were able to control the impromptu dopamine signals. Not only were mice aware of these dopamine impulses, the data revealed, but the results confirmed that they learned to anticipate and volitionally act upon a portion of them.

"Critically, mice learned to reliably elicit (dopamine) impulses prior to receiving a reward," the researchers note in the paper. "These effects reversed when the reward was removed. We posit that spontaneous dopamine impulses may serve as a salient cognitive event in behavioral planning."

The researchers say the study opens a new dimension in the study of dopamine and brain dynamics. They now intend to extend this research to explore if and how unpredictable dopamine events drive foraging, which is an essential aspect of seeking sustenance, finding a mate and as a social behavior in colonizing new home bases.

"We further conjecture that an animal's sense of spontaneous dopamine impulses may motivate it to search and forage in the absence of known reward-predictive stimuli," the researchers noted.

In their efforts to control dopamine, the researchers clarified that dopamine appears to invigorate, rather than initiate, motor behavior.

"This started as a serendipitous finding by a talented, and curious, graduate student with intellectual support from a wonderful group of colleagues," said study senior co-author David Kleinfeld, a professor in the Department of Physics (Division of Physical Sciences) and Section of Neurobiology (Division of Biological Sciences). "As an unanticipated result, we spent many long days expanding on the original study and of course performing control experiments to verify the claims. These led to the current conclusions."

Credit: 
University of California - San Diego

New understanding of cell stability with potential to improve immune cell therapies

Research in mice, published today in Science Immunology by researchers at the Babraham Institute, UK and VIB-KU Leuven, Belgium, provides two solutions with potential to overcome a key clinical limitation of immune cell therapies. Regulatory T cells have potential in treating autoimmunity and inflammatory diseases yet they can switch from a protective to damaging function. By identifying the unstable regulatory T cells, and understanding how they can be purged from a cell population, the authors highlight a path forward for regulatory T cell transfer therapy.

Cell therapy is based on purifying cells from a patient, growing them up in cell culture to improve their properties, and then reinfusing them into the patient. Professor Adrian Liston, Immunology group leader at the Babraham Institute, explained their therapeutic potential: "The leading use of cell therapy is to improve T cells so that they can attack and kill a patient's cancer, however the incredible versatility of the immune system means that, in principle, we could treat almost any immune disorder with the right cell type. Regulatory T cells are particularly promising, with their ability to shut down autoimmune disease, inflammatory disease and transplantation rejection. A key limitation in their clinical use, however, comes from the instability of regulatory T cells - we just can't use them in cell therapy until we make ensure that they stay protective".

T cells come in a large variety of types, each with unique functions in our immune system. "While most T cells are inflammatory, ready to attack pathogens or infected cells, regulatory T cells are potent anti-inflammatory mediators", Professor Susan Schlenner, University of Leuven, explains. "Unfortunately this cell type is not entirely stable, and sometimes regulatory T cells convert into inflammatory cells, called effector T cells. Crucially, the converted cells inherit both inflammatory behaviour and the ability to identify our own cells, and so pose a significant risk of damage to the system they are meant to protect."

The first key finding of this research shows that once regulatory T cells switch to becoming inflammatory, they are resistant to returning to their useful former state. Therefore, scientists need to find a way to remove the risky cells from any therapeutic cell populations, leaving behind the stable regulatory T cells.

By comparing stable and unstable cells the researchers identified molecular markers that indicate which cells are at risk of switching from regulatory to inflammatory. These markers can be used to purify cell populations before they are used as a treatment.

In addition to this method of cell purification, the researchers found that exposing regulatory T cells to a destabilising environment purges the unstable cells from the mixture. Under these conditions, the unstable cells are triggered to convert into inflammatory cells, allowing the researchers to purify the stable cells that are left. "The work needs to be translated into human cell therapies, but it suggests that we might be best off treating the cells mean", says Professor Adrian Liston. "Currently, cell culture conditions for cell therapy aim to keep all the cells in optimal conditions, which may actually be masking the unstable cells. By treating the cultures rougher, we may be able to identify and eliminate the unstable cells and create a safer mix of cells for therapeutic transfer."

Dr Steffie Junius, lead author on the paper who undertook the research as a PhD student at the University of Leuven, commented: "The next stage in the research is to take the lessons learned in mice and translate them into optimal protocols for patients. I hope that our research contributes to the improved design and allows the development of effective regulatory T cell therapy."

Establishing a thorough process to improve cell population stability in mice helps to lay the groundwork for improved immune cell therapies in humans, although the methods described in this work would require validation in humans before they were used in cell therapy trials. Dr Timothy Newton, CEO of Reflection Therapeutics, a Babraham Research Campus-based company designing cell therapies against neuro-inflammation, who was not involved in this study, commented on the translational potential of the study: "This research makes a significant impact on regulatory T cell therapeutic development by characterising unstable subsets of regulatory T cells that are likely to lose their desirable therapeutic qualities and become pro-inflammatory. The successful identification of these cells is of great importance when designing manufacturing strategies required to turn potential T cell therapeutics into practical treatments for patients of a wide range of inflammatory disorders."

Credit: 
Babraham Institute

Chemotherapy can induce mutations that lead to pediatric leukemia relapse

image: From left: Samuel Brady, Ph.D., and Jinghui Zhang, Ph.D., chair, both of Computational Biology, contributed to research that provides the first direct genomic and experimental evidence in pediatric cancer that drug-resistant mutations can be induced by chemotherapy.

Image: 
St. Jude Children's Research Hospital

Chemotherapy has helped make acute lymphoblastic leukemia (ALL) one of the most survivable childhood cancers. Now, researchers working in the U.S., Germany and China have shown how chemotherapy drugs called thiopurines can lead to mutations that set patients up for relapse. The work appears today in the journal Nature Cancer.

The research provides the first direct genomic and experimental evidence in pediatric cancer that drug-resistant mutations can be induced by chemotherapy and are not always present at diagnosis.

"The findings offer a paradigm shift in understanding how drug resistance develops," said Jinghui Zhang, Ph.D., Department of Computational Biology chair at St. Jude Children's Research Hospital. "The results also suggest possible treatment strategies for ALL patients who relapse, including screening to identify those who should avoid additional thiopurine treatment."

Zhang is co-corresponding author of the study with Bin-Bing Zhou, Ph.D., of Shanghai Children's Medical Center; and Renate Kirschner-Schwabe, M.D., of Charite-Universitaetsmedizin Berlin.

The roots of relapse

While 94% of St. Jude patients with ALL become five-year survivors, relapse remains the leading cause of death worldwide for children and adolescents with ALL.

This study involved ALL samples collected from relapsed pediatric ALL patients in the U.S., China and Germany. Researchers analyzed more than 1,000 samples collected from the patients at different times in treatment, including samples from 181 patients collected at diagnosis, remission and relapse.

Co-first author Samuel Brady, Ph.D., of St. Jude Computational Biology, identified a mutational signature that helped decipher the process. Mutational signatures reflect the history of genetic changes in cells.

Brady and his colleagues linked increased thiopurine-induced mutations to genes such as MSH2 that become mutated in leukemia. The mutations inactivated a DNA repair process called mismatch repair and rendered ALL resistant to thiopurines. The combination fueled a 10-fold increase in ALL mutations, including an alteration in the tumor suppressor gene TP53. The mutation, TP53 R248Q, promoted resistance to multiple chemotherapy drugs, including vincristine, daunorubicin and cytarabine.

Working in two cell lines in the laboratory, Zhou and his colleagues replicated the thiopurine-induced TP53 mutations and chemotherapy resistance. The research provided the first direct genomic and experimental evidence of chemotherapy-induced drug resistance mutations. "This study not only changes our ALL treatment considerations, but also opens the door to study mechanistically how defective repair generates drug-resistant mutations," Zhou said.

Chemotherapy's role in relapse

Researchers estimate that treatment-induced mutations play a role in 25% of pediatric ALL relapse. Eight percent of patients in this study had evidence of the thiopurine-associated mismatch-repair signature.

"In the future, it may be possible to monitor bone marrow during treatment as a way to detect these mutational signatures early enough to help identify at-risk patients who may be candidates for emerging therapies like CAR-T cells," Zhang said. But the researchers stressed that the benefits of thiopurine treatment outweigh the risks, noting that most patients are unaffected by thiopurine-induced mutations.

Credit: 
St. Jude Children's Research Hospital

Water resources: Defusing conflict, promoting cooperation

image: Mega-?dam on the Omo River: Gibe III (2016).

Image: 
Mimi Abebayehu/Wikimedia Commons

Rivers are lifelines for many countries. They create valuable ecosystems, provide drinking water for people and raw water for agriculture and industry. In the Global South in particular, there is strong competition for access to freshwater resources. The increasing use of hydropower has recently intensified this competition further.

Take Ethiopia, for example: when the country began filling the mega-?dam Gibe III on the Omo River in 2015, downstream users saw a drop in water volumes. Natural flooding declined, reducing the volume of fertile mud washed onto the floodplain. The level of Kenya's Lake Turkana, into which the Omo flows, fell temporarily by two metres, resulting in significant consequences for people and agriculture.

Addressing the nexus

The network of interactions between water, energy, food and ecosystems - referred to by experts as the "water-energy-food (WEF) nexus" - often leads to wide-ranging disputes in the catchment areas of transboundary rivers. Large-scale infrastructure construction projects such as dams and irrigation schemes have caused political tensions between neighbouring states at various points in the past.

An international research team led by ETH Zurich has now developed a strategic toolkit that can help to defuse such conflicts over water use, through an objective analysis of stakeholder's interests. In the EU's Horizon 2020 project DAFNE, 14 research partners from Europe and Africa worked together to find approaches to a more equitable management of water resources.

"We wanted to show how it is possible to sustainably manage the nexus between water, energy, food and ecosystems, even in large and transboundary river basins with a wide range of users," says Paolo Burlando, Professor of Hydrology and Water Resources Management at ETH Zurich.

Integrating and balancing different interests

While it is now recognised that watershed planning should take a holistic approach that respects the needs of all stakeholders, multidimensional decision-making problems with significant numbers of stakeholders make it difficult to negotiate generally accepted solutions.

"Conventional planning tools are usually overwhelmed with challenges such as these," explains Burlando, who has led the DAFNE consortium for the past four years. This is why the project team developed a novel method to map and quantify trade-offs in the WEF nexus.

The approach is based on the principles of the participatory and integrated planning and management of water resources, which focuses on the role and interests of stakeholders. The DAFNE methodology is designed to engage stakeholders and find compromises and synergies in a joint approach. "The key is to find solutions that benefit everyone, take the environment into account and also make economic sense," explains Burlando.

Enabling dialogue through models

DAFNE uses state-of-the-art modelling techniques and digital solutions to enable participatory planning. A strategic decision tool allows the social, economic and environmental consequences of interventions to be assessed in a quantitative approach, enabling users to identify viable development pathways. Stakeholder selected pathways are simulated in detail using a hydrological model driven by high-resolution climate scenarios, in order to accurately analyse the impact on the respective water resources. Additional sub-models can be used to model other aspects of the nexus. Finally, a visualisation tool helps to illustrate interrelationships and assess problems from various user perspectives.

"The models aim to facilitate continuous negotiation between stakeholders - which is a key element of the DAFNE approach," says Senior Scientist Scott Sinclair, who co-developed the modelling approach.

Case studies with local stakeholders

The DAFNE project focused on two large river basins in East, and Southern Africa - the Omo-Turkana and Zambezi - where the researchers tested their methodology in two case studies. In both case studies, real stakeholders were involved in the development of the DAFNE approaches, working with them to test alternative operating modes for the power plants and irrigation schemes, to design more sustainable use scenarios for their catchment areas. They exchanged their different perspectives in simulated negotiations to illustrate the process.

In the Omo-Turkana basin, the scientists also used their methodology in a retrospective analysis of the controversial two-year filling phase of the Gibe III mega-dam in Ethiopia. "We observed that the negative impact on downstream neighbours was exacerbated by a prolonged drought," reports Burlando. The DAFNE consortium partner from Politecnico di Milano were able to show in a study published in Nature Communications together with Burlando and Sinclair, that such problems can be reduced by combining DAFNE tools with seasonal drought forecasts and flexibly adapting the filling regime to hydroclimatic conditions.

Dams on the advance worldwide

The results of the study are highly topical: Ethiopia is currently building another mega-dam in the Omo-Turkana catchment area, and filling the Grand Ethiopian Renaissance Dam on the Blue Nile. Worldwide, around 500 dam projects are being planned in regions affected by climate feedbacks through teleconnections. Growing populations and increasing prosperity will continue to boost demand for energy, food and water. The researchers hope that the DAFNE methodology will one day become a reference.

"We designed the modelling tools to be transferable to other regions with competing water needs," says Burlando. Follow-up projects are already under way to apply and further develop the technology in several river basins worldwide.

Credit: 
ETH Zurich

Child mental health services lacking in high-income countries: SFU study finds

Most children with a mental health disorder are not receiving services to address their needs--according to a new study from researchers at Simon Fraser University's Children's Health Policy Centre. Their research was published this week in the journal Evidence-Based Mental Health.

Researchers found that of the one in eight children (12.7 per cent) who experience a mental disorder, less than half (44.2 per cent) receive any services for these conditions.

"We have illuminated an invisible crisis in children's mental health and unacceptable service shortfalls in high-income countries -- including in Canada -- to a degree that violates children's rights," says study author Charlotte Waddell, an SFU health sciences professor and centre director.

"Many countries will need to substantially increase, and protect, their children's mental health budgets. This is particularly urgent given documented increases in children's mental health needs since COVID-19--needs which are predicted to continue."

Using systematic review methods, the researchers examined 14 prevalence surveys conducted in 11 high-income countries that included a total of 61,545 children aged four to 18 years. Eight of the 14 studies also assessed service contacts. The 14 surveys were conducted between 2003 and 2020 in Canada as well as the US, Australia, Chile, Denmark, Great Britain, Israel, Lithuania, Norway, South Korea and Taiwan.

Researchers note that mental health service provision lags behind services available to treat physical conditions in most of these countries. "We would not find it acceptable to treat only 44 per cent of children who had cancer or diabetes or infectious diseases," says Waddell.

The costs of not providing adequate childhood mental healthcare are also high. Mental health disorders typically begin in childhood and adolescence and if not prevented or treated early, they significantly interfere with wellbeing and development--with the impact extending across the lifespan.

This study found that the most common childhood mental disorders were anxiety (5.2 per cent), attention-deficit/hyperactivity disorder (ADHD) (3.7 per cent), oppositional defiant disorder (e.g., argumentative behaviour) (3.3 per cent), substance use disorder (e.g., problematic use of alcohol or cannabis) (2.3 per cent), conduct disorder (1.3 per cent) and depression (1.3 per cent).

Crucially, Waddell says effective treatments are well known for all of these disorders, as are effective prevention programs, "so we know how to help children."

Credit: 
Simon Fraser University

Research identifies potential role of 'junk DNA' sequence in aging, cancer

image: Jiyue Zhu (second from left) talks to members of his research team inside his laboratory on the WSU Health Sciences Spokane campus, including Ken Porter (far left), Sean Mcgranaghan (center), Fan Zhang (second from right), and Jinlong Zhang (far right).

Image: 
Photo by Cori Kogan, WSU Health Sciences Spokane

The human body is essentially made up of trillions of living cells. It ages as its cells age, which happens when those cells eventually stop replicating and dividing. Scientists have long known that genes influence how cells age and how long humans live, but how that works exactly remains unclear. Findings from a new study led by researchers at Washington State University have solved a small piece of that puzzle, bringing scientists one step closer to solving the mystery of aging.

A research team headed by Jiyue Zhu, a professor in the College of Pharmacy and Pharmaceutical Sciences, recently identified a DNA region known as VNTR2-1 that appears to drive the activity of the telomerase gene, which has been shown to prevent aging in certain types of cells. The study was published in the journal Proceedings of the National Academy of Sciences (PNAS).

The telomerase gene controls the activity of the telomerase enzyme, which helps produce telomeres, the caps at the end of each strand of DNA that protect the chromosomes within our cells. In normal cells, the length of telomeres gets a little bit shorter every time cells duplicate their DNA before they divide. When telomeres get too short, cells can no longer reproduce, causing them to age and die. However, in certain cell types--including reproductive cells and cancer cells--the activity of the telomerase gene ensures that telomeres are reset to the same length when DNA is copied. This is essentially what restarts the aging clock in new offspring but is also the reason why cancer cells can continue to multiply and form tumors.

Knowing how the telomerase gene is regulated and activated and why it is only active in certain types of cells could someday be the key to understanding how humans age, as well as how to stop the spread of cancer. That is why Zhu has focused the past 20 years of his career as a scientist solely on the study of this gene.

Zhu said that his team's latest finding that VNTR2-1 helps to drive the activity of the telomerase gene is especially notable because of the type of DNA sequence it represents.

"Almost 50% of our genome consists of repetitive DNA that does not code for protein," Zhu said. "These DNA sequences tend to be considered as 'junk DNA' or dark matters in our genome, and they are difficult to study. Our study describes that one of those units actually has a function in that it enhances the activity of the telomerase gene."

Their finding is based on a series of experiments that found that deleting the DNA sequence from cancer cells--both in a human cell line and in mice--caused telomeres to shorten, cells to age, and tumors to stop growing. Subsequently, they conducted a study that looked at the length of the sequence in DNA samples taken from Caucasian and African American centenarians and control participants in the Georgia Centenarian Study, a study that followed a group of people aged 100 or above between 1988 and 2008. The researchers found that the length of the sequence ranged from as short as 53 repeats--or copies--of the DNA to as long as 160 repeats.

"It varies a lot, and our study actually shows that the telomerase gene is more active in people with a longer sequence," Zhu said.

Since very short sequences were found only in African American participants, they looked more closely at that group and found that there were relatively few centenarians with a short VNTR2-1 sequence as compared to control participants. However, Zhu said it was worth noting that having a shorter sequence does not necessarily mean your lifespan will be shorter, because it means the telomerase gene is less active and your telomere length may be shorter, which could make you less likely to develop cancer.

"Our findings are telling us that this VNTR2-1 sequence contributes to the genetic diversity of how we age and how we get cancer," Zhu said. "We know that oncogenes--or cancer genes--and tumor suppressor genes don't account for all the reasons why we get cancer. Our research shows that the picture is a lot more complicated than a mutation of an oncogene and makes a strong case for expanding our research to look more closely at this so-called junk DNA."

Zhu noted that since African Americans have been in the United States for generations, many of them have Caucasian ancestors from whom they may have inherited some of this sequence. So as a next step, he and his team hope to be able to study the sequence in an African population.

Credit: 
Washington State University

Blushing plants reveal when fungi are growing in their roots

image: Betalain coloured roots

Image: 
Temur Yunusov and Alfonso Timoneda

Almost all crop plants form associations with a particular type of fungi - called arbuscular mycorrhiza fungi - in the soil, which greatly expand their root surface area. This mutually beneficial interaction boosts the plant's ability to take up nutrients that are vital for growth.

The more nutrients plants obtain naturally, the less artificial fertilisers are needed. Understanding this natural process, as the first step towards potentially enhancing it, is an ongoing research challenge. Progress is likely to pay huge dividends for agricultural productivity.

In a study published in the journal PLOS Biology, researchers used the bright red pigments of beetroot - called betalains - to visually track soil fungi as they colonised plant roots in a living plant.

"We can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact. We previously had no idea about what happened because there was no way to visualise it in a living plant without the use of elaborate microscopy," said Dr Sebastian Schornack, a researcher at the University of Cambridge's Sainsbury Laboratory and joint senior author of the paper.

To achieve their results, the researchers engineered two model plant species - a legume and a tobacco plant - so that they would produce the highly visible betalain pigments when arbuscular mycorrhiza fungi were present in their roots. This involved combining the control regions of two genes activated by mycorrhizal fungi with genes that synthesise red-coloured betalain pigments.

The plants were then grown in a transparent structure so that the root system was visible, and images of the roots could be taken with a flatbed scanner without disturbing the plants.

Using their technique, the researchers could select red pigmented parts of the root system to observe the fungus more closely as it entered individual plant cells and formed elaborate tree-like structures - called arbuscules - which grow inside the plant's roots. Arbuscules take up nutrients from the soil that would otherwise be beyond the reach of the plant.

Other methods exist to visualise this process, but these involve digging up and killing the plant and the use of chemicals or expensive microscopy. This work makes it possible for the first time to watch by eye and with simple imaging how symbiotic fungi start colonising living plant roots, and inhabit parts of the plant root system over time.

"This is an exciting new tool to visualise this, and other, important plant processes. Beetroot pigments are a distinctive colour, so they're very easy to see. They also have the advantage of being natural plant pigments, so they are well tolerated by plants," said Dr Sam Brockington, a researcher in the University of Cambridge's Department of Plant Sciences, and joint senior author of the paper.

Mycorrhiza fungi are attracting growing interest in agriculture. This new technique provides the ability to 'track and trace' the presence of symbiotic fungi in soils from different sources and locations. The researchers say this will enable the selection of fungi that colonise plants fastest and provide the biggest benefits in agricultural scenarios.

Understanding and exploiting the dynamics of plant root system colonisation by fungi has potential to enhance future crop production in an environmentally sustainable way. If plants can take up more nutrients naturally, this will reduce the need for artificial fertilisers - saving money and reducing associated water pollution.

Credit: 
University of Cambridge

China's carbon-monitoring satellite reports global carbon net of six gigatons

image: The first global carbon flux map derived by TanSat observation.

Image: 
Dongxu Yang

About six gigatons -- roughly 12 times the mass of all living humans -- of carbon appears to be emitted over land every year, according to data from the Chinese Global Carbon Dioxide Monitoring Scientific Experimental Satellite (TanSat).

Using data on how carbon mixes with dry air collected from May 2017 to April 2018, researchers developed the first global carbon flux dataset and map. They published their results in Advances in Atmospheric Sciences.

The map was developed by applying TanSat's satellite observations to models of how greenhouse gasses are exchanged among Earth's atmosphere, land, water and living organisms. During this process, more than a hundred of gigatons of carbon are exchanged, but the increase in carbon emissions has resulted in net carbon added to the atmosphere -- now at about six gigatons a year -- which is a serious issue that contributes to climate change, according to Dongxu Yang, first author and a researcher in the Institute of Atmospheric Physics at the Chinese Academy of Sciences (IAP CAS).

"In this paper, we introduce the first implementation of TanSat carbon dioxide data on carbon flux estimations," Yang said. "We also demonstrate that China's first carbon-monitoring satellite can investigate the distribution of carbon flux across the globe."

While satellite measurements are not as accurate as ground-based measurements, said co-author Jing Wang, a researcher in IAP CAS, satellite measurements provide continuous global observation coverage that provides additional information not available from limited or varied surface monitoring stations. For example, a monitoring station in a city may report very different observations compared to a station in a remote village, especially if they are in drastically different climates.

"The sparseness and spatial inhomogeneity of the existing ground-based network limits our ability to infer consistent global- and regional-scale carbon sources and sinks," said co-author Liang Feng, researcher with the National Centre for Earth Observation at the University of Edinburgh. "To improve observation coverage, tailor-made satellites, for example TanSat, have been developed to provide accurate atmospheric greenhouse gas measurements."

The data from these satellites, which includes TanSat, Japan's GOSAT and the United States' OCO-2, and future missions, will be used to independently verify national emission inventories across the globe. According to the Yang, this process will be overseen by the United Nations Framework Convention on Climate Change and begin in 2023, in support of the Paris Agreement. TanSat's measurements generally match with data from the other satellites.

"This verification method will be helpful to better understand carbon emissions in real time, and to help ensure transparency across the inventories," said co-author Yi Liu, researcher in IAP CAS.

The process will be bolstered by the next generation of satellites, known as TanSat-2, which is currently in the design phase. The goal, Yang said, will be to obtain measurements that help elucidate the carbon budget from the global scale down to individual cities.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Novel imaging agent identifies biomarker for iron-targeted cancer therapies

image: LIP expansion is detectable in an orthotopic glioma model with 18F-TRX. A. 18FTRX PET/CT data showing radiotracer uptake in a U87 MG tumor (arrow) implanted within the right hemisphere of a mouse brain. The image was acquired at 90 min post injection. B. Quantification of 18F-TRX uptake using region of interest analysis of the PET data from mice bearing U87 MG tumors (n = 3). The tumor uptake was compared to uninvolved normal white matter on the contralateral region of the brain. C. Digital autoradiography showing the distribution of the radiotracer within a coronal section of the mouse brain. The tissue was stained with H&E and merged with the pseudocolor image of the autoradiography.

Image: 
Image created by Evans, Renslo et al. University of California San Francisco.

Reston, VA--A new radiotracer that detects iron in cancer cells has proven effective, opening the door for the advancement of iron-targeted therapies for cancer patients. The radiotracer, 18F-TRX, can be used to measure iron concentration in tumors, which can help predict whether a not the cancer will respond to treatment. This research was published in the July issue of the Journal of Nuclear Medicine.

All cancer cells have an insatiable appetite for iron, which provides them the energy they need to multiply. As a result, tumors have higher levels of iron than normal tissues. Recent advances in chemistry have led scientists to take advantage of this altered state, targeting the expanded cytosolic 'labile' iron pool (LIP) of the cancer cell to develop new treatments.

A clear method to measure LIP in tumors must be established to advance clinical trials for LIP-targeted therapies. "LIP levels in patient tumors have never been quantified," noted Adam R. Renslo, PhD, professor in the department of pharmaceutical chemistry at the University of California, San Francisco. "Iron rapidly oxidizes once its cellular environment is disrupted, so it can't be quantified reliably from tumor biopsies. A biomarker for LIP could help determine which tumors have the highest LIP levels and might be especially vulnerable to LIP-targeted therapies."

To explore a solution for this unmet need, researchers imaged 10 tissue graft models of glioma and renal cell carcinoma with 18F-TRX PET to measure LIP. Tumor avidity and sensitivity to the radiotracer were assessed. An animal model study was also conducted to determine effective human dosimetry.

18F-TRX showed a wide range of tumor accumulation, successfully distinguishing LIP levels among tumors and determining those that might be most likely to respond to LIP-targeted therapies. Pretreatment 18F-TRX uptake in tumors was also found to predict sensitivity to therapy. The estimated effective dose for adults was comparable to those of other 18F-based imaging agents.

"Iron dysregulation occurs in many human disorders, including neurodegenerative and cardiovascular diseases, and inflammation," said Michael J. Evans, associate professor in residence in the department of radiology and biomedical imaging at the University of California, San Francisco. "Applying 18F-TRX in the respective patient populations to define the extent of LIP expansion in affected tissues will be an important milestone toward understanding the therapeutic potential of LIP-targeted therapies beyond oncology."

Credit: 
Society of Nuclear Medicine and Molecular Imaging

The impact of climate change on Kenya's Tana river basin

Many species within Kenya's Tana River Basin will be unable to survive if global temperatures continue to rise as they are on track to do - according to new research from the University of East Anglia.

A new study published in the journal PLOS ONE today outlines how remaining within the goals of the Paris Agreement would save many species.

The research also identifies places that could be restored to better protect biodiversity and contribute towards global ecosystem restoration targets.

Researcher Rhosanna Jenkins carried out the study as part of her PhD at UEA's School of Environmental Sciences.

She said: "This research shows how many species within Kenya's Tana River Basin will be unable to survive if global temperatures continue to rise as they are on track to do.

"But remaining within the goals of the Paris Agreement, which aims to keep global warming well below 2°C, ideally at 1.5°C, would save many species. This is because large areas of the basin act as refugia from climate change."

"With higher warming levels, not only are the refuges lost but also the potential for restoration becomes more limited.

"The United Nations declared the 2020s as the 'Decade on Ecosystem Restoration'. Our results show the importance of considering climate change within these restoration efforts.

"With higher levels of warming, many of the species you are trying to restore will no longer be able to survive in the places they were originally found.

"Strong commitments from global leaders ahead of the COP climate change summit in Glasgow are needed to stand any chance of avoiding the loss of species - which for the Tana River Basin is clearly indicated by this work."

Credit: 
University of East Anglia

What's riskier for young soccer players, practice or game time?

For young soccer players, participating in repetitive technical training activities involving heading during practice may result in more total head impacts but playing in scrimmages or actual soccer games may result in greater magnitude head impacts. That's according to a small, preliminary study released today that will be presented at the American Academy of Neurology's Sports Concussion Conference, July 30-31, 2021.

"Headers are a fundamental component to the sport of soccer. Therefore, it is important to understand differences in header frequency and magnitude across practice and game settings," said study author Jillian Urban, PhD, MPH, of Wake Forest School of Medicine in Winston-Salem, N.C. "Practices are more amenable to change than games. Therefore, understanding how we can restructure practice to reduce head impact exposure while teaching fundamental skills needed to safely play the sport is critical to improving head impact safety in the sport."

The study followed eight soccer players who were ages 14 and 15 for two seasons. Players wore a custom-fitted mouthpiece sensor during all practices and games. Researchers recorded all activities on the field with a time-synchronized camera, and identified each time head contact was made.

Head impact exposure was quantified in terms of peak head motion and impacts per player per hour, or impact rate. The amount of time an athlete was exposed to an activity was also evaluated. Researchers then compared impact rates across activity types which ranged from 0.5 head impacts per player hour to 13.7 head impacts per player hour.

Researchers saw a similar number of player-to-player contacts happening during technical drills, team interaction and game play. Technical training activities like heading the ball and practicing ball-control and dribbling were associated with an average impact rate of 13.7 head impacts per player hour. Team interaction activities such as small-sided games in practice were associated with an average impact rate of 0.5 head impacts per player per hour, which was slightly lower than the 1.3 head impacts per player hour observed during games.

Researchers also looked at average rotational head motion, which ranged from 500 radians per second squared (rad/s2) to 1,560 rad/s2, with higher numbers signifying greater magnitude head impacts. Technical training was associated with an average magnitude of 550 rad/s2, while team interaction and games were associated with an average rotational head motion of 910 rad/s2 and 1,490 rad/s2, respectively.

"If the goal is to reduce the number of head impacts a young soccer player may get on the field, our findings suggest the best way may be to target technical training drills and how they are distributed within a season," said Urban. "However, if the goal is to reduce the likelihood of players sustaining head impacts of greater magnitude, then the best bet may be to look at factors associated with high-magnitude head impacts that can occur during scrimmages and games."

A limitation of the study is the small number of players involved.

Credit: 
American Academy of Neurology

Featured articles from the journal CHEST®, July 2021

image: Visual abstract from the original research "Pulmonary Function and Radiologic Features in Survivors of Critical COVID-19: A 3-Month Prospective Cohort"

Image: 
González Gutiérrez J., et al, Journal CHEST®

Glenview, Ill. - Published monthly, the journal CHEST® features peer-reviewed, cutting-edge original research in chest medicine: Pulmonary, critical care, sleep medicine and related disciplines. Journal topics include asthma, chest infections, COPD, critical care, diffuse lung disease, education and clinical practice, pulmonology and cardiology, sleep, and thoracic oncology.

The July issue of CHEST journal includes 85 articles, clinically relevant research, reviews, case series, commentary and more. Each month, the journal also offers complementary web and multimedia activities, including visual abstracts, to expand the reach of its most interesting, timely and relevant research.

"We have a lot of excellent content included in the July issue of CHEST," says Editor in Chief of the journal, Peter Mazzone, MD, MPH, FCCP. "I want to thank all of our contributors for their time, efforts, and extensive research that we are proud to share. This month, in particular, I want to celebrate the anniversary of our Humanities in Chest Medicine that was first launched in July 2020. It has quickly become a favorite for our readers and continues to embrace and celebrate the importance of the human element that accompanies work in pulmonary medicine."

Included in the July 2021 issue:

Chest infections

Following a year of COVID and the rollout of vaccines, "Fast Development of High-Quality Vaccines in a Pandemic" describes how vaccines are developed at "warp speed" without compromising on the science, quality or safety.

Asthma

Original research, "Hormone Replacement Therapy and Development of New Asthma," examines the association between hormone replacement therapy in menopause and new development of asthma. A visual abstract video for this research can be viewed here.

Critical care

Focusing on those who had critical COVID, original research, "Pulmonary Function and Radiologic Features in Survivors of Critical COVID-19: A 3-Month Prospective Cohort" reveals persistent impairment was likely and concludes that pulmonary evaluation at three months after discharge is advised. A visual abstract accompanying this research can be viewed here.

Sleep

Original research, "Randomized Controlled Trial of Solriamfetol for Excessive Daytime Sleepiness in Obstructive Sleep Apnea: An Analysis of Subgroups Adherent or Nonadherent to Obstructive Sleep Apnea Treatment," finds benefits to taking solriamfetol to improve excessive daytime sleepiness scores in patients with obstructive sleep apnea. A visual abstract for this research can be viewed here.

Credit: 
American College of Chest Physicians

Advantages of intranasal vaccination against SARS-CoV-2

image: Fran Lund

Image: 
UAB

BIRMINGHAM, Ala. - There are many reasons that an intranasal vaccine against the SARS-CoV-2 virus would be helpful in the fight against COVID-19 infections, University of Alabama at Birmingham immunologists Fran Lund, Ph.D., and Troy Randall, Ph.D., write in a viewpoint article in the journal Science.

That route of vaccination gives two additional layers of protection over intramuscular shots because it produces: 1) immunoglobulin A and resident memory B and T cells in the respiratory mucosa that are an effective barrier to infection at those sites, and 2) cross-reactive resident memory B and T cells that can respond earlier than other immune cells if a viral variant does start an infection.

"Given the respiratory tropism of the virus, it seems surprising that only seven of the nearly 100 SARS-CoV-2 vaccines currently in clinical trials are delivered intranasally," Lund and Randall said. "Advantages of intranasal vaccines include needle-free administration, delivery of antigen to the site of infection, and the elicitation of mucosal immunity in the respiratory tract."

Their viewpoint article goes on to detail the individual advantages and challenges of each of the seven intranasal vaccine candidates. Six are viral vectors, including three different adenovirus vectors, and one candidate each for live-attenuated influenza virus, live-attenuated respiratory syncytial virus and live-attenuated SARS-CoV-2. The seventh vaccine candidate is an inert protein subunit.

Among the drawbacks of using viruses that people may have encountered before is negative interference from anti-vector antibodies that impair vaccine delivery. And because of the minimal risk of reversion for the live-attenuated SARS-CoV-2 virus, it would likely be contraindicated for infants, people over 49 and immunocompromised persons.

"Notably absent from the list of intranasal vaccines are those formulated as lipid-encapsulated mRNA," Lund and Randall said, listing some of the challenges and adverse side effects that accompany that approach.

"Ultimately, the goal of vaccination is to elicit long-lived protective immunity," the UAB researchers concluded. Comparing the benefits and disadvantages of intranasal vaccination against intramuscular vaccinations, they suggest that perhaps effective vaccination need not be restricted to a single route.

"The ideal vaccination strategy," the immunologists concluded, "may use an intramuscular vaccine to elicit a long-lived systemic immunoglobulin G response and a broad repertoire of central memory B and T cells, followed by an intranasal booster that recruits memory B and T cells to the nasal passages and further guides their differentiation toward mucosal protection, including immunoglobulin A secretion and tissue-resident memory cells in the respiratory tract."

Credit: 
University of Alabama at Birmingham