Culture

Blood vessels communicate with sensory neurons to decide their fate

video: In this time-lapse video is shown the development of a zebrafish embryo for 12 hours in which the blood vessels are marked in magenta and the sensory neurons in green, which allows us to observe the interaction of these two cell types.

Image: 
Laura Taberner, Pompeu Fabra University.

Researchers at Pompeu Fabra University have shown for the first time that blood vessels communicate with neurons in the peripheral nervous system, regulating their proliferation and differentiation. The study is published today in the journal Cell Reports and was conducted using zebrafish as a model. It was led by Berta Alsina, principal investigator of the Morphogenesis and Cell Signaling in Sensory Systems group, and involved Laura Taberner and Aitor Bañón.

The researchers, using real-time videos, have discovered that both the neurons and the cells of blood vessels emit dynamic protrusions to be able to 'talk' to each other. These protrusions are called signalling filopodia or cytonemes and they have a receptor or ligand at the tip that allows them to send signals. It was only very recently discovered and it is a highly precise signalling mechanism, both in space and in time.

"It was known that vessel cells and stem cells in the brain communicate but this is the first time it has been witnessed through cytonemes in the peripheral nervous system", Berta Alsina explains. "By using high resolution spatiotemporal visualization techniques in vivo we have seen them in real time and they might also be in the brain", she adds.

This communication allows maintaining some precursors of neurons in quiescence, i.e., dormant, and they constitute a reservoir of stem cells. So, if later on in adulthood an injury occurs, quiescent cells can be activated and replace damaged neurons.

Laura Taberner, first author of the study, explains that "if all neuronal precursors proliferated and differentiated we would not have this reservoir and there would not be the opportunity for regeneration. In the auditory and vestibular system, which is what we are studying, cases of deafness or vertigo may arise".

The study also concludes that the precursors are initially in a hypoxic environment, i.e., lacking oxygen, which keeps them proliferating. When blood vessels connect to each other during development, oxygen is transported by the blood vessels and the environment becomes normoxic. The researchers have found that oxygen is the second signal of the vessels and in this case, instead of regulating quiescence, oxygen regulates the differentiation of neuronal precursors to neurons.

This study shows that during the development of the peripheral nervous system, the formation of new neurons and the maintenance of stem cells is highly dependent on signals from the blood vessels. Neurons receive signals from all the surrounding cells, which are part of the environment in which they reside and the vessels are part of this niche. "This new knowledge will help understand the connection between hearing loss and cardiovascular diseases, as well as improve protocols for in vitro differentiation of neurons for regenerative therapies", Taberner adds.

Credit: 
Universitat Pompeu Fabra - Barcelona

A "feeling" for dementia?

A research team led by the German Center for Neurodegenerative Diseases (DZNE) concludes that personal perception can be an important indicator for the early detection of Alzheimer's disease. In a new study involving 449 older adults, published in Neurology®, the medical journal of the American Academy of Neurology, the scientists report that individuals with subjectively felt memory problems also exhibited on average measurable cognitive deficits that were associated with abnormalities in the spinal fluid. Early diagnosis and therapy development could benefit from these findings.

When memory deteriorates according to one's own perception, but mental performance - following objective criteria - is still within the normal range, this is referred to as "subjective cognitive decline" (SCD). "People with SCD have an increased risk of developing dementia in the long term. However, little is known about the mechanisms underlying subjective memory problems," said Prof. Michael Wagner, head of a research group at the DZNE and a senior psychologist at the memory clinic of the University Hospital of Bonn. "The effects are subtle and previous studies have included relatively small groups of people, which makes statistically reliable assessments difficult. Therefore, we have now examined the largest sample of individuals to our knowledge."

A nationwide study

A network of German universities and university hospitals was involved in the investigations, which were coordinated by the DZNE. A total of 449 women and men - their average age was about 70 years - participated in the study. Of this group, 240 individuals were included via memory clinics of the participating university hospitals. These persons had consulted the clinics for diagnostic clarification of persistent subjective cognitive complaints, usually after a doctor's referral. However, in the usual tests they were assessed as cognitively normal. It was thus determined that they had SCD. The other 209 study participants were classified as cognitively healthy based on interviews and the same cognitive testing. They had decided to participate in the study following newspaper advertisements.

"We were able to show that those people who turned to a memory clinic because of SCD had measurable, albeit only mild cognitive deficits," explained Dr. Steffen Wolfsgruber, lead author of the current publication. The findings are based on extensive testing, refined data analysis and the relatively large number of people examined. "This significantly improved measurement sensitivity. Thus, we found that study participants considered to be healthy generally scored better in mental performance than memory clinic patients with SCD. These differences are hardly detectable with standard methods of analysis and in small groups of people. Especially not on an individual level. In any case, you need a large data set."

An extensive test series

The women and men who took part in the study underwent various tests of their mental abilities. In addition to memory performance, the focus was also on attention capacity and the ability to concentrate in various situations. Among other things, language skills and the ability to recognize and correctly name objects were also tested.

In addition, the cerebrospinal fluid of 180 study subjects - 104 of them with SCD - was analyzed. This liquid is present in the brain and the spinal cord. Levels of specific proteins were measured, namely of "amyloid-beta peptides" and "tau proteins". "These biomarker data allow conclusions on potential nerve damage and mechanisms associated with Alzheimer's disease," said Wolfsgruber.

"We found that our study subjects with SCD had mild cognitive deficits on average and that these defictis were associated to proteins that indicate early Alzheimer's disease. Therefore, we assume that both the subjective complaints and the minimal objective cognitive deficits are due to Alzheimer's processes. That's not something that can be taken for granted, because there are many reasons for memory problems," said Michael Wagner, who led the current study. "It is important to stress that these individuals had visited a memory clinic because of their complaints, or had been referred to one. Therefore, these findings cannot be generalised, because many elderly people suffer from temporary subjective memory disorders without having early Alzheimer's disease."

Early treatment

The now published results are based on data from the so-called DELCODE study of DZNE that investigates the early phase of Alzheimer's disease - the time period before marked symptoms manifest. Within the framework of DELCODE, the cognitive development of a total of about 1000 participants is monitored over several years. "It will then become clear who is actually developing dementia and how well the risk of dementia can be estimated in advance by means of SCD. Data on this still being collected and evaluated," said Wagner. "In any case, our current results support the concept that SCD can contribute to detect Alzheimer's disease at an early stage. However, SCD can certainly only provide a part of the larger picture that is necessary for diagnosis. One will also have to consider biomarkers."

The current findings could also help in the development of novel treatments. "Current therapies against Alzheimer's start too late. Then the brain is already severely damaged. A better understanding of SCD could create the basis for an earlier treatment. In order to test therapies that are intended to have an effect in the early stages of Alzheimer's, it is necessary to identify people at increased disease risk. For this, SCD could be an important criterion," said Wagner.

Credit: 
DZNE - German Center for Neurodegenerative Diseases

A simple laboratory test can aid in early recognition of COVID-19 in patients

CHICAGO--July 16, 2020--A rapid laboratory test, the eosinophil count, readily obtained from a routine complete blood cell count (CBC) can aid in the early recognition of COVID-19 in patients, as well as provide prognostic information, according to new research in The Journal of the American Osteopathic Association.

Current testing, which relies on diagnosis of COVID-19 by nasopharyngeal swab PCR assay, remains unreliable due to variable turnaround time and a high false-negative rate.

"We found that the absence of eosinophils on presentation can aid in early diagnosis, and in general, a persistent low count correlated with a poor prognosis for the patient," said Muhammad M. Zaman, MD, an infectious disease specialist affiliated with Coney Island Hospital in Brooklyn. "Review of the eosinophil count can be a useful tool in deciding whether to promptly isolate someone and initiate specific therapies while waiting for confirmatory test results."

In the study, eosinopenia correlated with diagnosis of COVID-19, and its persistence correlated with high disease severity and low rates of recovery. Low eosinophil count, or eosinopenia, is defined as having

"The trend of the eosinophil count has been known to correlate with viral infections, but we did not know the correlation was so significant in the case of COVID-19," says Dr. Zaman.

Actionable clinical information

Coney Island Hospital, part of the New York City Health and Hospitals system, serves a diverse population in Brooklyn, New York. The site experienced a sharp incline in COVID-19 cases during March and April 2020, when the study data was collected.

Researchers compared the eosinophil results of routine CBC from the first 50 admitted COVID-19-positive patients with the eosinophil results of 50 patients with confirmed influenza infection at the time of presentation to the emergency department at Coney Island Hospital in Brooklyn.

Of the patients with COVID-19, 60% had zero eosinophils at presentation, compared to 16% of influenza patients. An additional 28% of COVID-19 patients had zero eosinophils within 48 hours of admission, thus a total of 88% had zero eosinophils during hospitalization.

"In COVID-19, a disease that has substantial symptom overlap with influenza, eosinopenia could help to distinguish which patients likely have COVID-19," said Dr. Zaman.

A total of 23 of the 50 patients in the COVID-19 group (46%) passed away. Eighteen out of 21 (86%) deceased patients in the COVID-19 group who initially presented with eosinopenia remained eosinopenic versus 13 out of 26 (50%) survivors who had eosinopenia on presentation.

"As you can see from the data, continued low counts of eosinophils trended with mortality rates," said Dr. Zaman. "Patients whose eosinophil count rose tended to have better disease outcomes."

Existing testing is challenging

The clinical diagnosis of COVID-19 is confirmed by laboratory testing with a reverse-transcription polymerase chain reaction (RT-PCR) assay, which remains a challenge due to limited test availability, variable turnaround time, and low sensitivity of RT-PCR. In many hospitals, test results may take days to return.

"If a patient comes in on the first day of symptom onset, typically day five after exposure, the false-negative rate is as high as 38%," says Dr. Zaman. "This means a high number of potentially infectious patients are being misinformed of their true diagnosis--and a simple blood test could dramatically lower that number and the subsequent spread of disease."

Credit: 
American Osteopathic Association

Wonders of animal migration: How sea turtles find small, isolated islands

video: Animated tracks of 35 green turtle migrations tracked from nesting beaches in the Chagos Archipelago (Indian Ocean). The year of migration is indicated by colours; red = 2012, black = 2015, orange = 2017 and blue = 2018. Stars (n=33) indicate migration endpoints and incomplete migrations (n=2) are denoted by black crosses. Animation timing has been adjusted so that turtles from all years depart nesting beaches at the same time and migration duration in days is indicated by the counter in the lower right hand corner of the video frame. Of 35 equipped turtles, 33 were tracked all the way to their foraging grounds. The animation highlights the often circuitous routes of individual turtles.

Image: 
Nicole Esteban, Swansea University

One of Charles Darwin's long-standing questions on how turtles find their way to islands has been answered thanks to a pioneering study by scientists.

A team from Deakin University, University of Pisa and Swansea University equipped 33 green sea turtles with satellite tags and recorded unique tracks of green turtles migrating long distances in the Indian Ocean to small oceanic islands.

The study provides some of the best evidence to date that migrating sea turtles have an ability to redirect in the open ocean, but only at a crude level rather than at fine scales.

Seven turtles travelled only a few tens of kilometres to foraging sites on the Great Chagos Bank, six travelled over 4,000km to mainland Africa, one to Madagascar, while another two turtles ventured north to the Maldives.

Most of the species tracked (17) migrated westward to distant foraging sites in the Western Indian Ocean that were associated with small islands.

It shows that the turtles can travel several hundred kilometres off the direct routes to their goal before reorienting, often in the open ocean.

In 1873, Charles Darwin marvelled at the ability of sea turtles to find isolated islands where they nest. However, the details of how sea turtles, and other groups such as seals and whales, navigate during long migrations remains an open question.

Answering this question using free-living individuals is difficult due to thousands of sea turtles being tracked to mainland coasts where the navigational challenges are easiest.

The study also showed that turtles frequently struggled to find small islands, overshooting, and/or searching for the island in the final stages of migration.

These satellite tracking results support the suggestion, from previous laboratory work, that turtles use a crude true navigation system in the open ocean, possibly using the world's geomagnetic field.

Swansea University's Dr Nicole Esteban, a co-author in the study said:

"We were surprised that green turtles sometimes overshot their ultimate destination by several hundred kilometres and then searched the ocean for their target. Our research shows evidence that turtles have a crude map sense with open ocean reorientation."

Credit: 
Swansea University

Breeding new rice varieties will help farmers in Asia

image: Short-duration rice breeding trial at the IRRI, Los Banos, the Philippines.

Image: 
Courtesy of Dr. Phyo L.P. Won

After interviewing smallholder farmers throughout South and Southeast Asia, one of the top needs they mentioned is development of shorter duration rice varieties with only 100 days from sowing to harvest. Some farmers want to have more time to prepare for the next season crop, whereas other farmers are concerned about irrigation water running out during the dry season. Another benefit in countries such as the Philippines is reducing the risk of adverse weather (e.g., typhoons) affecting the crop compared to longer duration varieties.

In an article recently published in Crop Science, a diverse group of researchers from the International Rice Research Institute (IRRI) comprising agronomists, physiologists and breeders report on the advanced high-yielding, earlier-maturing lines for tropical Asia. The lines were developed from the IRRI's irrigated rice breeding pipeline.

The research team identified key agronomic traits for high yield in short-duration rice (SDR) that will assist future breeding efforts. The team found that low source-to-sink ratio was the major yield constraint of SDR and suggested that breeding should aim to enhance source capacity during grain filling. Importantly, some new SDR breeding lines yielded 11-38% higher than the most popular short-duration variety. Taken together, these findings indicate enormous potential for developing improved short-duration rice varieties in the future.

Credit: 
American Society of Agronomy

New evidence for a dynamic metallocofactor during nitrogen gas reduction

image: A sideview of the M-clusters in the two dimers of the catalytic component of Mo-nitrogenase. The key residues interacting with the M-clusters and the bound dinitrogen ligands are indicated as sticks. The two M-clusters are superimposed with the Fo-Fc omit maps of the dinitrogen ligands (mint-blue mesh). Color code of atoms: Fe, orange; S, yellow; O, red; N, blue.

Image: 
UCI School of Biological Sciences

A key mystery about the gas comprising most of our atmosphere is closer to being solved following a discovery by University of California, Irvine biologists. Their findings are the first step in understanding the biological mechanism for breaking down nitrogen gas. Besides yielding groundbreaking knowledge, the information holds promise for developing environmentally friendly and cheaper ways to make products such as fertilizer and fuel. The team's research has just been published in the journal Science.

Activation of the nitrogen gas (N2), which composes 78% of the atmosphere, has long stymied scientists. "The strong triple-bond between the nitrogen atoms in N2 makes this compound difficult to break apart, and thus nearly inert." said Molecular Biology and Biochemistry Chancellor's Professor Markus Ribbe. "Researchers have worked for decades to fully understand how nature can activate the nitrogen gas and break it down for biological purposes."

However, the teams of Professors Yilin Hu and Markus Ribbe, from the department of moleculary biology and biochemistry, have discovered how the enzyme nitrogenase can bind to N2 as the initial step towards its activation. X-ray crystallographic analysis showed that the three sulfur sites at the "belt region" of the FeMo cofactor in the active site of the enzyme are labile during catalysis, with the cofactor in one subunit of the enzyme having one, and the other subunit of the enzyme having two, of the three belt sulfur atoms replaced by distinct nitrogen species during the binding and reduction of N2. These findings are entirely unexpected and shed light on the sparsely understood mechanism of N2 reduction, pointing to a key role of belt sulfur displacement in proper nitrogenase function.

"We are optimistic that with further research, we will be able to demonstrate how this entire mechanism works," said Professor Hu.

In addition to revealing important scientific insights, the discovery could ultimately transform manufacturing. Because this natural process is poorly understood, industries turn nitrogen gas into commercial products through other methods that take an environmental toll. For example, the most common procedure for breaking down N2 to produce ammonia fertilizer for agriculture, called the Haber-Bosch process, relies on very high heat and pressure.

Professor Ribbe said: "Once we understand how nature activates nitrogen gas under ambient conditions, it opens the way for developing manufacturing processes that use less energy and are also cheaper." In addition to fertilizer, the discovery could have implications for alternative fuel production. Professors Hu and Ribbe, who have focused much of their research on nitrogenase over the years, have already discovered that the enzyme can convert carbon dioxide and carbon monoxide into hydrocarbons, which are major components in carbon fuels.

Credit: 
University of California - Irvine

Breakthrough blood test detects positive COVID-19 result in 20 minutes

image: Dr Simon Corrie with COVID-19 positive and negative blood samples.

Image: 
Monash University

World-first research by Monash University in Australia has been able to detect positive COVID-19 cases using blood samples in about 20 minutes, and identify whether someone has contracted the virus.

In a discovery that could advance the worldwide effort to limit the community spread of COVID-19 through robust contact tracing, researchers were able to identify recent COVID-19 cases using 25 microlitres of plasma from blood samples.

The research team, led by BioPRIA and Monash University's Chemical Engineering Department, including researchers from the ARC Centre of Excellence in Convergent BioNano Science and Technology (CBNS), developed a simple agglutination assay - an analysis to determine the presence and amount of a substance in blood - to detect the presence of antibodies raised in response to the SARS-CoV-2 infection.

Positive COVID-19 cases caused an agglutination or a clustering of red blood cells, which was easily identifiable to the naked eye. Researchers were able to retrieve positive or negative readings in about 20 minutes.

While the current swab / PCR tests are used to identify people who are currently positive with COVID-19, the agglutination assay can determine whether someone had been recently infected once the infection is resolved - and could potentially be used to detect antibodies raised in response to vaccination to aid clinical trials.

Using a simple lab setup, this discovery could see medical practitioners across the world testing up to 200 blood samples an hour. At some hospitals with high-grade diagnostic machines, more than 700 blood samples could be tested hourly - about 16,800 each day.

Study findings could help high-risk countries with population screening, case identification, contact tracing, confirming vaccine efficacy during clinical trials, and vaccine distribution.

This world-first research was published today (Friday 18 July 2020) in the prestigious journal ACS Sensors.

A patent for the innovation has been filed and researchers are seeking commercial and government support to upscale production.

Dr Simon Corrie, Professor Gil Garnier and Professor Mark Banaszak Holl (BioPRIA and Chemical Engineering, Monash University), and Associate Professor Timothy Scott (BioPRIA, Chemical Engineering and Materials Science and Engineering, Monash University) led the study, with initial funding provided by the Chemical Engineering Department and the Monash Centre to Impact Anti-microbial Resistance.

Dr Corrie, Senior Lecturer in Chemical Engineering at Monash University and Chief Investigator in the CBNS, said the findings were exciting for governments and health care teams across the world in the race to stop the spread of COVID-19. He said this practice has the potential to become upscaled immediately for serological testing.

"Detection of antibodies in patient plasma or serum involves pipetting a mixture of reagent red blood cells (RRBCs) and antibody-containing serum/plasma onto a gel card containing separation media, incubating the card for 5-15 minutes, and using a centrifuge to separate agglutinated cells from free cells," Dr Corrie said.

"This simple assay, based on commonly used blood typing infrastructure and already manufactured at scale, can be rolled out rapidly across Australia and beyond. This test can be used in any lab that has blood typing infrastructure, which is extremely common across the world."

Researchers collaborated with clinicians at Monash Health to collect blood samples from people recently infected with COVID-19, as well as samples from healthy individuals sourced before the pandemic emerged.

Tests on 10 clinical blood samples involved incubating patient plasma or serum with red blood cells previously coated with short peptides representing pieces of the SARS-CoV-2 virus.

If the patient sample contained antibodies against SARS-CoV-2, these antibodies would bind to peptides and result in aggregation of the red blood cells. Researchers then used gel cards to separate aggregated cells from free cells, in order to see a line of aggregated cells indicating a positive response. In negative samples, no aggregates in the gel cards were observed.

"We found that by producing bioconjugates of anti-D-IgG and peptides from SARS-CoV-2 spike protein, and immobilising these to RRBCs, selective agglutination in gel cards was observed in the plasma collected from patients recently infected with SARS-CoV-2 in comparison to healthy plasma and negative controls," Professor Gil Garnier, Director of BioPRIA, said.

"Importantly, negative control reactions involving either SARS-CoV-2-negative samples, or RRBCs and SARS-CoV-2-positive samples without bioconjugates, all revealed no agglutination behaviour."

Professor Banaszak Holl, Head of Chemical Engineering at Monash University, commended the work of talented PhD students in BioPRIA and Chemical Engineering who paused their projects to help deliver this game changing COVID-19 test.

"This simple, rapid, and easily scalable approach has immediate application in SARS-CoV-2 serological testing, and is a useful platform for assay development beyond the COVID-19 pandemic. We are indebted to the work of our PhD students in bringing this to life," Professor Banaszak Holl said.

"Funding is required in order to perform full clinical evaluation across many samples and sites. With commercial support, we can begin to manufacture and roll out this assay to the communities that need it. This can take as little as six months depending on the support we receive."

COVID-19 has caused a worldwide viral pandemic, contributing to nearly 600,000 deaths and more than 13.9 million cases reported internationally (figures dated 17 July 2020).

To download a copy of the research, please visit https://doi.org/10.1021/acssensors.0c01050 . To watch a video of this research in action, please visit https://www.youtube.com/watch?v=9WBQUC43u9Q&feature=youtu.be.

Credit: 
Monash University

Study: Five-year review of all alzheimer's drugs in development shows reason for optimism

image: Dr. Jeffrey Cummings

Image: 
(Lonnie Timmons III/UNLV Photo Services)

Dr. Jeffrey L. Cummings, UNLV research professor and a leading expert on Alzheimer’s disease clinical trials, led a five-year review of all Alzheimer’s drugs in the development pipeline. He says today there is more hope than ever that we'll one day solve Alzheimer’s. 
 
The paper, “Alzheimer's disease drug development pipeline: 2020,” was published this week in the journal Alzheimer's & Dementia: Translational Research & Clinical Interventions.
 
The first time his team published an analysis of experimental drugs in 2014, they quantified a 99 percent failure rate of all therapies in the pipeline. It was the first analysis of its kind and has proven enormously popular with roughly 67,000 downloads and 260 citations. 
 
Currently, there are 121 unique therapies in 136 clinical trials in the pipeline. We recently met with Cummings from the UNLV School of Integrated Health Sciences to understand what has changed in the world of Alzheimer’s drug development over the last five years.

Tell us about this five-year review. What is its primary purpose?

We want to continue to analyze how the ecosystem and process of drug development is working over time to accelerate getting treatments to patients. I have been involved with clinical trials for the past 20 years and there have been very few studies showing how the process of drug development works. To get a treatment for patients, it’s a complicated process involving many entities along the way including labs, biotech firms, pharmaceutical companies, the federal government, and the marketplace. Scientists, funders and, ultimately, patients want to understand what potential therapies are being developed. 

What are the key takeaways over the past five years of the drug development pipeline? 

We have yet to have a successful drug move across the finish line since our original publication in 2014. It is striking that there has been an increase in the number of repurposed agents compared to five years ago. Several drugs approved to treat other conditions are now being tested to see if they work for Alzheimer’s disease. There are several benefits because the drug’s properties and safety issues are already known. 
 
Another thing we learned is that patient participation is critical and recruitment is too slow, challenging, and expensive. A typical drug trial lasts 18 months and the patient recruitment period may be two years longer than it takes to show efficacy. The company funding the trial has to pay expenses the entire duration, which can mean $30 million to $50 million or more per trial. 

Does this mean there aren’t any new drugs in the pipeline? 

There are new drugs but not as many as we need to advance toward our goal of meaningful therapy for patients. The expense of developing new drugs is exorbitant. It can be $400 million for an entire development program. Currently, there are an estimated 100 million Americans suffering from at least one brain disorder, including Alzheimer’s, costing the health care system nearly $790 billion annually. Investment in new and innovative approaches is needed from public and private funders to help us solve these problems for millions of people. 
 
Thankfully, the Alzheimer’s Association has stepped in to fill part of the gap. Many funding initiatives — specifically their International Research Grants Program and Part the Cloud — were created to fund much needed novel therapies.

How has the recent discovery of new biomarkers impacted the pipeline?

This is an exciting development that offers hope for successful Alzheimer's treatments. In the past two years, several new biomarkers [a measurement, like a blood test, that reveals what's happening in the body] for Alzheimer’s have been developed and some have been approved for use by the FDA, which means we can offer a more precise clinical trial process. We can now use biomarkers to better define our patient populations. It is something new, precise and powerful.   

Recently, you proposed a new scoring method for the pipeline to determine readiness to move from one phase to the next. What inspired you to do so and what are you hoping to achieve? 

We know there is a 99 percent failure rate of drugs in the pipeline. Translational scoring addresses issues in the testing process with rigorous criteria that can consistently be applied to all the therapies in development. The result is a semi-quantitative rating that shows the flaws in a drug development program early on. This is especially critical information for a funding agency to compare treatments to each other and to understand which drug is least risky. 

What is your team at UNLV focusing on?

The recently established Chambers-Grundy Center for Transformative Neuroscience in the department of brain health is focusing on analyzing clinical trial methods to see which strategies, what targets, and what biomarkers are succeeding and how we can use these lessons to get better drugs to our patients faster.  For me, this is enormously exciting.  

How optimistic does this make you about conquering Alzheimer’s?

We’ve never seen more promise in the pipeline than there is today. The very recent discovery of relevant biomarkers allows us to develop drugs with a precision we’ve never had before. We are going to solve this problem.

What can real people do to help advance the science of Alzheimer’s treatments?

We are working with our patients and families to solve the brain disease they have. We can have success only if they participate in the clinical trials. There is a critical alliance between the scientists, a patient, and their family to accelerate drug development. We need our “citizen scientists” because they are contributing in such an important way to a future without Alzheimer’s disease. 

As a researcher yourself, where do you learn about other’s work?

The upcoming Alzheimer’s Association International Conference is the major information sharing opportunity among Alzheimer's disease researchers across the world each year. It’s an enormously important conference where we share and learn about each other’s work. I am presenting at four sessions — virtually this year given the pandemic — and am looking forward to hearing ideas from new researchers to senior ones like myself. Since it is virtual, it is offered at no cost and open to everyone. 

DOI

10.1002/trc2.12050

Credit: 
University of Nevada, Las Vegas

Researchers discover 2 paths of aging and new insights on promoting healthspan

video: Right, movies of representative mode 1 (top) and mode 2 (bottom) aging cells (encircled) are played sequentially. Green and red fluorescence reflect nucleolar and mitochondrial integrity in the cells during aging. Increase in green fluorescence represents loss of nucleolar integrity whereas decrease in red fluorescence represents loss of mitochondrial integrity. Left, real-time quantification of fluorescence plotted within a 3D aging space, in which z-axis represents the percentage of lifetime. After the aging trajectories of the two representative cells were quantified and plotted, trajectories of a population of isogenic wild type cells were plotted in the space (mode 1: red; mode 2: blue).

Image: 
Hao Lab, UC San Diego

Molecular biologists and bioengineers at the University of California San Diego have unraveled key mechanisms behind the mysteries of aging. They isolated two distinct paths that cells travel during aging and engineered a new way to genetically program these processes to extend lifespan.

The research is described July 17 in the journal Science.

Our lifespans as humans are determined by the aging of our individual cells. To understand whether different cells age at the same rate and by the same cause, the researchers studied aging in the budding yeast Saccharomyces cerevisiae, a tractable model for investigating mechanisms of aging, including the aging paths of skin and stem cells.

The scientists discovered that cells of the same genetic material and within the same environment can age in strikingly distinct ways, their fates unfolding through different molecular and cellular trajectories. Using microfluidics, computer modeling and other techniques, they found that about half of the cells age through a gradual decline in the stability of the nucleolus, a region of nuclear DNA where key components of protein-producing "factories" are synthesized. In contrast, the other half age due to dysfunction of their mitochondria, the energy production units of cells.

The cells embark upon either the nucleolar or mitochondrial path early in life, and follow this "aging route" throughout their entire lifespan through decline and death. At the heart of the controls the researchers found a master circuit that guides these aging processes.

"To understand how cells make these decisions, we identified the molecular processes underlying each aging route and the connections among them, revealing a molecular circuit that controls cell aging, analogous to electric circuits that control home appliances," said Nan Hao, senior author of the study and an associate professor in the Section of Molecular Biology, Division of Biological Sciences.

Having developed a new model of the aging landscape, Hao and his coauthors found they could manipulate and ultimately optimize the aging process. Computer simulations helped the researchers reprogram the master molecular circuit by modifying its DNA, allowing them to genetically create a novel aging route that features a dramatically extended lifespan.

"Our study raises the possibility of rationally designing gene or chemical-based therapies to reprogram how human cells age, with a goal of effectively delaying human aging and extending human healthspan," said Hao.

The researchers will now test their new model in more complex cells and organisms and eventually in human cells to seek similar aging routes. They also plan to test chemical techniques and evaluate how combinations of therapeutics and drug "cocktails" might guide pathways to longevity.

"Much of the work featured in this paper benefits from a strong interdisciplinary team that was assembled," said Biological Sciences Professor of Molecular Biology Lorraine Pillus, one of the study's coauthors. "One great aspect of the team is that we not only do the modeling but we then do the experimentation to determine whether the model is correct or not. These iterative processes are critical for the work that we are doing."

Credit: 
University of California - San Diego

Oral herpes rates are falling in children

Two thirds of children and young people have their first sexual activity unexposed to herpes but risk catching it in adulthood, say researchers

Fewer people are being exposed to herpes simplex type 1 - also known as oral herpes - in their childhood and the prevalence amongst the population in Europe is falling by 1% per year, suggests research published in the journal BMJ Global Health.

The prevalence of the virus, which often manifests itself with cold sores, appears to be declining in younger people but it could be increasingly likely to be transmitted sexually.

Herpes simplex type1 (HSV-1) is mainly transmitted by oral-to-oral contact during childhood, causing oral herpes, but it can also cause genital herpes. The other form of the virus (HSV-2) is sexually transmitted and causes genital herpes.

Both forms of the virus are lifelong and the World Health Organization estimates there are 3.7 billion people under age 50 (67%) who have HSV-1 infection globally and 491 million people aged 15-49 (13%) worldwide with HSV-2 infection.

Previous research data focused on North America and Europe has suggested that there is a decrease in acquisition of HSV-1 in childhood, a decline in its population prevalence in youth, and an increase in genital herpes cases that are caused by HSV-1.

A team of researchers from Weill Cornell Medicine-Qatar of Cornell University set out to examine the epidemiology of HSV-1 in Europe.

They systematically reviewed HSV-1 related publications, conducted various meta-analyses, assessed pooled prevalence rates in populations, and estimated pooled proportions of HSV-1 viral detection in clinically diagnosed genital ulcer disease and in genital herpes.

Their analysis gathered information from 142 suitable previous publications.

From these publications, they extracted 179 overall population prevalence measures, four overall proportions of HSV-1 in genital ulcer disease, and 64 overall proportions of HSV-1 in genital herpes.

The results showed that more than two-thirds (67.4%) of the population in Europe tested positive for HSV-1, which is far lower than the historical level of universal infection in childhood in other parts of the world, such as Africa. Around 32.5% of children and 74.4% of adults were infected in Europe.

Prevalence in the population increased steadily with age, being lowest in those aged below 20 years and highest in those aged over 50 years.

Population prevalence in Europe was declining by 1% per year, and the contribution of HSV-1 to genital herpes was rising, also by 1% per year.

As many as two-thirds of European children were reaching their first sexual activity unexposed to this infection and were at risk of acquiring the virus sexually in adulthood, said the researchers.

They speculated that reasons for falling prevalence rates of HSV-1 could include a general decrease in both family size and school crowding, as well as improved hygiene and living conditions.

The results also showed that half of first episode genital herpes cases in Europe were already due to HSV-1, as opposed to HSV-2 infection.

The authors acknowledged that their systematic review had some limitations, primarily the unavailability of data for 25 of 53 European countries, and had comparatively less data for genital ulcer disease and genital herpes than population prevalence.

Nevertheless, these limitations did not appear to have posed a barrier to the interpretation of the results of the study, they said.

They conclude: "HSV-1 epidemiology in Europe is in transition and shifting away from its historical pattern of oral acquisition in childhood.

"HSV-1 transition in Europe is leading to more heterogeneous and variable transmission by age and geography, and an increasing role for HSV-1 in genital herpes and as a sexually transmitted disease.

"The findings highlight the importance of disease surveillance and monitoring of HSV-1 seroprevalence and genital herpes aetiology, and strengthen the case for an HSV-1 vaccine to limit transmission."

Credit: 
BMJ Group

Difference between cystatin C- and creatinine-based eGFRs contains clinical information

Over one in four adults over the age of 65 may have chronic kidney disease (CKD) defined as a reduced kidney function. While serum creatinine remains the most practical biomarker for estimation of kidney function because it is part of metabolic panels, other biomarkers such as cystatin C may be more informative; unlike creatinine, cystatin C is not influenced by muscle mass or function. As people age, muscle mass declines and this may influence the estimation of kidney function based on serum creatinine and it may influence frailty and falls. Dr. Potok and colleagues examined the difference in estimated kidney function based on cystatin C and serum creatinine in 9,029 adults with a mean age of 68 +/- 9 years enrolled in the Systolic Blood Pressure Intervention trial. The difference between kidney function measured with these two markers (cystatin C-based kidney function - creatinine-based kidney function) was associated with lower odds of frailty, falls, and cardiovascular disease events. These findings emphasize the fact that muscle mass and function influence serum creatinine values but do not influence serum cystatin C values. The investigators state that the difference in cystatin C- and creatinine-based kidney function could be used to predict adverse outcomes in older adults.

Credit: 
National Kidney Foundation

Megaphages harbor mini-Cas proteins ideal for gene editing

video: Postdoc Patrick Pausch and doctoral student Basem Al-Shayed discuss a new gene-editing protein, CasΦ, which was discovered in a virus that attacks bacteria. Because it is very small and compact, the novel Cas protein should be easier to deliver to cells by a viral vector to alter plants or cure disease.

Image: 
UC Berkeley video by Roxanne Makasdjian

The DNA-cutting proteins central to CRISPR-Cas9 and related gene-editing tools originally came from bacteria, but a newfound variety of Cas proteins apparently evolved in viruses that infect bacteria.

The new Cas proteins were found in the largest known bacteria-infecting viruses, called bacteriophages, and are the most compact working Cas variants yet discovered -- half the size of today's workhorse, Cas9.

Smaller and more compact Cas proteins are easier to ferry into cells to do genome editing, since they can be packed into small delivery vehicles, including one of the most popular: a deactivated virus called adeno-associated virus (AAV). Hypercompact Cas proteins also leave space inside AAV for additional cargo.

As one of the smallest Cas proteins known to date, the newly discovered CasΦ (Cas-phi) has advantages over current genome-editing tools when they must be delivered into cells to manipulate crop genes or cure human disease.

"Adenoviruses are the perfect Trojan horse for delivering gene editors: You can easily program the viruses to reach almost any part in the body," said Patrick Pausch, a postdoctoral fellow at the University of California, Berkeley, and in UC Berkeley's Innovative Genomics Institute (IGI), a joint UC Berkeley/UCSF research group devoted to discovering and studying novel tools for gene editing in agriculture and human diseases. "But you can only pack a really small Cas9 into such a virus to deliver it. If you would have other CRISPR-Cas systems that are really compact, compared to Cas9, that gives you enough space for additional elements: different proteins fused to the Cas protein, DNA repair templates or other factors that regulate the Cas protein and control the gene editing outcome."

Apparently these "megaphages" use the CasΦ protein -- the Greek letter Φ, or phi, is used as shorthand for bacteriophages -- to trick bacteria into fighting off rival viruses, instead of itself.

"The thing that actually made me interested in studying this protein specifically is that all the known CRISPR-Cas systems were originally discovered in bacteria and Archaea to fend off viruses, but this was the only time where a completely new type of CRISPR-Cas system was first found, and so far only found, in viral genomes," said Basem Al-Shayeb, a doctoral student in the IGI. "That made us think about what could be different about this protein, and with that came a lot of interesting properties that we then found in the lab."

Among these properties: CasΦ evolved to be streamlined, combining several functions in one protein, so that it can dispense with half the protein segments of Cas9. It is as selective in targeting specific regions of DNA as the original Cas9 enzyme from bacteria, and just as efficient, and it works in bacteria, animal and plants cells, making it a promising, broadly applicable gene editor.

"This study shows that this virus-encoded CRISPR-Cas protein is actually very good at what it does, but it is a lot smaller, about half the size of Cas9," said IGI executive director Jennifer Doudna, a UC Berkeley professor of molecular and cell biology and of chemistry and a Howard Hughes Medical Institute investigator. "That matters, because it might make it a lot easier to deliver it into cells than what we are finding with Cas9. When we think about how CRISPR will be applied in the future, that is really one of the most important bottlenecks to the field right now: delivery. We think this very tiny virus-encoded CRISPR-Cas system may be one way to break through that barrier."

Pausch and Al-Shayeb are first authors of a paper describing CasΦ that will appear this week in the journal Science.

Biggiephages carry their own Cas proteins

The CasΦ protein was first discovered last year by Al-Shayeb in the laboratory of Jill Banfield, a a UC Berkeley professor of earth and planetary science and environment science, policy and management. The megaphages containing CasΦ were part of a group they dubbed Biggiephage and were found in a variety of environments, from vernal pools and water-saturated forest floors to cow manure lagoons.

"We use metagenomic sequencing to discover the Bacteria, Archaea and viruses in many different environments and then explore their gene inventories to understand how the organisms function independently and in combination within their communities," Banfield said. "CRISPR-Cas systems on phage are a particularly interesting aspect of the interplay between viruses and their hosts."

While metagenomics allowed the researchers to isolate the gene coding for CasΦ, its sequence told them only that it was a Cas protein in the Type V family, though evolutionarily distant from other Type V Cas proteins, such as Cas12a, CasX (Cas12e) and Cas14. They had no idea whether it was functional as an immune system against foreign DNA. The current study showed that, similar to Cas9, CasΦ targets and cleaves foreign genomes in bacterial cells, as well as double-stranded DNA in human embryonic kidney cells and cells of the plant Arabidopsis thaliana. It also can target a broader range of DNA sequences than can Cas9.

The ability of CasΦ to cut double-stranded DNA is a big plus. All other compact Cas proteins preferentially cut single-stranded DNA. So, while they may fit neatly into compact delivery systems like AAV, they are much less useful when editing DNA, which is double-stranded, inside cells.

As was the case after Cas9's gene-editing prowess was first recognized in 2012, there is a lot of room for optimizing CasΦ for gene editing and discovering the best rules for designing guide RNAs to target specific genes, Pausch said.

Credit: 
University of California - Berkeley

Phantom-limb pain reduced through brain power

image: During training, MEG signals were recorded while patients watched the images of the phantom hand that was controlled by real-time decoding of the recorded MEG signals.

Image: 
Osaka University

Osaka, Japan — Phantom-limb pain is as mysterious as the name implies. The vast majority of amputees experience "phantom-limb" sensations that make them feel their missing limb is still part of their body. The cause is still unknown, and 50% to 80% of the cases, the sensations are painful. With no established treatments or medication, phantom-limb pain can have a large impact on the quality of life and recovery for amputees.

Although the cause is unknown, one theory is that it happens when areas of the brain that used to control the amputated limb remain strongly connected to the mental image of the limb. To weaken this connection, one idea is to train the brain regions that control the intact limb to also control the phantom limb. Takufumi Yanagisawa and his team at Osaka University hypothesized that the key to accomplishing this was to do it unconsciously.

"It is very difficult to intentionally activate the part of your brain that controls your right hand without actually thinking about moving that hand," explains Yanagisawa. "Instead, we designed a system in which the patients did not even know they were using those parts of their brains."

In order to train the brains of patients with phantom-hand pain, the group used a brain-computer-interface. First they recorded brain activity when patients opened and closed their intact hands and used the pattern of brain activity as a template. Then they continuously recorded brain activity related to the intact hands, but asked the patients to try to control a virtual hand with their phantom hand. For half the experiments, this training was real; the recorded brain activity was decoded based on the template and the image of the opening/closing virtual hand was adjusted accordingly. For the other half, the images of the virtual hand were randomly adjusted with no connection to brain activity. All patients thought they were actually controlling the virtual hand. Patients trained for about 30 min/day for 3 days, and after each session they rated the intensity of their phantom-limb pain.

The team found that pain was reduced by 30% even on the first day of training and the effect lasted up to five days after training was complete. Importantly, only patients who received real training reported less phantom pain. They also found that after training, the mental image of the phantom hand was weakened in the brain regions that once controlled the amputated hand.

"These findings are promising," says Yanagisawa, "especially given that alternatives like mirror training require a month of training to have the same effect. However, in order for this treatment to become truly practical, the cost must be reduced."

Credit: 
Osaka University

New study provides evidence for decades-old theory to explain the odd behaviors of water

image: Fig A (left): Using two distinct computer simulations of water (top and bottom panels), researchers detected swings in density characteristic of supercooled water oscillating between two liquid phases that differ by density. Fig B (right): The simulations revealed a critical point between the two liquid phases, which are of different densities due to the presence of an extra water molecule in the high-density liquid.

Image: 
Reprinted with permission from PG Debenedetti et al, Science Vol 369 Issue 6501, DOI:10.1126/science.abb9796

Water, so ordinary and so essential to life, acts in ways that are quite puzzling to scientists. For example, why is ice less dense than water, floating rather than sinking the way other liquids do when they freeze?

Now a new study provides strong evidence for a controversial theory that at very cold temperatures water can exist in two distinct liquid forms, one being less dense and more structured than the other.

Researchers at Princeton University and Sapienza University of Rome conducted computer simulations of water molecules to discover the critical point at which one liquid phase transforms into the other. The study was published this week in the journal Science.

"The presence of the critical point provides a very simple explanation for water's oddities," said Princeton's Dean for Research Pablo Debenedetti, the Class of 1950 Professor in Engineering and Applied Science, and professor of chemical and biological engineering. "The finding of the critical point is equivalent to finding a good, simple explanation for the many things that make water odd, especially at low temperatures."

Water's oddities include that as water cools, it expands rather than contracting, which is why frozen water is less dense than liquid water. Water also becomes more squeezable -- or compressible -- at lower temperatures. There are also at least 17 ways in which its molecules can arrange when frozen.

A critical point is a unique value of temperature and pressure at which two phases of matter become indistinguishable, and it occurs just prior to matter transforming from one phase into the other.

Water's oddities are easily explained by the presence of a critical point, Debenedetti said. The presence of a critical point is felt on the properties of the substance quite far away from the critical point itself. At the critical point, the compressibility and other thermodynamic measures of how the molecules behave, such as the heat capacity, are infinite.

Using two different computational methods and two highly realistic computer models of water, the team identified the liquid-liquid critical point as lying in a range of about 190 to 170 degrees Kelvin (about -117 degrees to -153 degrees Fahrenheit) at about 2,000 times the atmospheric pressure at sea level.

The detection of the critical point is a satisfying step for researchers involved in the decades-old quest to determine the underlying physical explanation for water's unusual properties. Several decades ago, physicists theorized that cooling water to temperatures below its freezing point while maintaining it as a liquid -- a "supercooled" state that occurs in high-altitude clouds -- would expose water's two unique liquid forms at sufficiently high pressures.

To test the theory, researchers turned to computer simulations. Experiments with real-life water molecules have not so far provided unambiguous evidence of a critical point, in part due to the tendency for supercooled water to rapidly freeze into ice.

Francesco Sciortino, a professor of physics at the Sapienza University of Rome, conducted one of the first such modeling studies while a postdoctoral researcher in 1992. That study, published in the journal Nature, was the first to suggest the existence of a critical point between the two liquid forms.

The new finding is extremely satisfying for Sciortino, who is also a co-author of the new study in Science. The new study used today's much faster and more powerful research computers and newer and more accurate models of water. Even with today's powerful research computers, the simulations took roughly 1.5 years of computation time.

"You can imagine the joy when we started to see the critical fluctuations exactly behaving the way they were supposed to," Sciortino said. "Now I can sleep well, because after 25 years, my original idea has been confirmed."

In the case of the two liquid forms of water, the two phases coexist in uneasy equilibrium at temperatures below freezing and at sufficiently high pressures. As the temperature dips, the two liquid phases engage in a tug of war until one wins out and the entire liquid becomes low- density.

In the simulations performed by postdoctoral researcher Gül Zerze at Princeton and Sciortino in Rome, as they brought down the temperature well below freezing into the supercooled range, the density of water fluctuated wildly just as predicted.

Some of the odd behaviors of water are likely to be behind water's life-giving properties, Zerze said. "The fluid of life is water, but we still don't know exactly why water is not replaceable by another liquid. We think the reason has to do with the abnormal behavior of water. Other liquids don't show those behaviors, so this must be linked to water as the liquid of life."

The two phases of water occur because the water molecule's shape can lead to two ways of packing together. In the lower density liquid, four molecules cluster around a central fifth molecule in a geometric shape called a tetrahedron. In the higher density liquid, a sixth molecule squeezes in, which has the effect of increasing the local density.

The team detected the critical point in two different computer models of water. For each model, the researchers subjected the water molecules to two different computational approaches to looking for the critical point. Both approaches yielded the finding of a critical point.

Peter Poole, a professor of physics at St. Francis Xavier University in Canada, and a graduate student when he collaborated with Sciortino and coauthored the 1992 paper in Nature, said the result was satisfying. "It's very comforting to have this new result," he said. "It's been a long and sometimes lonely wait since 1992 to see another unambiguous case of a liquid-liquid phase transition in a realistic water model."

C. Austen Angell, Regents Professor at Arizona State University, is one of the pioneers of experiments in the 1970s on the nature of supercooled water. "No doubt that this is a heroic effort in the simulation of water physics with a very interesting, and welcome, conclusion," said Angell, who was not involved in the present study, in an email. "As an experimentalist with access to equilibrium (long-term) physical measurements on real water, I had always felt 'safe' from preemption by computer simulators. But the data presented in the new paper shows that this is no longer true."

Credit: 
Princeton University

The Lancet Public Health: Speed of testing is most critical factor in the success of contact tracing strategies to slow COVID-19 transmission

* Even if all contacts are successfully traced, a delay of three days or more between symptom onset and testing will not reduce onward transmission of the virus sufficiently to control further spread, according to modelling study

* In the best-case scenario, with zero delays and at least 80% of contacts traced, the R number is reduced from 1.2 to around 0.8, and 80% of onward transmission per person diagnosed could be prevented

* For conventional contact tracing to work, test results need to be delivered within a day of an individual developing symptoms

* Mobile apps can speed up contact tracing and keep R below 1, even if only 20% of the population use them

Speed of contact tracing strategies is essential to slowing COVID-19 transmission, according to a mathematical modelling study in The Lancet Public Health journal which models the effectiveness of conventional and app-based strategies on community transmission of the virus.

If COVID-19 testing is delayed by three days or more after a person develops symptoms, even the most efficient contact tracing strategy cannot reduce onward transmission of the virus.

The researchers say improving access to COVID-19 testing, combined with digital that minimise tracing delays, will be key to the success of a contact tracing approach to reduce spread of the virus.

Professor Mirjam Kretzschmar, one of the lead authors of the study, from the University of Utrecht, the Netherlands, said: "This study reinforces findings from other modelling studies, showing that contact tracing can be an effective intervention to prevent spread of the SARS-CoV-2 virus, but only if the proportion of contacts traced is high and the process is fast. Our study builds on this to show, in detail, what role each step in the process plays in making this approach successful. This will help policy makers understand where best to prioritise resources to maximise the chances of success. For example, we found that mobile apps can speed up the process of tracking down people who are potentially infected, but if testing is delayed by three days or more even these technologies can't stop transmission of the virus." [1]

Contact tracing involves tracking down all of the people who have been in contact with an infected individual so they can be isolated to prevent further spread of the virus. This approach is an established public health measure recommended by the World Health Organisation as a potential exit strategy to enable the alleviation of COVID-19 lockdown measures.

Conventional contact tracing methods involve a public health professional contacting the infected person and asking them to recall everyone they have been in contact with over a defined period before the onset of symptoms. Several countries have introduced mobile apps to speed up this process, by automatically alerting people who have been in proximity to the infected person using data from their mobile device.

To be successful, contact tracing measures must keep the rate of transmission of the virus, known as the Reproduction or R number, below 1. This means that, on average, the number of individuals who will be infected by a single infected person must be less than one.

In the new study, the researchers used a mathematical model that reflects the various steps and delays in the contact tracing process. This enabled them to quantify how such delays affect the R number and the fraction of onward transmission cases that can be prevented for each diagnosed person.

The model assumes that around 40% of virus transmission occurs before a person develops symptoms. In the absence of any strategies to mitigate the spread of the virus, each infected person will transmit the virus to an average of 2.5 people. Introducing physical distancing alone, assuming that close contacts are reduced by 40% and casual contacts by 70%, will reduce the reproduction number to 1.2.

In the best case scenario, the model predicts that contact tracing could reduce the number of people a person with COVID-19 passes the virus on to from 1.2 to 0.8. For this to work, at least 80% of people who are eligible must be tested, there must be no delays in testing after the onset of symptoms and at least 80% of contacts must be identified on the same day as the test results are received.

If testing is delayed by two days, keeping the R number below 1 would require contacts to be traced within a day and at least 80% of contacts must be identified, the model predicts.

The model assumes that conventional contact tracing takes a minimum of three days and is less efficient at tracking down contacts than mobile app technologies, which are assumed to be instantaneous.

The findings predict that conventional contact tracing will only work to keep the R number below 1 if people with COVID-19 receive a positive test result on the same day they develop symptoms of the virus.

Contact tracing based on mobile app technology can accommodate a delay in testing of up to 2 days and keep the R number below 1, as long as at least 80% of contacts are tracked down. In this case, the number of people infected from those contacts would be reduced by half.

Once testing is delayed by three days or more, even a perfect system that [would] trace 100% of contacts with no delays cannot bring the R number below 1, according to the model.

Overall, the study found that reducing the time between a person developing symptoms and receiving a positive test result is the most important factor for improving contact tracing effectiveness.

Professor Marc Bonten, one of the lead authors of the study, from the University of Utrecht, the Netherlands, said: "In our model, minimising testing delays had the largest impact on reducing transmission of the virus and testing infrastructure is therefore the most critical factor for the success of a contact tracing system. This means that as many infectious people as possible need to be tested, and policymakers might consider lowering the eligibility threshold for access to testing. This will lead to a large proportion of negative test results, however, and future studies should focus on identifying the optimal balance between the proportion of negative tests and the effectiveness of contact tracing." [1]

The authors note that their model does not take into account the age structure of the population. This might influence the proportion of asymptomatic cases, as these are more common in younger people and children, and might also influence mobile app usage. The model also does not account for infections acquired in hospitals and other healthcare settings, such as care homes.

Writing in a linked Comment Article, Professor Louise Ivers and Daniel J. Weitzner, who were not involved in the study, highlight four crucial questions that remain to be investigated. Firstly, an assessment is needed of how well smartphones measure proximity. Secondly, better understanding of how mobile apps will integrate with overall contact tracing programmes needs to be investigated. The Comment authors also call for further research to understand what factors will encourage users to trust the privacy and security properties of mobile apps. Finally, they highlight the potential for conventional and digital contact tracing strategies to perpetuate health disparities, and further evaluation is needed to prevent this.

Professor Ivers, of Harvard Medical School, and Daniel J. Weitzner, from the Massachusetts Institute of Technology, USA, said: "As contact tracing remains a crucial component of the COVID-19 response, mobile apps offer promise, especially when considering the speed and scale required for tracing to be effective--as highlighted in Kretzschmar and colleagues' study.4 However, understanding the potential impact of apps as part of a comprehensive integrated approach requires more evaluation of their use in real life and multidisciplinary engagement of technologists, epidemiologists, public health experts, and the public."

Credit: 
The Lancet