Culture

More and safer heart transplants could become possible with new heart box

A donated heart can now be transported and preserved for longer than what has previously been possible. The new method, which consists of a specially designed heart box, was used for a transplant for the first time as early as the summer of 2017. Now it has been evaluated in a first clinical study, and the results are published in Nature Communications.

The results from the study show clear signs of improvement when the new method is used.

"Although it is too early to draw far-reaching conclusions, we are obviously hopeful. If our results continue to be positive, the method could make a big difference for patients needing a heart transplant", says Johan Nilsson, cardiothoracic surgeon at Skåne University Hospital and professor at Lund University in Sweden.

The method involves preserving and transporting the donated heart in a specially designed heart box, where it is supplied with important substances in an oxygenated, blood-mixed solution. One of the advantages of the box is that it enables a longer preservation time, which in turn could mean that the donated heart could be transported farther than is currently possible.

Six of the patients in the study received heart transplants in which the heart was preserved and transported with the new method. Twenty-five patients made up the control group, and these patients received transplants using the traditional preservation method.

"None of the six patients who received their new heart with the new method have suffered complications, however, we have seen complications in some of the patients in the control group", says Johan Nilsson.

Examples of complications are heart failure, acute rejection or death of the patient.

The hope is that in the future, hearts can be preserved in the specially designed box for up to 12 hours. A few years ago, tests were carried out with pig hearts that, using the new method, survived a total of 24 hours outside the body. However, the current study tested a maximum of four and a half hours - in comparison, current methods have a four-hour time span.

"Based on the results we have produced so far, we cannot claim with certainty that we are able to preserve a heart for 12 hours, however, our study shows that the new method enables longer preservation than today. If we, in our continued research, can establish that this is possible, it would mean entirely new opportunities for transplants. It would, for example, be possible to fly hearts between countries, which we are currently cannot", says Johan Nilsson.

"The plan now is that we will shortly test the new method with a group of 33 patients. In addition to the six patients who participated in the study, we have also used the method in an additional nine transplants. And it still looks promising", he concludes.

Credit: 
Lund University

Low physical function increase the risk for bone loss in older hip fracture patients

image: Older hip fracture patients with low physical function and lower muscle mass may be at risk for greater bone loss during the first post-fracture year.

Image: 
University of Jyväskylä.

Low physical function and low muscle mass after hip fracture increased the risk for accelerated bone deterioration in older hip fracture patients. Acknowledgement of the risk factors is important for bone health and overall recovery.

"Substantial decrements in physical function, muscle and bone strength occur after hip fracture, which markedly increase the risk for a subsequent fracture," says Tuuli Suominen, a PhD student at the Gerontology Research Center, Faculty of Sport and Health Sciences at the University of Jyväskylä.

Part of the bone loss is presumably caused by disuse, but the contributing factors have not been well characterized.

"A strong relationship exists between bone and muscle," adds Suominen. "A low level of physical function may prevent effective loading of the bones and could be related to reduced bone-loading physical activity. Moreover, in older, often frail and undernourished hip fracture patients, higher muscle mass may also reflect better resources to cope with a prolonged catabolic state and the hip fracture-related stresses."

Associations of physical function and muscle mass with the accelerated posthip fracture bone loss were examined in a study by the Gerontology Research Center and Faculty of Sport and Health Sciences at the University of Jyväskylä, Finland. A total of 81 independently living men and women over the age of 60 who had been operated on for a hip fracture participated in the study.

Tibial bone properties were examined by computed tomography at baseline (on average 10 weeks after fracture) and after 12 months. Physical function at baseline was measured with perceived difficulty in walking outdoors and with the Short Physical Performance Battery (SPPB), which includes habitual walking speed, chair rise and standing balance tests. Lean body mass, assessed with bioelectrical impedance, was used as a measure of muscle mass.

Tibial bone density and strength declined during the year after the fracture, on the fractured as well as on the non-fractured side leg. A lower SPPB score, difficulty in walking outdoors and lower LBM predicted greater decline in bone density in both legs. A lower SPPB score and difficulty in walking outdoors were also associated with a greater decline in bone strength in both legs.

Older hip fracture patients with low physical function and lower muscle mass may be at risk for greater bone loss during the first post-fracture year. Acknowledgement of the risk factors could assist in developing interventions and care to promote bone health and overall recovery. Attention should be paid to physical function, muscle mass preservation and fall prevention before as well as after fracture occurrence.

The study was part of a larger research program aiming at promoting mobility recovery after hip fracture. The study was funded by the Ministry of Education and Culture and the Social Insurance Institution of Finland.

Credit: 
University of Jyväskylä - Jyväskylän yliopisto

AI reduces 'communication gap' for nonverbal people by as much as half

Researchers have used artificial intelligence to reduce the 'communication gap' for nonverbal people with motor disabilities who rely on computers to converse with others.

The team, from the University of Cambridge and the University of Dundee, developed a new context-aware method that reduces this communication gap by eliminating between 50% and 96% of the keystrokes the person has to type to communicate.

The system is specifically tailed for nonverbal people and uses a range of context 'clues' - such as the user's location, the time of day or the identity of the user's speaking partner - to assist in suggesting sentences that are the most relevant for the user.

Nonverbal people with motor disabilities often use a computer with speech output to communicate with others. However, even without a physical disability that affects the typing process, these communication aids are too slow and error prone for meaningful conversation: typical typing rates are between five and 20 words per minute, while a typical speaking rate is in the range of 100 to 140 words per minute.

"This difference in communication rates is referred to as the communication gap," said Professor Per Ola Kristensson from Cambridge's Department of Engineering, the study's lead author. "The gap is typically between 80 and 135 words per minute and affects the quality of everyday interactions for people who rely on computers to communicate."

The method developed by Kristensson and his colleagues uses artificial intelligence to allow a user to quickly retrieve sentences they have typed in the past. Prior research has shown that people who rely on speech synthesis, just like everyone else, tend to reuse many of the same phrases and sentences in everyday conversation. However, retrieving these phrases and sentences is a time-consuming process for users of existing speech synthesis technologies, further slowing down the flow of conversation.

In the new system, as the person is typing, the system uses information retrieval algorithms to automatically retrieve the most relevant previous sentences based on the text typed and the context the conversation the person is involved in. Context includes information about the conversation such as the location, time of day, and automatic identification of the speaking partner's face. The other speaker is identified using a computer vision algorithm trained to recognise human faces from a front-mounted camera.

The system was developed using design engineering methods typically used for jet engines or medical devices. The researchers first identified the critical functions of the system, such as the word auto-complete function and the sentence retrieval function. After these functions had been identified, the researchers simulated a nonverbal person typing a large set of sentences from a sentence set representative of the type of text a nonverbal person would like to communicate.

This analysis allowed the researchers to understand the best method for retrieving sentences and the impact of a range of parameters on performance, such as the accuracy of word-auto complete and the impact of using many context tags. For example, this analysis revealed that only two reasonably accurate context tags are required to provide the majority of the gain. Word-auto complete provides a positive contribution but is not essential for realising the majority of the gain. The sentences are retrieved using information retrieval algorithms, similar to web search. Context tags are added to the words the user types to form a query.

The study is the first to integrate context-aware information retrieval with speech-generating devices for people with motor disabilities, demonstrating how context-sensitive artificial intelligence can improve the lives of people with motor disabilities.

"This method gives us hope for more innovative AI-infused systems to help people with motor disabilities to communicate in the future," said Kristensson. "We've shown it's possible to reduce the opportunity cost of not doing innovative research with AI-infused user interfaces that challenge traditional user interface design mantra and processes."

Credit: 
University of Cambridge

3D X-ray reveals secrets from inside bones

An international research team has used new X-ray techniques to describe how the architecture of healthy human bones is built up. The team has uncovered a hitherto unknown structure in healthy bones.

The human bone is a wonderful and fantastic biological material. Bone tissue is highly specialised, with a structure optimised for specific functions in the body. Healthy bones are strong, they have a high carrying capacity, and they are hard to break.

The internal structure of bones is of great international interest to researchers, and a better understanding of the fundamental biomaterial structures would make it possible to prevent various bone diseases. It could also facilitate the development of completely new materials, with unprecedented properties. However, the structure of the bones is simply too complex for us to be able to come close to imitating it.

An international team of researchers from Aarhus University i Denmark, the European Synchotron in France (ESRF), the Swedish Chalmers University and the Paul Scherrer Institute in Switzerland have now uncovered a previously unknown substructure in bone tissue by means of a new X-ray technique. The discovery and the technique open up for new approaches to the study of the underlying architecture of bone tissue, and to create a better understanding of biomaterials.

The study is presented in the scientific magazine, Science Advances.

3D image of the crystals in bones

If you cut into a bone, we know that the inner architecture of healthy bone tissue is constructed of basically two types of tissue: the so-called collagen fibrillations, which are primarily made up of protein. They comprise the bearing capacity of the mechanical properties of bone, with a microscopic, thread-like structure woven together with nanocrystals of minerals containing calcium.

Together, the two tissue types constitute a twisted hierarchical structure with the ability of the fibrillations to withstand stretching forces and bending, and the hardness and resilience of nanocrystals. It is this twisted structure that provides bones with their mechanical properties, and which researchers have been trying to understand for many years.

"The challenge until now has been that we have no method to demonstrate the orientation of the nanocrystals in the bone tissue," explains Associate Professor Henrik Birkedal from iNANO and the Department of Chemistry at Aarhus University.

The international team has succeeded in finding the solution by improving the X-ray technique known as tensor tomography, and by creating an accurate 3D map of the crystals in the tissue.

"In recent years, significant technological and scientific progress has made this new method possible. By means of more powerful synchrotron radiation, it is possible to improve the method, and to challenge the previous assumption about bone tissue," explains Manfred Burghammer from the research facility ID13 at the ESRF, who has been the research director of the project together with Henrik Birkedal.

The improved method makes it possible to see how the nanocrystals are actually located in the structure. This has already revealed a disparity with previous knowledge about bones that has been built up through many years of research. The bone structure is not uniformly structured as previously assumed, because there are deviations in the orientation of the nanocrystals.

"Frankly, we were a little shocked to find the deviation from the models," says Henrik Birkedal. "It's been a really cross-disciplinary, international collaboration with participants from physics, chemistry and health sciences, and we were all pleasantly surprised by the discovery."

New knowledge with unknown significance

The new 3D images surprised the research group, because they conflict with fundamental theories that bones are built up in a predominantly uniform hierarchical structure.

"Admittedly, it's too early to give an unambiguous explanation of what hides behind the deviation we have demonstrated, but it has given science a new method of looking into the underlying structure of bones," says Tilman Grünewald from the ESRF.

The discovery potentially questions fundamentally a number of the models of bone tissue and the mechanical properties of bones that, among other things, have been used to describe the process of bone formation.

"Bones and other biomaterials, like sea shells, have a mechanical and structural characteristic that is closely linked to their structure. The better we understand this, the closer we can get to being able to imitate nature's building methods, for example. Our study has given us a new tool to reveal a few more of the secrets of nature, and this work is now underway," says Henrik Birkdal.

Credit: 
Aarhus University

Multi-ethnic study suggests vitamin K may offer protective health benefits in older age

BOSTON (June 15, 2020)-- A new, multi-ethnic study found older adults with low vitamin K levels were more likely to die within 13 years compared to those whose vitamin K levels were adequate. The results suggest vitamin K, a nutrient found in leafy greens and vegetable oils, may have protective health benefits as we age, according to the researchers.

The meta-analysis, involving nearly 4,000 Americans aged 54-76, one-third of whom were non-white, was led by researchers at the Jean Mayer USDA Human Nutrition Research Center on Aging at Tufts University (USDA HNRCA) and Tufts Medical Center and is published in The American Journal of Clinical Nutrition.

The research team categorized participants according to their vitamin K blood levels. They then compared risk of heart disease and risk of death across the categories over approximately 13 years of follow-up.

The results showed no significant associations between vitamin K levels and heart disease. However, the people with the lowest vitamin K levels had a 19 percent higher risk of death, compared to the those with vitamin K levels that reflected adequate vitamin K intake.

Vitamin K is a nutrient that is important for maintaining healthy blood vessels. It is found in leafy greens, such as lettuce, kale and spinach, and in some vegetable oils, especially soybean and canola.

"The possibility that vitamin K is linked to heart disease and mortality is based on our knowledge about proteins in vascular tissue that require vitamin K to function. These proteins help prevent calcium from building up in artery walls, and without enough vitamin K, they are less functional," said first author Kyla Shea.

Shea is a scientist on the HNRCA's vitamin K team, long renowned for its work on the role of vitamin K in the prevention of chronic disease. Sarah Booth, a co-author on the study and director of the USDA HNRCA, developed the methodology for measuring vitamin K in blood. Her research team measured the vitamin K levels in the study participants and continues to generate data about vitamin K status in population and clinic-based studies.

"Similar to when a rubber band dries out and loses its elasticity, when veins and arteries are calcified, blood pumps less efficiently, causing a variety of complications. That is why measuring risk of death, in a study such as this, may better capture the spectrum of events associated with worsening vascular health," said last author Daniel Weiner, M.D., nephrologist at Tufts Medical Center, whose research includes vascular disease in people with impaired kidney function.

While this study adds to existing evidence that vitamin K may have protective health benefits, it cannot establish a causal relationship between low vitamin K levels and risk of death because it is observational. Additional studies are also needed to clarify why circulating vitamin K was associated with risk for death but not heart disease.

Methodology

The study is a meta-analysis, which combined data from participants in three ongoing studies: the Health, Aging, and Body Composition Study, the Multi-Ethnic Study of Atherosclerosis, and the Framingham Heart Study (Offspring Cohort). Vitamin K levels for participants in all three studies were measured after fasting, with the same test, and processed at the same laboratory (the vitamin K laboratory at the USDA HNRCA), minimizing the potential for laboratory-based variation. The test showed levels of circulating phylloquinone, the compound known as vitamin K1.

Participants on the blood thinner warfarin were excluded because vitamin K counteracts the anti-clotting effects of warfarin. All participants were free of heart disease at baseline and had vitamin K levels measured during a single medical exam that was part of each study's regular protocol.

The statistical analysis adjusted for age, gender, race, ethnicity, BMI, triglycerides, cholesterol levels, smoking status, and use of medications for diabetes or high blood pressure.

There are some limitations to the study, including that circulating phylloquinone was measured from a single blood draw, rather than from repeated blood tests over time. Higher circulating phylloquinone may reflect an overall healthier diet and lifestyle. Lastly, there were fewer heart disease events compared to total deaths, which may have limited researchers' ability to detect statistically significant risk of heart disease.

Credit: 
Tufts University, Health Sciences Campus

The first intuitive programming language for quantum computers

image: Computer scientists at ETH Zurich have developed the first quantum programming language that enables solving complex computations elegantly, simply and safely.

Image: 
ETH Zurich

Programming quantum computers is becoming easier: computer scientists at ETH Zurich have designed the first programming language that can be used to program quantum computers as simply, reliably and safely as classical computers. "Programming quantum computers is still a challenge for researchers," says Martin Vechev, computer science professor in ETH's Secure, Reliable and Intelligent Systems Lab (SRI), "which is why I'm so excited that we can now continue ETH Zurich's tradition in the development of quantum computers and programming languages."

He adds: "Our quantum programming language Silq allows programmers to utilize the potential of quantum computers better than with existing languages, because the code is more compact, faster, more intuitive and easier to understand for programmers." This week, Vechev will introduce Silq to other experts in the field at PLDI 2020, a conference for programming languages. To facilitate discussion, adoption and further development, he and his team have also released Silq on its own website (silq.ethz.ch).

Quantum computing has been seeing increased attention over the last decade, since these computers, which function according to the principles of quantum physics, have enormous potential. Today, most researchers believe that these computers will one day be able to solve certain problems faster than classical computers, since to perform their calculations they use entangled quantum states in which various bits of information overlap at a certain point in time. This means that in the future, quantum computers will be able to efficiently solve problems which classical computers cannot solve within a reasonable timeframe.

This quantum supremacy has still to be proven conclusively. However, some significant technical advances have been achieved recently. In late summer 2019, a quantum computer succeeded in solving a problem - albeit a very specific one - more quickly than the fastest classical computer.

For certain "quantum algorithms", i.e. computational strategies, it is also known that they are faster than classical algorithms, which do not exploit the potential of quantum computers. To date, however, these algorithms still cannot be calculated on existing quantum hardware because quantum computers are currently still too error-prone.

Expressing the programmer's intent

Utilizing the potential of quantum computation not only requires the latest technology, but also a quantum programming language to describe quantum algorithms. In principle, an algorithm is a "recipe" for solving a problem; a programming language describes the algorithm so that a computer can perform the necessary calculations.

Today, quantum programming languages are tied closely to specific hardware; in other words, they describe precisely the behaviour of the underlying circuits. For programmers, these "hardware description languages" are cumbersome and error-prone, since the individual programming instructions must be extremely detailed and thus explicitly describe the minutiae needed to implement quantum algorithms.

This is where Vechev and his group come in with their development of Silq. "Silq is the first quantum programming language that is not designed primarily around the construction and functionality of the hardware, but on the mindset of the programmers when they want to solve a problem - without requiring them to understand every detail of the computer architecture and implementation," says Benjamin Bichsel, a doctoral student in Vechev's group who is supervising the development of Silq.

Computer scientists refer to computer languages that abstract from the technical details of the specific type of computer as high-level programming languages. Silq is the very first high-level programming language for quantum computers. High-level programming languages are more expressive, meaning that they can describe even complex tasks and algorithms with less code. This makes them more comprehensible and easier to use for programmers. They can also be used with different computer architectures.

Eliminating errors through automatic uncomputation

The greatest innovation and simplification that Silq brings to quantum programming languages concerns a source of errors that has plagued quantum programming until now. A computer calculates a task in several intermediate steps, which creates intermediate results or temporary values.

In order to relieve the memory, classical computers automatically erase these values. Computer scientists refer to this as "garbage collection", since the superfluous temporary values are disposed of.

In the case of quantum computers, this disposal is trickier due to quantum entanglement: the previously calculated values can interact with the current ones, interfering with the correct calculation. Accordingly, cleaning up such temporary values on quantum computers requires a more advanced technique of so-called uncomputation.

"Silq is the first programming language that automatically identifies and erases values that are no longer needed," explains Bichsel. The computer scientists achieved this by applying their knowledge of classical programming languages: their automatic uncomputation method uses only programming commands that are free of any special quantum operations - they are "qfree", as Vechev and Bichsel say.

"Silq is a major breakthrough in terms of optimising the programming of quantum computers; it is not the final phase of development," says Vechev. There are still many open questions, but because Silq is easier to understand, Vechev and Bichsel hope to stimulate both the further development of quantum programming languages and the theory and development of new quantum algorithms.

"Our team of four has made the breakthrough after two years of work thanks to the combination of different expertise in language design, quantum physics and implementation. If other research and development teams embrace our innovations, it will be a great success," says Bichsel.

Credit: 
ETH Zurich

Maternal transmission of COVID-19 to baby during pregnancy is uncommon, study finds

Transmission of COVID-19 from mother to baby during pregnancy is uncommon, and the rate of infection is no greater when the baby is born vaginally, breastfed or allowed contact with the mother, according to a new study.

The research also found that babies that did test positive for COVID-19, were mostly asymptomatic.
The findings are published in BJOG: An International Journal of Obstetrics and Gynaecology.

Many early reports in the literature on COVID-19 in pregnancy suggested that in order to reduce the risk of transmission of COVID-19 from mother to baby, it was safer to have a caesarean, to isolate the baby from the mother at birth and to formula feed, but there was very little evidence to support these guidelines.

To conclusively look at the risks associated with COVID-19 and pregnancy, experts from the School of Medicine at the University of Nottingham have undertaken a systematic review of 49 studies looking into this much talked about topic.

The studies reviewed included 666 neonates (newborn babies) and 655 women (as some women delivered twins). Of the women who delivered their babies vaginally, only eight out of 292 (2.7%) had a baby which tested positive for COVID-19.

Of the 364 women who had a caesarean, 20 (5.3%) of those had a baby which tested positive for COVID-19.

These findings show that neonatal COVID-19 infection is uncommon, and also commonly asymptomatic in those babies who are affected.

The data also showed that the infection rates to be no higher when the baby was born vaginally, breast fed or allowed contact with the mother immediately after birth.

The systematic review was an international effort carried out by Dr Kate Walker, Clinical Associate Professor in Obstetrics, and Jim Thornton, Professor of Obstetrics and Gynaecology, from the University of Nottingham, as well as experts at Dalhousie University, Canada and Monash University, Clayton, Australia, and University College Cork, Cork University Maternity Hospital, Ireland.

Dr Walker said: "There has been a lot of concern around whether pregnant women should be concerned for the health of their babies if they contract COVID-19.

"We wanted to look at the outcome for babies whose mothers contracted the virus and see if the route of birth, method of infant feeding and mother/baby interaction increased the risk of babies contracting the virus. From our results, we are satisfied that the chance of newborn infection with COVID-19 is low.

"We would also stress that a vaginal birth and breast feeding are safe for mothers who find themselves in these circumstances."

Dr Jeannette Comeau, is a Paediatric Infectious Diseases Physician at Dalhousie University, she said: "I am happy to see that the data continues to be reassuring, supporting keeping the mother/infant pair together after birth, underlining that while occasional postnatal infant infection is detected, clinical course tends to be mild. From the cases of infection in the newborn we do not have confirmatory evidence that this infection was acquired in the womb or during birth."

Credit: 
University of Nottingham

Diluting blood plasma rejuvenates tissue, reverses aging in mice

image: A new study by University of California, Berkeley, found that older mice grew significantly more new muscle fibers, shown as pink "donut" shapes, after undergoing a procedure that effectively diluted the proteins in their blood plasma (bottom) than they did before they underwent the procedure (top). The research team is currently finalizing clinical trials to determine if a modified plasma exchange in humans could be used to treat age-associated diseases and improve the overall health of older people.

Image: 
Image courtesy Irina Conboy

Berkeley -- In 2005, University of California, Berkeley, researchers made the surprising discovery that making conjoined twins out of young and old mice -- such that they share blood and organs -- can rejuvenate tissues and reverse the signs of aging in the old mice. The finding sparked a flurry of research into whether a youngster's blood might contain special proteins or molecules that could serve as a "fountain of youth" for mice and humans alike.

But a new study by the same team shows that similar age-reversing effects can be achieved by simply diluting the blood plasma of old mice -- no young blood needed.

In the study, the team found that replacing half of the blood plasma of old mice with a mixture of saline and albumin -- where the albumin simply replaces protein that was lost when the original blood plasma was removed -- has the same or stronger rejuvenation effects on the brain, liver and muscle than pairing with young mice or young blood exchange. Performing the same procedure on young mice had no detrimental effects on their health.

This discovery shifts the dominant model of rejuvenation away from young blood and toward the benefits of removing age-elevated, and potentially harmful, factors in old blood.

"There are two main interpretations of our original experiments: The first is that, in the mouse joining experiments, rejuvenation was due to young blood and young proteins or factors that become diminished with aging, but an equally possible alternative is that, with age, you have an elevation of certain proteins in the blood that become detrimental, and these were removed or neutralized by the young partners," said Irina Conboy, a professor of bioengineering at UC Berkeley who is the first author of the 2005 mouse-joining paper and senior author of the new study. "As our science shows, the second interpretation turns out to be correct. Young blood or factors are not needed for the rejuvenating effect; dilution of old blood is sufficient."

In humans, the composition of blood plasma can be altered in a clinical procedure called therapeutic plasma exchange, or plasmapheresis, which is currently FDA-approved in the U.S. for treating a variety of autoimmune diseases. The research team is currently finalizing clinical trials to determine if a modified plasma exchange in humans could be used to improve the overall health of older people and to treat age-associated diseases that include muscle wasting, neuro-degeneration, Type 2 diabetes and immune deregulation.

"I think it will take some time for people to really give up the idea that that young plasma contains rejuvenation molecules, or silver bullets, for aging," said Dobri Kiprov, a medical director of Apheresis Care Group and a co-author of the paper. "I hope our results open the door for further research into using plasma exchange -- not just for aging, but also for immunomodulation."

The study appears online in the journal Aging.

A molecular 'reset' button

In the early 2000s, Conboy and her husband and research partner Michael Conboy, a senior researcher and lecturer in the Department of Bioengineering at UC Berkeley and co-author of the new study, had a hunch that our body's ability to regenerate damaged tissue remains with us into old age in the form of stem cells, but that somehow these cells get turned off through changes in our biochemistry as we age.

"We had the idea that aging might be really more dynamic than people think," Conboy said. "We thought that it could be caused by transient and very reversible declines in regeneration, such that, even if somebody is very old, the capacity to build new tissues in organs could be restored to young levels by basically replacing the broken cells and tissues with healthy ones, and that this capacity is regulated through specific chemicals which change with age in ways that become counterproductive."

After the Conboys published their groundbreaking 2005 work, showing that making conjoined twins from the old mouse and a young mouse reversed many signs of aging in the older mouse, many researchers seized on the idea that specific proteins in young blood could be the key to unlocking the body's latent regeneration abilities.

However, in the original report, and in a more recent study, when blood was exchanged between young and old animals without physically joining them, young animals showed signs of aging. These results indicated that that young blood circulating through young veins could not compete with old blood.

As a result, the Conboys pursued the idea that a buildup of certain proteins with age is the main inhibitor of tissue maintenance and repair, and that diluting these proteins with blood exchange could also be the mechanism behind the original results. If true, this would suggest an alternative, safer path to successful clinical intervention: Instead of adding proteins from young blood, which could do harm to a patient, the dilution of age-elevated proteins could be therapeutic, while also allowing for the increase of young proteins by removing factors that could suppress them.

To test this hypothesis, the Conboys and their colleagues came up with the idea of performing "neutral" blood exchange. Instead of exchanging the blood of a mouse with that of a younger or an older animal, they would simply dilute the blood plasma by swapping out part of the animal's blood plasma with a solution containing plasma's most basic ingredients: saline and a protein called albumin. The albumin included in the solution simply replenished this abundant protein, which is needed for overall biophysical and biochemical blood health and was lost when half the plasma was removed.

"We thought, 'What if we had some neutral age blood, some blood that was not young or not old?'" said Michael Conboy. "We'll do the exchange with that, and see if it still improves the old animal. That would mean that by diluting the bad stuff in the old blood, it made the animal better. And if the young animal got worse, then that would mean that that diluting the good stuff in the young animal made the young animal worse."

After finding that the neutral blood exchange significantly improved the health of old mice, the team conducted a proteomic analysis of the blood plasma of the animals to find out how the proteins in their blood changed following the procedure. The researchers performed a similar analysis on blood plasma from humans who had undergone therapeutic plasma exchange.

They found that the plasma exchange process acts almost like a molecular reset button, lowering the concentrations of a number of pro-inflammatory proteins that become elevated with age, while allowing more beneficial proteins, like those that promote vascularization, to rebound in large numbers.

"A few of these proteins are of particular interest, and in the future, we may look at them as additional therapeutic and drug candidates," Conboy said. "But I would warn against silver bullets. It is very unlikely that aging could be reversed by changes in any one protein. In our experiment, we found that we can do one procedure that is relatively simple and FDA-approved, yet it simultaneously changed levels of numerous proteins in the right direction."

Therapeutic plasma exchange in humans lasts about two to three hours and comes with no or mild side effects, said Kiprov, who uses the procedure in his clinical practice. The research team is about to conduct clinical trials to better understand how therapeutic blood exchange might best be applied to treating human ailments of aging.

Credit: 
University of California - Berkeley

Black and female principal candidates more likely to experience delayed and denied promotions

WASHINGTON, June 15--Black and female assistant principals are systematically delayed and denied promotion to principal, compared to their White or male counterparts, despite having equivalent qualifications and more experience on average, according to a new study. The findings were published in June in AERA Open, a peer-reviewed, open access journal of the American Educational Research Association.

* VIDEO: Watch study coauthors Sarah Guthery and Lauren Bailes discuss study findings and implications (https://www.youtube.com/watch?v=5O4EzZSZBsI)

For their study, authors Lauren Bailes at the University of Delaware and Sarah Guthery at Texas A&M University¬¬-Commerce assessed the probability of and time to promotion for 4,689 assistant principals in Texas from 2001 to 2017, using data from the Texas Education Agency. The authors identified assistant principals serving in their first year and analyzed their progress to promotion, if it occurred. While principal promotion processes vary by district, assistant principals in the study had earned a master's degree and acquired a principal's license, which are the minimal credentials needed to qualify for promotion to principal in Texas.

Bailes and Guthery found that after holding education, experience, school level, and school location constant, Black assistant principals were 18 percent less likely to be promoted than White candidates who were equally qualified. When the Black candidates were promoted, their average time to promotion was 5.27 years, while the average wait time for their White peers was 4.67 years, leaving a 0.6-year gap attributable to race.

The authors found a difference in promotion by gender when they looked specifically at high school principalships. While women comprised half of high school assistant principals--and nearly two-thirds of all assistant principals--in Texas, women were 5 to 7 percent less likely to be promoted into high school principalships than men. As women gained more years of experience as assistant principals, their likelihood of promotion, in fact, decreased relative to their male peers. Women who did become high school principals waited longer, spending 5.62 years as an assistant principal versus 4.94 years for men, leaving a 0.68-year gender gap.

"Even though more diversity in the teacher and principal workforce has been shown to improve teacher retention and student outcomes, our findings indicate that there are still systematic race- and gender-based inequities within the profession," said Guthery, an assistant professor of education at Texas A&M University-Commerce. "This is despite a teacher corps that is overwhelmingly female and becoming more racially diverse."

While prior research has identified gaps in promotions at the top levels of education leadership, such as principals and superintendents, Bailes and Guthery identified inequities much earlier in the education leadership pipeline by focusing on time to and probability of promotions once an individual has self-selected into the leadership track.

The authors found that women and Blacks had more years of experience even before becoming assistant principals. Men who became high school assistant principals had 1.25 years less experience on average than women who entered high school principalships. In elementary and middle schools, the gender gap was even larger, mounting to 1.62 years.

"At every point of promotion, the pool of candidates is whiter and more male, especially compared to the teacher workforce," said Guthery. "We find that diversity exists in the pipeline, but the pipeline tends to squeeze out women and Blacks much earlier than studies of school leadership usually capture."

Bailes and Guthery also examined the differences between women's promotions across elementary, middle, and high schools to identify the ways in which women are promoted within education careers. They found that even when women worked as assistant principals in high schools for a longer time and had more career experience than their male counterparts, they were more likely to be promoted to principal in elementary schools than in high schools. This had implications for their future opportunities in higher levels of leadership, according to the authors.

"Because a high school principalship is so often viewed as requisite for district leadership, women who lead elementary schools are less likely to be tapped for superintendencies and other district leadership positions," said Bailes, an assistant professor at the University of Delaware.

The authors note that considering the enormous influence that principals exert on teachers and students, the systematic non-promotion of Black principal candidates imposes consequences for Black teachers and students throughout the entire school system.

"Because principals and district leaders are more likely to identify educators of their own race for promotion, the underrepresentation of minority groups is likely to ripple throughout schools and districts," said Bailes. "Prior research also shows that hiring more Black principals can help close the achievement gaps between White and non-White students nationally."

According to the authors, the patterns of disparities in leadership identified in their study suggest that state and district policymakers should consider establishing metrics of success within their school systems that rate equity in promotion for equivalently qualified individuals who aspire to school leadership.

"Administrators, such as principals and district leaders, need to identify and actively nurture diversity in all levels of leadership," Bailes said. "It is crucial that districts monitor inequities in their promotion practices."

Credit: 
American Educational Research Association

A raft that won't save you

image: Schematic of protein activation and lipid membrane remodeling. A density change of active receptors drives the system towards an active configuration in which the lipid bilayer shrinks around active domains, this causing cell membrane deformation and thickening in the form of lipid rafts.

Image: 
Carotenuto, Lunghi et al.

PITTSBURGH (June 15, 2020) -- A cell's membrane acts as a natural shield, a fence around the cell that protects and contains it. It mediates processes that let nutrients through and let waste out, and it acts as a physical barrier to the entry of toxic substances and pathogens, like the viruses SARS-CoV-1 and SARS-CoV-2, the one that causes COVID-19.

Such pathogens, however, employ clever strategies to trick and penetrate the cell, thereby replicating themselves and infecting the human body. The virus deceives the membrane by exposing specific anti-receptors to which suitable cell's receptors normally bind. The virus tricks the receptors into believing that what's landing is something else, namely an affine ligand, something that is safe. Such a process activates and grows thickened zones along the cell membrane, or "lipid rafts," which are more likely to permit the virus to alter the cell's membrane, yielding its entry into the cell.

New interdisciplinary research published in the Journal of the Mechanics and Physics of Solids sheds light on how and why the cell membrane forms and grows lipid rafts triggered by ligand-receptor activity. The work could lead to new strategies and innovative approaches to prevent or fight the action of the virus through the integration of biomedical and engineering knowledge.

"Although lipid rafts' influence on a cell's response to external agents has been deeply investigated, the physical components of what takes place during ligand-binding has not yet been fully understood," said Luca Deseri, research professor at the University of Pittsburgh's Swanson School of Engineering in the Mechanical Engineering and Materials Science Department, full professor and head of the graduate school in Engineering at DICAM-University of Trento in Italy, and corresponding author on the paper. "Our team used an interdisciplinary approach to better understand why active receptors tend to cluster on lipid rafts. More importantly, we confirm and predict the formation of the complex ligand receptors."

Through the studies of how mechanical forces and biochemical interactions affect the cell membrane, this research sheds light on the way localized thickening across cell membranes is triggered by the formation of the ligand-receptor complex. The researchers concluded that the formation of ligand-receptor complexes could not take place in thinner zones of the cell membrane; the thickening of the cell membrane provides the necessary force relief to allow for configurational changes of the receptors, which then become more prone to ligand binding

Understanding the way viruses use lipid rafts to alter the cell wall could lead to new approaches to treat and prevent viruses, like the one that causes COVID-19, from spreading in the body.

Credit: 
University of Pittsburgh

When board members get involved, corporate tax burden goes down

New research finds that corporate tax-planning practices improve when a company's board takes an interest in tax-planning practices - and better planning results in both less tax uncertainty and a lower tax burden.

"We wanted to see what happens when board members take an active role in risk oversight for a company's tax-planning efforts - and we found that it makes a substantial difference to a company's bottom line," says Nathan Goldman, co-author of a paper on the work and an assistant professor of accounting in North Carolina State University's Poole College of Management.

"For example, risk oversight of tax planning in an international business context might include the board playing an active role in determining the location of a new subsidiary and helping to consider the tax implications of the new location - as well as any related non-tax risks."

For this study, researchers evaluated the activities of 665 publicly traded companies to determine the extent of each board's involvement in risk management. The researchers also evaluated financial reporting of income taxes to gain insights into each company's tax-planning practices.

"We estimate that companies with the highest level of risk oversight have 31% lower tax uncertainty and 13.2% lower tax burden, as compared to the companies with the lowest level of risk oversight," Goldman says.

Tax uncertainty is the risk that the IRS or another taxing authority will overturn a company's tax position, resulting in the company having to pay back the money it saved from the tax positions - as well as penalties.

"A good tax-planning strategy is one that results in substantial tax savings and does not subject the firm to other risks or unexpected future tax liabilities," Goldman says.

"In the context of international tax planning, a good, low-risk decision might be to open a new manufacturing facility in a low-tax-rate jurisdiction - such as Ireland. This strategy would allocate more income to the low-tax country, but should not create a risky tax position that would be overturned by the IRS.

"In contrast, a bad, high-risk decision would be to just set up a shell company and make journal entries to artificially allocate income without a substantial business purpose. This strategy may result in the firm being shamed in the news for shifting income abroad, resulting in reputational harm. It could also result in the IRS reallocating income to the United States, resulting in higher tax payments, plus interest and penalties.

"Ultimately, our study suggests that companies are more likely to make high-risk decisions when the board is not involved," Goldman says. "And more likely to make decisions that balance risk and reward when the board is involved. So there are excellent reasons for board members to include tax planning in their enterprise risk management efforts."

Credit: 
North Carolina State University

Why pulsars shine bright: A half-century-old mystery solved

image: The simulated density distribution of electron-positron plasma near the surface of a neutron star (shown in gray at the bottom of the plot). Redder regions represent a higher density of electron-positron pairs.

Image: 
A. Philippov et al./Physical Review Letters 2020

When Jocelyn Bell first observed the emissions of a pulsar in 1967, the rhythmic pulses of radio waves so confounded astronomers that they considered whether the light could be signals sent by an alien civilization.

The stars act like stellar lighthouses, shooting beams of radio waves from their magnetic poles. For more than a half-century, the cause of those beams has confounded scientists. Now a team of researchers suspects that they’ve finally identified the mechanism responsible. The discovery could aid projects that rely on the timing of pulsar emissions, such as studies of gravitational waves.

The researchers’ proposal starts with the pulsar’s strong electric fields, which tear electrons from the star’s surface and accelerate them to extreme energies. The accelerated electrons eventually begin emitting high-energy gamma rays. These gamma rays, when absorbed by the pulsar’s ultra-strong magnetic field, produce a deluge of additional electrons and their antimatter counterparts, positrons.

The newborn charged particles dampen the electric fields, causing them to oscillate. The wobbling electric fields in the presence of the pulsar’s powerful magnetic fields then result in electromagnetic waves that escape into space. Using plasma simulations, the researchers found that these electromagnetic waves match radio waves observed from pulsars.

“The process is a lot like lightning,” says study lead author Alexander Philippov, an associate research scientist at the Flatiron Institute’s Center for Computational Astrophysics in New York City. “Out of nowhere, you have a powerful discharge producing a cloud of electrons and positrons, and then, as an afterglow, there are electromagnetic waves.”

Philippov and collaborators Andrey Timokhin of the University of Zielona Góra in Poland and Anatoly Spitkovsky of Princeton University present their findings June 15 in Physical Review Letters.

Pulsars are neutron stars, the dense and highly magnetized remains of collapsed stars. Unlike other neutron stars, pulsars spin at dizzying speeds, with some rotating more than 700 times each second. That spinning generates powerful electric fields.

At a pulsar’s two magnetic poles, continuous beams of radio waves blast into space. These radio emissions are special in that they are coherent, meaning the particles creating them move in lockstep with one another. As the pulsar rotates, the beams sweep in circles across the sky. From Earth, pulsars appear to blink as the beams move in and out of our line of sight. The timing of these blinks is so precise that they rival the accuracy of atomic clocks.

For decades, astronomers pondered the origins of these beams but failed to produce a viable explanation. Philippov, Timokhin and Spitkovsky took a fresh approach to the problem by creating 2D simulations of the plasma surrounding a pulsar’s magnetic poles (previous simulations were only 1D, which can’t show electromagnetic waves).

Their simulations replicate how a pulsar’s electric fields accelerate charged particles. That acceleration produces high-energy photons that interact with the pulsar’s intense magnetic field to produce electron-positron pairs, which are then accelerated by the electric fields and create even more photons. This runaway process ultimately fills the region with electron-positron pairs.

In the simulations, the electron-positron pairs create their own electric fields that oppose and dampen the initial electric field. Eventually, the original electric field becomes so weak that it reaches zero and begins oscillating between negative and positive values. That oscillating electric field, if not exactly aligned to the pulsar’s strong magnetic field, produces electromagnetic radiation.

The researchers plan to scale up their simulations to get closer to the real-world physics of a pulsar and further probe how the process works. Philippov hopes that their work will ultimately improve research that relies on precisely observing the timing of pulsar emissions reaching Earth. Gravitational wave astronomers, for instance, measure tiny fluctuations in pulsar timing to detect gravitational waves stretching and compressing the fabric of space-time.

“If you understand how the emission itself is produced, there’s a hope that we can also produce a model of the errors in the pulsar clock that can be used to improve pulsar timing arrays,” Philippov says. Additionally, such a deeper understanding could help resolve the mysterious source of periodic bursts of radio waves, known as fast radio bursts, that emanate from neutron stars, he says.

Journal

Physical Review Letters

DOI

10.1103/PhysRevLett.124.245101

Credit: 
Simons Foundation

Tuberculosis spread from animals to humans may be greater than previously thought

image: The World Health Organization aims to reduce the incidence of tuberculosis by 90% by 2035.

Image: 
Vivek Kapur, Penn State

UNIVERSITY PARK, Pa. -- The number of human tuberculosis (TB) cases that are due to transmission from animals, as opposed to human-to-human transmission, may be much higher than previously estimated, according to an international team of researchers. The results could have implications for epidemiological studies and public health interventions.

"Tuberculosis kills 1.4 million people every year, making it the most deadly disease arising from a single infectious agent," said Vivek Kapur, professor of microbiology and infectious diseases and Huck Distinguished Chair in Global Health, Penn State. "India has the largest burden of human tuberculosis globally, with more than 2.6 million cases and 400,000 deaths reported in 2019. Additionally, the cattle population in India exceeds 300 million, and nearly 22 million of these were estimated to be infected with TB in 2017.

Kapur noted that the World Health Organization, World Organisation for Animal Health and Food and Agriculture Organization of the United Nations define zoonotic TB as human infection with Mycobacterium bovis, a member of the Mycobacterium tuberculosis complex (MTBC).

To evaluate the use of M. bovis as a proxy for zoonotic tuberculosis and to investigate the potential role of other MTBC subspecies, Kapur and his colleagues analyzed 940 bacterial samples -- both pulmonary (from lung fluid or tissue) and extrapulmonary (from tissues other than the lungs) -- collected from patients who were visiting a large reference hospital for TB in southern India. The researchers used PCR to speciate M. tuberculosis complex organisms and then sequenced all the non-M. tuberculosis samples. Next, they compared the sequences to 715 sequences from cattle and humans that had previously been collected in south Asia and submitted to public databases.

"Surprisingly, we did not find any evidence for the presence of M. bovis in any of the samples," said Sreenidhi Srinivasan, postdoctoral scholar in the Huck Institutes of the Life Sciences. "Instead, we found that seven of the patient samples contained M. orygis. Six of these came from patients with extrapulmonary TB."

They describe their findings in a paper published June 1 in The Lancet Microbe.

As expected, most of the remainder of the sequences from the patients belonged to M. tuberculosis -- the TB bacterium that is generally thought to be transmitted only among humans.

"Our findings suggest that M. bovis might be uncommon in India, and that its detection may not be an adequate proxy for zoonotic TB infection in humans," said Srinivasan. "These data indicate that members of the TB complex other than M. bovis might be more prevalent in livestock in India."

Kapur added that the operational definition of zoonotic TB should be broadened to include other MTBC subspecies capable of causing human disease.

"By 2035, the World Health Organization is aiming to reduce the incidence of tuberculosis by 90% as a part of its End TB Strategy," he said. "The increasing evidence supporting M. orygis endemicity in south Asia and the identification of M. tuberculosis in cattle highlight the importance of using a One Health approach, involving multisectoral collaboration across the veterinary and clinical sectors, to meet the WHO's goal in India."

Credit: 
Penn State

Using Jenga to explain lithium-ion batteries

Tower block games such as Jenga can be used to explain to schoolchildren how lithium-ion batteries work, meeting an educational need to better understand a power source that has become vital to everyday life.

While lithium-ion batteries are abundant in so many of our electronic devices, from smart phones to electric vehicles, the resources available to teach children how they work and why they are important are limited.

A team in the University of Birmingham's School of Chemistry, has devised an educational tool which uses the tower block game Jenga to explain the processes at work inside the battery cells and the electrochemistry behind them. Their method is published in the Journal of Chemical Education.

A rechargeable Li-ion battery consists of an oxide and a graphite electrode. These are commonly built in layers separated with an electrolyte. When the battery is charged, lithium ions move from the graphite to the oxide electrode via the electrolyte. Current collectors, which the electrodes are coated onto, allow electrons to move via an external circuit, providing power.

By using the layers of blocks, children can get a sense of how the battery is constructed and how the different components interact with each other. The battery Jenga can show battery operation and key characteristics. The intercalation, or layered, chemistry of charging and discharging this type of battery can be easily visualised. Through removing a few blank blocks in the graphite electrode (these blocks represent empty space between the layers of graphite), a student can move the Li-ion blocks from the oxide electrode to the graphite electrode. The reverse process will occur on discharge.

The simplicity of this demonstration provides a basis for complex chemistry and redox reactions to be explained. The importance and safety of rate of charge for differing applications can be shown too, when students remove the lithium ion blocks from the oxide electrodes at varying rates. The faster charge invariably leads to the Jenga structure collapsing.

The tower block game can also demonstrate how the performance of the battery reduces over continued use by showing how the blocks become slightly displaced as the lithium blocks are removed and reinserted.

Researcher Elizabeth Driscoll explains: "Hands-on demonstrations are known to be a useful way of supporting learning - teachers often use lemons or potatoes to explain conventional non-rechargeable batteries, for example. But we know that electrochemistry is a tricky area for teachers, which often leads to misconceptions among students. We wanted to design a hands-on activity that would help address this and explain this rechargeable-type."

By introducing tower block sets with strong contrasting colours and different textures, the team were also able to devise teaching tools that would be more inclusive for students who are blind or partially-sighted.

The activities have been trialled with multiple visiting schools over the past year, including: the Royal Society of Chemistry's Top of the Bench demonstration lecture, with positive feedback from both teachers and students. The sets have also made an appearance at public events at museums, from the ThinkTank science museum in Birmingham to the Manchester Science Museum and the Royal Institution in London.

The next step for the team will be to enable the activity to be widely accessible to more students and provide support for educators in these topics. Funding from the Faraday Institution and the Royal Society of Chemistry has already enabled 100 small jenga sets to be supplied to a Birmingham secondary school. Tactile classroom sets will also be provided to New College Worcester and Bolton Sensory Support service. Educators interested in producing their own sets can access full instructions via the open access paper in the Journal of Chemical Education.

Credit: 
University of Birmingham

Combination drug treatments for COVID-19 show promise in cell culture tests

Six months into the COVID-19 pandemic, more than 7.4 million people have been infected, and more than 410 000 have died. As yet, there is no treatment or vaccine for the disease.

Now, a team of researchers from Norway and Estonia have looked at different possible treatment options -- and found both good and bad news.

The good news is that the team identified six existing safe-in-humans broad-spectrum antivirals that worked against the disease in laboratory tests. Two of the six, when combined, showed an even stronger effect in infected cell cultures.

"This is exciting new data from the work we did," said Magnar Bjørås, a professor in the Norwegian University of Science and Technology's (NTNU) Department of Clinical and Molecular Medicine, and one of the paper's co-authors.

The bad news is that another, non-drug treatment -- the use of antibody-laden plasma from recovered patients to treat the severely ill -- may only work if the donor has recently recovered from COVID-19.

"This means if you collect blood from patients who have recovered from COVID-19 after 2 months from diagnosis of the disease, and transfuse their plasma/serum to severely sick patients, it may not help," said Svein Arne Nordbø, an associate professor at the university's Department of Clinical and Molecular Medicine and an MD at Department of Medical Microbiology at St. Olavs Hospital in Trondheim, and another of the paper's authors.

The study has been published in the journal Viruses.

The research team developed a cell culture that they could use to grow SARS-CoV-2, the name of the coronavirus that causes COVID-19. The culture allowed them to actually test the efficacy of the different drugs in the laboratory.

They determined that a cell type called Vero-E6 was best suited to propagate the coronavirus, and were able to screen 136 drugs using the cell culture.

The screening identified six existing drugs that had some effect, and several combinations of drugs that acted synergistically, the researchers said. The six drugs were nelfinavir, salinomycin, amodiaquine, obatoclax, emetine and homoharringtonine, said Denis Kainov, an associate professor at the university's Department of Clinical and Molecular Medicine, and senior author of the article.

A combination of nelfinar and amodiaquine "exhibited the highest synergy," he said.

This last finding was encouraging enough that the researchers hope that others will follow up and start testing the drug combinations in patients.

"This orally available drug combination -- nelfinavir -amodiaquine -- inhibits the virus infection in cell cultures," Kainov said. "It should be tested further in pre-clinical studies and clinical trials now."

The researchers also wanted to look more closely at the efficacy of using blood plasma from recovered patients to treat people with COVID-19.

The Vero-E6 cell line enabled them to develop a "neutralizing antibody" test, which they could use to determine the strength of antibodies from the blood of recovered patients.

The neutralizing antibody test works much like its name suggests.

The researchers took blood plasma from recovered patients and added it to the cell cultures containing the live virus. That allowed them to see how effectively the antibodies in the plasma neutralized or killed the virus that was growing in the cell culture. Researchers call the plasma from recovered patients "convalescent serum."

"Convalescent serum from patients containing antibodies against the virus has been used for treatment of different viral diseases over the last decades with some success, when vaccines or antivirals are not available," Nordbø said. "If used for treatment, it is essential that the convalescent serum contains enough antibodies that are capable of inactivating or killing the virus."

But Nordbø points out that the only way to know if the convalescent serum is strong enough is by adding dilutions of it to a live virus strain and testing the mixtures on cell lines that can propagate the virus, as the researchers did.

Ordinary antibody tests may not reflect the ability of the convalescent serum to actually kill or neutralize the virus, he said. That means the neutralization tests are still the most specific.

The neutralizing antibody tests allowed the researchers to test convalescent sera from a number of recovered patients. They were able to see that some recovered patients didn't produce lots of antibodies at all, a finding that has been confirmed by other research.

They also were able to see that the more recent the recovery from COVID-19, the more effective was the serum. Two months after a patient had been diagnosed, their serum didn't have enough antibodies to combat the virus in the cell culture.

"The conclusion so far is that clinicians need to collect plasma for treatment purposes as soon as patients recover from COVID-19," Nordbø said, because the amounts of antibodies decline with time.

However, this finding is not contrary to the notion of lasting immunity. If the patient was exposed the virus a second time, the cells of the immune system would most likely be prepared to increase the production of antibodies again, said Mona Høysæter Fenstad, a researcher at the Department of Immunology and Transfusion Medicine at St. Olavs Hospital, and another co-author.

The fact that the researchers had been able to diagnose and isolate the virus from Trøndelag patients gave them the chance to identify the origin and evolution of the viral strains. This was achieved with the help of a new nanotechnology-based test for COVID-19 that was spearheaded by Bjørås and adopted by the Norwegian government and that could potentially be exported for use in other countries.

By determining the genetic make-up of the strains, the researchers were able to compare the strains to those registered in an online resource and figure out where the different strains originated.

"We determined that the SARS-CoV-2 strains isolated in Trondheim had originated from China, Denmark, the USA and Canada," said Aleksandr Ianevski, the first author of the paper and a PhD candidate in the university's Department of Clinical and Molecular Medicine.

That raises the question of whether or not Norway's travel restrictions, enacted on March 12, should perhaps have been introduced earlier to prevent the import of the virus to the country, the researchers said.

But seeing how strains are moving across the globe offers potential helpful insights into the virus and its transmission, Ianevski said.

"Monitoring pathogen epidemiology and the evolution of the virus helps with our epidemiological understanding of the disease and may improve outbreak response," he said.

Kainov and Ianevski had previously gone through the academic literature to identify what are called "safe-in-man" broad spectrum antivirals (abbreviated BSAAs). These are drugs that are known to inhibit human viruses that belong to two or more viral families, and have passed the first phase of clinical trials.

That database of the drugs was published in the International Journal of Infectious Diseases and is accessible at https://drugvirus.info/. The authors also identified 46 BSAAs that could potentially act against the SARS-CoV-2 virus including remdesivir and favipiravir, which are currently being studied in different clinical trials across the globe.

The advantage of these drugs is that if they are shown to be able to inhibit the coronavirus in the lab, they can be given to patients without having to first test the drugs for safety.

They would still require clinical trials to see how well they actually work in the human body and what kind of doses are needed for them to control or kill the virus.

Credit: 
Norwegian University of Science and Technology