Culture

Showing robots how to drive a car...in just a few easy lessons

Imagine if robots could learn from watching demonstrations: you could show a domestic robot how to do routine chores or set a dinner table. In the workplace, you could train robots like new employees, showing them how to perform many duties. On the road, your self-driving car could learn how to drive safely by watching you drive around your neighborhood.

Making progress on that vision, USC researchers have designed a system that lets robots autonomously learn complicated tasks from a very small number of demonstrations--even imperfect ones. The paper, titled Learning from Demonstrations Using Signal Temporal Logic, was presented at the Conference on Robot Learning (CoRL), Nov. 18.

The researchers' system works by evaluating the quality of each demonstration, so it learns from the mistakes it sees, as well as the successes. While current state-of-art methods need at least 100 demonstrations to nail a specific task, this new method allows robots to learn from only a handful of demonstrations. It also allows robots to learn more intuitively, the way humans learn from each other -- you watch someone execute a task, even imperfectly, then try yourself. It doesn't have to be a "perfect" demonstration for humans to glean knowledge from watching each other.

"Many machine learning and reinforcement learning systems require large amounts of data data and hundreds of demonstrations--you need a human to demonstrate over and over again, which is not feasible," said lead author Aniruddh Puranic, a Ph.D. student in computer science at the USC Viterbi School of Engineering.

"Also, most people don't have programming knowledge to explicitly state what the robot needs to do, and a human cannot possibly demonstrate everything that a robot needs to know. What if the robot encounters something it hasn't seen before? This is a key challenge."

Learning from demonstrations

Learning from demonstrations is becoming increasingly popular in obtaining effective robot control policies -- which control the robot's movements -- for complex tasks. But it is susceptible to imperfections in demonstrations and also raises safety concerns as robots may learn unsafe or undesirable actions.

Also, not all demonstrations are equal: some demonstrations are a better indicator of desired behavior than others and the quality of the demonstrations often depends on the expertise of the user providing the demonstrations.

To address these issues, the researchers integrated "signal temporal logic" or STL to evaluate the quality of demonstrations and automatically rank them to create inherent rewards.

In other words, even if some parts of the demonstrations do not make any sense based on the logic requirements, using this method, the robot can still learn from the imperfect parts. In a way, the system is coming to its own conclusion about the accuracy or success of a demonstration.

"Let's say robots learn from different types of demonstrations -- it could be a hands-on demonstration, videos, or simulations -- if I do something that is very unsafe, standard approaches will do one of two things: either, they will completely disregard it, or even worse, the robot will learn the wrong thing," said co-author Stefanos Nikolaidis, a USC Viterbi assistant professor of computer science.

"In contrast, in a very intelligent way, this work uses some common sense reasoning in the form of logic to understand which parts of the demonstration are good and which parts are not. In essence, this is exactly what also humans do."

Take, for example, a driving demonstration where someone skips a stop sign. This would be ranked lower by the system than a demonstration of a good driver. But, if during this demonstration, the driver does something intelligent -- for instance, applies their brakes to avoid a crash -- the robot will still learn from this smart action.

Adapting to human preferences

Signal temporal logic is an expressive mathematical symbolic language that enables robotic reasoning about current and future outcomes. While previous research in this area has used "linear temporal logic", STL is preferable in this case, said Jyo Deshmukh, a former Toyota engineer and USC Viterbi assistant professor of computer science .

"When we go into the world of cyber physical systems, like robots and self-driving cars, where time is crucial, linear temporal logic becomes a bit cumbersome, because it reasons about sequences of true/false values for variables, while STL allows reasoning about physical signals."

Puranic, who is advised by Deshmukh, came up with the idea after taking a hands-on robotics class with Nikolaidis, who has been working on developing robots to learn from YouTube videos. The trio decided to test it out. All three said they were surprised by the extent of the system's success and the professors both credit Puranic for his hard work.

"Compared to a state-of-the-art algorithm, being used extensively in many robotics applications, you see an order of magnitude difference in how many demonstrations are required," said Nikolaidis.

The system was tested using a Minecraft-style game simulator, but the researchers said the system could also learn from driving simulators and eventually even videos. Next, the researchers hope to try it out on real robots. They said this approach is well suited for applications where maps are known beforehand but there are dynamic obstacles in the map: robots in household environments, warehouses or even space exploration rovers.

"If we want robots to be good teammates and help people, first they need to learn and adapt to human preference very efficiently," said Nikolaidis. "Our method provides that."

"I'm excited to integrate this approach into robotic systems to help them efficiently learn from demonstrations, but also effectively help human teammates in a collaborative task."

Credit: 
University of Southern California

Maraxilibat reduces debilitating itching in children with Alagille syndrome

Houston - (Nov 18, 2020) - On behalf of Childhood Liver Disease Research Network (ChiLDReN), Texas Children's Hospital and Baylor College of Medicine researchers report that prolonged treatment with Maraxilibat resulted in clinically meaningful improvements in debilitating itching (pruritus) and related quality of life outcomes in children with Alagille syndrome. This syndrome is a rare genetic systemic disorder in which problems with bile flow can cause significant liver injury and potential liver failure necessitating liver transplantation. The novel pharmacological approach addresses a major unfulfilled therapeutic need to control severe itching in pediatric patients with Alagille syndrome.

This was the first preliminary presentation of the data from the recently concluded multi-institutional clinical trial called IMAGINE II (ClinicalTrials.gov Identifier: NCT02117713). The results, presented as a late-breaker abstract, were shared by Dr. Benjamin Shneider, chief of Pediatric Gastroenterology, Hepatology and Nutrition at Texas Children's Hospital and professor of Pediatrics at Baylor College of Medicine, as an oral presentation on Nov. 15 at the Liver Meeting Digital Experience™, the virtual annual conference organized by the American Association for the Study of Liver Diseases (AASLD). The study was conducted under the auspices of the NIDDK-funded Childhood Liver Disease Research Network (ChiLDReN) in the context of a cooperative research and development agreement with Mirum Pharmaceuticals.

Severe cholestasis is common in children with Alagille Syndrome and can appear in infancy. In this condition, the flow of bile (fluid that helps digest fats) is reduced, presumably due to impaired development of bile ducts through which bile flows. Accumulation of bile in the liver leads to progressive damage in this critical organ. This genetic disorder can also lead to problems in the heart, kidneys, blood vessels, skeleton and others.

"Poor bile flow, which impacts a significant proportion of children with Alagille syndrome, manifests in part as debilitating itching. The urge to scratch is so strong and unrelenting that these young patients and their families are unable to sleep at night. Constant scratching can lead to bleeding and scarring of large swaths of skin, significant fatigue due to sleep deprivation and psychosocial effects, adversely impacting the quality of life," Shneider, who also holds the George Peterkin Endowed Chair at Baylor College of Medicine, said.

Standard and non-standard anti-itch medicines that are presently available have limited efficacy in children with Alagille Syndrome who have moderate to severe pruritus. Currently, the only effective treatment options beyond available medications for treating their pruritus involves complex invasive surgeries to either divert the bile flow away from the liver (e.g. the biliary diversion procedure) or liver transplantation. Even still, these are not perfect solutions. While transplantation may address the issue of itching, since Alagille is a systemic disorder affecting several organs, liver transplantation will not improve other health problems confronting these patients and undergoing the procedure carries its own risks.

The IMAGINE II clinical trial was designed to specifically address the problem of pruritus in Alagille patients. It was a follow-up to a previous study called ITCH, a 13-week randomized placebo-controlled trial, which was also led by Shneider and conducted under the auspices of the Childhood Liver Disease Network. The ITCH trial showed that maraxilibat was safe, generally well tolerated and, more importantly, provided the first evidence of short-term potential efficacy of this drug in treating pruritus in Alagille patients. At the conclusion of that study, all enrolled participants, irrespective of whether they received the drug or placebo, were offered the opportunity to rollover to the longer IMAGINE II trial. Since some patients had already received a short course of Maraxilibat in the previous ITCH trial, the results presented in this initial report are a combined analysis of the data from both trials.

ITCH and IMAGINE II trials were based on the hypothesis that since Maralixibat inhibits the sodium-dependent bile acid transporter protein that facilitates the enterohepatic circulation of bile acids, it may, pharmacologically, mimic the beneficial effects of a biliary diversion surgery. Maralixibat was granted Breakthrough Therapy Designation by the Food and Drug Administration (FDA) for Alagille syndrome in 2019.

Since it is difficult to quantify itching, researchers used two different qualitative scales to measure itching severity. ItchRO is a novel electronic diary based on caregiver's observation twice a day and a four-point clinician scratch scale (CSS) assessed by clinical investigators. In addition, they evaluated health-related quality of life outcomes using PedsQL and multidimension fatigue scale. A 'clinically meaningful' response was defined as one-point reduction in ItchRO and CSS and a five-point increase in PedsQL.

"At the end of >96-weeks of treatment, 90% and 80% of the participants who were able to receive maralixibat to that time point had clinically meaningful responses in ItchRO and CSS scales respectively. We were particularly excited to see 25% of participants showed dramatic four-point reduction in itching on the CSS scale. They went from severe itching (described as "cutaneous self-mutilation") to no features of scratching, which as one can imagine, dramatically improves the quality of life for patients and their families. Although further complex analyses of this data and others is still needed to determine the impact of maralixibat on other aspects of the liver disease in Alagille syndrome, finding a drug that effectively treats pruritus is truly a game-changer for children and families who struggle with Alagille Syndrome," Shneider concluded.

Credit: 
Baylor College of Medicine

Long-acting antipsychotic therapy plus cognitive training show promise for schizophrenia

image: Dr. Keith Nuechterlein

Image: 
UCLA Health

FINDINGS

UCLA scientists and colleagues found the use of long-acting antipsychotic medication combined with the use of cognitive training in group settings led to improved cognition and increased productivity.

Researchers say patients using a combination of long-acting antipsychotic medication and a multipronged cognitive remediation that taught memory and problem-solving skills had significant improvements in work and school function.

BACKGROUND

Schizophrenia is a serious mental illness that affects how a person thinks, feels and behaves. People with schizophrenia may appear to have lost touch with reality, which can cause distress for family and friends and lead to permanent disability. Treatments delivered in a sustained manner can help people with schizophrenia engage in school or work, achieve independence and enjoy personal relationships.

METHODS

During a 12-month randomized controlled trial, 60 patients from the UCLA Aftercare Program who recently experienced a first episode of schizophrenia were randomized to oral or long-acting injectable antipsychotic medication and to either cognitive remediation or healthy behavior training. Cognitive remediation involved training in attention, memory, and problem-solving skills to help navigate complex, life-like situations. The healthy behavior training focused on nutrition, stress management, and exercise, with equal treatment time. All patients were provided supported employment and education to encourage return to work or school.

IMPACT

Systematic cognitive training, when combined with consistent antipsychotic medication adherence, achieved in this case through the use of a long-acting medication, can significantly improve cognitive deficits in the initial period of schizophrenia. These therapies show a separate significant impact on improving work and school functioning.

Credit: 
University of California - Los Angeles Health Sciences

Three reasons why COVID-19 can cause silent hypoxia

Scientists are still solving the many puzzling aspects of how the novel coronavirus attacks the lungs and other parts of the body. One of the biggest and most life-threatening mysteries is how the virus causes "silent hypoxia," a condition when oxygen levels in the body are abnormally low, which can irreparably damage vital organs if gone undetected for too long. Now, thanks to computer models and comparisons with real patient data, Boston University biomedical engineers and collaborators from the University of Vermont have begun to crack the mystery.

Despite experiencing dangerously low levels of oxygen, many people infected with severe cases of COVID-19 sometimes show no symptoms of shortness of breath or difficulty breathing. Hypoxia's ability to quietly inflict damage is why it's been coined "silent." In coronavirus patients, it's thought that the infection first damages the lungs, rendering parts of them incapable of functioning properly. Those tissues lose oxygen and stop working, no longer infusing the blood stream with oxygen, causing silent hypoxia. But exactly how that domino effect occurs has not been clear until now.

"We didn't know [how this] was physiologically possible," says Bela Suki, a BU College of Engineering professor of biomedical engineering and of materials science and engineering and one of the authors of the study. Some coronavirus patients have experienced what some experts have described as levels of blood oxygen that are "incompatible with life." Disturbingly, Suki says, many of these patients showed little to no signs of abnormalities when they underwent lung scans.

To help get to the bottom of what causes silent hypoxia, BU biomedical engineers used computer modeling to test out three different scenarios that help explain how and why the lungs stop providing oxygen to the bloodstream. Their research, which has been published in Nature Communications, reveals that silent hypoxia is likely caused by a combination of biological mechanisms that may occur simultaneously in the lungs of COVID-19 patients, according to biomedical engineer Jacob Herrmann, a research postdoctoral associate in Suki's lab and the lead author of the new study.

Normally, the lungs perform the life-sustaining duty of gas exchange, providing oxygen to every cell in the body as we breathe in and ridding us of carbon dioxide each time we exhale. Healthy lungs keep the blood oxygenated at a level between 95 and 100 percent--if it dips below 92 percent, it's a cause for concern and a doctor might decide to intervene with supplemental oxygen. (Early in the coronavirus pandemic, when clinicians first started sounding the alarm about silent hypoxia, oximeters flew off store shelves as many people, worried that they or their family members might have to recover from milder cases of coronavirus at home, wanted to be able to monitor their blood oxygen levels.)

The researchers first looked at how COVID-19 impacts the lungs' ability to regulate where blood is directed. Normally, if areas of the lung aren't gathering much oxygen due to damage from infection, the blood vessels will constrict in those areas. This is actually a good thing that our lungs have evolved to do, because it forces blood to instead flow through lung tissue replete with oxygen, which is then circulated throughout the rest of the body.

But according to Herrmann, preliminary clinical data have suggested that the lungs of some COVID-19 patients had lost the ability of restricting blood flow to already damaged tissue, and in contrast, were potentially opening up those blood vessels even more--something that is hard to see or measure on a CT scan.

Using a computational lung model, Herrmann, Suki, and their team tested that theory, revealing that for blood oxygen levels to drop to the levels observed in COVID-19 patients, blood flow would indeed have to be much higher than normal in areas of the lungs that can no longer gather oxygen--contributing to low levels of oxygen throughout the entire body, they say.

Next, they looked at how blood clotting may impact blood flow in different regions of the lung. When the lining of blood vessels get inflamed from COVID-19 infection, tiny blood clots too small to be seen on medical scans can form inside the lungs. They found, using computer modeling of the lungs, that this could incite silent hypoxia, but alone it is likely not enough to cause oxygen levels to drop as low as the levels seen in patient data.

Last, the researchers used their computer model to find out if COVID-19 interferes with the normal ratio of air-to-blood flow that the lungs need to function normally. This type of mismatched air-to-blood flow ratio is something that happens in many respiratory illnesses, such as with asthma patients, Suki says, and it can be a possible contributor to the severe, silent hypoxia that has been observed in COVID-19 patients. Their models suggest that for this to be a cause of silent hypoxia, the mismatch must be happening in parts of the lung that don't appear injured or abnormal on lung scans.

Altogether, their findings suggest that a combination of all three factors are likely to be responsible for the severe cases of low oxygen in some COVID-19 patients. By having a better understanding of these underlying mechanisms, and how the combinations could vary from patient to patient, clinicians can make more informed choices about treating patients using measures like ventilation and supplemental oxygen. A number of interventions are currently being studied, including a low-tech intervention called prone positioning that flips patients over onto their stomachs, allowing for the back part of the lungs to pull in more oxygen and evening out the mismatched air-to-blood ratio.

"Different people respond to this virus so differently," says Suki. For clinicians, he says it's critical to understand all the possible reasons why a patient's blood oxygen might be low, so that they can decide on the proper form of treatment, including medications that could help constrict blood vessels, bust blood clots, or correct a mismatched air-to-blood flow ratio.

Credit: 
Boston University

UCF researchers identify features that could make someone a virus super-spreader

ORLANDO, Nov. 19, 2020 - New research from the University of Central Florida has identified physiological features that could make people super-spreaders of viruses such as COVID-19.

In a study appearing this month in the journal Physics of Fluids, researchers in UCF's Department of Mechanical and Aerospace Engineering used computer-generated models to numerically simulate sneezes in different types of people and determine associations between people's physiological features and how far their sneeze droplets travel and linger in the air.

They found that people's features, like a stopped-up nose or a full set of teeth, could increase their potential to spread viruses by affecting how far droplets travel when they sneeze.

According to the U.S. Centers for Disease Control and Prevention, the main way people are infected by the virus that causes COVID-19 is through exposure to respiratory droplets, such as from sneezes and coughs that are carrying infectious virus.

Knowing more about factors affecting how far these droplets travel can inform efforts to control their spread, says Michael Kinzel, an assistant professor with UCF's Department of Mechanical Engineering and study co-author.

"This is the first study that aims to understand the underlying 'why' of how far sneezes travel," Kinzel says. "We show that the human body has influencers, such as a complex duct system associated with the nasal flow that actually disrupts the jet from your mouth and prevents it from dispersing droplets far distances."

For instance, when people have a clear nose, such as from blowing it into a tissue, the speed and distance sneeze droplets travel decrease, according to the study.

This is because a clear nose provides a path in addition to the mouth for the sneeze to exit. But when people's noses are congested, the area that the sneeze can exit is restricted, thus causing sneeze droplets expelled from the mouth to increase in velocity.

Similarly, teeth also restrict the sneeze's exit area and cause droplets to increase in velocity.

"Teeth create a narrowing effect in the jet that makes it stronger and more turbulent," Kinzel says. "They actually appear to drive transmission. So, if you see someone without teeth, you can actually expect a weaker jet from the sneeze from them."

To perform the study, the researchers used 3D modeling and numerical simulations to recreate four mouth and nose types: a person with teeth and a clear nose; a person with no teeth and a clear nose; a person with no teeth and a congested nose; and a person with teeth and a congested nose.

When they simulated sneezes in the different models, they found that the spray distance of droplets expelled when a person has a congested nose and a full set of teeth is about 60 percent greater than when they do not.

The results indicate that when someone keeps their nose clear, such as by blowing it into a tissue, that they could be reducing the distance their germs travel.

The researchers also simulated three types of saliva: thin, medium and thick.

They found that thinner saliva resulted in sneezes comprised of smaller droplets, which created a spray and stayed in the air longer than medium and thick saliva.

For instance, three seconds after a sneeze, when thick saliva was reaching the ground and thus diminishing its threat, the thinner saliva was still floating in the air as a potential disease transmitter.

The work ties back to the researchers' project to create a COVID-19 cough drop that would give people thicker saliva to reduce the distance droplets from a sneeze or cough would travel, and thus decrease disease-transmission likelihood.

The findings yield novel insight into variability of exposure distance and indicate how physiological factors affect transmissibility rates, says Kareem Ahmed, an associate professor in UCF's Department of Mechanical and Aerospace Engineering and study co-author.

"The results show exposure levels are highly dependent on the fluid dynamics that can vary depending on several human features," Ahmed says. "Such features may be underlying factors driving superspreading events in the COVID-19 pandemic."

The researchers say they hope to move the work toward clinical studies next to compare their simulation findings with those from real people from varied backgrounds.

Study co-authors were Douglas Fontes, a postdoctoral researcher with the Florida Space Institute and the study's lead author, and Jonathan Reyes, a postdoctoral researcher in UCF's Department of Mechanical and Aerospace Engineering.

Fontes says to advance the findings of the study, the research team wants to investigate the interactions between gas flow, mucus film and tissue structures within the upper respiratory tract during respiratory events.

"Numerical models and experimental techniques should work side by side to provide accurate predictions of the primary breakup inside the upper respiratory tract during those events," he says.

"This research potentially will provide information for more accurate safety measures and solutions to reduce pathogen transmission, giving better conditions to deal with the usual diseases or with pandemics in the future," he says.

Credit: 
University of Central Florida

Tel Aviv University study finds hyperbaric oxygen treatments reverse aging process

A new study from Tel Aviv University (TAU) and the Shamir Medical Center in Israel indicates that hyperbaric oxygen treatments (HBOT) in healthy aging adults can stop the aging of blood cells and reverse the aging process. In the biological sense, the adults' blood cells actually grow younger as the treatments progress.

The researchers found that a unique protocol of treatments with high-pressure oxygen in a pressure chamber can reverse two major processes associated with aging and its illnesses: the shortening of telomeres (protective regions located at both ends of every chromosome) and the accumulation of old and malfunctioning cells in the body. Focusing on immune cells containing DNA obtained from the participants' blood, the study discovered a lengthening of up to 38% of the telomeres, as well as a decrease of up to 37% in the presence of senescent cells.

The study was led by Professor Shai Efrati of the Sackler School of Medicine and the Sagol School of Neuroscience at TAU and Founder and Director of the Sagol Center of Hyperbaric Medicine at the Shamir Medical Center; and Dr. Amir Hadanny, Chief Medical Research Officer of the Sagol Center for Hyperbaric Medicine and Research at the Shamir Medical Center. The clinical trial was conducted as part of a comprehensive Israeli research program that targets aging as a reversible condition.

The paper was published in Aging on November 18, 2020.

"For many years our team has been engaged in hyperbaric research and therapy - treatments based on protocols of exposure to high-pressure oxygen at various concentrations inside a pressure chamber," Professor Efrati explains. "Our achievements over the years included the improvement of brain functions damaged by age, stroke or brain injury.

"In the current study we wished to examine the impact of HBOT on healthy and independent aging adults, and to discover whether such treatments can slow down, stop or even reverse the normal aging process at the cellular level."

The researchers exposed 35 healthy individuals aged 64 or over to a series of 60 hyperbaric sessions over a period of 90 days. Each participant provided blood samples before, during and at the end of the treatments as well as some time after the series of treatments concluded. The researchers then analyzed various immune cells in the blood and compared the results.

The findings indicated that the treatments actually reversed the aging process in two of its major aspects: The telomeres at the ends of the chromosomes grew longer instead of shorter, at a rate of 20%-38% for the different cell types; and the percentage of senescent cells in the overall cell population was reduced significantly - by 11%-37% depending on cell type.

"Today telomere shortening is considered the 'Holy Grail' of the biology of aging," Professor Efrati says. "Researchers around the world are trying to develop pharmacological and environmental interventions that enable telomere elongation. Our HBOT protocol was able to achieve this, proving that the aging process can in fact be reversed at the basic cellular-molecular level."

"Until now, interventions such as lifestyle modifications and intense exercise were shown to have some inhibiting effect on telomere shortening," Dr. Hadanny adds. "But in our study, only three months of HBOT were able to elongate telomeres at rates far beyond any currently available interventions or lifestyle modifications. With this pioneering study, we have opened a door for further research on the cellular impact of HBOT and its potential for reversing the aging process."

Credit: 
American Friends of Tel Aviv University

Analysis of the relations between Spanish civil society organizations and science

Researchers at UPF have analysed the relationship between civil society organizations and the Spanish science and technology system. The study has been published in Public Understanding of Science and conducted by Carolina Llorente and Gema Revuelta, at the Science, Communication and Society Studies Centre (CCS-UPF), and Mar Carrió, of the Health Sciences Educational Research Group (GRECS).

In recent decades, various movements have emerged promoting the inclusion of society in the research process in order to build more socially relevant science. This new model of scientific production is becoming established in Europe and increasingly worldwide. "Often, social participation does not take place individually, but through civil society organizations, so our study, for the first time in Spain, explores the interactions between these organizations and science", Carolina Llorente explains. "Understanding the perspectives of these organizations is useful for proposing effective tools to help strengthen relations between science and society", she adds. The concept of organized civil society includes non-profit organizations in which citizens are generally involved on a voluntary basis: patient and consumer associations, organizations working for the environment or animal rights, humanitarian associations, groups of minorities, etc.

The analysis was based on semi-structured interviews with managers of 31 Spanish organizations. Organizations were selected taking into account their characteristics and distribution across the country. In Spain, there are three so-called unique organizations, La ONCE, the Red Cross and Cáritas, which account for over 60% of the country's volunteers. There are also groups of organizations (federations), but what really makes up the bulk of volunteers are small, decentralized associations which are highly active at local level, but generally have few financial resources.

The study results show that a large number of organizations are not involved in science and technology, or in some cases, are not aware of their involvement. The most common type of collaboration is to act as research study subjects, for example in cases where social associations whose members are interviewed. Such is the case, for example, of investigations that examine the role of certain minorities (i.e., religious or linguistic) that contact organizations dealing with such matters to interview their members. To a lesser extent, organizations, usually patient organizations, fund research through calls to tender, prizes and awards, or they carry out research within the organization. In some cases, organizations also participate as advisors or in training targeting the researchers to transfer their sectoral knowledge to the academic environment.

Regarding hindrances for participation, interviewees agree that the main one is the lack of financial resources and personnel. But they also mention the lack of mutual knowledge: scientists do not know what the organizations are doing and the latter are not aware of what they can contribute or do not know how they can engage in scientific production.

In the words of Mar Carrió: "as a strategy to improve ties, we believe there is a need to encourage researchers to know how to integrate into the organizations and vice versa, for these groups to gain greater knowledge of how science works".

As for the ideal relationship that the associations wish to have with the system of science and technology, generally speaking, civil society organizations appear to be unaware of their own potential and what they can contribute to research. Nevertheless, the results indicate that these associations are willing to engage in scientific production, for example, proposing that they should be consulted from the outset to help approach research.

"In order to promote relations between science and society, there is a need to strengthen alliances between these two worlds. This could be done through better communication between academia and civil society organizations and, therefore, researchers require solid training in this field", Gema Revuelta affirms. "But, we also have to open channels that allow formal, stable relations between institutions and align the research goals with the expectations of society", she concludes.

Credit: 
Universitat Pompeu Fabra - Barcelona

How rotavirus causes severe gastrointestinal disease

Rotavirus is a major cause of diarrhea and vomiting, especially in children, that results in approximately 128,000 deaths annually. The virus triggers the disease by infecting enterocyte cells in the small intestine, but only a fraction of the susceptible cells has the virus. In the mid-90s, scientists proposed that the small portion of infected cells promotes severe disease by sending out signals that disrupt the normal function of neighboring uninfected cells, but the nature of the signal has remained a mystery.

In the current study published in the journal Science, a team led by researchers at Baylor College of Medicine discovered that rotavirus-infected cells release signaling molecules, identified as adenosine diphosphate (ADP), which binds its cellular receptor P2Y1 on neighboring cells. Activating P2Y1 by ADP results in signals called intercellular calcium waves in these uninfected cells. Disrupting ADP binding to its receptor reduced the severity of diarrhea in a mouse model of the disease, suggesting that targeting the P2Y1 may be an effective strategy to control viral diarrhea in human populations.

"In our previous studies using fluorescent calcium sensors and time-lapse imaging, we discovered that rotavirus-infected cells display aberrant calcium signals that we can visualize as bright pulses of intercellular calcium waves that radiate from the infected cells," said corresponding author Dr. Joseph Hyser, assistant professor of virology and microbiology and member of the Alkek Center for Metagenomic and Microbiome Research at Baylor. "Calcium signaling was known to be associated with various aspects of rotavirus infection and our work revealed the dynamic nature of the alterations induced by rotavirus."

In the current study, the researchers conducted the experiments using a lower dose of the virus and noticed that it was not just the virus-infected cells that showed dynamic calcium signaling, but also the adjacent uninfected cells surrounding the infected cells produced pulses of calcium waves that were coordinated with those of the infected cells. This observation suggested that the infected cells could be triggering intercellular calcium waves in the uninfected cells.

The researchers connected their observation to a concept proposed in the mid-90s suggesting that rotavirus-infected cells send signals to neighboring uninfected cells that disrupt their function, promoting diarrhea and vomiting.

"Our videos of live fluorescent microscopy showing intercellular calcium wave signaling in both rotavirus-infected and uninfected cells provided an unprecedented means to investigate the nature of the proposed signal, which had not been identified," said first author Dr. Alexandra L. Chang-Graham, an M.D./Ph.D. student in the Medical Scientist Training Program who completed her Ph.D. thesis working in the Hyser lab.

Finding the signal

Chang-Graham, Hyser and their colleagues worked with three different laboratory models to identify the signal that triggers intercellular calcium waves on uninfected cells. They used a monkey kidney cell line commonly used to study rotavirus. They also worked with human intestinal enteroids, a cultivation system that recapitulates many of the characteristics of the human infection and a neonatal mouse model of rotavirus infection and diarrhea.
Their studies showed that suspected triggers of calcium waves, such as prostaglandin E2 and nitric oxide, did not elicit a calcium response. Then they tested ATP and ADP, known mediators of calcium signaling that had not been previously associated with rotavirus infection. They found that rotavirus-infected cells triggered intercellular calcium waves by releasing ADP that binds to its receptor, P2Y1, on uninfected neighboring cells. Knocking out the P2Y1 gene, which prevents ADP from signaling, reduced intercellular calcium waves.

"Across the three model systems we consistently found evidence that rotavirus-infected cells signal uninfected cells with ADP and that this contributes to the severity of the disease," Chang-Graham said. "We consider it paradigm shifting that the actual signal, ADP, was not even on the radar before."

ADP signaling is involved in triggering severe rotavirus symptoms

Further studies revealed previously unknown roles of ADP on rotavirus infection and replication, shining a spotlight on ADP as an important trigger of the multiple factors involved in severe diarrhea and vomiting caused by rotavirus. For instance, the researchers found evidence that ADP signaling increases rotavirus infection, the expression of inflammatory cytokine IL1-alpha and the secretion of serotonin, an inducer of diarrhea. ADP signaling also increases the expression of enzymes that produce prostaglandin and nitric oxide, potentially causing the increases in those compounds observed in rotavirus infection. Preventing ADP signaling and intercellular calcium waves reduced the production of the compounds mentioned above.

"Finally, we determined that inhibiting the P2Y1 receptor reduced the severity of rotavirus-induced diarrhea in a mouse model," Chang-Graham said. "Using intercellular calcium waves, rotavirus amplifies its ability to cause disease beyond the cells it directly infects. This is the first virus identified to activate ADP-mediated intercellular calcium waves. This may be a strategy that other viruses also use to cause disease in their hosts."

"Our findings add a new and very potent signaling pathway into the causative mechanisms of rotavirus diarrhea," Hyser said. "In terms of treatment, this is exciting because currently some drugs targeting P2Y1 are undergoing preclinical testing as anticlotting drugs. It's possible that such drugs could be repurposed, if proven to be safe for children, to be used to treat diarrhea caused by rotavirus infection."

Credit: 
Baylor College of Medicine

Predicting preterm births

Predicting preterm birth can be difficult, especially for women who have not given birth. It has long been known that the best predictor of preterm birth is someone who has had a prior preterm birth; however, this information is helpful only in second and subsequent pregnancies. For women in their first pregnancy, it is a challenge for obstetricians and midwives to advise them on their risks. To address this issue, researchers at Baylor College of Medicine and Texas Children's Hospital studied how family history can predict preterm birth. Their findings were published in the American Journal of Obstetrics & Gynecology.

"This is a retrospective study of prospective data," said Dr. Kjersti Aagaard, professor of obstetrics and gynecology at Baylor and Texas Children's Hospital. "We developed a biobank and data repository called PeriBank where we consistently asked our pregnant patients a set of questions about their familial history. We were able to take that detailed data and determine if that specific woman's family history did or did not predict her delivering preterm."

Once familial information was gathered, the research team was able to answer questions to quantify estimates of risk for preterm birth based on the pregnant patient's family history of preterm birth in herself, her sister(s), her mother, grandmothers and aunts and great-aunts. Their findings showed scenarios for women who have previously given birth (multiparous), as well as women who have never given birth (nulliparous). If a nulliparous woman herself was born preterm, her relative risk for delivering preterm was 1.75-fold higher. If her sister delivered preterm, her relative risk was 2.25-fold higher. If her grandmother or aunt delivered preterm, there was no significant increase of risk. If a multiparous mother with no prior preterm births was born preterm herself, her risk was 1.84-fold higher. However, if her sister, grandmother or aunt delivered preterm, there was no significant increase.

"We've managed over the years to collect data from a very large population of pregnant women that reflect Houston. There was considerable diversity by race, ethnicity, culture and socioeconomic status. This was a key strength of our study. With this breadth and depth of data reflective of the diversity of Houston, we were able to ask some good questions, which gave us really important information about 'heritability' of risk," Aagaard said.

The research team showed that preterm births cannot be fully attributed to genetics, Aagaard said. Family members may share DNA or genetic code, but the same generation of family members are more likely to share social determinants or have experienced systemic racism and bias. This was best demonstrated by their finding that a history of preterm birth in the pregnant woman or her sister was significantly associated with preterm birth, while a grandmother or aunt was not. These same-generation predictors are generally thought to reflect more about common environmental or social exposures (or a combination of limited genetics plus common exposures) than genetic linkages.

"We know that for the majority of women who deliver a baby preterm, we cannot say that the cause of that preterm birth was in whole or in part genetics. Rather, this study provides subtle but important clues that it is more likely the shared familial background and its exposures that render risk," Aagaard said. "We hope others will similarly be mindful of those subtle characteristics when looking at heritability and risk. We remain committed to finding the underlying true causal and driving factors. In the meantime, we provide for the first time some reliable risk estimates for first time moms based on their and their family history of preterm birth."

Credit: 
Baylor College of Medicine

College students are less food insecure than non-students

image: College students in a University of Illinois dining hall.

Image: 
L. Brian Stauffer, University of Illinois

URBANA, Ill. - College students are significantly less likely to be food insecure than non-students in the same age group, according to a new study from the University of Illinois.

"College hunger" has been widely reported in the media, and several studies found very high food insecurity rates among college students, sometimes up to 50 or 60%. "That did not make sense to those of us doing research on food insecurity, so I wanted to check those findings," says Craig Gundersen, agricultural economist at U of I.

Gundersen, who conducts research on food insecurity measures, notes many of those studies used small, non-representative samples with low response rates. To provide more accurate results, he analyzed data from the 2014 to 2018 Current Population Survey (CPS), a national survey covering about 50,000 households. CPS is the official data source for food insecurity measures in the U.S.

His findings were clear.

"No matter how you look at it, college students have far lower rates of food insecurity than both non-college students of similar ages and the general population," he says.

According to his study, 9.9% of full-time college students ages 18 to 25 are food insecure, compared to 16.8% for non-students of the same age group. In the general population, about 12.5% were food insecure during the study's time frame.

Gundersen found the same trend for 26- to 30-year-olds (where just 7% are full-time students) though the gap is smaller than for the younger age group.

Only for part-time students, especially in the 26- to 30-year old group, are food insecurity rates almost equivalent to those of non-students.

Because CPS collects data per household, parents may be responding for adult children living at home. To account for this, Gundersen compared data from young people living on their own and those living with parents. The overall trend was similar, but the difference was even starker ¬¬- 9.1% of students versus 18.4% of non-students were food insecure according to parent responses.

The pattern holds true across demographic groups, except for disabled students, where the food insecurity rate is closer to that of non-students.

This doesn't mean college hunger is a myth. Gundersen emphasizes one in 10 college students is still food insecure.

However, the problem is much more serious for non-students in the same age groups, and that has implications for food aid relief and intervention policies.

"The main conclusion from this study is that full-time college students have food insecurity rates that are far below those of non-college students of similar ages, and quite a bit below those of the general population," he concludes. "Therefore, in thinking about who we should be especially concerned about with respect to policy and other interventions are those who are not in college, in the age group from 18 to 25, rather than college students."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

A meta-analysis of major complications between traditional pacemakers and leadless pacemakers

In a new publication from Cardiovascular Innovations and Applications; DOI https://doi.org/10.15212/CVIA.2019.0596, Diyu Cui, Yimeng Liao, Jianlin Du and Yunqing Chen from The Second Affiliated Hospital of Chongqing Medical University, Chongqing, China consider major complications between traditional pacemakers and leadless pacemakers.

Leadless pacemakers, which are increasingly used in clinical practice, have several advantages compared with traditional pacemakers in avoiding pocket- and lead-related complications. However, the clinical effect of leadless pacemakers remains controversial.

The authors meta-analysis of the material appears to favour leadless pacemakers over traditional pacemakers with regard to major complications. This indicates that leadless pacemakers have potential for future clinical applications. However, the application of a leadless pacemaker is still controversial, and more randomized controlled studies are warranted to explore safety and practicality.

Credit: 
Compuscript Ltd

More than 1.1 million deaths among Medicare recipients due to high cost of drugs

WASHINGTON, DC and SAN DIEGO, CA - Nov. 19, 2020 - More than 1.1 million Medicare patients could die over the next decade because they cannot afford to pay for their prescription medications, according to a new study released today by the West Health Policy Center, a nonprofit and nonpartisan policy research group and Xcenda, the research arm of the drug distributor AmerisourceBergen.

If current drug pricing trends continue, researchers estimate cost-related non-adherence to drug therapy will result in the premature deaths of 112,000 beneficiaries a year, making it a leading cause of death in the U.S., ahead of diabetes, influenza, pneumonia, and kidney disease. Millions more will suffer worsening health conditions and run up medical expenses that will cost Medicare an additional $177.4 billion by 2030 or $18 billion a year for the next 10 years.

Researchers also modeled what would happen if Medicare was allowed to bring down drug prices for its beneficiaries through direct negotiation with drug companies, as described in H.R. 3, the Elijah E. Cummings Lower Drug Costs Now Act, passed by the U.S. House of Representatives last year. They found Medicare negotiation could result in 94,000 fewer deaths annually. Additionally, the model found that the policy would reduce Medicare spending by $475.9 billion by 2030.

"One of the biggest contributors to poor health, hospital admissions, higher healthcare costs and preventable death is patients failing to take their medications as prescribed," said Timothy Lash, President, West Health Policy Center. "Cost-related nonadherence is a significant and growing issue that is direct result of runaway drug prices and a failure to implement policies and regulations that make drugs more affordable."

The price of prescription medications has skyrocketed in recent years. Between 2007 and 2018, list prices for branded pharmaceutical products increased by 159% and there are few signs of it slowing. According to the Centers for Medicare & Medicaid Services (CMS), spending on prescription drugs will grow faster than any other major medical good or service over the next several years.

Under Medicare, beneficiaries must pay 25% of the cost of generic and brand-name medications. For many people with multiple chronic conditions, this could add up to thousands of dollars a year in out-of-pocket costs.

"The costs of doing nothing about high drug prices are too high especially when policy changes such as allowing Medicare to negotiate drug prices would result in saving millions of lives and billions of dollars," said Sean Dickson, Director of Health Policy at West Health Policy Center and Chair of the Council for Informed Drug Spending Analysis (CIDSA), an independent group of experts on drug spending from leading academic institutions.

For the study, researchers developed a 10-year model representative of the majority of Medicare beneficiaries with chronic conditions. The model allows users to estimate how different levels of price reductions would lower the number of premature deaths and decrease Medicare spending on a sliding scale. The interactive tool and the complete technical report can be found on the CIDSA website.

Credit: 
West Health Institute

Afro-Caribbean patients with severe kidney disease at greater risk of hospitalisation from COVID-19

Afro-Caribbean people with end stage kidney disease (ESKD) are more likely to be hospitalised with COVID-19 than other ethnicities, a study has found.

Research published today in Nephrology by King's College London and Guy's and St Thomas' NHS Foundation Trust investigated the link between ethnicity and ESKD. They found patients of Afro-Caribbean ethnicity have a four-fold increased risk of being hospitalised with COVID-19 when compared to kidney transplant patients attending for routine care.

The study examined 39 people with diabetes related ESKD hospitalised with COVID-19 at Guy's and St Thomas' between March and April 2020. Of the hospitalised cohort, 73% of patients with a kidney transplant and 54% of haemodialysis patients were of Afro-Caribbean ethnicity. By comparison in patients attending hospital for routine care 18% of kidney transplant patients and 42% of haemodialysis patients are of Afro-Caribbean ethnicity.

The study concluded that ESKD is associated with a high mortality rate of 36% among severely ill patients hospitalised with COVID-19.

First author of the study, Dr Antonella Corcillo from the School of Cardiovascular Medicine and Sciences at King's College London said: "There is very little information on the clinical features and outcomes in patients with ESKD admitted with COVID-19 in the UK. Patients with ESKD are at high risk of severe COVID-19 and often have a poor prognosis. The mortality rate of patients hospitalised, as they had very severe COVID-19, was high at 36% and similar to other recent studies internationally. We observed a disproportionately high prevalence of people of Afro-Caribbean ethnicity being hospitalised. We also saw that low blood glucose levels (hypoglycaemia) were common during hospitalisation in this high-risk population and that adjustment of diabetes treatment was frequently required."

Senior author, Dr Janaka Karalliedde from King's College London, said: "People with ESKD are a high-risk group and are vulnerable to severe COVID-19 infection. When we compared the prevalence of Afro-Caribbean ethnicity in kidney transplant patients admitted with severe COVID-19 to the population of transplant patients attending for routine care at our hospital we observed a more than a four-fold increase in patients of Afro-Caribbean origin being admitted with COVID-19. Further studies and research are urgently required to understand and explain this observation of disproportionate risk in patients of Afro-Caribbean origin. Our data also confirm that the management of diabetes in the setting of severe COVID-19 infection is very challenging and reinforce the importance of integrated multidisciplinary care and teamwork for patients with diabetes hospitalized with COVID-19."

Limitations of this study include its relatively small sample size and that, as it is a cross-sectional study, it is unable to identify causal relationships between ethnicity and severe COVID-19 outcomes.

Credit: 
King's College London

Truffle munching wallabies shed new light on forest conservation

image: It is critical to understand the role of animals in forest ecosystem health. Credit Todd F Elliott.

Image: 
Todd F Elliott

Feeding truffles to wallabies may sound like a madcap whim of the jet-setting elite, but it may give researchers clues to preserving remnant forest systems.

Dr Melissa Danks from Edith Cowan University in Western Australia led an investigation into how swamp wallabies spread truffle spores around the environment. Results demonstrate the importance of these animals to the survival of the forest.

"There are thousands of truffle species in Australia and they play a critical role in helping our trees and woody plants to survive," she said.

"Truffles live in a mutually beneficial relationship with these plants, helping them to uptake water and nutrients and defence against disease.

"Unlike mushrooms where spores are dispersed through wind and water from their caps, truffles are found underground with the spores inside an enclosed ball - they need to be eaten by an animal to move their spores."

Dr Danks and colleagues at the University of New England investigated the role of swamp wallabies in dispersing these spores.

"Wallabies are browsing animals that will munch on ferns and leaves as well as a wide array of mushrooms and truffles," she said.

"This has helped them to be more resilient to changes in the environment than smaller mammals with specialist diets like potoroos.

"We were interested in finding out whether swamp wallabies have become increasingly important in truffle dispersal with the loss of these other mammals."

Conservation by poo tracking

The team fed truffles to wallabies and timed how long it would take for the spores to appear in the animals' poo. Most spores appeared within 51 hours, with some taking up to three days.

Armed with this information, the researchers attached temporary GPS trackers to wallabies to map how far they move over a three-day period.

Results showed the wallabies could move hundreds of metres, and occasionally more than 1200 metres, from the original truffle source before the spores appeared in their poo, which makes them a very effective at dispersing truffles around the forest.

Dr Danks said this research had wide ranging conservation implications for Australian forests.

"As forest systems become more fragmented and increasingly under pressure, understanding spore dispersal systems is really key to forest survival," Dr Danks said.

"Many of our bushland plants have a partnership with truffles for survival and so it is really critical to understand the role of animals in dispersing these truffle spores.

"Our research on swamp wallabies has demonstrated a simple method to predict how far an animal disperses fungal spores in a variety of landscapes."

Credit: 
Edith Cowan University

Tau protein changes in Alzheimer's disease correlate with dementia stage

Research into Alzheimer's disease has long focused on understanding the role of two key proteins, beta amyloid and the tau protein. Found in tangles in patients' brain tissue, a pathological form of the tau protein contributes to propagating the disease in the brain.

In new research from their joint laboratory, Judith Steen, PhD, and Hanno Steen, PhD show for the first time that this pathological tau protein changes its forms over time, which could mean it will take multiple drugs to target it effectively.

For years, pharmaceutical companies have largely focused with very limited success on developing Alzheimer's drugs against beta amyloid. More recently, drug discovery has shifted to drugs against tau. This new tau discovery may help prevent the same fate for drugs now in development against tau because it shows that the protein may present with one set of targets at early stages, with other targets at later stages in the disease process.

"The tau protein in Alzheimer's disease looks different at every stage," says lead investigator Judith Steen. "We discovered that tau undergoes a series of chemical modifications in a stepwise process that correlates with disease severity. This suggests that we need to have different diagnostics and treatments for every stage of disease."

Published in a paper in Cell, this research represents years of work in the Steen lab funded by the Tau Consortium. Its mission is to accelerate discoveries of new treatments for Alzheimer's and other disease related to "tauopathies" -- abnormalities of the tau protein.

Tau changes according to dementia stage

The Steen lab team looked at tau aggregates in tissues of two areas of the human brain, the frontal gyrus and the angular gyrus, from 49 patients with AD and 42 age-matched individuals without known Alzheimer's or dementia. They found that the chemistry of the tau protein changed in Alzheimer's patients. The tau had several modifications not found on normal tau, known as post-translational chemical modifications or PTMs. The specific chemically modified forms of tau correlated with dementia stage.

Stepwise chemical modifications

As we age, tau tangles accumulate in our brains, even if we don't develop dementia. "But these aggregates are not as abundant and don't look like the tau aggregates in patients who have severe Alzheimer's disease," says Steen. She believes that tau undergoes chemical changes after it's first made in the body. Their team revealed 95 PTMs of the tau protein; about one-third of them were not described before.

"Chemical processing by a variety of enzymes introduces modifications to the original tau protein," says Steen. Examples included addition of phosphate, methyl, acetyl, and ubiquitin groups, and others. The first step of disease appears to start with adding phosphate, with other changes following in a step-wise process. "We were able to see precisely what those modifications were, the extent of the changes, and we also could map it to a precise area of the tau protein," Steen says.

More than one drug likely needed

The team's discovery has immediate implications for the future of Alzheimer's treatment.

"It is likely that to successfully target tau protein with an antibody or small molecule, you may need more than one to clear it," explains Steen. "Early intervention may need different therapeutics compared with late stage Alzheimer's because of the distinct PTM profiles associated with each stage of disease."

The team is hopeful that exploring some these crucial chemical modifications may help explain how Alzheimer's develops and progresses as well as further reveal the chemistry of the tau protein in its earliest stages.

FLEXITau technology platform

Key to the study was use of a novel mass spectrometry technique developed within the lab to measure proteins. Called FLEXITau, and first described in a paper in 2016, it can sequence and quantify tau's makeup with unprecedented accuracy. FLEXITau utilizes biochemistry, mass spectrometry and data analysis to provide information on chemical modifications of tau in brain tissue.

The Steen lab leveraged the platform to study brain tissues collected by NIH-funded brain banks from patients with neurodegenerative diseases, using it to analyze thousands of proteins from these tissues.

The FLEXITau approach provides a deeper understanding of tau pathology that the team believes may lead to novel therapeutics. It may also be useful for developing better diagnostic tools to identify biomarkers of Alzheimer's and other tau-associated diseases.

Credit: 
Boston Children's Hospital