Culture

International team discover new species of flying reptiles

image: Three new species of toothed pterosaurs have been discovered

Image: 
Nizar Ibrahim, University of Detroit Mercy

A community of flying reptiles that inhabited the Sahara 100 million years ago has been discovered by a University of Portsmouth palaeontologist and an international team of scientists.

Professor David Martill from Portsmouth and researchers from the United States and Morocco identified the three new species of toothed pterosaurs.

The pterosaurs were part of an ancient river ecosystem in Africa that was full of life, including fish, crocodiles, turtles and several predatory dinosaurs.

The research was led by Megan Jacobs from Baylor University, Texas, who worked alongside Professor Martill and Nizar Ibrahim from the University of Detroit Mercy.

The new fossils, published in the journal Cretaceous Research, are helping to uncover the very poorly known evolutionary history of Africa during the time of the dinosaurs. The new finds show that African pterosaurs were quite similar to those found on other continents.

These flying predators soared above a world dominated by predators, including crocodile-like hunters and carnivorous dinosaurs. Interestingly, herbivores such as sauropods and ornithischian dinosaurs are rare. Many of the predators, including the toothed pterosaurs, preyed on a superabundance of fish.

Professor Martill said: "We are in a golden age for discovering pterodactyles. This year alone we have discovered three new species and we are only into March."

The new pterosaurs identified by the researchers from chunks of jaws and teeth, found in the middle Cretaceous Kem Kem beds of Morocco, had wingspans of around three to four metres. These aerial fishers snatched up their prey while on the wing, using a murderous looking set of large spike-like teeth that formed a highly effective tooth grab. Large pterosaurs such as these would have been able to forage over vast distances, similar to present day birds such as condors and albatrosses.

"These new finds provide an important window into the world of African pterosaurs," said Ibrahim, assistant professor of Biology at Detroit Mercy. "We know so much more about pterosaurs from places like Europe and Asia, so describing new specimens from Africa is always very exciting."

One of the species, Anhanguera, was previously only known from Brazil. Another, Ornithocheirus, had until now only been found in England and Middle Asia.

Credit: 
University of Portsmouth

Fleeing Nazis shaped Austrian politics for generations after World War II

A new study in The Economic Journal, published by Oxford University Press, suggests that migrating extremists can shape political developments in their destination regions for generations. Regions in Austria that witnessed an influx of Nazis fleeing the Soviets after WWII are significantly more right-leaning than other parts of the country. There were no such regional differences in far-right values before World War Two.

There is a long history of ideological radicals who have moved abroad to spread their political views: From the anarchist Mikhail Bakunin over the revolutionary Che Guevara to Jihadist fighters returning to their home countries from the Islamic State. Governments fear that these immigrants bring political turmoil and often react with travel bans or harsh surveillance. Beyond anecdotal evidence, however, researchers have not yet identified effects of migrating extremists on the spread of actual political beliefs.

The researchers use the Allied occupation of Austria after World War Two as a natural experiment. In the summer of 1945, occupation zones in the Austrian federal state of Upper Austria were unexpectedly reallocated between the United States and the Soviets. US-liberated regions north of the Danube River were reassigned to the Soviets, while the southern bank remained under US control. People started to flee to the US zone in large numbers immediately. Primarily Nazi elites fearing Soviet punishment migrated to the south bank of the Danube River. The zoning along the Danube River divided an otherwise historically, economically and culturally homogeneous region into two areas - one with a high density and another one with comparably low density of Nazi elite members.

Austria's long tradition of far-right populism allows the authors to trace the effects of migrated Nazi elites since the late 1940s until today. The results indicate a substantial and persistent increase in extreme right-wing attitudes in the destinations of migrating extremists. Even seventy years after the Nazi influx, vote shares for far-right parties are still much higher in places where Nazi elites settled.

The authors provide two main explanations for the long-term persistence of far-right values: local institutions and family ties. Migrated Nazis founded and penetrated local party branches at their destination. Those institutions multiplied their impact. The researchers found that migrating Nazis leverage far-right votes by at least a factor of 1.3 up to a factor of 2.5. Another explanation for persistence is intergenerational transmission. The authors collected pre-war phone book entries and show that names of far-right politicians today still reflect long-gone migration of Nazi elites after the war. All results hold when including controls for socio-economic and time invariant geographic characteristics.

It appears that political preferences are transmitted from generation to generation. Even after three or four generations, attitudes and beliefs of Nazi migrant families and communities continue to differ. Descendants of migrating extremists together with local party institutions are continuously spreading their beliefs to residents through active engagement in local politics.

"We were surprised to learn that imported extremism can survive for generations and does not fade away," said the paper's lead author Felix Roesel. "The good news is that liberal and democratic values spread in a very similar manner. This is what new research has shown. Populism is not more contagious than other political ideas."

Credit: 
Oxford University Press USA

Lipid helps heal the eye's frontline protection

image: Drs. Wendy Bollag and Mitchell Watsky

Image: 
Phil Jones, Senior Photographer, Augusta University

A species of a lipid that naturally helps skin injuries heal appears to also aid repair of common corneal injuries, even when other conditions, like diabetes, make healing difficult, scientists report.

Their findings show that the lipid DOPG, or dioleoyl-phosphatidylglycerol, aids healing of scratches on the cornea that can result from trauma such as finger pokes or mascara wands.

These results were observed in human cornea cells grown in a laboratory dish as well as healthy mice and mice whose healing ability was compromised, they report in the journal Investigative Ophthalmology and Visual Science.

They also found in the corneal epithelial cells, which are stacked like a protective brick wall at the front of the eye, the same signaling pathway that prompts production of the parent lipid, phosphatidylglycerol, or PG, in the most prominent skin cell type, keratinocytes.

The findings indicate a topical application of DOPG, possibly even adding it to existing eye drop products, could one day aid healing of this important protective barrier after injury as well as after common eye procedures like cataract surgery, say co-corresponding authors Drs. Wendy B. Bollag and Mitchell Watsky.

A superficial scratch to the eye's outermost corneal epithelial cell layer, the most common injury, typically heals in a few days without lasting implications, says Watsky, vision scientist, dean of The Graduate School at Augusta University and professor in the Medical College of Georgia's Department of Cellular Biology and Anatomy.

But when it doesn't heal and/or if there is deeper tissue injury, there can be persistent irritation and pain, scarring and potentially vision loss. "Corneal epithelial scratches open the eye to the outside world," says Watsky as infectious agents can now creep through this outer layer of protection for the eye that is also essential to refracting light so we can see.

Watsky uses the analogy of trying to tolerate and see through a contact lens that has been roughed-up with sandpaper. "It's got to be perfectly smooth like a lens," he says. Injured epithelial cells can eventually die off, particularly in the face of an autoimmune disease where the immune system is misdirected, or in a condition like diabetes, which results in a constant state of inflammation and can itself impact vision.

Watsky and Bollag, cell physiologist in the MCG Department of Physiology and a leader in basic science studies of normal and abnormal skin cell turnover like psoriasis, have also just received a $1.14 million grant (R01EY030576) from the National Institutes of Health to figure out more about how DOPG helps the cornea heal and what treatment protocols could one day help patients.

The scientists say that their next three years of new studies, enabled by the grant, should pave the way for clinical trials.

Bollag has led studies that show naturally occurring PG has an important role in regulating keratinocytes as well as in suppressing inflammation in the skin and has evidence it can help restore healthy skin cell turnover in psoriasis.

That evidence and some common traits between skin cells and corneal epithelial cells got Bollag and her colleagues wondering if maybe PG could work similar magic with the outer cellular layers of the cornea, which function much like skin for the eyeball.

They found a target of PG -- and now they've shown DOPG -- is toll-like receptors, a family of receptors that work as part of the body's frontline, but rather nonspecific, immune response to a perceived invader or to what they perceive as elements of damage. Although such watchfulness sounds like a good thing, it's also about balance and perception, the scientists say.

Bollag reported in 2018 in the Journal of Investigative Dermatology that PG inhibits toll-like receptor activation by antimicrobial peptides produced by skin cells for protection, which are produced in excess in psoriasis. At such high volume the body views these antimicrobials as indicators of damage, called DAMPs, or damage-associated molecular patterns, which are known to activate toll-like receptors, and they help perpetuate the vicious cycle of lesions of red, flaky, raised skin that are a psoriasis hallmark and which PG can interrupt.

The many questions they are looking to answer now with the new grant include more about how DOPG works in the cornea.

Back in the skin, Bollag had seen that DOPG is particularly adept at increasing the proliferation of skin cells that have stopped growing as soon as they touch each other, rather than keep growing to form the impenetrable barrier of healthy skin.

Since cornea healing also requires enhanced cell growth, they decided to try this PG species in corneal cells. They found DOPG stimulated the healing of a scratched corneal cell layer by about 40% while a mixture of PGs derived from eggs actually inhibited healing about 30%. In mice with an impaired ability to heal, injuries were about 50% smaller 28 hours after DOPG treatment and healing was significantly enhanced in healthy mice as well.

The scientists found higher DOPG doses actually resulted in lower wound healing and now want to find the optimal dose, something the new grant will enable them to pursue, along with optimal timing for dosing.

The little-studied DOPG's apparent effect on toll-like receptors is a relatively new finding but the scientists suspect, and will also further explore, that at least one way DOPG works to inhibit toll-like receptor activation in the cornea is through their co-receptor CD 14, which is good at detecting bacteria by sniffing out large molecules on the bacteria's exterior.

While there are definite differences, the skin and cornea both contain a lot of similar cell types and express a lot of similar proteins, notes Bollag. "They respond to a lot of the same things," she says.

How the cells are layered is one clear difference, with the cornea being transparent, notes Watsky. Like the skin for the body, the cornea provides a barrier protection to the eye, but here it needs to be clear to enable clear vision. "It's the strongest refractive part of the eye so if light does not enter there it doesn't get to the retina and you don't see," says Watsky. The cornea and inner eyelid also are both supposed to be smooth.

Today, a common course of treatment for corneal injury could include topical antibiotics to fight infection and a corticosteroid to fight inflammation. As with any medication, there are side effects, which in this case includes acute -- and potentially blinding -- glaucoma from topical steroids as well as an increased risk of infection in this dance of drugs where one fights infection and another inflammation, a natural part of the body's fight against infection.

The newly published research was funded by the National Eye Institute.

The scientists note that PG already is added to several existing eye products including over-the-counter treatments for dry eyes and a prescription drug for macular degeneration.

Corneal wounds can result from seemingly innocuous things like vigorous rubbing, very dry eyes, wearing contact lenses too long or errant grains of sand or concrete dust. Tears, which should constantly bathe our eyes, have components that lubricate, fight infection and aid healing, the investigators note.

For most of us, a relatively superficial scratch should heal even before a skin wound might.

The cornea is one of the most innervated tissues in the body -- a natural defense mechanism that becomes clear when we can't even ignore a single errant eyelash in the eye, Watsky says.

"The eyelash itself likely won't cause an injury but something as small as that, something as relatively insignificant as that, could cause an injury so we need to know how to take care of it."

Credit: 
Medical College of Georgia at Augusta University

Completely new antibiotic resistance gene has spread unnoticed to several pathogens

image: Joakim Larsson, Professor, University of Gothenburg

Image: 
Photo by Johan Wingborg

Aminoglycoside antibiotics are critically important for treating several types of infections with multi-resistant bacteria. A completely new resistance gene, which is likely to counteract the newest aminoglycoside-drug plazomycin, was recently discovered by scientists in Gothenburg, Sweden.

The bacterial gene the team discovered in river sediment from India does not resemble any known antibiotic resistance gene. But when the scientist compared its DNA sequence to already published bacterial DNA sequences, they found that it was already present in several pathogens, including Salmonella and Pseudomonas, from the USA, China and Italy. Until now, no one had realized that it was a resistance gene.

The research team has named the gene gar as it provides resistance to aminoglycoside antibiotics that carry a garosamine group. This is the case for the newest aminoglycoside drug, plazomycin, developed to circumvent most existing aminoglycoside resistance mechanisms.

Professor Joakim Larsson, senior author of the study and director of the Centre for Antibiotic Resistance Research at University of Gothenburg, Sweden, comments on the finding:

- It is good news that the gar gene still seems to be rather rare, but as it is spreading, it will likely further complicate treatment of already multi-resistant bacteria. Pseudomonas aeruginosa, for example, is a common cause of hospital-acquired pneumonia. Being able to treat secondary bacterial lung infections is something that we are particularly worried about these days when the world is hit by the covid-19 pandemic.

Rather than investigating bacterial isolates from patients, the researchers looked for novel resistance genes in in waste-water-impacted rivers in India, a country already struggling hard with increasing antibiotic resistance. The scientists´ approach of investigating environmental samples turned out to be an effective way of discovering resistance genes that, so far, are carried only by few people.

- Early discovery of resistance genes can help us managing their spread, facilitate gene-based diagnostics and perhaps also guide industry to develop drugs that can circumvent the resistance, says Joakim Larsson.

Around the world, companies and academic researcher try to develop new antibiotics, but their success is very limited. Even when they succeed, the development seems inevitable:

- Every antibiotic mankind has developed so far has eventually been met by resistance in at least some of the pathogens it was intended to treat. The gar gene is just the latest in a series of genes that one by one reduces the value of antibiotics, says Joakim Larsson.

The research group in Gothenburg studies the environments role in antibiotic resistance, particularly as source for resistance genes that can move from harmless environmental species to those that cause disease.

- The enormous diversity of bacteria in the environment around us probably already harbour genes to every antibiotic we ever will develop - unless we start thinking very differently about how antibiotics are designed, says Joakim Larsson.

Credit: 
University of Gothenburg

Study: An aspirin a day does not keep dementia at bay

MINNEAPOLIS - Taking a low-dose aspirin once a day does not reduce the risk of thinking and memory problems caused by mild cognitive impairment or probable Alzheimer's disease, nor does it slow the rate of cognitive decline, according to a large study published in the March 25, 2020, online issue of Neurology®, the medical journal of the American Academy of Neurology.

Aspirin has anti-inflammatory properties and also thins the blood. For years, doctors have been prescribing low-dose aspirin for some people to reduce their risk of heart disease and stroke. However, there are also possible risks to taking aspirin, including bleeding in the brain, so guidance from a doctor is important.

Because aspirin can be beneficial to the heart, researchers have hypothesized, and smaller previous studies have suggested, that it may also be beneficial to the brain, possibly reducing the risk of dementia by reducing inflammation, minimizing small clots or by preventing the narrowing of blood vessels within the brain.

"Worldwide, an estimated 50 million people have some form of dementia, a number that is expected to grow as the population increases, so the scientific community is eager to find a low-cost treatment that may reduce a person's risk," said study author Joanne Ryan, PhD, of Monash University's School of Public Health in Melbourne, Australia. "Unfortunately, our large study found that a daily low-dose aspirin provided no benefit to study participants at either preventing dementia or slowing cognitive decline."

The study involved 19,114 people who did not have dementia or heart disease. A majority of participants were age 70 or older. They took thinking and memory tests at the start of the study as well as during follow-up visits.

Half of the people were given daily 100 milligram low-dose aspirin while the other half were given a daily placebo. They were followed for an average of 4.7 years, with annual in-person examinations.

Over the course of the study, 575 people developed dementia.

Researchers found no difference between those who took aspirin and those who took placebo in the risk of developing mild cognitive impairment, dementia, or probable Alzheimer's disease. There was also no difference in the rate of cognitive change over time.

"While these results are disappointing, it is possible that the length of just under five years for our study was not long enough to show possible benefits from aspirin, so we will continue to examine its potential longer-term effects by following up with study participants in the coming years," said Ryan.

A limitation of the study was that only relatively healthy people were enrolled, and such a population may benefit less from aspirin than the general population.

Credit: 
American Academy of Neurology

Ultrasound solves an important clinical problem in diagnosing arrhythmia

image: Electromechanical wave imaging (EWI) activation maps are capable of localizing the arrhythmic origin and differentiating irregular beats (right) from consecutive normal sinus rhythm beats (left) on the same patient before ablation. Red illustrates early and blue represents late activation (in milliseconds). The red arrow indicate the earliest activated region displayed by EWI, which successfully corresponds to the source of the arrhythmia in agreement with the site that was ablated with the intracardiac ablation site.LV = left ventricle, RV = left ventricle, ANT = anterior, POST = posterior.

Image: 
Ultrasound Elasticity Imaging Laboratory/Columbia Engineering

New York, NY--March 25, 2020--Cardiac arrhythmias are a major cause of morbidity and mortality worldwide. Currently, the 12-lead electrocardiogram (ECG) is the noninvasive clinical gold standard used to diagnose and localize these conditions, but it has limited accuracy, cannot provide an anatomical tool to visually localize the source of the arrhythmia, and depending on which clinician is looking at the signals, there might be some interpretation variability.

Researchers at Columbia Engineering announced today that they have used an ultrasound technique they pioneered a decade ago--Electromechanical Wave Imaging (EWI)--to accurately localize atrial and ventricular cardiac arrhythmias in adult patients in a double-blinded clinical study. VIDEO: https://youtu.be/IvQQyYabhME

"This study presents a significant advancement in addressing a major unmet clinical need: the accurate arrhythmia localization in patients with a variety of heart rhythm disorders," says Natalia Trayanova, Murray B. Sachs Endowed Chair and professor of biomedical engineering and medicine Medicine, and director of the Alliance for Cardiovascular Diagnostic and Treatment innovation at Johns Hopkins University, who was not involved with the study. "The non-invasive nature of EWI using standard hospital hardware, and its ability to visualize the arrhythmia sources in 3D render it an attractive component for inclusion in the clinical ablation procedure."

EWI is a high-frame-rate ultrasound technique that can noninvasively map the electromechanical activation of the heart; it is readily available, portable, and can pinpoint the arrhythmic source by providing 3D cardiac maps. The new study, published online in Science Translational Medicine, evaluated the accuracy of EWI for localization of various arrhythmias in all four chambers of the heart prior to catheter ablation: the results showed that EWI correctly predicted 96% of arrhythmia locations as compared with 71% for 12-lead electrocardiogram (ECG).

"We knew EWI was feasible in individual patients and we wanted to see if it made a difference in the clinical setting where they treat many people with different types of arrhythmias," says Elisa Konofagou, Robert and Margaret Hariri Professor of Biomedical Engineering and Radiology (Physics) who directed the study. Her group has been working on several studies with electrophysiologists in the cardiology department at Columbia University Irving Medical Center (CUIMC) and for the purpose of this study, the Konofagou team partnered with Elaine Wan, Esther Aboodi Assistant Professor of Medicine at CUIMC and co-senior author, who saw the potential of this new technology and wanted to work together.

"So, we joined forces with cardiac electrophysiologists to determine clinical utility for the first time," Konofagou explains. "We were able to show that not only does our imaging method work in difficult cases of arrhythmia but that it can also predict the optimal site of radiofrequency ablation before the procedure where there is no other imaging tool currently available to do that in the clinic. Using EWI as a clinical visualization tool in conjunction with ECG and clinical workflow could improve discussions with patients about treatment options and pre-procedural planning as well as potentially reducing redundant ablation sites, prolonged procedures, and anesthesia times."

The researchers ran a double-blinded clinical study to evaluate the diagnostic accuracy of EWI for localizing and identifying the sites of atrial and ventricular arrhythmias. Fifty-five patients, who had pre-existing cardiac disease including previous catheter ablations and/or other cardiovascular co-morbidities, underwent EWI scans prior to their catheter ablation procedures to generate activation maps of their hearts. The team retrospectively compared EWI maps and 12-lead ECG assessments made by six expert electrophysiologists in a team led by Wan to the site of successful ablation found on the intracardiac electroanatomical maps obtained during invasive catheter mapping.

"The accuracy of EWI was higher than that of clinical diagnosis by electrophysiologists reading standard 12-lead ECGs," says the study's co-first author Lea Melki, a PhD student in the department of biomedical engineering working in Konofagou's team who teamed up with electrophysiology fellow and co-first author, Chris Grubb, to accomplish that task. "While the inter-observer variability of our expert electrophysiologists may have played a role, we also know that 12-lead ECGs are limited in diagnosing arrhythmias from the posterior side of the heart, while EWI allows for easier anatomical location in 3D. In fact, a big advantage of EWI is the ease with which activation maps can clearly demarcate the earliest sites of interest along with direct anatomic visualization using standard echocardiography scans that clinicians are already trained on."

The researchers are now planning a long-term clinical study, set to start later this year that will use EWI prediction to improve ablation outcomes by increasing the accuracy of the ablation site and spare normal tissue from ablation.

"It's really clear now that, when used in conjunction with standard 12-lead ECG, EWI can be a valuable tool for diagnosis, clinical decision making, and treatment planning of patients with arrhythmias," says Melki. "We believe our EWI technique, with minimal training, will result in higher accuracy in the site of ablation, a faster procedure, and fewer complications and repeat visits after the procedure. This is a win-win for everyone, both patients and clinicians."

About the Study

The study is titled "Noninvasive localization of cardiac arrhythmias using electromechanical wave imaging."

Authors are: Christopher S. Grubb 1, Lea Melki 2, Daniel Y. Wang 1, James Peacock 1, Jose Dizon 1, Vivek Iyer 1, Carmine Sorbera 1, Angelo Biviano 1, David A. Rubin 1, John P. Morrow 1, Deepak Saluja 1, Andrew Tieu2 , Pierre Nauleau 2, Rachel Weber 2, Salma Chaudhary 1, Irfan Khurram 1, Marc Waase 1, Hasan Garan 1, Elisa E Konofagou 2,3, and Elaine Y. Wan 1.

1 Division of Cardiology, Department of Medicine, Vagelos College of Physicians and Surgeons, Columbia University, New York, NY 10032

2 Ultrasound Elasticity Imaging Laboratory, Department of Biomedical Engineering, Columbia University, New York, NY 10032

3 Department of Radiology, Columbia University Medical Center, New York, NY 10032

The study was supported by the National Institutes of Health (NIH R01 HL140646-01, R01 HL114358, and R01 EB006042).

The authors declare no financial or other conflicts of interest.

Credit: 
Columbia University School of Engineering and Applied Science

Stanford engineers find ankle exoskeleton aids running

image: Graduate student Delaney Miller runs on a treadmill aided by the ankle exoskeleton emulator. Fellow graduate student Guan Rong Tan controls the emulator and monitors Miller's gait and respiration.

Image: 
Farrin Abbott/Stanford News Service

Running is great exercise but not everyone feels great doing it. In hopes of boosting physical activity - and possibly creating a new mode of transportation - engineers at Stanford University are studying devices that people could strap to their legs to make running easier.

In experiments with motor-powered systems that mimic such devices - called exoskeleton emulators - the researchers investigated two different modes of running assistance: motor-powered assistance and spring-based assistance. The results, published March 25 in Science Robotics, were surprising.

The mere act of wearing an exoskeleton rig that was switched off increased the energy cost of running, making it 13 percent harder than running without the exoskeleton. However, the experiments indicated that, if appropriately powered by a motor, the exoskeleton reduced the energy cost of running, making it 15 percent easier than running without the exoskeleton and 25 percent easier than running with the exoskeleton switched off.

In contrast, the study suggested that if the exoskeleton was powered to mimic a spring there was still an increase in energy demand, making it 11 percent harder than running exoskeleton-free and only 2 percent easier than the non-powered exoskeleton.

"When people run, their legs behave a lot like a spring, so we were very surprised that spring-like assistance was not effective," said Steve Collins, associate professor of mechanical engineering at Stanford and senior author of the paper. "We all have an intuition about how we run or walk but even leading scientists are still discovering how the human body allows us to move efficiently. That's why experiments like these are so important."

If future designs could reduce the energy cost of wearing the exoskeleton, runners may get a small benefit from spring-like assistance at the ankle, which is expected to be cheaper than motor-powered alternatives.

Powering your step

The frame of the ankle exoskeleton emulator straps around the user's shin. It attaches to the shoe with a rope looped under the heel and a carbon fiber bar inserted into the sole, near the toe. Motors situated behind the treadmill (but not on the exoskeleton itself) produce the two modes of assistance - even though a spring-based exoskeleton would not actually use motors in the final product.

As the name implies, the spring-like mode mimics the influence of a spring running parallel to the calf, storing energy during the beginning of the step and unloading that energy as the toes push off. In powered mode, the motors tug a cable that runs through the back of the exoskeleton from the heel to the calf. With action similar to a bicycle brake cable, it pulls upward during toe-off to help extend the ankle at the end of a running step.

"Powered assistance took off a lot of the energy burden of the calf muscles. It was very springy and very bouncy compared to normal running," said Delaney Miller, a graduate student at Stanford who is working on these exoskeletons and also helping test the devices. "Speaking from experience, that feels really good. When the device is providing that assistance, you feel like you could run forever."

Eleven experienced runners tested the two assistance types while running on a treadmill. They also completed tests where they wore the hardware without any of the assistance mechanisms turned on.

Each runner had to become accustomed to the exoskeleton emulator prior to testing - and its operation was customized to accommodate their gait cycle and phases. During the actual tests, the researchers measured the runners' energetic output through a mask that tracked how much oxygen they were breathing in and how much carbon dioxide they were breathing out. Tests for each type of assistance lasted six minutes and the researchers based their findings on the last three minutes of each exercise.

The energy savings the researchers observed indicate that a runner using the powered exoskeleton could boost their speed by as much as 10 percent. That figure could be even higher if runners have additional time for training and optimization. Given the considerable gains involved, the researchers think it should be possible to turn the powered skeleton into an effective untethered device.

The Future

By providing physical support, confidence and possibly increased speed, the researchers think this kind of technology could help people in various ways.

"You can almost think of it as a mode of transportation," said Guan Rong Tan, a graduate student in mechanical engineering who, like Miller, is continuing this research. "You could get off a bus, slap on an exoskeleton, and cover the last one-to-two miles to work in five minutes without breaking a sweat."

"These are the largest improvements in energy economy that we've seen with any device used to assist running," said Collins. "So, you're probably not going to be able to use this for a qualifying time in a race, but it may allow you to keep up with your friends who run a bit faster than you. For example, my younger brother ran the Boston Marathon and I would love to be able to keep pace with him."

Credit: 
Stanford University

Culturally adapted materials boost Latino participation in diabetes education programs

CORVALLIS, Ore. -- An Oregon State University study published last week found that diabetes education programs that are linguistically and culturally tailored to Latinos lead to significantly higher rates of completion among Latino participants -- even higher than rates among non-Latinos enrolled in the English versions of those programs.

Cultural adaptation means that a program is not simply a word-for-word translation of an English-language version. For example, the Programa de Manejo Personal de la Diabetes (PMPD) was originally developed in Spanish, using idioms and examples that are familiar and applicable to Latinos specifically.

"Linguistic adaptation is important, obviously, when we're trying to reach people who speak languages other than English. But equally important is that it's culturally adapted," said lead author Carolyn Mendez-Luck, a researcher in OSU's College of Public Health and Human Sciences. "Those two go hand-in-hand."

Latinos in the U.S. are twice as likely as non-Latino whites to develop Type 2 diabetes, with over half of Latinos expected to develop Type 2 diabetes by age 70.

Latinos also tend to experience more complications from uncontrolled diabetes, including kidney disease, vision problems and heart disease. Such complications lead to high health care costs and significant disability.

Diabetes self-management education has been shown to improve healthy eating, and has been linked to lower medical costs and reduced ER visits. But few prior studies have focused on Latino participation in such programs.

Mendez-Luck and OSU co-authors Diana Govier, Jeff Luck, Esmeralda Julyan and Shyama Mahakalanda used data from the National Council on Aging to measure participation rates among Latinos and non-Latino whites in two programs, the PMPD, and its equivalent that targets non-Latinos, the Diabetes Self-Management Program (DSMP). Angelica Herrera-Venson of the National Council on Aging was also a co-author.

The sample, drawn from the council's Chronic Disease Self-Management Education Database, included 8,321 Latinos and 23,537 non-Latino whites who participated in either program between January 2010 and March 2019.

The researchers found that, compared to non-Latino whites, Latinos enrolled in either the PMPD or DSMP program had a higher probability of completing at least four sessions of the six-session programs. Among Latinos, those enrolled in the PMPD Spanish-language program had the highest probability of completing all six sessions.

A potential explanation for higher rates, the study says, is that these kinds of programs are "sensitive to cultural values and beliefs related to diabetes, thus making them more relevant and interesting to Latino participants."

For example, in talking about food, a Latino-tailored diabetes program would emphasize the need to limit intake of rice and tortillas, rather than white bread and potatoes, as might be the case in a non-Latino program. Or if talking about exercise, a program based in a desert community would not be likely to recommend kayaking as an option.

In addition to language translation, linguistic adaptations may also use easy-to-understand terminology, the study says, which helps make programs more accessible to participants with lower educational levels. Such materials may also help boost Latinos' overall health literacy, which can improve health outcomes and increase motivation for self-care.

A unique factor in providing diabetes education to Latinos, Mendez-Luck said, is having to combat the cultural notion of "susto," the belief held by some Latinos that a major scare or trauma in someone's life is what initially causes them to get diabetes.

There are also challenges in bridging the gap between what some Latino elders believe about the disease and its treatment, and what their caregivers do to help them.

The National Council on Aging is continually working with community-based organizations to identify and disseminate culturally adapted version of health education programs. Going forward, the researchers said, further study will be needed to determine whether those tailored approaches lead to similar participation rates among other racial and ethnic groups.

Credit: 
Oregon State University

A critical enzyme for sperm formation could be a target for treating male infertility

image: The activity of the Skp1 protein is crucial for sperm formation, Penn Vet scientists found. In a dividing sperm precursor cell, chromosomes (in purple) normally align in the middle, as shown on the left. But in cells lacking Skp1, as shown on the right, chromosomes fail to align and are instead distributed chaotically around the cell. 

Image: 
Courtesy of the Wang laboratory

While some of our body's cells divide in a matter of hours, the process of making sperm, meiosis, alone takes about 14 days from start to finish. And fully six of those days are spent in the stage known as the pachytene, when pairs of chromosomes from an individual's mother and father align and connect.

"This stage is really important, because the pair needs to be aligned for the exchange of genetic material between those two chromosomes," says P. Jeremy Wang, a biologist in Penn's School of Veterinary Medicine. "If anything goes wrong at this stage, it can cause a defect in meiosis and problems in the resulting sperm, leading to infertility, pregnancy loss, or birth defects."

In a new paper in Science Advances, Wang and colleagues have identified an enzyme that plays a crucial role in maintaining this chromosomal pairing during the pachytene stage of meiosis. Without this protein, named SKP1, meiosis cannot proceed to metaphase, the next major developmental stage involved in generating sperm cells.

The finding may help overcome hurdles that have stood in the way of treating certain forms of male infertility, in which a man makes no sperm but in whom sperm's precursor cells, spermatogonia, can be found.

"Reproductive technologies like in vitro fertilization have made a huge difference for infertile patients, but the male needs to have at least some sperm," says Wang. "If the male has no sperm, then the only option is to use donor sperm. But if you can find these spermatogonia, the pre-meiotic germ cells, they could be induced to go through meiosis and make sperm. So SKP1 could be part of the solution to ensuring meiosis continues."

Wang is also hopeful that his finding could aid in basic research on sperm development that his and many other labs pursue.

"Right now we use animals to do our research; we don't have a cell culture system to produce sperm," he says. "Manipulating SKP1 and the pathway in which it acts could allow us to set up an in vitro system to produce sperm artificially, which would be a boon for our studies."

The publication represents nearly a decade of work, led by Wang's postdoctoral researcher Yongjuan Guan, with major contributions from former postdoc Mengcheng Luo.

The team began focusing on SKP1 after conducting a screening test to look for proteins found in the area where the paired chromosomes come together during the pachytene stage of meiosis. From earlier studies, the researchers knew that SKP1 also plays a role in cell division in cells throughout the body, not just sperm and eggs. Without it, cells die.

That fact forced the Penn Vet team to get creative to understand the protein's function. Unable to simply eliminate it, they created a model system in mice in which they could turn off the protein only in the germ cells and only in adulthood.

"Taking this inducible, germ-cell-specific model, we found that taking away SKP1 caused the chromosomes to prematurely separate," says Wang.

While the normal alignment process in the pachytene stage takes six days in mice, in the cells that lost SKP1 the paired chromosomes separated far earlier.

Scientists had hypothesized the existence of a metaphase competence factor, or some protein required for a cell to enter metaphase. Wang believes that SKP1 is it.

While introducing a compound known as okadaic acid to sperm precursor cells can coax them into an early entrance to metaphase, cells lacking SKP1 did not progress to metaphase.

Experiments in developing eggs showed the researchers that SKP1 is also required for females to maintain viable eggs. Oocytes, the cells that develop through meiosis to form mature eggs, that lacked SKP1 developed misaligned chromosomes and many eventually were lost.

In future work, Wang and his colleagues want to dig deeper into the mechanism of action by which SKP1 works to ensure cells can progress to metaphase, with the idea of eventually manipulating it to find strategies for addressing infertility and innovative laboratory techniques.

"Now that we know SKP1 is required, we're looking for the proteins it interacts with upstream and downstream so we can study this pathway," says Wang.

Credit: 
University of Pennsylvania

Mayo Clinic outlines approach for patients at risk of drug-induced sudden cardiac death in COVID-19

ROCHESTER, Minn. - SARS-CoV-2, the virus that causes COVID-19, continues to spread, leading to more than 20,000 deaths worldwide in less than four months. Efforts are progressing to develop a COVID-19 vaccine, but it's still likely 12 to 18 months away.

In the meantime, the pandemic, with over 400,000 confirmed cases worldwide already, is driving researchers to find safe and effective therapies for patients with COVID-19, and an antimalarial drug is potentially on the front lines of that effort. While new and repurposed drugs are being tested in clinical trials, some of these promising drugs are simultaneously being used off-label for compassionate use to treat patients.

Some of the medications being used to treat COVID-19 are known to cause drug-induced prolongation of the QTc of some people. The QTc is an indicator of the health of the heart's electrical recharging system. Patients with a dangerously prolonged QTc are at increased risk for potentially life-threatening ventricular rhythm abnormalities that can culminate in sudden cardiac death.

"Correctly identifying which patients are most susceptible to this unwanted, tragic side effect and knowing how to safely use these medications is important in neutralizing this threat," says Michael J. Ackerman, M.D., Ph.D., a Mayo Clinic genetic cardiologist. Dr. Ackerman is director of the Mayo Clinic Windland Smith Rice Comprehensive Sudden Cardiac Death Program.

A study published in Mayo Clinic Proceedings details more information about potential dangers and the application of QTc monitoring to guide treatment when using drugs that can cause heart rhythm changes. Dr. Ackerman is the senior author of the study.

Hydroxychloroquine is a long-standing preventive and treatment drug for malaria. It also is used to manage and minimize symptoms of inflammatory immune diseases, such as lupus and rheumatoid arthritis. In laboratory tests, hydroxychloroquine can prevent the SARS-CoV and SARS-CoV-2 viruses from attaching to and entering cells. If these antiviral abilities work the same way in animals and humans, the drug could be used to treat patients and limit the number of COVID-19 deaths.

On a cellular level, potential QT-prolonging medications, like hydroxychloroquine, block one of the critical potassium channels that control the heart's electrical recharging system. This interference increases the possibility that the heart's rhythm could degenerate into dangerous erratic heart beats, resulting ultimately in sudden cardiac death.

Accordingly, Mayo Clinic cardiologists and physician-scientists have provided urgent guidance on how to use a 12-lead ECG, telemetry or smartphone-enabled mobile ECG to determine the patient's QTc as a vital sign to identify those patients at increased risk and how to ultimately minimize the chance of drug-induced sudden cardiac death.

"Right now, it is the Wild West out there, ranging from doing no QTc surveillance whatsoever and just accepting this potential tragic side effect as part of 'friendly fire,' to having ECG technicians going into the room of a patient with COVID-19 daily, exposing them to coronavirus and consuming personal protective equipment," says Dr. Ackerman. "Here Mayo Clinic has stepped forward to provide timely and critical guidance."

Guidelines for QTc monitoring during treatment

The antimalarial drugs chloroquine and hydroxychloroquine, as well as the HIV drugs lopinavir and ritonavir, all carry a known or possible risk of drug-induced ventricular arrhythmias and sudden cardiac death. Prior to starting treatment with these medications, it is important to get a baseline ECG to be able to measure changes. This starting point measurement could be from a standard 12-lead ECG, telemetry or a smartphone-enabled mobile ECG device. On Monday, March 20, the Food and Drug Administration (FDA) granted emergency approval of AliveCor's Kardia 6L mobile ECG device as the only FDA-approved mobile device for QTc monitoring with COVID-19.

The mobile device's ability to remotely provide the patient's heart rhythm and QTc value does not require an extra ECG technician to take the measurement in person, thus saving increased exposure to COVID-19 and the need for more personal protective equipment.

Using the algorithm developed by Dr. Ackerman and colleagues, the potential risk of drug-induced arrhythmias can be rated and used to modify treatment accordingly. For example, patients with a baseline QTc value greater than or equal to 500 milliseconds and those that experience an acute QTc reaction with a QTc greater than or equal to 60 milliseconds from baseline after starting treatment with one or more QTc-prolonging drugs are at greatest risk for drug-induced arrhythmias. Simple QTc countermeasures can be implemented for patients with a cautionary "red light" QTc status if the decision is made to proceed with the intended COVID-19 therapies.

Information guides decisions

There are a number of considerations around the use of off-label drugs to treat COVID-19. The drugs may or may not be available in large enough supply to treat a worldwide pandemic, even at the current compassionate use stage of testing.
It will take careful consideration of COVID-19 patients' circumstances for treating clinicians and patients to decide on the use of drugs or drug combinations that may treat their infection, but which potentially could cause harmful drug-induced side effects.

Dr. Ackerman says that patients under 40 with mild symptoms and a QTc greater than or equal to 500 milliseconds may choose to avoid treatment altogether, as the arrhythmia risk may far outweigh the risk of developing COVID-19-related acute respiratory distress syndrome. However, in COVID-19 patients with a QTc greater than or equal to 500 milliseconds who have progressively worsening respiratory symptoms or are at greater risk of respiratory complications due to advanced age, immunosuppression or having another high-risk condition, the potential benefit of QTc-prolonging medicines may exceed the arrhythmia risk.

"Importantly, the vast majority of patients ? about 90% ? are going to be QTc cleared with a 'green light go' and can proceed, being at extremely low risk for this side effect," says Dr. Ackerman.

Ultimately, the weighing of risks to benefits depends on whether hydroxychloroquine, with or without azithromycin, is truly an effective treatment against COVID-19.

"If it is, we hope that this simple QTc surveillance strategy, enabled by innovation and the FDA's emergency approval, will help prevent or at least significantly minimize drug-induced ventricular arrhythmias and sudden cardiac death, particularly if the treatment is widely adopted and used to treat COVID-19," says Dr. Ackerman.

Credit: 
Mayo Clinic

New dataset reveals trends in social scientists' congressional testimony

image: Testimony by discipline at Congress: 1946-2016.

Image: 
Maher et al, 2020

From 1946 to 2016, testimony from economists accounted for more than two thirds of all instances of U.S. congressional testimony delivered by social scientists. Thomas Maher of Purdue University, Indiana, and colleagues present these findings in the open-access journal PLOS ONE on March 25, 2020.

The U.S. Congress regularly invites stakeholders and experts to speak before lawmakers at congressional hearings as a central component of the legislation process. Social scientists are among those who may be invited to testify (testimonies by social scientists represent about 2% of all congressional testimonies from 1946-2016). However, the impact of their testimony is difficult to ascertain, in large part due to a lack of quantitative data on their appearances before Congress.

To address this data gap, Maher and colleagues analyzed the congressional record and compiled a new, publicly available dataset on testimony from social scientists between 1946 and 2016. They categorized social scientists into five major disciplines: economists, political scientists, sociologists, psychologists, and anthropologists.

The new dataset revealed 15,506 instances of testimony from social scientists, 10,834 of which were from economists. Testimony from economists occurred more than four times as often as testimony from political scientists, and more than 10 times as often as testimony from sociologists. Anthropologists had the lowest rate of testimony.

The researchers also examined the organizations represented by social scientists who delivered congressional testimony. Over the study period, they found an increase in the proportion of testimony from social scientists--especially political scientists--who represent think tanks, as opposed to academic institutions or other governmental or non-governmental organizations. Additionally, economists are most present at congressional hearings, while anthropologists and sociologists have a declining presence.

The new dataset could help inform research into the impact of social scientists' testimony on legislation, and why their testimony is increasingly associated with think tanks.

The authors add: Economists are invited to testify before Congress significantly more often than any other social scientist, and their dominance has held even as the sources of expertise have diversified with the growth of think tanks and industry positions.

Credit: 
PLOS

SLAC researcher discovers giant cavity in key tuberculosis molecule

video: Cornelius Gati and other researchers were studying a protein thought to be important for the progression of tuberculosis when he made a strange discovery, unlike anything scientists have seen before: a giant cavity that could transport a wide range of molecules in and out of tuberculosis bacteria.

Image: 
Olivier Bonin/SLAC National Accelerator Laboratory

Menlo Park, Calif. -- Researchers from the Department of Energy's SLAC National Accelerator Laboratory have discovered a strange new feature of a protein that's thought to be important in the development of tuberculosis: The protein contains a "huge" interior pocket, the likes of which has never before been seen, that appears capable of passing a wide range of other molecules into the bacterial cell.

Cornelius Gati, a structural biologist at SLAC, discovered the pocket while investigating the role this "transporter protein" on the surface of tuberculosis bacteria plays in sucking up vitamin B12 from surrounding cells. As far as anyone knew, transporter proteins that import molecules into cells tend to be quite specialized, with nooks and crannies tailored to grab onto particular molecules and move them into cells. This one, Gati found, was a generalist that could in principle bring in small nutrients, larger molecules like vitamin B12 or even some antibiotics.

In theory, the new findings could lead to new ways to treat tuberculosis, but the for the moment Gati and colleagues are simply trying to get a better handle on what the protein can and cannot transport - as well as what purpose such an odd protein might serve.

"We've never seen anything like this before," Gati said. "It doesn't really make sense."

The research, which Gati performed in collaboration with researchers at the University of Groningen, Stockholm University, and the Moscow Institute of Physics and Technology, was published March 25 in the journal Nature.

A still-deadly disease

Although tuberculosis is largely a thing of the past in the United States, it remains a serious public health threat in other parts of the world. There were 10 million new cases in 2018, and 1.5 million people died from tuberculosis that year alone, according to the World Health Organization. Worldwide, it remains one of the top 10 leading causes of death, the leading cause of death from infectious disease and the leading cause of death for people with HIV.

Yet Mycobacterium tuberculosis, the bacterium that causes tuberculosis, remains relatively poorly understood, as does the process of turning a tuberculosis infection into active disease. In the United States, for example, around 13 million people are infected with the bacteria, but only about one in 10 will ever actually develop the disease, and no one is quite sure why.

One clue to understanding the disease concerns the tuberculosis bacterium's uptake of vitamin B12, a step that seems to be crucial for the bacteria's survival and for the shift from TB infection to disease. How the bacteria imports the vitamin, however, was a mystery. Researchers could find no transporter protein in the bacterium's outer membrane that was dedicated specifically to vitamin B12. The one Gati and team studied had been linked via genetic studies to B12 uptake, but it was known for shuttling an entirely different class of molecules, including the antimicrobial bleomycin. Still, Gati and team knew that the protein and its connection to B12 was essential. "Without this transporter, tuberculosis bacteria cannot survive," Gati said.

A cryogenic magnifying glass

To get a handle on the transporter protein's structure, Gati turned to cryo-electron microscopy. Known as cryo-EM for short, the technique involves freezing molecules in place so that they can be studied in more or less their natural state under an electron microscope. Although the technique was first developed in the 1970s, a series of advances in the last few decades have made it more and more practical to use the technique to study biological molecules.

Still, when Gati took images of the transporter protein and analyzed the data, he was not entirely prepared for what it was about to show him. Rather than uncovering a hidden nook tailored to vitamin B12, cryo-EM revealed a cavity within the transporter roughly 8 cubic nanometers in size - a tiny volume by our everyday standards, but absolutely enormous in the context of transporter proteins. The pocket could easily fit a number of water molecules, vitamin B12 and perhaps many other molecules.

That generalist nature is particularly exciting, said Laura Dassama, a chemist at Stanford University and Stanford ChEM-H. "We have seen transporters that move a variety of drugs and molecules out of a cell, with little specificity, but not importers. If this is really an importer that can recognize and import multiple unrelated molecules, that would be fantastic" and might suggest a way to move antibiotics into the tuberculosis cell.

The million dollar question

Although the most tantalizing possibility is that the transporter protein discovery could lead to new treatments for the disease, Gati said the team still doesn't know what exactly their molecule can and cannot transport. While they have a sense of what can fit inside the cavity, for example, they still do not know what can get in and out. So far, the team has only been able to observe the cavity in its closed state. To figure out what can actually get into the cavity and back out again, the team needs to catch the cavity with its doors open.

Even then, the team will not know what the molecule actually does transport in practice. Future structural studies and biochemical screens, Gati said, could help answer those questions, although they will not be easy: Tuberculosis bacteria tend to grow and reproduce very slowly, which in turn hampers the methods scientists would normally use to study transporter molecules.

But even if Gati and his colleagues figure out exactly what their molecule is doing, there remain deeper questions: Why did nature cook up this molecule and its enormous interior cavity, why are such molecules so rare, and what purpose do they serve? On one hand, a cavity like the one the team has discovered is an "Achilles heel," particularly if it can help transport tuberculosis-killing antibiotics. On the other hand, it remains possible there is some evolutionary advantage to the structure.

"That is the million-dollar question," Gati said.

Credit: 
DOE/SLAC National Accelerator Laboratory

Too much salt weakens the immune system

image: (from left) Dr. Katarzyna Jobin, Natascha Ellen Stumpf, Melanie Eichler, Prof Dr. Christian Kurts, Olena Babyak and Mirjam Meissner.

Image: 
(c) Photo: Max Germer

A high-salt diet is not only bad for one's blood pressure, but also for the immune system. This is the conclusion of a current study under the leadership of the University Hospital Bonn. Mice fed a high-salt diet were found to suffer from much more severe bacterial infections. Human volunteers who consumed an additional six grams of salt per day also showed pronounced immune deficiencies. This amount corresponds to the salt content of two fast food meals. The results are published in the journal Science Translational Medicine.

Five grams a day, no more: This is the maximum amount of salt that adults should consume according to the recommendations of the World Health Organization (WHO). It corresponds approximately to one level teaspoon. In reality, however, many Germans exceed this limit considerably: Figures from the Robert Koch Institute suggest that on average men consume ten, women more than eight grams a day.

This means that we reach for the salt shaker much more than is good for us. After all, sodium chloride, which is its chemical name, raises blood pressure and thereby increases the risk of heart attack or stroke. But not only that: "We have now been able to prove for the first time that excessive salt intake also significantly weakens an important arm of the immune system," explains Prof. Dr. Christian Kurts from the Institute of Experimental Immunology at the University of Bonn.

This finding is unexpected, as some studies point in the opposite direction. For example, infections with certain skin parasites in laboratory animals heal significantly faster if these consume a high-salt diet: The macrophages, which are immune cells that attack, eat and digest parasites, are particularly active in the presence of salt. Several physicians concluded from this observation that sodium chloride has a generally immune-enhancing effect.

The skin serves as a salt reservoir

"Our results show that this generalization is not accurate," emphasizes Katarzyna Jobin, lead author of the study, who has since transferred to the University of Würzburg. There are two reasons for this: Firstly, the body keeps the salt concentration in the blood and in the various organs largely constant. Otherwise important biological processes would be impaired. The only major exception is the skin: It functions as a salt reservoir of the body. This is why the additional intake of sodium chloride works so well for some skin diseases.

However, other parts of the body are not exposed to the additional salt consumed with food. Instead, it is filtered out by the kidneys and excreted in the urine. And this is where the second mechanism comes into play: The kidneys have a sodium chloride sensor that activates the salt excretion function. As an undesirable side effect, however, this sensor also causes so-called glucocorticoids to accumulate in the body. And these in turn inhibit the function of granulocytes, the most common type of immune cell in the blood.

Granulocytes, like macrophages, are scavenger cells. However, they do not attack parasites, but mainly bacteria. If they do not do this to a sufficient degree, infections proceed much more severely. "We were able to show this in mice with a listeria infection," explains Dr. Jobin. "We had previously put some of them on a high-salt diet. In the spleen and liver of these animals we counted 100 to 1,000 times the number of disease-causing pathogens." Listeria are bacteria that are found for instance in contaminated food and can cause fever, vomiting and sepsis. Urinary tract infections also healed much more slowly in laboratory mice fed a high-salt diet.

Sodium chloride also appears to have a negative effect on the human immune system. "We examined volunteers who consumed six grams of salt in addition to their daily intake," says Prof. Kurts. "This is roughly the amount contained in two fast food meals, i.e. two burgers and two portions of French fries." After one week, the scientists took blood from their subjects and examined the granulocytes. The immune cells coped much worse with bacteria after the test subjects had started to eat a high-salt diet.

In human volunteers, the excessive salt intake also resulted in increased glucocorticoid levels. That this inhibits the immune system is not surprising: The best-known glucocorticoid cortisone is traditionally used to suppress inflammation. "Only through investigations in an entire organism were we able to uncover the complex control circuits that lead from salt intake to this immunodeficiency," stresses Kurts. "Our work therefore also illustrates the limitations of experiments purely with cell cultures."

Credit: 
University of Bonn

How robots can help combat COVID-19: Science Robotics editorial

Can robots be effective tools in combating the COVID-19 pandemic? A group of leaders in the field of robotics, including Henrik Christensen, director of UC San Diego's Contextual Robotics Institute, say yes, and outline a number of examples in an editorial in the March 25 issue of Science Robotics. They say robots can be used for clinical care such as telemedicine and decontamination; logistics such as delivery and handling of contaminated waste; and reconnaissance such as monitoring compliance with voluntary quarantines.

"Already, we have seen robots being deployed for disinfection, delivering medications and food, measuring vital signs, and assisting border controls," the researchers write.

Christensen, who is a professor in the Department of Computer Science and Engineering at UC San Diego, particularly highlighted the role that robots can play in disinfection, cleaning and telepresence.

Other co-authors include Marcia McNutt, president of the National Research Council and president of the National Academy of Sciences, as well as a number of other robotics experts from international and U.S. universities.

"For disease prevention, robot-controlled noncontact ultraviolet (UV) surface disinfection has already been used because COVID-19 spreads not only from person to person via close contact respiratory droplet transfer but also via contaminated surfaces," the researchers write.

"Opportunities lie in intelligent navigation and detection of high-risk, high-touch areas, combined with other preventative measures," the researchers add. "New generations of large, small, micro-, and swarm robots that are able to continuously work and clean (i.e., not only removing dust but also truly sanitizing/sterilizing all surfaces) could be developed."

In terms of telepresence, "the deployment of social robots can present unique opportunities for continued social interactions and adherence to treatment regimes without fear of spreading more disease," researchers write. "However, this is a challenging area of development because social interactions require building and maintaining complex models of people, including their knowledge, beliefs, emotions, as well as the context and environment of interaction."

"COVID-19 may become the tipping point of how future organizations operate," researchers add. "Rather than cancelling large international exhibitions and conferences, new forms of gathering--virtual rather than in-person attendance--may increase. Virtual attendees may become accustomed to remote engagement via a variety of local robotic avatars and controls."

"Overall, the impact of COVID-19 may drive sustained research in robotics to address risks of infectious diseases," researchers go on. "Without a sustainable approach to research and evaluation, history will repeat itself, and technology robots will not be ready ready to assist for the next incident."

Credit: 
University of California - San Diego

COVID-19 should be wake-up call for robotics research

PITTSBURGH--Robots could perform some of the "dull, dirty and dangerous" jobs associated with combating the COVID-19 pandemic, but that would require many new capabilities not currently being funded or developed, an editorial in the journal Science Robotics argues.

The editorial, published today and signed by leading academic researchers including Carnegie Mellon University's Howie Choset, said robots conceivably could perform such tasks as disinfecting surfaces, taking temperatures of people in public areas or at ports of entry, providing social support for quarantined patients, collecting nasal and throat samples for testing, and enabling people to virtually attend conferences and exhibitions.

In each case, the use of robots could reduce human exposure to pathogens -- which will become increasingly important as epidemics escalate.

"The experiences with the (2015) Ebola outbreak identified a broad spectrum of use cases, but funding for multidisciplinary research, in partnership with agencies and industry, to meet these use cases remains expensive, rare and directed to other applications," the researchers noted in the editorial.

"Without a sustainable approach to research, history will repeat itself, and robots will not be ready for the next incident," they added.

In addition to Choset, a professor in CMU's Robotics Institute and one of the founding editors of Science Robotics, the authors of the editorial include Marcia McNutt, president of the National Academy of Science; Robin Murphy of Texas A&M University; Henrik Christensen of the University of California, San Diego; and former CMU faculty member Steven Collins, now at Stanford University.

Choset stressed that the idea behind the editorial wasn't solely to prescribe how robots might be used in a pandemic.

"Rather, we hope to inspire others in the community to conceive of solutions to what is a very complicated problem," he explained.

Choset also emphasized that, like robots, artificial intelligence could help in responding to epidemics and pandemics. Researchers at Carnegie Mellon, for instance, are performing research to address humanitarian aid and disaster response. For that task, they envision a combination of AI and robotics technologies, such as drones. Human-robot interaction, automated monitoring of social media, edge computing and ad hoc computer networks are among the technologies they are developing.

Credit: 
Carnegie Mellon University