Tech

Study offers insight into biological changes among invasive species

image: A female Anolis maynardi

Image: 
Ruth Smith - University of Plymouth

A remote island in the Caribbean could offer clues as to how invasive species are able to colonise new territories and then thrive in them, a new study suggests.

Scientists from the University of Plymouth have recently completed extensive research into a lizard population on the Cayman Islands.

Up until the mid-1980s, there had never been a recorded sighting of the Maynard's Anole (Anolis maynardi) on Cayman Brac island despite it being less than 10km from its native territory, Little Cayman.

However, since the species was first discovered on Cayman Brac in 1987 - in what is thought to have been a human-assisted colonisation - its population has spread right across the 39km² island.

For this study, recent graduate Vaughn Bodden and Lecturer in Conservation Biology Dr Robert Puschendorf conducted a detailed analysis of the invasive species.

They wanted to assess whether individuals at the forefront of the invasion have developed distinct biological traits that are advantageous for dispersal, and compared their findings to animals in the area of first introduction and the native population on Little Cayman.

They discovered the Cayman Brac population has diverged morphologically from the native population, and within the invasive range there was trend of increasing forelimb length from the core to range edge areas. This ran contrary to the expected findings that longer hindlimbs would be the trait selected as a dispersal-related phenotype.

They also showed that the introduced population had lower levels of parasite prevalence, and that both males and females were of significantly higher body condition than the native population.

Writing in the Journal of Zoology, they say the results are a perfect example of how a species can colonise a new territory, and the biological adaptations it can make in order to do so.

Vaughn, who graduated with a First from the BSc (Hons) Conservation Biology programme in 2018, said: "There has been a history of lizard studies indicating that longer hindlimbs are an important factor affecting movement ability, so to not find longer hind limbed animals on the range edge was a surprise. For parasites, we found a clear decreasing trend in prevalence within the invasive population from the area of first introduction to the range edge, indicating that the parasites lag behind the host during periods of range expansion. We think our findings add to the growing body of literature that demonstrates the complex dynamics of species' invasions. The results highlight that the animals on the range edge of an invasion are likely to be experiencing different ecological selection pressures that can result in changes in behaviour, morphology, and health for the animals."

Dr Puschendorf has spent several years researching the consequences of emerging infectious diseases and climate change on biodiversity, with a particular focus on Central America. He added: "Biological invasions are an important conservation threat across the world. However, every invasion needs to be carefully investigated to identify impacts to native eco-systems and identify potential mitigation strategies. In this instance there is likely to be limited overlap with, and therefore a limited threat to, the endemic anole population - the Cayman Brac Anole (Anolis luteosignifer) - because one inhabit the crowns of trees while the other is found closer to the ground. This in some ways highlights the challenges biodiversity managers face when managing species invasions with limited resources, and emphasises the need for greater collaboration among scientific and policy communities."

Credit: 
University of Plymouth

Tracking records of the oldest life forms on Earth

image: Controversies have surrounded the rock from Greenland (lower right), and the significance of graphite (red) with apatite (turquoise) in such rocks, for the past 20+ years. Note the coarse-grain size of both highly metamorphosed banded iron formations. The thin sections (bottom) are paper-thin slices of the rock.

Image: 
Dominic Papineau Ph.D.

The discovery provides a new characteristic 'biosignature' to track the remains of ancient life preserved in rocks which are significantly altered over billions of years and could help identify life elsewhere in the Solar System.

The research, published in two papers - one in the Journal of the Geological Society and another in Earth and Planetary Science Letters - solves the longstanding problem of how scientists can track records of life on Earth in highly metamorphosed rocks more than 3,700 million years old, with organic material often turning into the carbon-based mineral graphite.

In the first study, published in Earth and Planetary Science Letters, the team analysed ten rock samples of banded iron formations (BIF) from Canada, India, China, Finland, USA and Greenland spanning over 2,000 million years of history.

They argue that carbon preserved in graphite-like crystals -'graphitic carbon'- located alongside minerals such as apatite, which our teeth and bones are made of, and carbonate, are the biosignatures of the oldest life forms on Earth.

"Life on Earth is all carbon-based and over time, it decomposes into different substances, such as carbonate, apatite and oil. These become trapped in layers of sedimentary rock and eventually the oil becomes graphite during subsequent metamorphism in the crust," explained Dr Dominic Papineau (UCL Earth Sciences, Center for Planetary Sciences and the London Centre for Nanotechnology).

"Our discovery is important as it is hotly debated whether the association of graphite with apatite is indicative of a biological origin of the carbon found in ancient rocks. We now have multiple strands of evidence that these mineral associations are biological in banded iron formations. This has huge implications for how we determine the origin of carbon in samples of extra-terrestrial rocks returned from elsewhere in the Solar System."

The team investigated the composition of BIF rocks as they are almost always of Precambrian age (4,600 million years old to 541 million years old) and record information about the oldest environments on Earth.

For this, they analysed the composition of rocks ranging from 1,800 million years old to more than 3,800 million years old using a range of methods involving photons, electrons, and ions to characterise the composition of graphite and other minerals of potential biogenic origin.

"Previously, it was assumed that finding apatite and graphite together in ancient rocks was a rare occurrence but this study shows that it is commonplace in BIF across a range of rock metamorphic grades," said team member Dr Matthew Dodd (UCL Earth Sciences and the London Centre for Nanotechnology).

The apatite and graphite minerals are thought to have two possible origins: mineralised products of decayed biological organic matter, which includes the breakdown of molecules in oil at high temperatures, or formation through non-biological reactions which are relevant to the chemistry of how life arose from non-living matter.

By showing evidence for the widespread occurrence of graphitic carbon in apatite and carbonate in BIF along with its carbon-isotope composition, the researchers conclude that the minerals are most consistent with a biological origin from the remains of Earth's oldest life forms.

To investigate the extent to which high-temperature metamorphism causes a loss in molecular, elemental and isotope signatures from biological matter in rocks, they analysed the same minerals from a 1,850 million year old BIF rock in Michigan which had metamorphosed in 550 degree Celsius heat.

In this second study, published today in Journal of the Geological Society, the team show that several biosignatures are found in the graphitic carbon and the associated apatite, carbonate and clays.

They used a variety of high-tech instruments to detect traces of key molecules, elements, and carbon isotopes of graphite and combined this with several microscopy techniques to study tiny objects trapped in rocks which are invisible to the naked eye.

Together, all of their observations of the composition are consistent with an origin from decayed biomass, such as that of ancient animal fossils in museums, but which has been strongly altered by high temperatures.

"Our new data provide additional lines of evidence that graphite associated with apatite in BIF is most likely biological in origin. Moreover, by taking a range of observations from throughout the geological record, we resolve a long-standing controversy regarding the origin of isotopically light graphitic carbon with apatite in the oldest BIF," said Dr Papineau.

"We've shown that biosignatures exist in highly metamorphosed iron formations from Greenland and northeastern Canada which are more than 3,850 million years old and date from the beginning of the sedimentary rock record."

Credit: 
University College London

Laying the ground for robotic strategies in environmental protection

image: The robot is designed by Wyss Institute researchers to drive interlocking sheet piles into granular soils like sand on a beach.

Image: 
Wyss Institute at Harvard University

(CAMBRIDGE, Mass.) -- Along developed riverbanks, physical barriers can help contain flooding and combat erosion. In arid regions, check dams can help retain soil after rainfall and restore damaged landscapes. In construction projects, metal plates can provide support for excavations, retaining walls on slopes, or permanent foundations. All of these applications can be addressed with the use of sheet piles, elements folded from flat material and driven vertically into the ground to form walls and stabilize soil. Proper soil stabilization is key to sustainable land management in industries such as construction, mining, and agriculture; and land degradation, the loss of ecosystem services from a given terrain, is a driver of climate change and is estimated to cost up to $10 trillion annually.

With this motivation, a team of roboticists at Harvard's Wyss Institute for Biologically Inspired Engineering has developed a robot that can autonomously drive interlocking steel sheet piles into soil. The structures that it builds could function as retaining walls or check dams for erosion control. The study will be presented at the upcoming 2019 IEEE International Conference on Robotics and Automation.

Conventional sheet pile driving processes are extremely energy intensive. Only a fraction of the weight of typical heavy machinery is used for applying downward force. The Wyss team's "Romu" robot, on the other hand, is able to leverage its own weight to drive sheet piles into the ground. This is made possible by each of its four wheels being coupled to a separate linear actuator, which also allows it to adapt to uneven terrain and ensure that piles are driven vertically. From a raised position, Romu grips a sheet pile and then lowers its chassis, pressing the pile into the soil with the help of an on-board vibratory hammer. By gripping the pile again at a higher position and repeating this process, the robot can drive a pile much taller than its own range of vertical motion. After driving a pile to sufficient depth, Romu advances and installs the next pile such that it interlocks with the previous one, thereby forming a continuous wall. Once it has used all of the piles it carries, it may return to a supply cache to restock.

The study grew out of previous work at the Wyss Institute on teams or swarms of robots for construction applications. In work inspired by mound-building termites, Core Faculty member Radhika Nagpal and Senior Research Scientist Justin Werfel designed an autonomous robotic construction crew called TERMES, whose members worked together to build complex structures from specialized bricks. Further work by Werfel and researcher Nathan Melenbrink explored strut-climbing robots capable of building cantilevering truss structures, addressing applications like bridges. However, neither of these studies addressed the challenge of anchoring structures to the ground. The Romu project began as an exploration of methods for automated site preparation and installation of foundations for the earlier systems to build on; as it developed, the team determined that such interventions could also be directly applicable to land restoration tasks in remote environments.

"In addition to tests in the lab, we demonstrated Romu operating on a nearby beach," said Melenbrink. "This kind of demonstration can be an icebreaker for a broader conversation around opportunities for automation in construction and land management. We're interested in engaging with experts in related fields who might see potential benefit for the kind of automated interventions we're developing."

The researchers envision large numbers of Romu robots working together as a collective or swarm. They demonstrated in computer simulations that teams of Romu robots could make use of environmental cues like slope steepness in order to build walls in effective locations, making efficient use of limited resources. "The swarm approach gives advantages like speedup through parallelism, robustness to the loss of individual robots, and scalability for large teams," said Werfel. "By responding in real-time to the conditions they actually encounter as they work, the robots can adapt to unexpected or changing situations, without needing to rely on a lot of supporting infrastructure for abilities like site surveying, communication, or localization."

"The name Terramanus ferromurus (Romu) is a nod to the concept of 'machine ecology' in which autonomous systems can be introduced into natural environments as new participants, taking specific actions to complement and promote human environmental stewardship," said Melenbrink. In the future, the Terramanus "genus" could be extended by additional robots carrying out different tasks to protect or restore ecosystem services. Based on their findings, the team now is interested in investigating interventions ranging from groundwater retention structures for supporting agriculture in arid regions, to responsive flood barrier construction for hurricane preparedness. Future versions of the robot could perform other interventions such as spraying soil-binding agents or installing silt fencing, such that a family of these robots could act to stabilize soil in a wide range of situations.

In many scenarios for environmental protection or restoration, the opportunity for action is limited by the availability of human labor and by site access for heavy machinery. Smaller, more versatile construction machines could provide a solution. "Clearly, the needs of many degraded landscapes are not being met with the currently available tools and techniques," said Melenbrink. "Now, 100 years after the dawn of the heavy equipment age, we're asking whether there might be more resilient and responsive ways to approach land management and restoration."

"This sheet pile driving robot with its demonstrated ability to perform in a natural setting signals a path on which the Wyss Institute's robotics and swarm robotics capabilities can be brought to bear on both natural and man-made environments where conventional machinery, man power limitations, or cost is inadequate to prevent often disastrous consequences. This robot also could address disaster situations where walling off dangerous chemical spills or released radioactive fluids makes it difficult or impossible for humans to intervene," said Wyss Institute Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at HMS and the Vascular Biology Program at Boston Children's Hospital, as well as Professor of Bioengineering at SEAS.

Credit: 
Wyss Institute for Biologically Inspired Engineering at Harvard

Current methods may inadequately measure health impacts from oil, natural gas extraction

FINDINGS

An examination of peer-reviewed studies published over six years on hazardous air pollutants associated with the extraction of oil and natural gas finds that measurements of hazardous air pollutant concentrations near operational sites have generally failed to capture levels above standard health benchmarks; yet, the majority of studies continue to find poor health outcomes increasing as distance from these operations decreases.

While it is unclear why there is a gap in the evidence between environmental sampling and health-based studies, the current review provides insights into methodological shortcomings that may help explain this discrepancy. Authors state that current health benchmarks may not provide accurate risk estimates from the broad range of pollutants associated with oil and natural gas development, and fail to adequately address potential risks associated with long-term, chronic, lower levels of exposure or from a mixture of chemicals. Further, a failure of sampling methods to properly account for degradation and dispersion of pollutants, or inappropriate sampling timeframes that may not capture peak emission periods that are characteristic of oil and natural gas extraction, may also contribute to the current gap in the literature.

The authors call for additional investigations of emissions using measurements and research that incorporate appropriate timeframes and proximity to oil and gas extraction on health impacts from chronic, low-level ambient hazardous air pollutant exposures, among others.

BACKGROUND

Energy demands have increased over several decades as technical innovations have led to more extraction of oil and natural gas, making the United States one of the world's leading producers of petroleum and natural gas hydrocarbons. Several hazardous air pollutants such as benzene, toluene, and ethyl-benzene are listed by the Environmental Protection Agency as known or suspected carcinogens, or that carry other health effects, have been measured at elevated concentrations around oil and natural gas extraction sites.

METHOD

The researchers reviewed 37 peer-reviewed journal articles published between Jan. 1, 2012 and Feb. 28, 2018. One focused on Poland and the rest on the U.S.

IMPACT

This review will help guide future research on air quality near oil and natural gas development sites by highlighting future research priorities. It may also bring insights into possible exposures of communities near oil and natural gas development and storage sites such as Aliso Canyon in Los Angeles' Porter Ranch, where there was a major methane leak that affected the community.

Credit: 
University of California - Los Angeles Health Sciences

Food additive may influence how well flu vaccines work

EAST LANSING, Mich. - Michigan State University scientists have linked a common food preservative to an altered immune response that possibly hinders flu vaccines.

The study conducted in mice, presented at the 2019 Experimental Biology meeting in Orlando, Fla., April 7 at 9 a.m., offers up a new potential factor in vaccine effectiveness.

Tert-butylhydroquinone, or tBHQ, can be found in several food products including cooking oils, frozen meats (especially fish) and processed foods such as chips and crackers. Products don't always have to include it on ingredient lists.

"If you get a vaccine, but part of the immune system doesn't learn to recognize and fight off virus-infected cells, then this can cause the vaccine to be less effective," said Robert Freeborn, a fourth-year doctoral student who led the study with Cheryl Rockwell, an associate professor in pharmacology and toxicology. "We determined that when tBHQ was introduced through the diet, it affected certain cells that are important in carrying out an appropriate immune response to the flu."

Using various flu strains including H1N1 and H3N2, Freeborn and Rockwell focused on CD4 and CD8 T cells and incorporated tBHQ into the food of mice in an amount comparable to human consumption.

"CD4 T cells are like movie directors that tell everyone else what to do," Freeborn said. "The CD8 T cells are the actors that do what the director wants."

The researchers looked at several response factors including whether the T cells showed up, were able to do the right job and ultimately, recognize and remember the invading virus.

"Overall, we saw a reduced number of CD8 T cells in the lung and a reduction in the number of CD4 and CD8 T cells that could identify the flu virus in the mice that were exposed to tBHQ," Freeborn said. "These mice also had widespread inflammation and mucus production in their lungs."

TBHQ also slowed down the initial activation of T cells, reducing their ability to fight off an infection sooner. This allowed the virus to run rampant in the mice until the cells fully activated.

A second phase of the study showed the additive hindered the immune system's ability to remember how to respond to the flu virus, particularly when another strain was introduced at another time. This resulted in a longer recovery and additional weight loss in the mice.

"It's important for the body to be able to recognize a virus and remember how to effectively fight it off," Freeborn said. "That's the whole point of vaccines, to spur this memory and produce immunity. TBHQ seems to impair this process."

Credit: 
Michigan State University

Experimental drug shows promise for opioid withdrawal symptoms

Orlando, Fla. (April 7, 2019) - While medicines are available to relieve withdrawal symptoms in people recovering from opioid addiction, they cause side effects and can maintain the brain changes that led to addiction in the first place, which can lead to relapse before treatment is completed. New research offers hope that a better solution may be on the horizon. Rapastinel, an experimental drug originally developed as an antidepressant, substantially reversed acute signs of opioid withdrawal in rats in just three days.

The findings suggest rapastinel could be useful to help manage withdrawal during the critical first days after someone has entered treatment and is trying to abstain from opioid use, according to researchers.

"We have found that rapastinel has potential as a new treatment for opioid dependence, as it is effective in reducing withdrawal signs and has not been shown to produce any negative side effects," said Julia Ferrante, an undergraduate at Villanova University who conducted the research with Cynthia M. Kuhn, PhD, professor of pharmacology and cancer biology at Duke University. "By reducing withdrawal symptoms, the patient feels less discomfort during treatment, and we hypothesize this would lead to a decreased risk of relapse."

Ferrante will present the research at the American Society for Pharmacology and Experimental Therapeutics annual meeting during the 2019 Experimental Biology meeting, held April 6-9 in Orlando, Fla.

"Our research suggests that new alternatives to standard treatments for opioid dependence have potential to be safer and more effective," Ferrante added. "Rapastinel research for opioid dependency is currently only being done in rodents, but if the drug continues to have successful trials, it may enter clinical trials for use in humans."

Buprenorphine and methadone, the most common drugs used to help people quit opioid abuse, are problematic because they are themselves opioids and can be addictive, have unpleasant and sometimes dangerous side effects and often must be used for months to avoid relapse. Ketamine, which has been proposed as an alternative, non-opioid treatment for opioid withdrawal, also has the potential for abuse and can cause hallucinations and other negative side effects.

Rapastinel, developed as an antidepressant, binds to the same receptor as ketamine but at a different site, where it has a milder effect. While a clinical trial recently concluded rapastinel is not effective against depression, trials have shown it is well tolerated and has no serious side effects.

In the new study, Ferrante and Kuhn modeled opioid dependence in rats and then tracked signs of withdrawal in groups of rats given either rapastinel, ketamine or a saline solution. On the third day, rats given rapastinel showed significantly fewer signs of withdrawal than rats given either ketamine or saline, which showed roughly equal amounts of withdrawal signs.

To move toward clinical trials in humans, researchers will continue to investigate rapastinel's effects on a molecular level and study whether the drug can reduce the likelihood of relapse. If approved for treating opioid dependence, rapastinel would likely be administered intravenously, possibly in an outpatient setting, Ferrante said. It is unknown how long patients would need to use rapastinel to ensure complete recovery from opioid dependence.

Julia Ferrante will present this research on Sunday, April 7, from 9 a.m.-4 p.m. in Exhibit Hall-West Hall B, Orange County Convention Center (abstract). Contact the media team for more information or to obtain a free press pass to attend the meeting.

Credit: 
Experimental Biology

Mystery of negative capacitance in perovskite solar cells solved

On the verge of outcompeting current thin-film solar cells, perovskite solar cells seem to embody an ideal solar cell: highly efficient and low-cost - if there was not the issue of a weak long-term stability, which remains a challenge. Related to this are peculiar phenomena occurring in perovskite materials and devices, where very slow microscopic processes can furnish them with a kind of "memory effect".

For instance, measuring the efficiency of a perovskite solar cell can depend on things like how long the device is illuminated prior to measurement or how the voltage was applied. A few years ago, this effect, known as current-voltage hysteresis, led to disputes on how to accurately determine the efficiency of perovskites. Another example of these obscure processes is a (partial) recovery of a previously degraded solar cell during day-night cycling.

Such effects are a concern when measuring the solar cells' performance as a function of frequency, which is a typical measurement for characterizing these devices in more detail (impedance spectroscopy). They lead to large signals at low frequencies (Hz to mHz) and giant capacitance values for the (mF/cm2), including strange, "unphysical" negative values that are still a puzzle to the research community.

Now, chemical engineers from the lab of Anders Hagfeldt at EPFL have solved the mystery. Led by Wolfgang Tress, a scientist in Hagfeldt's lab, they found that the large perovskite capacitances are not classical capacitances in the sense of charge storage, but just appear as capacitances because of the cells' slow response time.

The researchers show this by measurements in the time domain and with different voltage scan rates. They find that the origin of the apparent capacitance is a slow modification of the current passing the contact of the solar cells, which is regulated by a slow accumulation of mobile ionic charge. A slowly increasing current appears like a negative capacitance in the impedance spectra.

The work sheds light onto the interaction between the photovoltaic effect in these devices and the ionic conductivity of perovskite materials. Gaining such in-depth understanding contributes to the endeavor to tailored, stable perovskite solar cells.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Screen time -- even before bed -- has little impact on teen well-being

Data from more than 17,000 teenagers show little evidence of a relationship between screen time and well-being in adolescents. The study, published in Psychological Science, a journal of the Association for Psychological Science, casts doubt on the widely accepted notion that spending time online, gaming, or watching TV, especially before bedtime, can damage young people's mental health.

"Implementing best practice statistical and methodological techniques we found little evidence for substantial negative associations between digital-screen engagement and adolescent well-being," said Amy Orben, a Researcher at the Oxford Internet Institute (OII) and College Lecturer at the Queen's College, University of Oxford.

"While psychological science can be a powerful tool for understanding the link between screen use and adolescent well-being, it still routinely fails to supply stakeholders and the public with high-quality, transparent, and objective investigations into growing concerns about digital technologies. Analyzing three different datasets, which include improved measurements of screen time, we found little clear-cut evidence that screen time decreases adolescent well-being, even if the use of digital technology occurs directly before bedtime," said Professor Andrew Przybylski, Director of Research at the OII and coauthor on the study.

The full research article is available online.

The research found that adolescents' total screen time per day had little impact on their mental health, both on weekends and weekdays. It also found that the use of digital screens 2 hours, 1 hour, or 30 minutes before bedtime didn't have clear associations with decreases in adolescent well-being, even though this is often taken as a fact by media reports and public debates.

Unlike other studies, the Oxford research analyzed data from Ireland, the US, and the UK to support its conclusions. The researchers used a rigorous methodology to gather how much time an adolescent spends on screens per day, including both self-reported measures and time-use diaries. This is important as many studies are based solely on self-reported digital technology use, even though recent work found only one third of participants give accurate accounts of how much time they spend online when asked after the fact.

The researchers were also able to create a comprehensive picture of teens' well-being, examining measures of psychosocial functioning, depression symptoms, self-esteem, and mood, with data provided by both young people and their caregivers.

Additionally, the final of the three studies conducted was preregistered, meaning that the researchers publicly documented the analyses they would run before they analyzed the data. This prevents hypothesizing after the results are known, a challenge for controversial research topics.

"Because technologies are embedded in our social and professional lives, research concerning digital-screen use and its effects on adolescent well-being is under increasing scrutiny," said Orben. "To retain influence and trust, robust and transparent research practices will need to become the norm--not the exception. We hope our approach will set a new baseline for new research on the psychological study of technology," added Przybylski.

The insights come days ahead of the anticipated release of the UK government's new White Paper on Online Harms, which is expected to set out plans for legislation governing social media companies. This new study builds on previous work by Orben and Przybylski that used novel and transparent statistical approaches to show that technology use has a minuscule influence on adolescent well-being.

The study used data from Ireland, the US, and the UK. In Ireland, it covered 5,363 young people tracked under the Growing Up in Ireland project. In the US, the data covered 709 subjects of a variety of ages compiled by the United States Panel Study of Income Dynamics. And in the UK, the dataset included responses from 11,884 adolescents and their caregivers surveyed as part of the Millennium Cohort Study.

Credit: 
Association for Psychological Science

SwRI engineers develop novel techniques to trick object detection systems

image: Many of today's vehicles use object detection systems to help avoid collisions. SwRI engineers developed unique patterns that can trick these systems into seeing something else, seeing the objects in another location or not seeing the objects at all. In this photo, the object detection system sees a person rather than a vehicle. This research will allow engineers to thoroughly test object detection systems and improve the security of the deep-learning algorithms they use.

Image: 
Southwest Research Institute

SAN ANTONIO -- April 4, 2019 -- New adversarial techniques developed by engineers at Southwest Research Institute can make objects "invisible" to image detection systems that use deep-learning algorithms. These techniques can also trick systems into thinking they see another object or can change the location of objects. The technique mitigates the risk for compromise in automated image processing systems.

"Deep-learning neural networks are highly effective at many tasks," says Research Engineer Abe Garza of the SwRI Intelligent Systems Division. "However, deep learning was adopted so quickly that the security implications of these algorithms weren't fully considered."

Deep-learning algorithms excel at using shapes and color to recognize the differences between humans and animals or cars and trucks, for example. These systems reliably detect objects under an array of conditions and, as such, are used in myriad applications and industries, often for safety-critical uses. The automotive industry uses deep-learning object detection systems on roadways for lane-assist, lane-departure and collision-avoidance technologies. These vehicles rely on cameras to detect potentially hazardous objects around them. While the image processing systems are vital for protecting lives and property, the algorithms can be deceived by parties intent on causing harm.

Security researchers working in "adversarial learning" are finding and documenting vulnerabilities in deep- and other machine-learning algorithms. Using SwRI internal research funds, Garza and Senior Research Engineer David Chambers developed what look like futuristic, Bohemian-style patterns. When worn by a person or mounted on a vehicle, the patterns trick object detection cameras into thinking the objects aren't there, that they're something else or that they're in another location. Malicious parties could place these patterns near roadways, potentially creating chaos for vehicles equipped with object detectors.

"These patterns cause the algorithms in the camera to either misclassify or mislocate objects, creating a vulnerability," said Garza. "We call these patterns 'perception invariant' adversarial examples because they don't need to cover the entire object or be parallel to the camera to trick the algorithm. The algorithms can misclassify the object as long as they sense some part of the pattern."

While they might look like unique and colorful displays of art to the human eye, these patterns are designed in such a way that object-detection camera systems see them very specifically. A pattern disguised as an advertisement on the back of a stopped bus could make a collision-avoidance system think it sees a harmless shopping bag instead of the bus. If the vehicle's camera fails to detect the true object, it could continue moving forward and hit the bus, causing a potentially serious collision.

"The first step to resolving these exploits is to test the deep-learning algorithms," said Garza. The team has created a framework capable of repeatedly testing these attacks against a variety of deep-learning detection programs, which will be extremely useful for testing solutions.

SwRI researchers continue to evaluate how much, or how little, of the pattern is needed to misclassify or mislocate an object. Working with clients, this research will allow the team to test object detection systems and ultimately improve the security of deep-learning algorithms.

Credit: 
Southwest Research Institute

Study: Protein key to charcot-marie-tooth, other nerve diseases

image: This is a multi-color image of the human brain.

Image: 
National Institute of Mental Health, National Institutes of Health

LOS ANGELES (April 4, 2019) -- A new study provides critical insight into a little-known, yet relatively common, inherited neurological condition called Charcot-Marie-Tooth disease. The findings point to a pathway to possible treatments for this disease and better understanding of other neurodegenerative disorders, including Alzheimer's disease, that affect millions.

The study focused on two related proteins, MFN2 and MFN1, found on the outer membranes of mitochondria -- structures inside the body's cells that act as powerhouses by converting food into energy. Mitochondria play an especially critical role in nerve cells. Previous research has shown that mutated MFN2 causes mitochondria to malfunction in a common type of Charcot-Marie-Tooth disease -- CMT type 2A.

The new research, published in the April 1 issue of the Journal of Clinical Investigation, showed that increasing levels of MFN1 to counterbalance mutated MFN2 reduced symptoms of CMT type 2A and neurodegeneration in laboratory mice.

The multi-institutional study was co-led by Robert Baloh, MD, PhD, professor of Neurology, Ben Winters Chair in Regenerative Medicine and director of Cedars-Sinai Center for Neural Science and Medicine; and Yueqin Zhou, PhD, a postdoctoral researcher in his laboratory.

Charcot-Marie-Tooth disease affects an estimated 150,000 people in the U.S., according to the National Institutes of Health. It typically cause weakness, numbness, muscle cramps and movement problems in legs and arms. The CMT type 2A form of the disease also may cause wasting of the optic nerve, spinal cord damage leading to difficulty walking, hearing loss, developmental delay and changes in vital tissues of the brain known as white matter.

Despite the fact that mutated MFN2 can be expressed in every cell in the body, CMT type 2A primarily affects the nervous system. This is because levels of MFN1 are particularly low in brain cells, and restoring those levels can improve mitochondrial function. That fact is significant, Baloh said, "because findings about CMT2A can go beyond just a single disease. The hope is that similarly increasing MFN1 potentially could treat other neurodegenerative diseases that also involve mitochondrial dysfunction."

These other diseases include Alzheimer's disease, Parkinson's disease, and amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease, which all have devastating consequences. Collectively, these three diseases are believed to affect about 7 million people in the U.S. Despite much research, the causes of these disorders and Charcot-Marie-Tooth disease remain elusive.

Because of their relevance to CMT type 2A and other neurodegenerative conditions, mitochondrial proteins have been the focus of intense study in recent years. Previous laboratory studies showed that the protein MFN1 could compensate for the loss of function of mutated MFN2. The new study advances these findings by testing the approach in laboratory mice.

To perform this experiment, the investigators incorporated a human gene with the mutation that causes the disease into the genome of the mouse. This technique allowed them to study CMT type 2A over the lifetime of the lab animal.

Mice with the mutated gene developed symptoms of CMT type 2A. Importantly, when levels of MFN1 or normal MFN2 were increased in mice with CMT type 2A, the disease process almost completely stopped. "It appears that MFN1 helps take over the work of the disabled, mutated protein in mice," Baloh said.

This finding raises the possibility that increasing levels of MFN1 using gene therapy or other approaches might in the future be used to treat patients with CMT type 2A and also other neurodegenerative diseases that involve mitochondrial dysfunction, he added.

Credit: 
Cedars-Sinai Medical Center

See and be seen

image: Schematic of the experimental setup: Depending on whether the coated glass beads have many or few neighbors within their field of vision (red), they are either illuminated by a laser beam or not. Researchers can use such an experiment to investigate the effects of visual information on the collective behavior and swarming of swimming microparticles.

Image: 
Noemi Furlani

Birds, fish and bacteria often gather into groups or swarms. This so-called collective behaviour requires all group members to continuously and reciprocally adapt their movements. It can be a challenging task, however, for researchers to ascertain the specific environmental stimuli that individuals respond to within the context of their group; in addition to optical and acoustic information, flow resistances or chemical messengers can also play a role. By designing experiments with artificial microswimmers, physicists at the University of Konstanz were able to show that the formation of stable groups requires only few skills: forward visual perception over large distances and regulation of the speed according to the number of perceived individuals. In addition to providing more insight into collective phenomena, their findings can also be used for research on autonomous systems. The results of their study were published in the current issue of the journal Science.

The ability to gather into compact swarms or groups is an effective skill that allows individuals to evade predators, find food or efficiently travel long distances. To begin to understand how swarms form, the following questions must be answered: What information does an individual perceive within its environment? And how does this individual then adapt its movement in response to such environmental stimuli? The so-called Vicsek model proposes that individual group members adjust their movement direction to that of the surrounding individuals. Additionally, there must be an attraction between the group members. If one of these two conditions (orientation or attraction) is not met, the swarm becomes unstable and disperses.

A more simple and robust rule

As a result of their recent experiments, Clemens Bechinger, professor at the Department of Physics at the University of Konstanz, and his colleagues have discovered a much simpler and remarkably robust rule with which individuals spontaneously form a stable group: It only requires that individuals have a forward and long range vision, a basic ability of many living organisms. Each individual determines the number of peers visible in its own filed of view. If this number reaches a certain threshold value, the particle begins to swim forward; otherwise its movements are entirely random. Here, it is not necessary for the individual to identify the exact locations of its neighbours. It must simply perceive them within its field of vision.

Instead of working with living organisms, the physicists use artificial microswimmers suspended in a liquid. These consist of glass beads with diameters of a few micrometers coated on one side with a thin layer of carbon. By illuminating them with a focused laser spot, the carbon absorbs the light, causing the beads to heat up unevenly. The temperature gradient generates a fluid flow at the surface of the bead, which starts swimming like a bacterium. This situation is comparable to a rotating ship propeller, which pushes water away, thus moving the vessel forwards.

To equip these microswimmers with a field of vision, the researchers use a trick: With the help of a computer, the positions and orientations of all glass particles are continuously monitored. This allows the researchers to determine the number of a particle's neighbours within a fixed angular range, which corresponds to the particle's field of vision. If this number exceeds a prescribed threshold value, a focused laser beam briefly illuminates the respective particle, causing it to perform a swimming motion. If, however, the number of particles remains below the threshold value, the corresponding particle is not illuminated by a laser beam, allowing the particle to undergo undirected and diffusive movements. Since this process is carried out several times a second, each microswimmer is induced to dynamically and continuously react to the slightest changes in its environment, just like a fish within its school. Using this procedure, the researchers observed that the particles spontaneously formed an artificial swarm.

Perceived information can be controlled in a precise manner

By adapting these "artificial organisms" for their research purposes, the physicists are not only able to precisely determine the information that individual group members perceive within their environment, they can also observe how changes in perception affect their collective behaviour. Modifying either their field of vision or perception threshold changes the respective level of group formation and cohesion. The physicists thus created particles with the broad field of view of herbivores and found that they stay together only by lowering their threshold of reaction. On other words, herbivores need to keep a close eye at each other in order to stay within their protective group. With their simple model, it is also explained how the narrow vision of predators is an advantage to detect the presence of preys over long distances.

Another important research finding is that gregarious individuals, in principle, do not have to adapt their velocity direction or gather information about the speed of their neighbours. From a control system point of view, this is extremely advantageous since minimal sensory and cognitive resources are required for such behaviour. This aspect might also prove useful for future applications, where, for example, millions of autonomous microrobots with limited computing capacity are expected to perform complex tasks. To ensure that such tasks are successfully carried out, they must be able to organise themselves and coordinate their behaviour. These abilities will also ensure that groups can master unforeseen situations, such as when schools of fish successfully evade an attacker.

Credit: 
University of Konstanz

Researchers have invented a quieter airplane toilet

Airplane toilets are loud. For some, they are downright terrifying. But chin up, frequent flyers, because a group of Brigham Young University physicists have figured out how to make them quieter.

After two years of trial and error, three academic publications and thousands of flushes, the BYU researchers have invented a vacuum-assisted toilet that is about half as loud as the regular airplane commode.

"People have told us they don't want their kids to be scared to use the bathroom on a flight," said lead researcher Kent Gee, BYU professor of physics. "So, we've used good physics to solve the problem."

It's been a really hard problem to solve, given the industry hasn't been able to improve vacuum-assisted toilets over the last 25 years. That's because getting airplane toilets to flush with very little water requires a partial vacuum, which at 38,000 feet, pulls air at nearly half the speed of sound. (According to research done in Gee and Scott Sommerfeldt's lab, an air-water mix in vacuum-assisted toilets travels more than 300 miles per hour.) When things move at that speed, any disturbance at all to the flow -- like the bend of a pipe or a valve -- generates significant noise.

And now that newer airplanes come with much quieter interiors, toilet flushes reverberate much more throughout the cabin. It can make for very patchy sleep on a red-eye flight on a plane like the Airbus A380 that can have as many as 20 toilets.

"Airline companies have always had standards for the toilet noise, but they've never met those and there has never been much pressure to do so," Sommerfeldt said. "Now with the reduced cabin sound levels, the sound of the toilet flushing is more noticeable and customers are pushing back."

To solve the problem, the BYU team focused on three valve conditions during the flush cycle: the initial noise level peak associated with the flush valve opening, an intermediate noise level plateau associated with the valve being fully opened and the final noise level peak associated with the flush valve closing. The researchers added additional piping to increase the distance between the toilet bowl and the flush valve and made the pipe attachment at the bowl more of a gradual bend as opposed to a sharp 90-degree angle. Tests of the new contraption show aeroacoustically-generated noise dropped up to 16 decibels during the flush valve opening and about 5 to 10 decibels when the valve is fully opened.

"It's a great mix between physics and engineering," said grad student Michael Rose, lead author on the team's most recent vacuum-assisted tech publication in Proceedings of Meetings on Acoustics. "The toilet is much quieter and now kids won't think they're going to get sucked out."
Along with Scott Thomson, professor of mechanical engineering, the researchers have already filed three patents on the new toilet and are now working with an industry partner to bring it to market. Part of the lure of the BYU invention is that it works with existing airplane toilets -- only the elbow need be removed during a retrofit, while the valve and the bowl stay where they are.

The vacuum-assisted tech could also be used for toilets on cruise ships and trains and even in some new green building projects where housing units are looking at more and more ways to reduce water usage.

"At the end of the day, this is about using science to improve a user experience," Gee said. "It's an important part of making flights more comfortable for customers."

Credit: 
Brigham Young University

Robots to autocomplete Soldier tasks, new study suggests

image: Army researchers are looking for ways to use brain data in the moment to indicate specific tasks Soldiers are performing. This knowledge, they say, will better enable AI to dynamically respond and adapt to assist the Soldier in completing the task.

Image: 
US Army graphic

Smart phones autocorrect in texting, search engines autocomplete queries, and mapping applications redirect navigation in real-time to avoid slowed traffic. These ubiquitous AI-based technologies adapt to everyday needs and learn user habits by focusing on making the algorithm better, but Army researchers want to enhance AI by providing more information about the intent of the user.

New research published in Science Advances today looks at Soldier brain activity during specific tasks for ways to incorporate AI teaming to dynamically complete tasks.

The Army envisions a future battlefield wrought with teams of Soldiers and autonomous systems, and as part of this future vision, the Army is looking to create technologies that can predict states and behaviors of the individual to create a more optimized team, said Dr. Jean Vettel, a senior neuroscientist at the Combat Capabilities Development Command Army Research Laboratory, the Army's corporate research laboratory also known as ARL.

Recent collaborative work between ARL and the University at Buffalo is looking at ways the dynamics and architecture of the human brain may be coordinated to predict such behaviors and consequently optimize team performance.

"While this research focuses on a single person, the purpose is to understand how an individual's brain activity can be used to create novel strategies for a teaming environment, both for teams with Soldiers as well as teams with Autonomy" said Vettel, a co-author of the recent paper.

"In military operations, Soldiers perform multiple tasks at once. They're analyzing information from multiple sources, navigating environments while simultaneously assessing threats, sharing situational awareness, and communicating with a distributed team. This requires Soldiers to constantly switch among these tasks, which means that the brain is also rapidly shifting among the different brain regions needed for these different tasks," Vettel said. "If we can use brain data in the moment to indicate what task they're doing, AI could dynamically respond and adapt to assist the Soldier in completing the task."

To achieve this future capability, the researchers first sought to understand how the brain coordinates its different regions while executing a particular task. They used a computational approach to understand how this may be characterized to inform the behavioral prediction.

To complete the study, researchers mapped how different regions of the brain were connected to one another in 30 different people via tracts of tissue called white matter. (The specific connectivity pattern linking different brain regions varies between individuals.)

Next, the scientists converted these maps into computational models of each subject's brain, and used computers to simulate what would happen when a single region of a person's brain was stimulated. The researchers then used a mathematical framework, which they developed, to measure how brain activity became synchronized across various cognitive systems in the simulations.

"The brain is very dynamic," Dr. Kanika Bansal, lead author on the work, says. "Connections between different regions of the brain can change with learning or deteriorate with age or neurological disease. Connectivity also varies between people. Our research helps us understand this variability and assess how small changes in the organization of the brain can affect large-scale patterns of brain activity related to various cognitive systems."

While Dr. Bansal points out the foundational principles of brain coordination this research describes, the method described in the work could potentially be extended outside the brain, as well, creating dynamic teaming assignments in the future.

"While the work has been deployed on individual brains of a finite brain structure, it would be very interesting to see if coordination of Soldiers and autonomous systems may also be described with this method, too," Dr. Javier Garcia, ARL neuroscientist and co-author points out. "Much how the brain coordinates regions that carry out specific functions, you can think of how this method may describe coordinated teams of individuals and autonomous systems of varied skills work together to complete a mission."

Credit: 
U.S. Army Research Laboratory

New study finds poor diet kills more people globally than tobacco and high blood pressure

video: Lancet editor Richard Horton calls the new Global Burden of Disease study on diet a "blockbuster."

Image: 
IHME

SEATTLE - Poor diet is responsible for more deaths globally than tobacco, high blood pressure, or any other health risk, according to a new scientific study.

Consuming low amounts of healthy foods, such as whole grains, and too much unhealthy foods, including sweetened beverages, account for one in every five deaths globally.

"Poor diet is an equal opportunity killer," said Dr. Ashkan Afshin, lead author on the study and an assistant professor at the Institute for Health Metrics and Evaluation (IHME) at the University of Washington. "We are what we eat and risks affect people across a range of demographics, including age, gender, and economic status."

Afshin, who authored a global paper on obesity in 2017, emphasized that today's study focuses on the effects of food on chronic health problems, such as heart disease and diabetes, independent of their connections to obesity. More than 130 scientists from nearly 40 countries contributed to the analysis, which was published today in the international medical journal The Lancet. (LINK TO PAPER). The paper is the most comprehensive analysis on the health effects of diet ever conducted.

Poor diets were responsible for 10.9 million deaths, or 22% of all deaths among adults in 2017, with cardiovascular disease (CVD) as the leading cause, followed by cancers and diabetes. They also resulted in 255 million disability-adjusted life years (DALYs), which equal the sum of years of life lost and years lived with disability. Poor diet represents 16% of all DALYs among adults globally.

In comparison, tobacco was associated with 8.0 million deaths, and high blood pressure was linked to 10.4 million deaths. (see: https://vizhub.healthdata.org/gbd-compare/#)

In 2017, CVD was the leading cause of diet-related deaths (9,497,300) and DALYs (207.2 million), followed by cancers (913,100 deaths and 20.2 million DALYs), diabetes (338,700 deaths and 23.7 million DALYs), and kidney diseases (136,600 deaths and 3.4 million DALYs).

The study finds that while the impact of individual dietary factors varies across countries, three dietary factors - low intake of whole grains, as well as fruits, and high consumption of sodium - accounted for more than 50% of diet-related deaths and 66% of DALYs. The other 50% of death and 34% of DALYs were attributed to high consumption of red meat, processed meats, sugar-sweetened beverages, and trans fatty acids among other foods.

"We are highlighting the importance of low consumption of healthy foods as compared to the greater consumption of unhealthy foods," Afshin said. "Dietary policies focusing on promoting healthy eating can have a more beneficial effect than policies advocating against unhealthy foods."

The largest gaps between current and optimal diets were observed for nuts and seeds, milk, and whole grains. Some of those gaps, Afshin said, result from food producers and manufacturers.

"There is an urgent and compelling need for changes in the various sectors of the food production cycle, such as growing, processing, packaging, and marketing," Afshin said. "Our research finds the need for a comprehensive food system intervention to promote the production, distribution, and consumption of healthy foods across nations."
Harvard Professor Dr. Walter Willett, a co-author of the study, noted that the findings are consistent with a recent summary of randomized trials documenting benefits on risk factors for cardiovascular disease by replacing red meat with plant sources of protein.

"Thus, adoption of diets emphasizing soy foods, beans and other healthy plant sources of protein will have important benefits for both human and planetary health," he said.

While sodium, sugar, and fat have been the focus of diet policy debate in recent years, the assessment shows the leading risk factors resulting in death are diets high in sodium, low in whole grains, low in fruit, low in nuts and seeds, and low in vegetables. Each of these accounts for more than 2% of all deaths globally.

Among the world's 20 most populous countries, Egypt had the highest rate of diet-related deaths (552 per 100,000) and DALYs (11,837 per 100,000) in 2017; Japan had the lowest rate of diet-related deaths (97 per 100,000) and DALYs (2,300 per 100,000).

Credit: 
Institute for Health Metrics and Evaluation

The screen interval for high cardiovascular disease risk should be individual

Current American Heart Association, European Society of Cardiology, and UK National Health Service guidelines recommend a 5-yearly health check interval for screening of individuals at high cardiovascular disease risk. This health check covers measurement of a variety of risk factors including systolic blood pressure, cholesterol profile, blood glucose, and smoking status.

If lifestyle interventions are inadequate to reduce the risk, the guidelines recommend primary preventive medication such as statins. However, the 5-yearly screenings are not based on direct research evidence.

According to a study published today in The Lancet Public Health, screening for high cardiovascular disease risk should be based on individual risk level. The authors conclude that this would be achieved without increased health care costs. The study showed that current 5-year screening intervals were unnecessarily frequent for low-risk individuals and insufficiently frequent for intermediate-risk individuals.

"Our study shows that by optimizing the screening intervals, 8% of myocardial infarcts and strokes could be prevented without increase in health care costs. This means that during the next 20 years, in the English population aged now 40 to 64, the number of new myocardial infarcts or strokes prevented annually could reach 5000," says lead author Joni Lindbohm MD, PhD from the University of Helsinki.

The authors estimated the optimal screening interval by following development of cardiovascular disease risk in 7000 English men and women who participated in the Whitehall II study. This study measured their cardiovascular disease risk factors according to the current guidelines in 5-yearly intervals over a 22-year follow-up and collected data on cardiovascular diseases using national electronic health and death records.

Those at low risk for cardiovascular diseases spent on average 9 years in that risk category before moving to intermediate-low risk. The participants then spent on average 7 years in this next category before progressing to intermediate-high risk. However, the time spent in intermediate-high risk was only 4 years; after this, over 70% of participants progressed to the high-risk category that leads to consideration of preventive medication if lifestyle intervention is insufficient to reduce the risk.

An individualized screening interval would enable more effective cardiovascular disease prevention by means of lifestyle intervention or preventive medication, because of more timely detection of those at high risk.

"The results are promising, but national guidelines are rarely changed based on one study. The benefits of individualized screening intervals should be further studied in a randomized control trial before changing the guidelines emphasizes one of the authors, Professor Mika Kivimäki, Director of the Whitehall II study at University College London.

Credit: 
University of Helsinki