Tech

New study reveals local drivers of amplified Arctic warming

image: An international team of researchers, including Professor Sarah Kang (left) and DoYeon Kim (right) in the School of Urban and Environmental Engineering at UNIST, has unveiled that local greenhouse gas concentrations appear to be attributable to Arctic Amplification. l Photo Credit: Pixabay

Image: 
UNIST

The Artic experienced an extreme heat wave during the February 2018. The temperature at the North Pole has soared to the melting point of ice, which is about 30-35 degrees (17-19 Celsius) above normal. There have also been recent studies, indicating the mass of Arctic glaciers has declined significantly since the 1980's by more than 70%. These sudden climate changes affected not just the Arctic regions, but also the water, food, and energy security nexus throughout the globe. This is why climate scientists from around the world are paying increasing attention to this accelerated warming pattern, commonly referred to as 'Arctic Amplification'.

An international team of researchers, including Professor Sarah Kang and DoYeon Kim in the School of Urban and Environmental Engineering at UNIST, has unveiled that local greenhouse gas concentrations appear to be attributable to Arctic Amplification.

Published in the November 2018 issue of Nature Climate Change, their study on the cause of Arctic Amplification shows that local greenhouse gas concentrations, and Arctic climate feedbacks outweigh other processes. This study has been led by Assistant Project Leader Malte F. Stuecker from the IBS Center for Climate Physics (ICCP) in Busan, South Korea and participated by researchers around the globe, including United States, Austrailia, and China.

Long-term observations of surface temperatures show an intensified surface warming in Canada, Siberia, Alaska and in the Arctic Ocean relative to global mean temperature rise. Arctic Amplification is consistent with computer models, simulating the response to increasing greenhouse gas concentrations. However, the underlying physical processes for the intensified warming still remain elusive.

Using complex computer simulations, the scientists were able to disprove previously suggested hypotheses, that emphasized the role of transport of heat from the tropics to the poles as one of the key contributors to the amplified warming in the Arctic.

"Our study clearly shows that local carbon dioxide forcing and polar feedbacks are most effective in Arctic amplification compared to other processes", says Assistant Project Leader Malte F. Stuecker, the corresponding author of the study.

Increasing anthropogenic carbon dioxide (CO2) concentrations trap heat in the atmosphere, which leads to surface warming. Regional processes can then further amplify or dampen this effect, thereby creating the typical pattern of global warming. In the Arctic region, surface warming reduces snow and sea-ice extent, which in turn decreases the reflectivity of the surface. As a result, more sunlight can reach the top of layers of the soil and ocean, leading to accelerated warming. Furthermore, changes in Arctic clouds and of the vertical atmospheric temperature profile can enhance warming in the polar regions.

In addition to these factors, heat can be transported into the Arctic by winds. "We see this process for instance during El Niño events. Tropical warming, caused either by El Niño or anthropogenic greenhouse emissions, can cause global shifts in atmospheric weather patterns, which may lead to changes in surface temperatures in remote regions, such as the Arctic", said Kyle Armour, co-author of the study and professor of Atmospheric Sciences and Oceanography at the University of Washington.

Moreover, global warming outside the Arctic region will also lead to an increase in Atlantic Ocean temperatures. Ocean currents, such as the Gulf Stream and the North Atlantic drift can then transport the warmer waters to the Arctic ocean, where they could melt sea ice and experience further amplification due to local processes.

To determine whether tropical warming, atmospheric wind and ocean current changes contribute to future Arctic Amplification, the team designed a series of computer model simulations. "By comparing simulations with only Arctic CO2 changes with simulations that apply CO2 globally, we find similar Arctic warming patterns. These findings demonstrate that remote physical processes from outside the polar regions do not play a major role, in contrast to previous suggestions", says co-author Cecilia Bitz, professor of Atmospheric Sciences at the University of Washington.

In the tropics - fueled by high temperature and moisture - air can easily move up to high altitudes, meaning the atmosphere is unstable. In contrast, the Arctic atmosphere is much more stable with respect to vertical air movement. This condition enhances the CO2-induced warming in the Arctic near the surface. In the tropics - due to the unstable atmosphere - CO2 mostly warms the upper atmosphere and energy is easily lost to space. This is opposite to what happens in the Arctic: Less outgoing infrared radiation escapes the atmosphere, which further amplifies the surface-trapped warming.

"Our computer simulations show that these changes in the vertical atmospheric temperature profile in the Arctic region outweigh other regional feedback factors, such as the often-cited ice-albedo feedback" says Malte Stuecker.

The findings of this study highlights the importance of Arctic processes in controlling the pace at which sea-ice will retreat in the Arctic Ocean. The results are also important to understand how sensitive polar ecosystems, Arctic permafrost and the Greenland ice-sheet will respond to Global Warming.

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Scientists turn carbon emissions into usable energy

image: This is a schematic illustration of Hybrid Na-CO2 System and its reaction mechanism.

Image: 
UNIST

A recent study, affiliated with UNIST has developed a system that produces electricity and hydrogen (H2) while eliminating carbon dioxide (CO2), which is the main contributor of global warming.

Published This breakthrough has been led by Professor Guntae Kim in the School of Energy and Chemical Engineering at UNIST in collaboration with Professor Jaephil Cho in the Department of Energy Engineering and Professor Meilin Liu in the School of Materials Science and Engineering at Georgia Institute of Technology.

In this work, the research team presented Hybrid Na-CO2 system that can continuously produce electrical energy and hydrogen through efficient CO2 conversion with stable operation for over 1,000 hr from spontaneous CO2 dissolution in aqueous solution.

"Carbon capture, utilization, and sequestration (CCUS) technologies have recently received a great deal of attention for providing a pathway in dealing with global climate change," says Professor Kim. "The key to that technology is the easy conversion of chemically stable CO2 molecules to other materials." He adds, "Our new system has solved this problem with CO2 dissolution mechanism."

Much of human CO2 emissions are absorbed by the ocean and turned into acidity. The researchers focused on this phenomenon and came up with the idea of melting CO2 into water to induce an electrochemical reaction. If acidity increases, the number of protons increases, which in turn increases the power to attract electrons. If a battery system is created based on this phenomenon, electricity can be produced by removing CO2.

Their Hybrid Na-CO2 System, just like a fuel cell, consists of a cathode (sodium metal), separator (NASICON), and anode (catalyst). Unlike other batteries, catalysts are contained in water and are connected by a lead wire to a cathode. When CO2 is injected into the water, the entire reaction gets started, eliminating CO2 and creating electricity and H2. At this time, the conversion efficiency of CO2 is high at 50%.

"This hybrid Na-CO2 cell, which adopts efficient CCUS technologies, not only utilizes CO2 as the resource for generating electrical energy but also produces the clean energy source, hydrogen," says Jeongwon Kim in the Combined M.S/Ph.D. in Energy Engineering at UNIST, the co-first author for the research.

In particular, this system has shown stability to the point of operating for more than 1,000 hours without damage to electrodes. The system can be applied to remove CO2 by inducing voluntary chemical reactions.

"This research will lead to more derived research and will be able to produce H2 and electricity more effectively when electrolytes, separator, system design, and electrocatalysts are improved," said Professor Kim.

Credit: 
Ulsan National Institute of Science and Technology(UNIST)

Signs of memory problems could actually be symptoms of hearing loss

image: This is an older adult trying on a hearing aid.

Image: 
Baycrest

Older adults concerned about displaying early symptoms of Alzheimer's disease should also consider a hearing check-up, suggest recent findings.

What might appear to be signs of memory loss could actually point to hearing issues, says Dr. Susan Vandermorris, one of the study's authors and a clinical neuropsychologist at Baycrest.

A recent Baycrest study, published in the Canadian Journal on Aging, found that the majority (56 per cent) of participants being evaluated for memory and thinking concerns and potential brain disorders had some form of mild to severe hearing loss, but only about 20 per cent of individuals used hearing aids. Among the participants, a quarter of them did not show any signs of memory loss due to a brain disorder.

"We commonly see clients who are worried about Alzheimer's disease because their partner complains that they don't seem to pay attention, they don't seem to listen or they don't remember what is said to them," says Dr. Vandermorris. "Sometimes addressing hearing loss may mitigate or fix what looks like a memory issue. An individual isn't going to remember something said to them if they didn't hear it properly."

Hearing loss is the third most common chronic health condition in older adults, which is experienced by 50 per cent of individuals over the age of 65 and 90 per cent of people over the age of 80. It takes an average of 10 years before people seek treatment and less than 25 per cent of individuals who need hearing aids will buy them.

Hearing status is not always addressed in neuropsychology clinics, but can impact performance on memory assessments done verbally, adds Dr. Vandermorris.

"Some people may be reluctant to address hearing loss, but they need to be aware that hearing health is brain health and help is available," she adds.

The study analyzed results from 20 individuals who were receiving a neuropsychological assessment at Baycrest. Participants completed a hearing screening test after their cognitive evaluation.

Neuropsychologists were privy to hearing test results after their initial assessment, which altered some of their recommendations. For example, some clients were referred to a hearing clinic for a full audiology assessment or to consider using a hearing aid, as well as provided education on hearing loss and communication.

"Since hearing loss has been identified as a leading, potentially modifiable risk factor for dementia, treating it may be one way people can reduce the risk," says Marilyn Reed, another author on the study and practice advisor with Baycrest's audiology department. "People who can't hear well have difficulty communicating and tend to withdraw from social activities as a way of coping. This can lead to isolation and loneliness, which can impact cognitive, physical and mental health."

This study builds on earlier research that analyzed how addressing memory problems could benefit older adults seeking hearing loss treatment.

"We are starting to learn more about the important role hearing plays in the brain health of our aging population," says Dr. Kate Dupuis, lead author on the study, a former postdoctoral fellow at Baycrest, clinical neuropsychologist and Schlegel Innovation Leader at the Sheridan Centre for Elder Research. "In order to provide the best care to our older clients, it is imperative that neuropsychologists and hearing care professionals work together to address the common occurrence of both cognitive and hearing loss in individuals."

Since the studies, Baycrest's Neuropsychology and Cognitive Health Program and Hearing Services have incorporated general screening for hearing and memory issues into their assessments, as well as provided educational materials to clients.

Next steps for the study will involve optimizing screening strategies for hearing loss in memory assessments and ongoing interprofessional collaborations to create educational tools that counsel clients about the relationship between hearing, memory and brain health.

Credit: 
Baycrest Centre for Geriatric Care

UTA herpetologists describe new species of snake found in stomach of predator snake

image: Jonathan Campbell, UTA professor of biology; Eric Smith, UTA associate professor of biology; and Alexander Hall, who earned a UTA doctorate in quantitative biology in 2016, wrote a paper in the Journal of Herpetology about the snake found in a snake.

Image: 
UT Arlington

Herpetologists at The University of Texas at Arlington have described a previously unknown species of snake that was discovered inside the stomach of another snake more than four decades ago.

The new snake has been named Cenaspis aenigma, which translates from Latin as "mysterious dinner snake." It is described in a recent paper in the Journal of Herpetology titled "Caudals and Calyces: The Curious Case of a Consumed Chiapan Colubroid." The paper was co-authored by Jonathan Campbell, UTA professor of biology; Eric Smith, UTA associate professor of biology; and Alexander Hall, who earned a UTA doctorate in quantitative biology in 2016.

The researchers' work identifies Cenaspis as not only a new species but also an entirely new genus.

The specimen was found in the stomach of a Central American coral snake -- a species that has been known to eat smaller snakes -- by palm harvesters in the southern Mexico state of Chiapas in 1976. The 10-inch long specimen was preserved in a museum collection. Amazingly, a live specimen has never been found in the ensuing 42 years.

"This small snake was obtained now over 40 years ago, and the report of its discovery has been a long time in coming," the co-authors wrote in the Journal of Herpetology paper. "We were optimistic that additional specimens might be secured, but after at least a dozen more trips into the region spanning several decades, we have been unrewarded."

Cenaspis has several unique features that defy placing it in any known genus and clearly distinguishes it from all known genera. These include undivided subcaudal scales, or enlarged plates on the underside of its tail; a lack of spines and presence of cup-like structures called calyces on its hemipenes, or paired male reproductive organs found in snakes and lizards; and the shape of its skull.

The first two of those features are not found in any other known snake in the family Colubridae in the Western Hemisphere. Colubridae is the largest snake family and includes just over 51 percent of all known living snake species.

Utilizing the vast resources of UTA's Amphibian and Reptile Diversity Research Center for comparative purposes, the researchers made CT scans of dozens of specimens of snakes. The biologists believe that due to some of the specimen's physical features, Cenaspis is likely a burrowing snake that feeds on insects and spiders. Campbell believes that Cenaspis is not extinct but has eluded capture due to its burrowing lifestyle and other elusive habits.

"This provides evidence of just how secretive some snakes can be," Campbell told National Geographic, which ran a story about the discovery in its Dec, 19, 2018, edition. "Combine their elusive habits with restricted ranges and some snakes do not turn up often."

He noted said that because of the snake's unique nature, the Chiapas highlands area of southern Mexico where it was found all those years ago should be considered for protected status, so that more unknown species can be discovered and not face possible extinction.

Credit: 
University of Texas at Arlington

Smart microrobots that can adapt to their surroundings

video: Scientists at EPFL and ETH Zurich have developed tiny elastic robots that can change shape depending on their surroundings.

Image: 
EPFL

One day we may be able to ingest tiny robots that deliver drugs directly to diseased tissue, thanks to research being carried out at EPFL and ETH Zurich.

The group of scientists - led by Selman Sakar at EPFL and Bradley Nelson at ETH Zurich - drew inspiration from bacteria to design smart, biocompatible microrobots that are highly flexible. Because these devices are able to swim through fluids and modify their shape when needed, they can pass through narrow blood vessels and intricate systems without compromising on speed or maneuverability. They are made of hydrogel nanocomposites that contain magnetic nanoparticles allowing them to be controlled via an electromagnetic field.

In an article appearing in Science Advances, the scientists describe the method they have developed for "programming" the robot's shape so that it can easily travel through fluids that are dense, viscous or moving at rapid speeds.

Embodied intelligence

When we think of robots, we generally think of bulky machines equipped with complex systems of electronics, sensors, batteries and actuators. But on a microscopic scale, robots are entirely different.

Fabricating miniaturized robots presents a host of challenges, which the scientists addressed using an origami-based folding method. Their novel locomotion strategy employs embodied intelligence, which is an alternative to the classical computation paradigm that is performed by embedded electronic systems. "Our robots have a special composition and structure that allow them to adapt to the characteristics of the fluid they are moving through. For instance, if they encounter a change in viscosity or osmotic concentration, they modify their shape to maintain their speed and maneuverability without losing control of the direction of motion," says Sakar.

These deformations can be "programmed" in advance so as to maximize performance without the use of cumbersome sensors or actuators. The robots can be either controlled using an electromagnetic field or left to navigate on their own through cavities by utilizing fluid flow. Either way, they will automatically morph into the most efficient shape.

Inspired by nature

"Nature has evolved a multitude of microorganisms that change shape as their environmental conditions change. This basic principle inspired our microrobot design. The key challenge for us was to develop the physics that describe the types of changes we were interested in, and then to integrate this with new fabrication technologies," says Nelson. In addition to offering enhanced effectiveness, these miniaturized soft robots can also be manufactured easily at a reasonable cost. For now, the research team is working on improving the performance for swimming through complex fluids like those found in the human body.

Credit: 
Ecole Polytechnique Fédérale de Lausanne

Orchards in natural habitats draw bee diversity, improve apple production

ITHACA, N.Y. - Apple orchards surrounded by agricultural lands are visited by a less diverse collection of bee species than orchards surrounded by natural habitats, according to a new Cornell University-led study, published in the journal Science.

In turn, apple production suffers when fewer, more closely-related species of bees pollinate an orchard. Production improves in orchards surrounded by natural habitats, which then draw a broader selection of species to apple blossoms.

The researchers examined 10 years of data from 27 New York state apple orchards. The study accounted for the types of landscapes that surround these orchards, measured apple production and surveyed the species of bees that visited each orchard.

The researchers also reconstructed the evolutionary history and relatedness of New York native bee species to better understand species patterns that played out across these orchard bee communities. This reconstruction is represented by a branching tree-like diagram of related species, called a phylogeny.

"Orchards that have bee communities that are more closely related to each other did worse in terms of their fruit production, and the communities that are more broad across the phylogeny did much better," said Heather Grab, Ph.D., the paper's first author and a postdoctoral researcher in the lab of Katja Poveda, associate professor of entomology and a co-author of the study. Brian Danforth, professor of entomology, is a senior author of the study.

Species of bees exhibit different behaviors in how and when they pollinate flowers. Some species approach from the side, others from the top, and they each may feed at different times of day and with varied frequencies, all of which affect how completely an apple flower is pollinated.

Organs in apple flowers must receive a certain number of pollen grains in order to develop a full complement of seeds. When seeds do well, the tissue that supports those seeds, the fleshy part of the fruit, is also more fully developed.

"If only half of the seeds mature fully, then the fruit is misshapen," which in turn affects weight and salability, Grab said.

In this way, habitats that surround farms affect the diversity of bee communities and, thus, an orchard's productivity.

Credit: 
Cornell University

Automated text messages improve outcomes after joint replacement surgery

January 17, 2019 - An automated text messaging system increases patient engagement with home-based exercise and promotes faster recovery after total knee or hip replacement surgery, reports a study in the January 16, 2019 issue of The Journal of Bone & Joint Surgery. The journal is published in the Lippincott portfolio in partnership with Wolters Kluwer.

Patients receiving timely texts showed improvement in several key outcomes, including fewer days on opioid pain medications, more time spent on home exercises, faster return of knee motion, and higher satisfaction scores, according to the research by Kevin J. Campbell, MD, of Rush University Medical Center, Chicago, and colleagues. "A chatbot that texts timely, informative and encouraging messages to patients can improve clinical outcomes and increase patient engagement in the early postoperative period after total joint replacement," Dr. Campbell comments.

Automated Texts Lead to Improved Outcomes of Surgery

The randomized trial included 159 patients undergoing primary total knee or hip replacement. All received standard education, including instructions on home exercises after surgery.

In addition, one group of patients received a series of automated, physician-specific text messages. The pre-programmed texts provided recovery instructions along with encouraging and empathetic messages, personalized video messages from the surgeon, and brief instructional therapy videos. The texts were sent via a service called STREAMD; Dr. Campbell is the CEO and Co-Founder of STREAMD.

"The content of the text and video messages reinforced the perioperative instructions and were delivered to patients at the appropriate time based on their recovery progress," the researchers write. Over the six-week period after surgery, patients in the text-message group received about 90 texts. The system did not accept inbound text responses from patients, although patients could access further information on topics they selected.

Patients who received automated texts performed their home exercises an average of 46 minutes per day, compared to 38 minutes in the standard-care group, a significant difference of nine minutes per day. The texted group had greater knee motion at three weeks' follow-up, suggesting faster short-term recovery, but by six weeks, knee motion was similar between groups.

Patients in the text-message group stopped using opioid pain medications about 10 days sooner than those in the control group (22 versus 32 days). They also had higher mood scores and were more likely to say that their postoperative instructions were clear. Patients assigned to automated texts also made fewer phone calls to the surgeon's office. There was a trend toward fewer emergency department visits as well, although this difference was not statistically significant.

There is growing interest in using text messages to increase patient engagement in recovery after surgery. But previous digital patient engagement platforms have not been widely adopted by either patients or healthcare providers.

This study provides evidence of improved outcomes when an automated text-message system makes daily contact with patients and provides them with relevant information and encouragement. Advantages include more time doing recommended home exercises, faster recovery of knee motion, and improved patient satisfaction.

The 10-day reduction in opioid use is a potentially important advantage, reducing the risk of persistent opioid use and other complications. "This finding could be related to improved patient education and to the encouraging and empathetic tone of the text and video messages," Dr. Campbell comments. "It could also reflect improved mood scores and patients' confidence in their ability to manage their recovery, which have been shown to be very effective pain relievers."

The benefits of such an automated system could be especially important at a time when more patients are undergoing joint replacement surgery with less overall contact with the treatment team. "As we search for practical methods to engage patients, automated messages providing education, support, and encouragement create a natural and convenient way for patients to receive information, potentially improving key outcomes without placing extra time demands on the surgeon and staff," Dr. Campbell concludes.

Credit: 
Wolters Kluwer Health

Measuring AI's ability to learn is difficult

Organizations looking to benefit from the artificial intelligence (AI) revolution should be cautious about putting all their eggs in one basket, a study from the University of Waterloo has found.

In a study published in Nature Machine Intelligence, Waterloo researchers found that contrary to conventional wisdom, there can be no exact method for deciding whether a given problem may be successfully solved by machine learning tools.

"We have to proceed with caution," said Shai Ben-David, lead author of the study and a professor in Waterloo's School of Computer Science. "There is a big trend of tools that are very successful, but nobody understands why they are successful, and nobody can provide guarantees that they will continue to be successful.

"In situations where just a yes or no answer is required, we know exactly what can or cannot be done by machine learning algorithms. However, when it comes to more general setups, we can't distinguish learnable from un-learnable tasks."

In the study, Ben-David and his colleagues considered a learning model called estimating the maximum (EMX), which captures many common machine learning tasks. For example, tasks like identifying the best place to locate a set of distribution facilities to optimize their accessibility for future expected consumers. The research found that no mathematical method would ever be able to tell, given a task in that model, whether an AI-based tool could handle that task or not.

"This finding comes as a surprise to the research community since it has long been believed that once a precise description of a task is provided, it can then be determined whether machine learning algorithms will be able to learn and carry out that task," said Ben-David.

Credit: 
University of Waterloo

NSD2: Gene responsible for spread of prostate cancer identified

image: When Antonina Mitrofanova learned she couldn't become an oncologist, she changed majors to computer science. Now, a pioneer in the emerging field of biomedical informatics she is fighting cancer with big data.

Image: 
Nick Romanenko

A Rutgers study has found that a specific gene in cancerous prostate tumors indicates when patients are at high-risk for the cancer to spread, suggesting that targeting this gene can help patients live longer.

The study, which was published in the journal Nature Communications, identified the NSD2 gene through a computer algorithm developed to determine which cancer genes that spread in a mouse model were most relevant to humans. The researchers were able to turn off the gene in the mice tumor cells, which significantly decreased the cancer's spread.

"Currently, when a patient is diagnosed with prostate cancer, physicians can determine how advanced a tumor is but not whether the patients' cancer will spread," said lead author Antonina Mitrofanova, an assistant professor at Rutgers School of Health Professions and a research member of Rutgers Cancer Institute of New Jersey. "If we can determine whether a patient's cancer is likely to spread at the time of diagnosis, we can start them on a targeted treatment plan as soon as possible to decrease the likelihood of their cancer spreading."

While the algorithm used in the study focused on prostate cancer, Mitrofanova said it can be applied more broadly to study other cancers to better understand what findings can be translated to people.

According to the American Cancer Society, prostate cancer is the second most common cancer in American men and the second leading cause of cancer deaths.

Credit: 
Rutgers University

Another piece of Ebola virus puzzle identified

image: Ebola virus (red) replicates much faster in human macrophages depleted of RBBP6 protein (right panel).

Image: 
Texas Biomed

A team of researchers have discovered the interaction between an Ebola virus protein and a protein in human cells that may be an important key to unlocking the pathway of replication of the killer disease in human hosts. Scientists at Texas Biomedical Research Institute were part of a nationwide collaborative with scientists at Gladstone Institutes, UC San Francisco and Georgia State University for a study recently published in the journal Cell.

Scientists around the globe are trying to pinpoint potential drug targets to stop Ebola virus disease, a hemorrhagic fever that killed 382 people in the latest outbreak in the Democratic Republic of Congo in 2018. Thousands of people have died from Ebola since an outbreak erupted in West Africa four years ago.

Texas Biomed Staff Scientist Olena Shtanko, Ph.D., describes this new work as a "turning point for understanding how replication of Ebola virus is modulated." Her role in the project was to validate and test whether the interaction between an Ebola virus protein called VP 30 and a host (human) protein called RBBP6 had involvement in the life cycle of the virus. Dr. Shtanko worked on this project while in the lab of Dr. Robert Davey, a former Texas Biomed Scientist, now at Boston University.

Earlier research by scientists in California used a protein interaction map to narrow down host and virus protein interactions and then using a yeast system and an artificial proxy virus system proved the theory of this particular protein-protein interaction. However, scientists needed to use replicating virus and human immune cells to test the clinical significance of the finding.

"The interaction is important if you can show functional significance of what it does to the virus in cells that have clinical relevance," Shtanko stressed. "If you can figure out the mechanism within these cells, then you can potentially manipulate it and stop the disease progression."

Texas Biomed Staff Scientist Eusondia Arnett, Ph.D., and Texas Biomed President and CEO Dr. Larry Schlesinger - both tuberculosis researchers - have expertise in working with human macrophage (immune) cells drawn from donated blood samples. "We were able to capitalize on our experience with macrophages to over- and under-express the RBBP6 (host) protein to create an effective model for this important Ebola virus research," Arnett said.

By over- and under-expressing the RBBP6, Shtanko was able to test what impact the protein had on the growth of Ebola virus in the macrophages. Shtanko said the results were striking. When the host protein was under-expressed, the viral replication went up exponentially. She found similar results when working with vascular cells, which are also key to Ebola virus replication in an infected patient.

The study was also an example of the Institute's new team science environment; whereby, researchers capitalize not only on the resources available at Texas Biomed but the expertise of its cross-functional teams (i.e. Ebola virus and macrophage biology) to produce beneficial results.

Texas Biomed has one of only a handful of Biosafety Level 4 laboratories in the United States where scientists can safely work with deadly pathogens, such as Ebola virus, which cause diseases for which there is no cure. This allowed for the study to be completed in human cells using live virus. And, Shtanko explained, the study turnaround was very short, so having the expertise of Arnett and Schlesinger was integral in creating the human macrophage cells necessary to complete the study.

Credit: 
Texas Biomedical Research Institute

Research to improve welding process for manufacturing industries

Arc welding and additive manufacturing are hugely important for creating large metal components relatively inexpensively and quickly.

New research led by Professor Hongbiao Dong from the University of Leicester's Department of Engineering has shown how to optimise this process to improve efficiency and cost.

The research, which was a collaboration between the University of Leicester, Delft University of Technology, Diamond Light Source, University College Dublin and TATA Steel Research UK was recently published in Nature Communications.

It explores the internal flow behaviour in additive manufacturing of metals and arc welding - the most widely used welding process in modern manufacturing.

The work focused on examining the melt pools that are created during the welding process.

To do this, the team inserted small tungsten and tantalum particles into the melt pool. Due to their high melting points, the particles remained solid in the melt pool long enough for them to be tracked using intense beams of X-rays.

The X-rays were generated using the synchrotron particle accelerator at Diamond Light Source, which is the UK's National facility for synchrotron light. Beamline I12 was selected for this research due to its specialised high energy, high-speed imaging capability at thousands of frames per second.

Using Beamline I12, the researchers were able to create high-speed movies showing how surface tension affects the shape of the welding melt pool and its associated speed and patterns of flow. The results showed, for the first time, that the melt flow behaviour is similar to that previously only seen via computer simulations.

The results revealed that arc welding can be optimised by controlling the flow of the melt pool and changing the associated active elements on the surface.

Professor Dong said: "Understanding what happens to the liquid in melt pools during welding and metal-based additive manufacturing remains a challenge. The findings will help us design and optimise the welding and additive manufacturing processes to make components with improved properties at a reduced cost.

"Welding is the most economical and effective way to join metals permanently, and is a vital component of our manufacturing economy."

Dr Thomas Connolley, Principal Beamline Scientist for I12 at Diamond Light Source commented: "The I12 team was closely involved in the experiment. The beamline was designed with these challenging in-situ experiments in mind and I am very happy that we have helped advance understanding of additive manufacturing and welding, given their technological importance."

It is estimated that over 50% of global domestic and engineering products contain welded joints. In Europe, the welding industry has traditionally supported a diverse set of companies across the shipbuilding, pipeline, automotive, aerospace, defence and construction sectors. Revenue from welding equipment and consumable markets reached €3.5 billion in Europe in 2017.

The results will help with the future designing and optimisation of the welding and additive manufacturing process, and will have an important and far-reaching impact.

Credit: 
University of Leicester

Poisons or medicines? Cyanobacteria toxins protect tiny lake dwellers from parasites

image: This is a microscope image of two Daphnia dentifera that fed on different diets and were exposed to a fungal pathogen. The top Daphnia fed on a nutritious green algae. She is larger and has several embryos in her brood chamber but is infected with a virulent fungal pathogen. The animal on the bottom fed on a toxic cyanobacterium. As a result, she is smaller and only has one developing embryo but is not infected with the pathogen (and, therefore, will live longer)

Image: 
Meghan Duffy

ANN ARBOR--The cyanobacteria blooms that plague western Lake Erie each summer are both an unsightly nuisance and a potential public health hazard, producing liver toxins that can be harmful to humans and their pets.

But the toxins produced in cyanobacteria blooms may also have protective effects on sand-grain-sized lake animals that ingest them, much as the toxins in milkweed plants protect monarch butterflies from parasites, according to a new study from University of Michigan ecologists.

The laboratory-based study shows that tiny, shrimp-like freshwater crustaceans called Daphnia can gain protection from fungal parasites by consuming toxins produced by bloom-forming cyanobacteria. Commonly known as water fleas, Daphnia play a key role in freshwater food webs and are a vital food source for many fish.

The U-M researchers plan follow-up studies to see if the protective effects they observed in the lab are also occurring in lakes. They'll also explore the potential of developing anti-fungal drugs for human use.

The study is scheduled for publication Jan. 15 in the journal Proceedings of the Royal Society B. The first author is Kristel Sánchez, who did the work for her master's thesis in the U-M Department of Ecology and Evolutionary Biology. Co-authors include her faculty advisers, U-M ecologists Mark Hunter and Meghan Duffy.

"This paper shows that Daphnia living in Michigan lakes can gain protection from fungal parasites through the toxins that are present in bloom-forming cyanobacteria," said Hunter, who has studied monarchs at U-M's Biological Station for more than a decade.

"This is an amazing aquatic parallel of the monarch butterflies, which gain protection from their parasites from the toxins in milkweed. It suggests that animal medication may be even more common than we thought, extending into the aquatic realm."

It's been known for decades that animals such as chimpanzees seek out medicinal herbs to treat their diseases. In recent years, the list of animal pharmacists has grown much longer, and it now appears that the practice of animal "self-medication" is a lot more widespread than scientists had previously known.

But most studies of animal self-medication have focused on terrestrial ecosystems. Relatively little is known about how an animal's diet serves as sources of medications in aquatic systems.

To be clear, the current study does not show that Daphnia are self-medicating in the wild. To demonstrate self-medication, researchers would have to show that Daphnia feed preferentially on toxin-producing cyanobacteria and algae to reduce their disease risk. The researchers will look for evidence of this type of selective foraging in the next phase of the study.

But the current study does provide strong evidence for the protective properties of plant toxins consumed by Daphnia.

"Fungal pathogens have devastating impacts on crops, wildlife, and even people," Duffy said. "To me, the idea that compounds for fighting those might be just below the surface of our local lakes is really exciting."

Sánchez raised Daphnia dentifera, a common herbivore in North American lakes, in the laboratory and fed individuals one of eight species of green algae or cyanobacteria (formerly known as blue-green algae). In the wild, Daphnia feed on algae and bacteria floating in the water column; they are part of a group of tiny aquatic animals known as zooplankton grazers.

Five species of green algae and three species of cyanobacteria were used, and they differed in their nutritional value and toxin production. Green algae tend to be more nutritious for Daphnia than toxin-producing cyanobacteria.

Cyanobacteria used in the study included one species of Microcystis, a group of common colony-forming cyanobacteria largely to blame for Lake Erie's annual summer blooms. Some Microcystis species produce liver toxins called microcystins. In her experiment, Sánchez added pure microcystin to some of the Microcystis cultures to approximate levels of the toxin commonly observed during cyanobacteria blooms.

The lab-reared Daphnia in the U-M study were then exposed to one common fungal parasite and one common bacterial parasite. In the wild, Daphnia become infected by consuming parasite spores in the water.

The researchers found that diet had striking effects on the infection prevalence of the fungal parasite, but not the bacterial parasite.

Two of the cyanobacteria diets--including the diet treatment in which pure microcystin was added to beakers containing Microcystis cultures--completely prevented fungal infections. The diet of Microcystis alone, without additional microcystin, resulted in very low levels of fungal infection.

Fungal infections were also completely prevented when Daphnia fed on the green alga Chlorella. In all, four of the nine dietary treatments showed strong evidence of anti-fungal activity.

"I thought I would probably get one diet out of the nine that showed some medicinal effect," Sánchez said. "The fact that nearly half of the diet treatments showed some sort of medicinal effect was really surprising and suggests that we are not paying enough attention to the mechanism of self-medication in aquatic ecosystems."

In contrast, none of the diet treatments prevented infection with the bacterial parasite. Also, uninfected Daphnia that consumed a diet of green algae produced two- to three-fold more offspring than those that ate cyanobacteria, confirming the higher nutritional quality of the green algal diets. And, in general, growth was reduced by parasite infection.

Sánchez plans to continue the work for her doctoral research project, with Duffy as her adviser. The researchers will look for evidence of self-medication and will further examine the anti-microbial properties of the three toxins that showed fungal-fighting activity in the initial study: microcystin-LR, anatoxin-a, and chlorellin. Researchers from Minnesota and Georgia will join the project.

"A lot of natural products research is done by screening tons of plant or algal material and using different bioassays and hoping you get lucky and find a new compound or something that is active," Sánchez said. "But why not look at what is happening already in nature? I think this is what really appealed to me and why I chose to work on this project."

The work has broad ecological implications, the researchers say, in part because a warming global climate is expected to increase the frequency of cyanobacteria blooms in some regions. That change could alter disease dynamics among Daphnia, which in turn "could have important consequences for lake food webs, as Daphnia are key grazers and disease outbreaks within Daphnia populations can influence ecosystem-level processes," the authors wrote.

Credit: 
University of Michigan

Difficulties with audiovisual processing contributes to dyslexia in children

BUFFALO, N.Y. - A University at Buffalo psychologist has published a neuroimaging study that could help develop tests for early identification of dyslexia, a disorder that effects 80 percent of those diagnosed with difficulties reading, writing and spelling.

Tasks which require audiovisual processing are especially challenging for children with dyslexia, according to Chris McNorgan, an assistant professor in UB's psychology department and project lead for the research published in the journal PLoS ONE.

Designing tests sensitive to the problem of audiovisual integration could determine the presence of a disorder that often goes undetected during the early years of elementary education since many children with dyslexia are considered, initially, as simply being on the lower end of a normal range of reading levels.

"Until these kids with dyslexia are lagging so far behind their peers, there's no way to reasonably assume that they're not part of a continuum of ability, but rather a separate group altogether," says McNorgan.

The study's results suggest that the reading difficulty associated with dyslexia stems from a lack of coordinated processing in the four brain areas known as "the reading network."

"We find that the organization of the brain outside of the core reading network does not appear to be related to how well or poorly dyslexic children read," says McNorgan, an expert in neuroimaging and computational modeling. "This is notable because it would be consistent with dyslexia as a problem related to the wiring specifically of the brain areas associated with integrating auditory and visual information, and not with some other general cognitive disruption, such as memory or attention."

Unlike much previous research on dyslexia that focused on the strength of connections in the reading network, McNorgan and his colleagues looked not only at that strength, but also the manner in which these regions are connected, a critical point in order to better understand dyslexia.

"To think of the 'manner' of connections, by way of analogy, as being separate from 'strength,' a city planner trying to optimize traffic flow is probably not going to be successful by just dropping a multi-lane highway down the middle of a city if the neighborhoods and other city streets are not organized in a way that can take advantage of the extra traffic capacity.

"While connection strength is absolutely an important factor, our results indicate that it is only one of several components of the brain network that is optimized for fluent reading through practice."

In cases of dyslexia, there is no problem with how someone's eyes or how someone's ears work. But reading isn't about just what's seen and heard; it's a multisensory task that involves decoding letters into their associated speech sounds.

"Don't imagine someone as seeing words with scrambled letters or seeing letters upside down," explains McNorgan. "Dyslexia is about being unable to figure out how a particular sequence of letters fits together and then mapping that sequence to a particular sound."

Consider coming across a new word, like reading "Brobdingnagian" for the first time in Jonathan Swift's "Gulliver's Travels." The unfamiliarity requires a laborious effort to unpack the letters' sounds into what becomes the word.

"It's a struggle," says McNorgan. "And though even fluent readers occasionally encounter this difficulty, the exertion required to get the word is what happens all the time for people with dyslexia."

For the current study, McNorgan and UB graduate students Erica Edwards and Kali Burke, and Vanderbilt University collaborator James Booth used fMRI, a technology that measures and maps brain activity, to look at how the regions of the reading network connect and interact.

The 24 participants, ages 8-13, completed rhyming tasks under three conditions: seeing two words; hearing two words; and hearing the first words while seeing the second. The rhyming tasks required participants to map visual representations to sounds.

As the participants completed the tasks, fMRI scans revealed what brain regions were activated and how they were communicating.

"We're taking a brain network perspective," says McNorgan. "We're want to learn, not just what these brain areas are doing, but how are these areas talking to each other."

The goal, says McNorgan, is to determine whether or not the network's configuration is determining the degree to which dyslexic children experience reading difficulty.

"The way things are wired is going to make a big difference in how communication occurs within this network," he says. "And why some children's brains seem to be resistant to becoming optimally wired remains an outstanding question."

Credit: 
University at Buffalo

Pioneering surgery restores movement to children paralyzed by acute flaccid myelitis

image: Dr. Scott Wolfe and patient Kale Hyder

Image: 
Hospital for Special Surgery

An innovative and complex surgery involving nerve transfers is restoring hope and transforming lives torn apart by a mysterious and devastating illness. Acute flaccid myelitis, also known as AFM, strikes without warning, shows no mercy and frequently results in paralysis. Most affected patients are children, and nearly all have partial or complete loss of muscle function in their arms or legs.

Dr. Scott Wolfe, an orthopedic surgeon specializing in nerve injuries at Hospital for Special Surgery (HSS), has restored arm movement and function in a number of young AFM patients previously told their paralysis would be permanent.

The journal Pediatric Neurology published two AFM case studies by Dr. Wolfe and colleagues. The article was made available online in the summer of 2018, in advance of final print publication in November 2018. The report documents patients, ages 12 and 14, who had suffered partial paralysis and regained movement in their arms after nerve transfer surgery at HSS.

"We published the case studies to raise awareness in the medical community," said Dr. Wolfe. "Since the procedure is so highly specialized and performed by very few surgeons, most people, even doctors, are unaware that nerve transfer could potentially help AFM patients. But there is a window of opportunity, and the surgery should ideally be performed within six to nine months of disease onset."

Cases of acute flaccid myelitis, which is considered a subtype of transverse myelitis, have been on the rise in the United States since 2014, according to the Centers for Disease Control and Prevention. The illness, which is most common in children and teenagers, appears to occur after a viral infection. Within a day or two, inflammation within the spinal cord leads to muscle weakness and rapid, progressive paralysis of the arms and/or legs. In 2014, and again in 2018, a disturbing spike in cases was reported in certain regions of the United States.

Patients suffer different degrees of paralysis, and while some regain function, many suffer some degree of permanent paralysis. No nonsurgical treatment has been shown to be effective.

At age 15, Kale Hyder of Davenport, Iowa was not only the picture of health, but an accomplished athlete. At 6-foot-two, the active teenager played on his school's basketball team and on an elite traveling team in the summer. One day in June 2015, he woke up with a stiff neck and asked his mother for a new pillow. By the next day, he could not move his arms and could barely stand up. His mother rushed him to the emergency room. Soon after, he was paralyzed from the chest down.

Kale was diagnosed with transverse myelitis. After days and months had passed, Kale and his family were shocked to receive a terrible prognosis. "We went to an excellent neurologist in the Chicago area, and he said, 'I'm so sorry, you're not going to regain function of your hands.' It was devastating, I'll never forget receiving that news," recalls Marcy Hyder, Kale's mother. "Here's my basketball player, sitting next to me in a wheel chair. Everything changed in the blink of an eye. When we got out of there, we all just sobbed."

At that time, Kale was receiving occupational therapy at Shriners Hospitals for Children in Chicago, and the therapist told them about another young girl in her care who had been helped by Dr. Wolfe at HSS. In March 2016, the Hyders made the trip to New York.

Although several doctors had indicated that Kale would require assistive devices and a caregiver to help with activities of daily living, Dr. Wolfe gave them another option. And he gave them hope.

Dr. Wolfe performed nerve transfers to each arm, followed by tendon transfer surgery a year later to restore function in Kale's hands and enable him to lift his arm over his head. Nerve transfer surgery entails taking all or part of a working nerve with a less important or redundant function and transferring it to restore function to one or more muscles that have been paralyzed. The delicate and painstaking surgery requires hours of muscle and nerve testing and meticulous planning by the surgeon and his team beforehand. It's not uncommon for Dr. Wolfe to work much of the day and into the night to plan the surgery. Performed under a microscope, each set of nerve transfer surgeries can take five to seven hours.

"We have been performing nerve transfers for patients with brachial plexus injuries, so it made sense to try it for AFM patients," Dr. Wolfe said. "But it's more challenging. Since the disease causes almost random patterns of muscle paralysis, there's no roadmap to follow and we have to come up with a creative solution for each patient. We take a full inventory of what's working and what's not working in each limb by checking each muscle. We look for donor nerves, so if two muscles move an elbow, for example, we can take one of the nerves controlling that function and use it to restore function to a hand or shoulder."

Marcy says she is extremely grateful to the Shriners occupational therapist and a physician there who told her about HSS and Dr. Wolfe. "We would never have known. We would have missed that window of opportunity for surgery. Without it, Kale would not be functioning as he is today," she says. "Think of everything you do with your hands; he couldn't do it. But he can now. What Dr. Wolfe did for him is absolutely amazing. He gave him his independence."

Now Marcy is on a mission to raise awareness that nerve transfer surgery could be an option for transverse myelitis and AFM patients, but they need to be evaluated sooner rather than later.

Kale, now a freshman at Johns Hopkins University, credits Dr. Wolfe with enabling him to follow his dreams. A pre-med major, he plans to be a neurologist and a researcher so he can help others. He attends classes, engages in volunteer work on campus, types on a keyboard, shops for food and does his own laundry.

"Dr. Wolfe has been a role model and I admire him for the way he works with patients. They have such serious conditions, yet he finds hope in every case and gives his patients hope," Kale says. "And he's a very great problem solver, the way he knows anatomy and physiology. Those problem-solving skills are something that I hope I can have at least half of, one day. It's crazy what he can do."

Credit: 
Hospital for Special Surgery

Democratizing data science

MIT researchers are hoping to advance the democratization of data science with a new tool for nonstatisticians that automatically generates models for analyzing raw data.

Democratizing data science is the notion that anyone, with little to no expertise, can do data science if provided ample data and user-friendly analytics tools. Supporting that idea, the new tool ingests datasets and generates sophisticated statistical models typically used by experts to analyze, interpret, and predict underlying patterns in data.

The tool currently lives on Jupyter Notebook, an open-source web framework that allows users to run programs interactively in their browsers. Users need only write a few lines of code to uncover insights into, for instance, financial trends, air travel, voting patterns, the spread of disease, and other trends.

In a paper presented at this week's ACM SIGPLAN Symposium on Principles of Programming Languages, the researchers show their tool can accurately extract patterns and make predictions from real-world datasets, and even outperform manually constructed models in certain data-analytics tasks.

"The high-level goal is making data science accessible to people who are not experts in statistics," says first author Feras Saad '15, MEng '16, a PhD student in the Department of Electrical Engineering and Computer Science (EECS). "People have a lot of datasets that are sitting around, and our goal is to build systems that let people automatically get models they can use to ask questions about that data."

Ultimately, the tool addresses a bottleneck in the data science field, says co-author Vikash Mansinghka '05, MEng '09, PhD '09, a researcher in the Department of Brain and Cognitive Sciences (BCS) who runs the Probabilistic Computing Project. "There is a widely recognized shortage of people who understand how to model data well," he says. "This is a problem in governments, the nonprofit sector, and places where people can't afford data scientists."

The paper's other co-authors are Marco Cusumano-Towner, an EECS PhD student; Ulrich Schaechtle, a BCS postdoc with the Probabilistic Computing Project; and Martin Rinard, an EECS professor and researcher in the Computer Science and Artificial Intelligence Laboratory.

Bayesian modeling

The work uses Bayesian modeling, a statistics method that continuously updates the probability of a variable as more information about that variable becomes available. For instance, statistician and writer Nate Silver uses Bayesian-based models for his popular website FiveThirtyEight. Leading up to a presidential election, the site's models make an initial prediction that one of the candidates will win, based on various polls and other economic and demographic data. This prediction is the variable. On Election Day, the model uses that information, and weighs incoming votes and other data, to continuously update that probability of a candidate's potential of winning.

More generally, Bayesian models can be used to "forecast" -- predict an unknown value in the dataset -- and to uncover patterns in data and relationships between variables. In their work, the researchers focused on two types of datasets: time-series, a sequence of data points in chronological order; and tabular data, where each row represents an entity of interest and each column represents an attribute.

Time-series datasets can be used to predict, say, airline traffic in the coming months or years. A probabilistic model crunches scores of historical traffic data and produces a time-series chart with future traffic patterns plotted along the line. The model may also uncover periodic fluctuations correlated with other variables, such as time of year.

On the other hand, a tabular dataset used for, say, sociological research, may contain hundreds to millions of rows, each representing an individual person, with variables characterizing occupation, salary, home location, and answers to survey questions. Probabilistic models could be used to fill in missing variables, such as predicting someone's salary based on occupation and location, or to identify variables that inform one another, such as finding that a person's age and occupation are predictive of their salary.

Statisticians view Bayesian modeling as a gold standard for constructing models from data. But Bayesian modeling is notoriously time-consuming and challenging. Statisticians first take an educated guess at the necessary model structure and parameters, relying on their general knowledge of the problem and the data. Using a statistical programming environment, such as R, a statistician then builds models, fits parameters, checks results, and repeats the process until they strike an appropriate performance tradeoff that weighs the model's complexity and model quality.

The researchers' tool automates a key part of this process. "We're giving a software system a job you'd have a junior statistician or data scientist do," Mansinghka says. "The software can answer questions automatically from the data -- forecasting predictions or telling you what the structure is -- and it can do so rigorously, reporting quantitative measures of uncertainty. This level of automation and rigor is important if we're trying to make data science more accessible."

Bayesian synthesis

With the new approach, users write a line of code detailing the raw data's location. The tool loads that data and creates multiple probabilistic programs that each represent a Bayesian model of the data. All these automatically generated models are written in domain-specific probabilistic programming languages -- coding languages developed for specific applications -- that are optimized for representing Bayesian models for a specific type of data.

The tool works using a modified version of a technique called "program synthesis," which automatically creates computer programs given data and a language to work within. The technique is basically computer programming in reverse: Given a set of input-output examples, program synthesis works its way backward, filling in the blanks to construct an algorithm that produces the example outputs based on the example inputs.

The approach is different from ordinary program synthesis in two ways. First, the tool synthesizes probabilistic programs that represent Bayesian models for data, whereas traditional methods produce programs that do not model data at all. Second, the tool synthesizes multiple programs simultaneously, while traditional methods produce only one at a time. Users can pick and choose which models best fit their application.

"When the system makes a model, it spits out a piece of code written in one of these domain-specific probabilistic programming languages ... that people can understand and interpret," Mansinghka says. "For example, users can check if a time series dataset like airline traffic volume has seasonal variation just by reading the code -- unlike with black-box machine learning and statistics methods, where users have to trust a model's predictions but can't read it to understand its structure."

Probabilistic programming is an emerging field at the intersection of programming languages, artificial intelligence, and statistics. This year, MIT hosted the first International Conference on Probabilistic Programming, which had more than 200 attendees, including leading industry players in probabilistic programming such as Microsoft, Uber, and Google.

Credit: 
Massachusetts Institute of Technology