Tech

Study finds people who feed birds impact conservation

image: A dark-eyed junco, an American goldfinch, and a house finch feed on sunflower seeds on a snowy day. Bird watchers report that cold weather influences how much they feed birds, more so than time or money. Photo by Cynthia Raught.

Image: 
Virginia Tech

People in many parts of the world feed birds in their backyards, often due to a desire to help wildlife or to connect with nature. In the United States alone, over 57 million households feed backyard birds, spending more than $4 billion annually on bird food.

While researchers know that bird feeding can influence nature, they do not know how it influences the people who feed those birds.

"Given that so many people are so invested in attracting birds to their backyard, we were interested in what natural changes they observe at their feeders beyond simply more birds," said Ashley Dayer, an assistant professor in the Department of Fish and Wildlife Conservation in the College of Natural Resources and Environment at Virginia Tech. "In particular, we wanted to know how they respond to their observations. For example, how do they feel if they see sick birds at their feeders, and what actions do they take to address these observations?"

Researchers Ashley Dayer and Dana Hawley of Virginia Tech recently published their findings in People and Nature, a new journal published by the British Ecological Society.

The study was conducted in collaboration with researchers from the Cornell Lab of Ornithology and the Odum School of Ecology at the University of Georgia.

The researchers analyzed how people who feed birds notice and respond to natural events at their feeders by collaborating with Project FeederWatch, a program managed by the Cornell Lab of Ornithology that engages more than 25,000 people to observe and collect data on their backyard birds.

Using a survey of 1, 176 people who feed birds and record their observations of birds in the Project FeederWatch database, the researchers found that most people noticed natural changes in their backyards that could be due to feeding, including an increase in the number of birds at their feeders, a cat or hawk near their feeders, or a sick bird at their feeders.

"More and more, we see that humans are interacting less with nature and that more of our wildlife are being restricted to areas where there are humans around. Looking at how humans react to and manage wildlife in their own backyards is very important for the future of wildlife conservation and for understanding human well-being as the opportunities for people to interact with wildlife become more restricted to backyard settings," said Hawley, an associate professor in the Department of Biological Sciences in the College of Science. Hawley's research program at Virginia Tech focuses on wildlife disease ecology and evolution.

"From my 17 years working with people who feed birds as part of citizen science projects, I've heard a great deal about their impactful observations at their feeders," said co-author David Bonter, director of Citizen Science at the Cornell Lab of Ornithology. "This study provides important information about the breadth and pattern of these experiences through responses of over 1,000 participants. The findings will help us at Project Feederwatch improve how we work with bird watchers toward our shared goal of bird conservation."

The people who feed birds also responded, particularly to cats at their feeders, by scaring off the cats, moving feeders, or providing shelter for birds. When observing sick birds, most people cleaned their feeders. When observing more birds, people often responded by providing more food. Fewer people acted in response to seeing hawks; the most common response to this was providing shelter for the feeder birds. These human responses were, in some cases, tied to peoples' emotions about their observations, particularly anger. While cats near feeders most commonly evoked anger, sick birds led to sadness or worry. Emotions in response to hawks were more varied.

"Feeding wild birds is a deceptively commonplace activity. Yet, it is one of the most intimate, private, and potentially profound forms of human interaction with nature. This perceptive study uncovers some of the remarkable depth associated with bird feeding and discerns that people who feed birds are alert to a wide range of additional natural phenomena," said Darryl Jones, a professor at the Environmental Futures Research Institute and School of Environment and Sciences at Griffith University in Australia, who was not connected to the study.

One surprising result that the researchers found in this study was that when deciding how much to feed birds, people prioritized natural factors, such as cold weather, more than time and money. Most people believed that the effects of their feeding on wild birds was primarily good for birds, even though many observed and took action in response to natural events in their backyard that could impact the health of the birds and might partly result from their feeding.

"Overall, our results suggest that people who feed birds observe aspects of nature and respond in ways that may affect outcomes of feeding on wild birds. More work is needed to fully understand the positive and negative effects of feeding on wild birds and, thereby, the people who feed them," said Dayer, whose research focuses on the human dimensions of wildlife conservation, applying social science to understand human behavior related to wildlife.

Credit: 
Virginia Tech

NASA finds heavy rainfall around Tropical Cyclone Joaninha's center

image: On March 24, 2019 NASA's IMERG calculated heavy rain falling around Tropical Cyclone Joaninha's center at a rate of 0.75 inches (19 mm) per hour (dark red).

Image: 
JAXA/NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA calculated the rainfall rates occurring in Tropical Cyclone Joaninha as it moved through the open waters of the Southern Indian Ocean.

Data from the Global Precipitation Measurement mission (GPM) is used to develop rainfall total estimations and provided a look at rain Joaninha was generating.

GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA. GPM also utilizes a constellation of other satellites to provide a global analysis of precipitation that are used in the IMERG calculation. NASA's Integrated Multi-satellitE Retrievals for GPM (IMERG) data were used to estimate the rate in which rain was falling in Joaninha on March 24. Rainfall rates were often greater than 0.75 inches (19 mm) per hour around the center.

What Is IMERG?

At NASA's Goddard Space Flight Center in Greenbelt, Maryland, those data are incorporated into NASA's IMERG or Integrated Multi-satellitE Retrievals for GPM. IMERG is used to estimate precipitation from a combination of passive microwave sensors, including the Global Precipitation Measurement (GPM) mission's core satellite's GMI microwave sensor and geostationary IR (infrared) data. IMERG real-time data are generated by NASA's Precipitation Processing System every half hour and are normally available within six hours.

IMERG creates a merged precipitation product from the GPM constellation of satellites. These satellites include DMSPs from the U.S. Department of Defense, GCOM-W from the Japan Aerospace Exploration Agency (JAXA), Megha-Tropiques from the Centre National D'etudies Spatiales (CNES) and Indian Space Research Organization (ISRO), NOAA series from the National Oceanic and Atmospheric Administration (NOAA), Suomi-NPP from NOAA-NASA, and MetOps from the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). All of the instruments (radiometers) onboard the constellation partners are inter-calibrated with information from the GPM Core Observatory's GPM Microwave Imager (GMI) and Dual-frequency Precipitation Radar (DPR).

Tropical Cyclone Joaninha's Status

On March 25 at 11 a.m. EDT (1500 UTC), Joaninha had maximum sustained winds near 105 knots (121 mph/195 kph). It was centered near 18.3 degrees south latitude and 62.9 degrees east longitude, about 335 miles east-northeast of Port Louis, Mauritius. Joaninha was moving to the southeast and is expected to continue in that general direction.

A tropical cyclone warning class 4 remains in force at Rodrigues. At 11 a.m. EDT (1500 UTC) wind gusts of 87 knots (100 mph/161 kph) were observed, which will increase as the cyclone passes.

The Joint Typhoon Warning Center forecast expects Joaninha to continue to move southeast, while intensifying to 110 knots (126 mph/204 kph) by March 26, as it passes close to Rodrigues Island. After 3 days, the cyclone will start to weaken as conditions deteriorate.

Credit: 
NASA/Goddard Space Flight Center

Model learns how individual amino acids determine protein function

A machine-learning model from MIT researchers computationally breaks down how segments of amino acid chains determine a protein's function, which could help researchers design and test new proteins for drug development or biological research.

Proteins are linear chains of amino acids, connected by peptide bonds, that fold into exceedingly complex three-dimensional structures, depending on the sequence and physical interactions within the chain. That structure, in turn, determines the protein's biological function. Knowing a protein's 3-D structure, therefore, is valuable for, say, predicting how proteins may respond to certain drugs.

However, despite decades of research and the development of multiple imaging techniques, we know only a very small fraction of possible protein structures -- tens of thousands out of millions. Researchers are beginning to use machine-learning models to predict protein structures based on their amino acid sequences, which could enable the discovery of new protein structures. But this is challenging, as diverse amino acid sequences can form very similar structures. And there aren't many structures on which to train the models.

In a paper being presented at the International Conference on Learning Representations in May, the MIT researchers develop a method for "learning" easily computable representations of each amino acid position in a protein sequence, initially using 3-D protein structure as a training guide. Researchers can then use those representations as inputs that help machine-learning models predict the functions of individual amino acid segments -- without ever again needing any data on the protein's structure.

In the future, the model could be used for improved protein engineering, by giving researchers a chance to better zero in on and modify specific amino acid segments. The model might even steer researchers away from protein structure prediction altogether.

"I want to marginalize structure," says first author Tristan Bepler, a graduate student in the Computation and Biology group in the Computer Science and Artificial Intelligence Laboratory (CSAIL). "We want to know what proteins do, and knowing structure is important for that. But can we predict the function of a protein given only its amino acid sequence? The motivation is to move away from specifically predicting structures, and move toward [finding] how amino acid sequences relate to function."

Joining Bepler is co-author Bonnie Berger, the Simons Professor of Mathematics at MIT with a joint faculty position in the Department of Electrical Engineering and Computer Science, and head of the Computation and Biology group.

Learning from structure

Rather than predicting structure directly -- as traditional models attempt -- the researchers encoded predicted protein structural information directly into representations. To do so, they use known structural similarities of proteins to supervise their model, as the model learns the functions of specific amino acids.

They trained their model on about 22,000 proteins from the Structural Classification of Proteins (SCOP) database, which contains thousands of proteins organized into classes by similarities of structures and amino acid sequences. For each pair of proteins, they calculated a real similarity score, meaning how close they are in structure, based on their SCOP class.

The researchers then fed their model random pairs of protein structures and their amino acid sequences, which were converted into numerical representations called embeddings by an encoder. In natural language processing, embeddings are essentially tables of several hundred numbers combined in a way that corresponds to a letter or word in a sentence. The more similar two embeddings are, the more likely the letters or words will appear together in a sentence.

In the researchers' work, each embedding in the pair contains information about how similar each amino acid sequence is to the other. The model aligns the two embeddings and calculates a similarity score to then predict how similar their 3-D structures will be. Then, the model compares its predicted similarity score with the real SCOP similarity score for their structure, and sends a feedback signal to the encoder.

Simultaneously, the model predicts a "contact map" for each embedding, which basically says how far away each amino acid is from all the others in the protein's predicted 3-D structure -- essentially, do they make contact or not? The model also compares its predicted contact map with the known contact map from SCOP, and sends a feedback signal to the encoder. This helps the model better learn where exactly amino acids fall in a protein's structure, which further updates each amino acid's function.

Basically, the researchers train their model by asking it to predict if paired sequence embeddings will or won't share a similar SCOP protein structure. If the model's predicted score is close to the real score, it knows it's on the right track; if not, it adjusts.

Protein design

In the end, for one inputted amino acid chain, the model will produce one numerical representation, or embedding, for each amino acid position in a 3-D structure. Machine-learning models can then use those sequence embeddings to accurately predict each amino acid's function based on its predicted 3-D structural "context" -- its position and contact with other amino acids.

For instance, the researchers used the model to predict which segments, if any, pass through the cell membrane. Given only an amino acid sequence, the researchers' model predicted all transmembrane and non-transmembrane segments more accurately than state-of-the-art models.

Next, the researchers aim to apply the model to more prediction tasks, such as figuring out which sequence segments bind to small molecules, which is critical for drug development. They're also working on using the model for protein design. Using their sequence embeddings, they can predict, say, at what color wavelengths a protein will fluoresce.

"Our model allows us to transfer information from known protein structures to sequences with unknown structure. Using our embeddings as features, we can better predict function and enable more efficient data-driven protein design," Bepler says. "At a high level, that type of protein engineering is the goal."

Berger adds: "Our machine learning models thus enable us to learn the 'language' of protein folding -- one of the original 'Holy Grail' problems -- from a relatively small number of known structures."

Credit: 
Massachusetts Institute of Technology

Matter waves and quantum splinters

Physicists in the United States, Austria and Brazil have shown that shaking ultracold Bose-Einstein condensates (BECs) can cause them to either divide into uniform segments or shatter into unpredictable splinters, depending on the frequency of the shaking.

"It's remarkable that the same quantum system can give rise to such different phenomena," said Rice University physicist Randy Hulet, co-author of a study about the work published online today in the journal Physical Review X. Hulet's lab conducted the study's experiments using lithium BECs, tiny clouds of ultracold atoms that march in lockstep as if they are a single entity, or matter wave. "The relationship between these states can teach us a great deal about complex quantum many-body phenomena."

The research was conducted in collaboration with physicists at Austria's Vienna University of Technology (TU Wien) and Brazil's University of São Paulo at São Carlos.

The experiments harken to Michael Faraday's 1831 discovery that patterns of ripples were created on the surface of a fluid in a bucket that was shaken vertically at certain critical frequencies. The patterns, known as Faraday waves, are similar to resonant modes created on drumheads and vibrating plates.

To investigate Faraday waves, the team confined BECs to a linear one-dimensional waveguide, resulting in a cigar-shaped BEC. The researchers then shook the BECs using a weak, slowly oscillating magnetic field to modulate the strength of interactions between atoms in the 1D waveguide. The Faraday pattern emerged when the frequency of modulation was tuned near a collective mode resonance.

But the team also noticed something unexpected: When the modulation was strong and the frequency was far below a Faraday resonance, the BEC broke into "grains" of varying size. Rice research scientist Jason Nguyen, lead co-author of the study, found the grain sizes were broadly distributed and persisted for times even longer than the modulation time.

"Granulation is usually a random process that is observed in solids such as breaking glass, or the pulverizing of a stone into grains of different sizes," said study co-author Axel Lode, who holds joint appointments at both TU Wien and the Wolfgang Pauli Institute at the University of Vienna.

Images of the quantum state of the BEC were identical in each Faraday wave experiment. But in the granulation experiments the pictures looked completely different each time, even though the experiments were performed under identical conditions.

Lode said the variation in the granulation experiments arose from quantum correlations -- complicated relationships between quantum particles that are difficult to describe mathematically.

"A theoretical description of the observations proved challenging because standard approaches were unable to reproduce the observations, particularly the broad distribution of grain sizes," Lode said. His team helped interpret the experimental results using a sophisticated theoretical method, and its implementation in software, which accounted for quantum fluctuations and correlations that typical theories do not address.

Hulet, Rice's Fayez Sarofim Professor of Physics and Astronomy, and a member of the Rice Center for Quantum Materials (RCQM), said the results have important implications for investigations of turbulence in quantum fluids, an unsolved problem in physics.

Credit: 
Rice University

Breast cancer may be likelier to spread to bone with nighttime dim-light exposure

NEW ORLEANS--Exposure to dim light at night, which is common in today's lifestyle, may contribute to the spread of breast cancer to the bones, researchers have shown for the first time in an animal study. Results of the study will be presented Saturday at ENDO 2019, the Endocrine Society's annual meeting in New Orleans, La.

"To date, no one has reported that exposure to dim light at night induces circadian disruption, which then increases the formation of bone metastatic breast cancer," said Muralidharan Anbalagan, Ph.D., assistant professor, Tulane University School of Medicine in New Orleans, La.". "This is important, as many patients with breast cancer are likely exposed to light at night as a result of lack of sleep, stress, excess light in the bedroom from mobile devices and other sources, or night shift work."

More than 150,000 U.S. women had breast cancer in 2017 that metastasized, or spread outside the breast, according to an estimate from the National Cancer Institute. When breast cancer spreads, it often goes to the bones, where it can cause severe pain and fragile bones.

In this preliminary study funded by the Louisiana Clinical and Translational Science Center (LACATS) in collaboration with Louisiana Cancer Research Consortium (LCRC) & Tulane Center for Circadian Biology, the researchers created a mouse model of bone metastatic breast cancer. They injected estrogen receptor-positive human breast cancer cells that have a low propensity to grow in bones into the tibia, or shinbone, of female mice. Like humans, the mice used in this study produce a strong nighttime circadian melatonin signal. This nighttime melatonin signal has been shown to produce strong anti-cancer actions and also promotes sleep.

All mice were kept in the light for 12 hours each day. One group of three mice was in the dark the other 12 hours, which helped them produce high levels of endogenous melatonin. Another group spent 12 hours in light followed by 12 hours in dim light at night, which suppresses their nocturnal melatonin production. The dim light was 0.2 lux, which is less than a night-light or a display light from a cell phone, according to Anbalagan.

X-ray images showed that mice exposed to a light/dim light cycle had much larger tumors and increased bone damage compared with mice kept in a standard light/dark cycle, he reported.

"Our research identified the importance of an intact nocturnal circadian melatonin anti-cancer signal in suppressing bone-metastatic breast tumor growth," Anbalagan said.

The ultimate goal of their research, he said, is to find a way to inhibit or suppress the progression of breast cancer metastases to bone.

Credit: 
The Endocrine Society

For migraine sufferers with obesity, losing weight can decrease headaches

NEW ORLEANS--For migraine sufferers with obesity, losing weight can decrease headaches and improve quality of life, researchers from Italy and the United States report. The results of their meta-analysis will be presented Saturday, March 23 at ENDO 2019, the Endocrine Society's annual meeting in New Orleans, La.

"If you suffer from migraine headaches and are obese, losing weight will ameliorate the quality of your family and social life as well as your work and school productivity. Your overall quality of life will greatly improve," said lead study author Claudio Pagano, M.D., Ph.D., an associate professor of internal medicine at the University of Padova in Padova, Italy.

"Weight loss in adults and children with obesity greatly improves migraine headache by improving all the main features that worsen migraineurs' quality of life," he added. "When people lose weight, the number of days per month with migraine decreases, as does pain severity and headache attack duration."

To investigate the effects of weight loss achieved through bariatric surgery or behavioral intervention on migraine frequency and severity, Pagano and his colleagues reviewed the standard online medical research databases for studies that considered pain intensity, headache frequency, attack duration, disability; and BMI, BMI change, intervention (bariatric surgery versus behavioral), and population (adult versus pediatric).

In a meta-analysis of the 473 patients in the 10 studies that met the researchers' inclusion criteria, they found that weight loss was linked with significant reductions in headache frequency, pain intensity and disability (all p

Migraine improvement was not linked with either degree of obesity at baseline or amount of weight reduction. Also, the effect on migraine was similar when weight reduction was achieved through bariatric surgery or behavioral intervention and was comparable in adults and children.

"Weight loss reduces the impact of conditions associated with obesity, including diabetes, hypertension, coronary heart disease, stroke and respiratory diseases," Pagano said. "Obesity and migraine are common in industrialized countries. Improving quality of life and disability for these patients will greatly impact these populations and reduce direct and indirect healthcare costs."

The mechanisms linking obesity, weight loss and migraine headache remain unclear, according to the authors, but they may include alterations in chronic inflammation, adipocytokines, obesity comorbidities, and behavioral and psychological risk factors.

Credit: 
The Endocrine Society

Researchers find method to prioritize treatment strategies in hepatitis C in US prisons

Key takeaway: New research identifies guidelines for prioritizing the treatment of hepatitis C infections in the U.S. prison system, resulting in a significant improvement in health outcomes of incarcerated persons while simultaneously reducing new infections in the society.

CATONSVILLE, MD, March 21, 2019 - There are currently more than three million people in the United States with hepatitis C, a condition that can lead to serious and even deadly liver complications. In the U.S. prison system, the prevalence of hepatitis C virus (HCV) infection is currently 10 times higher than the national average. And while new HCV treatment drugs are very effective, their high cost along with very limited healthcare budget in prisons impedes universal treatment in prisons. However, new research in the INFORMS journal Operations Research, has identified new protocols that could substantially decrease HCV infection in the U.S. prison system.

The study, "Prioritizing Hepatitis C Treatment in U.S. Prisons," was conducted by Turgay Ayer of the Georgia Institute of Technology, Can Zhang of Duke University, Anthony Bonifonte of Denison University, Anne C. Spaulding of Emory University, and Jagpreet Chhatwal of Harvard Medical School.

While the prevalence of HCV infection in the general population is only 1-2 percent, within the prison system the prevalence of antibodies to hepatitis C jumps to 17 percent. This is primarily due to the fact that many HCV-infected people are current or past injection drug users (IDU). Nearly 80 percent of all HCV transmissions are IDU-related transmissions, and most Americans who inject drugs have been incarcerated at some point during their lives.

Currently the biggest barrier to treating persons in prison with HCV is that while the newest medications have a higher than 95 percent cure rate (versus a 50 percent cure rate of previous treatments), the cost of treatment is outrageously high. When the new treatments were approved in 2015, their cost was $84,000 per treatment course. Since then, the prices have come down to around $25,000. However, even at this price, treating incarcerated persons could cost $3.3 billion. Because the healthcare budget is very limited, only 1-13 percent of HCV-infected persons in prison receive treatment currently.

Because of the cost/budget constraints, prisons often prioritize patients for HCV treatment. The current approach emphasizes liver stage , and often ignores other factors such as their risk of transmission, age, etc. The study's authors identified a new protocol to prioritize treatment among HCV-infected persons in the prison population to optimize the effect of HCV treatment on overall society's well-being. Their solution systematically considers factors including liver health state, remaining sentence length, propensity to inject drugs, age, disease progression over time, and reinfection rates.

"We found that by simultaneously considering health state, remaining sentence length, IDU status, and age in prioritization, decisions can lead to a significant decrease in hepatitis C-caused mortality and infections both in correctional health systems and in the community," said Ayer.

This new system offers an alternative to the current controversial patient prioritization protocol, which focuses on liver status, or the level of scarring on the patient's liver. "Ideally, prisons would be allocated enough resources to treat everyone infected," said Spaulding, a public health physician-researcher, who has been working with incarcerated persons living with HCV since 1996. "In the meantime, this algorithm is designed to maximize the public health outcome of treatment."

"Due to the simplicity in implementing prioritization policy, our work is appealing to multiple stakeholders within the U.S. prison system, including medical directors and policy makers at the prisons," continued Ayer. "Ultimately, by reducing the prevalence of hepatitis C in the prison population, we are reducing the chances of persons spreading the disease in the general population once they return to society."

Credit: 
Institute for Operations Research and the Management Sciences

Ankle exoskeleton fits under clothes for potential broad adoption

image: The new ankle exoskeleton design integrates into the shoe and under clothing.

Image: 
Matthew Yandell

A new lightweight, low-profile and inexpensive ankle exoskeleton could be widely used among elderly people, those with impaired lower-leg muscle strength and workers whose jobs require substantial walking or running.

Developed by Vanderbilt mechanical engineers, the device is believed to be the first ankle exoskeleton that could be worn under clothes without restricting motion. It does not require additional components such as batteries or actuators carried on the back or waist.

The study, published online by IEEE Transactions on Neural Systems & Rehabilitation Engineering, builds on a successful and widely cited ankle exoskeleton concept from other researchers in 2015.

"We've shown how an unpowered ankle exoskeleton could be redesigned to fit under clothing and inside/under shoes so it more seamlessly integrates into daily life," said Matt Yandell, a mechanical engineering Ph.D. student and lead author of the study.

In a significant design advancement, the team invented an unpowered friction clutch mechanism that fits under the foot or shoe and is no thicker than a typical shoe insole. The complete device, which includes a soft shank sleeve and assistive spring, weighs just over one pound.

The unpowered ankle exoskeleton costs less than $100 to fabricate, without factoring in optimized design for manufacturing and economies of scale.

"Our design is lightweight, low profile, quiet, uses no motor or batteries, it is low cost to manufacture, and naturally adapts to different walking speeds to assist the ankle muscles," said Karl Zelik, assistant professor of mechanical engineering and senior author on the study.

Zelik will be presenting this work next week at the Wearable Robotics Association Conference in Phoenix, Arizona.

The potential applications are broad, from helping aging people stay active to assisting recreational walkers, hikers or runners, he said.

"It could also help reduce fatigue in occupations that involve lots of walking, such as postal and warehouse workers, and soldiers in the field," Zelik said.

Joshua Tacca, BE'18, also is a co-author. He is now a graduate student in the Integrative Physiology Department at the University of Colorado-Boulder. Several other Vanderbilt undergraduate engineering students also contributed to the device design and pilot testing.

Credit: 
Vanderbilt University

Financial incentives didn't improve response rates to mailed colorectal cancer screening tests

Bottom Line: Financial incentives didn't increase completion rates of colorectal cancer screening tests mailed to patients. In a randomized clinical trial of almost 900 patients, none of the incentives (an unconditional $10, a promised $10 upon completion of the fecal immunochemical test (FIT) kit to test for blood in a stool sample or chance at a lottery with a 1-in-10 chance of winning $100) was statistically better than no financial incentive to entice patients to complete the FIT. The overall FIT completion rate at six months was nearly 29 percent but the incentives used in this study may have been too small to improve response rates.

Authors: Shivan J. Mehta, M.D., M.B.A., M.S.H.P., Perelman School of Medicine, University of Pennsylvania, Philadelphia, and co-authors.

(doi:10.1001/jamanetworkopen.2019.1156)

Editor's Note: The article contains conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Squishing blood stem cells could facilitate harvest for transplants

image: How deformable cells are, and thus how stiff or squishy they are, plays an important role in retaining blood-forming stem cells in their marrow niches and thus preserving their long-term repopulation capabilities.

Image: 
From Ni et al <i>Cell Stem Cell</i> (2019)

Scientists at Winship Cancer Institute of Emory University, Children's Healthcare of Atlanta and Georgia Tech have found that modulating blood-forming stem cells' stiffness could possibly facilitate mobilization procedures used for stem cell-based transplants.

Temporary squishiness could help drive blood-forming stem cells out of the bone marrow and into the blood, but the cells need to be stiff to stay put and replenish the blood and immune system, the researchers have found. The results from animal research were published on March 14 in the journal Cell Stem Cell.

How deformable cells are, and thus how stiff or squishy they are, plays an important role in retaining blood-forming stem cells in their marrow niches and thus preserving their long-term repopulation capabilities, says lead author Cheng-Kui Qu, MD, PhD. The research provides insights into how alterations in blood stem cell biomechanics can be associated with certain blood disorders, including leukemias.

"Bone marrow transplants", as part of a treatment strategy for cancer, don't usually involve physically extracting bone marrow. Instead, doctors use a drug (G-CSF) that encourages blood-forming stem cells to leave the bone marrow and enter the blood, because it generally gives a higher yield. However, that is not the case for about a third of patients, for whom mobilization is insufficient. Qu says one of the experiments in the paper was a "proof-of-concept" for a strategy that could supplement conventional approaches.

Qu is professor of pediatrics at Emory University School of Medicine, Winship Cancer Institute and Aflac Cancer and Blood Disorders Center, Children's Healthcare of Atlanta. The first author of the paper is postdoctoral fellow Fang Ni, MD, PhD.

Qu and his colleagues were studying an enzyme, Ptpn21, which is highly expressed in blood stem cells and helps reshape parts of a cell's internal skeleton. The scientists generated mice without Ptpn21, and in the bone marrow of the mutant mice. There were fewer stem cells and early progenitor cells. In addition, blood-forming stem cells tended to be further away (twice as far) from the niches where they usually reside.

The mutant mice were very sensitive to chemotherapy drugs, but it was also easier to spur blood stem cells out of their bone marrow. These observations suggested deformability as an explanation. Blood stem cells from mutant mice could more easily squeeze through narrow pores.

"Our initial observations led to a wonderful collaboration with the Lam and Sulchek labs," Qu says.

Qu approached Wilbur Lam and Todd Sulchek, biomedical engineers who are experts on studying the mechanical characteristics of cells. The Ptpn21-mutant cells were indeed squishier, and the scientists were able to measure exactly how much.

Qu's lab performed additional experiments to pin down how the loss of Ptpn21 affects cell deformability. They found they could make cells lacking Ptpn21 stiff again by interfering with the function of another protein, Septin1. In addition, they showed that treating normal mice with blebbistatin, which interferes with parts of a cell's internal skeleton, also results in mobilization of stem cells into the blood. Qu cautions that blebbistatin may also be having systemic effects on the mice.

"Our findings are that normal blood-forming stem cells are stiffer and less deformable than differentiated blood cells," Qu says. "This helps us better understand the pathogenesis of blood disorders associated with loss of stem cell quiescence. In addition, our findings suggest that cell biomechanics can be leveraged to improve current mobilization regimens for stem cell-based therapy."

Credit: 
Emory Health Sciences

Energy monitor can find electrical failures before they happen

A new system devised by researchers at MIT can monitor the behavior of all electric devices within a building, ship, or factory, determining which ones are in use at any given time and whether any are showing signs of an imminent failure. When tested on a Coast Guard cutter, the system pinpointed a motor with burnt-out wiring that could have led to a serious onboard fire.

The new sensor, whose readings can be monitored on an easy-to-use graphic display called a NILM (non-intrusive load monitoring) dashboard, is described in the March issue of IEEE Transactions on Industrial Informatics, in a paper by MIT professor of electrical engineering Steven Leeb, recent graduate Andre Aboulian MS '18, and seven others at MIT, the U.S. Coast Guard, and the U.S. Naval Academy. A second paper will appear in the April issue of Marine Technology, the publication of the Society of Naval Architects and Marine Engineers.

The system uses a sensor that simply is attached to the outside of an electrical wire at a single point, without requiring any cutting or splicing of wires. From that single point, it can sense the flow of current in the adjacent wire, and detect the distinctive "signatures" of each motor, pump, or piece of equipment in the circuit by analyzing tiny, unique fluctuations in the voltage and current whenever a device switches on or off. The system can also be used to monitor energy usage, to identify possible efficiency improvements and determine when and where devices are in use or sitting idle.

The technology is especially well-suited for relatively small, contained electrical systems such as those serving a small ship, building, or factory with a limited number of devices to monitor. In a series of tests on a Coast Guard cutter based in Boston, the system provided a dramatic demonstration last year.

About 20 different motors and devices were being tracked by a single dashboard, connected to two different sensors, on the cutter USCGC Spencer. The sensors, which in this case had a hard-wired connection, showed that an anomalous amount of power was being drawn by a component of the ship's main diesel engines called a jacket water heater. At that point, Leeb says, crewmembers were skeptical about the reading but went to check it anyway. The heaters are hidden under protective metal covers, but as soon as the cover was removed from the suspect device, smoke came pouring out, and severe corrosion and broken insulation were clearly revealed.

"The ship is complicated," Leeb says. "It's magnificently run and maintained, but nobody is going to be able to spot everything."

Lt. Col. Nicholas Galanti, engineer officer on the cutter, says "the advance warning from NILM enabled Spencer to procure and replace these heaters during our in-port maintenance period, and deploy with a fully mission-capable jacket water system. Furthermore, NILM detected a serious shock hazard and may have prevented a class Charlie [electrical] fire in our engine room."

The system is designed to be easy to use with little training. The computer dashboard features dials for each device being monitored, with needles that will stay in the green zone when things are normal, but swing into the yellow or red zone when a problem is spotted.

Detecting anomalies before they become serious hazards is the dashboard's primary task, but Leeb points out that it can also perform other useful functions. By constantly monitoring which devices are being used at what times, it could enable energy audits to find devices that were turned on unnecessarily when nobody was using them, or spot less-efficient motors that are drawing more current than their similar counterparts. It could also help ensure that proper maintenance and inspection procedures are being followed, by showing whether or not a device has been activated as scheduled for a given test.

"It's a three-legged stool," Leeb says. The system allows for "energy scorekeeping, activity tracking, and condition-based monitoring." But it's that last capability that could be crucial, "especially for people with mission-critical systems," he says. In addition to the Coast Guard and the Navy, he says, that includes companies such as oil producers or chemical manufacturers, who need to monitor factories and field sites that include flammable and hazardous materials and thus require wide safety margins in their operation.

One important characteristic of the system that is attractive for both military and industrial applications, Leeb says, is that all of its computation and analysis can be done locally, within the system itself, and does not require an internet connection at all, so the system can be physically and electronically isolated and thus highly resistant to any outside tampering or data theft.

Although for testing purposes the team has installed both hard-wired and noncontact versions of the monitoring system -- both types were installed in different parts of the Coast Guard cutter -- the tests have shown that the noncontact version could likely produce sufficient information, making the installation process much simpler. While the anomaly they found on that cutter came from the wired version, Leeb says, "if the noncontact version was installed" in that part of the ship, "we would see almost the same thing."

The research team also included graduate students Daisy Green, Jennifer Switzer, Thomas Kane, and Peer Lindahl at MIT; Gregory Bredariol of the U.S. Coast Guard; and John Donnal of the U.S. Naval Academy in Annapolis, Maryland. The research was funded by the U.S. Navy's Office of Naval Research NEPTUNE project, through the MIT Energy Initiative.

Credit: 
Massachusetts Institute of Technology

Antibodies stabilize plaque in arteries

image: Stephen Malin and Monica Centa, researchers at the Department of Medicine, Solna, Karolinska Institutet, Sweden.

Image: 
Alessandro Gallina

Researchers at Karolinska Institutet in Sweden have found that type IgG antibodies play an unexpected role in atherosclerosis. A study on mice shows that the antibodies stabilise the plaque that accumulates on the artery walls, which reduces the risk of it rupturing and causing a blood clot. It is hoped that the results, which are published in the journal Circulation, will eventually lead to improved therapies.

Atherosclerosis is the main underlying cause of heart attack and stroke, and is expected to be the leading cause of death in the world from a long time to come. Approximately a third of patients do not respond to statin treatment.

The disease is characterised by the narrowing of the arterial walls resulting from the accumulation of lipids and cells - the so-called atherosclerotic plaque. When the plaque ruptures, blood clots can form that restrict the blood flow to vital organs, such as the heart and brain. To reduce the number of deaths from atherosclerosis, researchers are therefore trying to find ways to prevent this from happening.

Immune system B lymphocytes produce antibodies that are involved in fighting infection. But the antibodies can also help to clean up damaged tissue, for instance in the form of atherosclerotic plaques. Scientists also know that the immune system has a bearing on the development of plaque, but exactly how this happens remains largely unresearched. The team behind the present study has studied how atherosclerotic plaque develops in mice that lack antibodies.

"We found that plaque formed in an antibody-free environment was unusually small," says study leader Stephen Malin, senior researcher at Karolinska Institutet's Department of Medicine in Solna. "But on closer inspection, we discovered that the plaque looked different and contained more lipid and fewer muscle cells than normal. This suggested that the plaque is unstable and more prone to rupturing, which also turned out to be the case."

The researchers found that the necessary ingredient for plaque stability was so-called IgG antibodies, the most common class of antibody in the blood. Further analyses showed that the smooth muscle cells of the aorta need these antibodies to divide correctly; when the cells cannot divide correctly, the plaque seems to become smaller and more unstable.

"It came as a huge surprise to us that antibodies can play such an important role in the formation of arterial plaque," says Dr Malin. "We now want to find out if it is some special type of IgG antibody that recognises plaque components. If so, this could be a new way of mitigating atherosclerosis and hopefully reducing the number of deaths from cardiovascular disease."

Credit: 
Karolinska Institutet

Energy stealthily hitches ride in global trade

image: How the energy embedded in materials and products has changed since the global economic crisis in 2008.

Image: 
Michigan State University

Fulfilling the world's growing energy needs summons images of oil pipelines, electric wires and truckloads of coal. But Michigan State University scientists show a lot of energy moves nearly incognito, embedded in the products of a growing society.

And that energy leaves its environmental footprint home.

In this month's journal Applied Energy, MSU researchers examine China's flow of virtual energy - the energy used to produce goods and products in one place that are shipped away. What they found was that virtual energy flowed from less-populated, energy-scarce areas in China's western regions to booming cities in the energy-abundant east.

In fact, the virtual energy transferred west to east was much greater than the physical energy that moves through China's massive infrastructure investment, the West-To-East Electricity Transmission Project. China is a powerful model of energy use, having surpassed the United States. In 2013, nearly 22 percent of global energy use occurred in China.

Conserving energy and managing its accompanying environmental impacts are a growing concern across the world, and it is crucial to take a holistic look at all the ways energy is used and moved," said Jianguo "Jack" Liu, Rachel Carson Chair in Sustainability of MSU's Center for Systems Integration and Sustainability (CSIS). "Only when we understand the full picture of who is producing energy and who is consuming it in all its forms can we make effective policy decisions."

Virtual energy is considered critical to avoiding regional energy crises since commodities traded from one location to another include virtual energy. This means the importing area can avoid expending the energy to produce the imported commodities. The paper "Shift in a National Virtual Energy Network" examines how a region's energy haves and have-nots meet economic and energy needs by acknowledging energy is tightly wound around economic growth and demand.

The researchers are first to focus on energy use after the 2008 global financial crisis, seeing how economic desperation can have a big, not always obvious, impact on energy - and the pollution and environmental degradation that can accompany its use.

"China, like a lot of places across the globe, has an uneven distribution of energy resources, and China also is developing quickly," said the article's first author Zhenci Xu, an MSU-CSIS PhD student. "We wanted to understand the true paths of energy use when economic growth kicks into gear after a financial crisis. Eventually, all the costs of energy use, including environmental damage and pollution, have to be accounted for, and current policies focus primarily on physical energy, not virtual energy."

The researchers found a persistent flow of total virtual energy from energy-scare to energy-abundant provinces increased from 43.2% in 2007 to 47.5% in 2012. Following the framework of metacoupling (socioeconomic-environmental interactions within as well as between adjacent and distant places), they also discovered after the financial crisis, trade was taking place between distant provinces - trade that came with energy's environmental footprint.

The authors note these types of analyses are needed across the globe to guide policies that hold the areas that are shifting their energy consumption to appropriately contribute to mitigating the true costs of energy.

Credit: 
Michigan State University

Alpine tundra releases long-frozen CO2 to the atmosphere, exacerbating climate warming

Thawing permafrost in high-altitude mountain ecosystems may be a stealthy, underexplored contributor to atmospheric carbon dioxide emissions, new University of Colorado Boulder research shows.

The new findings, published today in the journal Nature Communications, show that alpine tundra in Colorado's Front Range emits more CO2 than it captures annually, potentially creating a feedback loop that could increase climate warming and lead to even more CO2 emissions in the future.

A similar phenomenon exists in the Arctic, where research in recent decades has shown that melting permafrost is unearthing long-frozen tundra soil and releasing CO2 reserves that had been buried for centuries.

"We wondered if the same thing could be happening in alpine terrain," said John Knowles, lead author of the new study and a former doctoral student in CU Boulder's Department of Geography and a researcher at the Institute of Arctic and Alpine Research (INSTAAR). "This study is a strong indication that that is indeed the case."

Forests have long been considered vital carbon 'sinks,' sequestering more carbon than they produce and helping to mitigate global CO2 levels. As part of the Earth's carbon cycle, trees and other vegetation absorb CO2 via photosynthesis while microbes (which decompose soil nutrients and organic material) emit it back to the atmosphere via respiration, just as humans release CO2 with every breath.

Melting permafrost, however, changes that equation. As previously frozen tundra soil thaws and becomes exposed for the first time in years, its nutrients become freshly available for microbes to consume. And unlike plants, which go dormant in winter, microscopic organisms can feast all year long if environmental conditions are right.

To study this effect in alpine conditions, researchers measured the surface-to-air CO2 transfer over seven consecutive years (2008-2014) at the Niwot Ridge Long Term Ecological Research (LTER) site in Colorado, a high-altitude research project funded by the National Science Foundation that has been in continuous operation for over 35 years. The team also collected samples of soil CO2 and used radiocarbon dating to estimate how long the carbon forming that CO2 had been present in the landscape.

The study showed, somewhat surprisingly, that barren, wind-scoured tundra landscapes above 11,000 feet emitted more CO2 than they captured each year, and that a fraction of that CO2 was relatively old during the winter, the first such finding of its kind in temperate latitudes. The findings suggest higher-than-expected year-round microbial activity, even in the absence of a deep insulating snowpack.

"Microbes need it to be not too cold and not too dry, they need liquid water," said Knowles, now a researcher at the University of Arizona. "The surprise here is that we show winter microbial activity persisting in permafrost areas that don't collect much insulating snowpack due to wind stripping it away."

While the alpine tundra's net CO2 contributions are small compared to a forest's sequestration capability, the newly-documented effect may act as something of a counterweight, hampering atmospheric CO2 reductions from mountain ecosystems in general. The findings will need to be factored in to future projections of global warming, Knowles said.

"Until now, little was known about how alpine tundra behaved with regard to this balance, and especially how it could continue emitting CO2 year after year" Knowles said. "But now, we have evidence that climate change or another disturbance may be liberating decades-to-centuries-old carbon from this landscape."

Credit: 
University of Colorado at Boulder

Medicine and personal care products may lead to new pollutants in waterways

image: Pharmaceuticals and personal care products leave households through wastewater and may enter the environment after the wastewater treatment process.

Image: 
Abigail W. Porter/Rutgers University-New Brunswick

When you flush the toilet, you probably don't think about the traces of the medicine and personal care products in your body that are winding up in sewage treatment plants, streams, rivers, lakes, bays and the ocean.

But Rutgers scientists have found that bacteria in sewage treatment plants may be creating new contaminants that have not been evaluated for potential risks and may affect aquatic environments, according to a study in Environmental Toxicology and Chemistry.

The scientists tested the ability of bacteria in sludge from a sewage treatment plant to break down two widely used pharmaceutical products: naproxen, a non-steroidal anti-inflammatory drug, and guaifenesin, an expectorant in many cough and cold medications. They also tested two common compounds in personal care products: oxybenzone, a key ingredient in many sunscreens, and methylparaben, a preservative in many cosmetics.

Bacteria that don't require oxygen to grow in the sludge broke down methylparaben, but the microbes only partially broke down the three other chemicals - and created new contaminants in the process, according to the study.

"The partial breakdown of pharmaceuticals and personal care products is important because it results in a stream of possible contaminants in waterways that may have biological effects on impacted environments," said Abigail W. Porter, corresponding author and teaching instructor in the Department of Environmental Sciences at Rutgers University-New Brunswick. "These contaminants and their potential risks have yet to be studied."

Contaminants of emerging concern, including pharmaceuticals and personal care products, are increasingly found at low levels in surface water, according to the U.S. Environmental Protection Agency. There is concern that these chemical compounds may have an impact on aquatic life and human health.

"Our findings can help us assess other widely used pharmaceutical and personal care products with similar chemical structures," said co-author Lily Young, Distinguished Professor in the Department of Environmental Sciences. "By predicting or assessing the chemicals that might form during the breakdown process, we can identify and quantify them in the environment."

The Rutgers scientists are interested in how anaerobic microorganisms, such as bacteria that thrive in zero-oxygen conditions, break down the chemicals in pharmaceuticals and personal care products.

The team studied two bacterial communities: one in sludge from a sewage treatment plant and the other in low-oxygen subsurface sediment in a clean marine environment off Tuckerton, New Jersey. The researchers previously showed that bacteria can transform the anti-inflammatory drug naproxen.

The researchers found that the two microbial communities had different types of bacteria. But both communities transformed the four chemicals, which have very different structures, in the same way. Future research will look at sediment samples from different environmental locations to evaluate the long-term persistence of transformed chemicals.

Credit: 
Rutgers University