Earth

Study reveals bias in children even before they reach kindergarten

In a Developmental Science study of preschool-aged children, implicit and explicit evaluations of Black boys were less positive than evaluations of Black girls, White boys, or White girls.

This "gendered racial bias" was exhibited by both White and non-White children and was not correlated with their exposure to diversity. It also mirrors social bias observed in adults.

The study, which reveals the earliest evidence of bias at the intersection of race and gender, underscores the importance of addressing bias in the first years of life.

"Our results suggest that children are attuned to nuanced patterns of social bias at a surprisingly young age," said lead author Danielle Perszyk, of Northwestern University. "This means that efforts to counter such bias must begin very early in children's development."

Credit: 
Wiley

Conservation efforts help some rare birds more than others, study finds

image: The Conservation Reserve Enhancement Program appears to have benefited the Bell's vireo in Illinois.

Image: 
Photo by Michael Jeffords and Sue Post

CHAMPAIGN, Ill. -- Land conservation programs that have converted tens of thousands of acres of agricultural land in Illinois back to a more natural state appear to have helped some rare birds increase their populations to historic levels, a new study finds. Other bird species with wider geographic ranges have not fared as well, however.

The research, reported in the journal Ecosphere, finds that one of the four species studied, the Bell's vireo (Vireo bellii bellii), has bounced back from historic declines to more than double its last estimated abundance in Illinois.

"This increase surpasses state goals set for the bird in 2004, and speaks to some of the successes of the Conservation Reserve Enhancement Program, a national effort begun in 1996 to improve water quality, reduce erosion and restore lands and wildlife once lost to agricultural expansion," said Illinois Natural History Survey avian ecologist Bryan Reiley, who led the study. "Other rare birds - particularly those most reliant on early succession grasslands - are still struggling, however."

The growth of agriculture "has negatively affected biodiversity throughout the world," the study authors wrote. Grassland species have experienced some of the sharpest declines. Conservation programs like CREP use monetary incentives to entice private landowners to voluntarily convert some of their land back to grasslands, wetlands or forest. More than 140,000 acres have been restored so far in Illinois through CREP.

To determine how this conservation effort affects populations of specific rare birds, Reiley and his colleagues surveyed 172 randomly selected restored fields in 10 counties in central and west-central Illinois during the 2012-15 breeding seasons. They focused on the Bell's vireo and three other species in decline: the field sparrow (Spizella pusilla), northern bobwhite (Colinus virginianus) and willow flycatcher (Empidonax trailli trailli).

"We found that private land conservation efforts in Illinois are probably effective in achieving state population goals for some rare species, such as the Bell's vireo, which prefers shrubby areas near grasslands," Reiley said. "They also may help other species with similar habitat needs, like the willow flycatcher, which we estimate to be at 92 percent of the goal."

The field sparrow and northern bobwhite still appear to be in trouble, however. Based on the researchers' estimates, CREP has increased northern bobwhite populations by only 6 percent of the goal. Field sparrow abundance is better, but the conservation program has achieved only 33 percent of the goal for this species.

Reiley and his colleagues estimated that the amount of restored land would need to increase by 5 percent to rebuild populations of willow flycatchers to historic levels. Substantially more habitat would be required to support historic populations of field sparrows and northern bobwhites, however. To achieve state goals, those species would need habitat increases of 118 percent and 598 percent, respectively, the researchers found.

"Interestingly, all the species we studied, and probably many others not studied, would likely rebound to historic levels if 1 percent of the agricultural land in Illinois was restored through CREP," Reiley said. "This program is clearly important to populations of declining wildlife - not only in Illinois, but also in the other 26 states where it operates."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Blood test shows promise for early detection of severe lung-transplant rejection

image: Blood test for organ transplant monitoring using DNA.

Image: 
NHLBI

Researchers have developed a simple blood test that can detect when a newly transplanted lung is being rejected by a patient, even when no outward signs of the rejection are evident. The test could make it possible for doctors to intervene faster to prevent or slow down so-called chronic rejection--which is severe, irreversible, and often deadly--in those first critical months after lung transplantation. Researchers believe this same test might also be useful for monitoring rejection in other types of organ transplants. The work was funded by the National Heart, Lung, and Blood Institute (NHLBI), part of the National Institutes of Health.

The study's findings are scheduled to appear Jan. 22 in EBioMedicine, a publication of The Lancet.

"This test solves a long-standing problem in lung transplants: detection of hidden signs of rejection," said Hannah Valantine, M.D., co-leader of the study and lead investigator of the Laboratory of Organ Transplant Genomics in the Cardiovascular Branch at NHLBI. "We're very excited about its potential to save lives, especially in the wake of a critical shortage of donor organs."

The test relies on DNA sequencing, Valantine explained, and as such, represents a great example of personalized medicine, as it will allow doctors to tailor transplant treatments to those individuals who are at highest risk for rejection.

Lung transplant recipients have the shortest survival rates among patients who get solid organ transplantation of any kind--only about half live past five years. Lung transplant recipients face a high incidence of chronic rejection, which occurs when the body's immune system attacks the transplanted organ. Existing tools for detecting signs of rejection, such as biopsy, either require the removal of small amounts of lung tissue or are not sensitive enough to discern the severity of the rejection. The new test appears to overcome those challenges.

Called the donor-derived cell-free DNA test, the experimental test begins with obtaining a few blood droplets taken from the arm of the transplant recipient. A special set of machines then sorts the DNA fragments in the blood sample, and in combination with computer analysis, determines whether the fragments are from the recipient or the donor and how many of each type are present. Because injured or dying cells from the donor release lots of donor DNA fragments into the bloodstream compared to normal donor cells, higher amounts of donor DNA indicate a higher risk for transplant rejection in the recipient.

In the study, 106 lung transplant recipients were enrolled and monitored. Blood samples collected in the first three months after transplantation underwent the testing procedure. The results showed that those with higher levels of the donor-derived DNA fragments in the first three months of transplantation were six times more likely to subsequently develop transplant organ failure or die during the study follow-up period than those with lower donor-derived DNA levels. Importantly, researchers found that more than half of the high-risk subjects showed no outward signs of clinical complications during this period.

"We showed for the first time that donor-derived DNA is a predictive marker for chronic lung rejection and death, and could provide critical time-points to intervene, perhaps preventing these outcomes," Valantine said. "Once rejection is detected early via this test, doctors would then have the option to increase the dosages of anti-rejection drugs, add new agents that reduce tissue inflammation, or take other measures to prevent or slow the progression."

In 2010, Valantine was part of a research team that pioneered the first blood test to diagnose organ rejection. The now-widely used test, called the AlloMap, analyzes the expression of 20 genes in a transplant recipient's blood sample to determine whether the patient's immune system is launching an attack. The following year, Valantine and her colleagues showed for the first time that a cell-free DNA blood test could be useful for monitoring early signs of rejection. However, those early studies of the cell-free DNA test only identified signs of "acute" transplant rejection, which is easily reversed. The current study shows that high cell-free DNA levels during the first three months after transplant predicts chronic rejection. If validated, this blood test could become a routine tool used to monitor transplant patients at very early stages of rejection, the researchers said.

Credit: 
NIH/National Heart, Lung and Blood Institute

Possible Oahu populations offer new hope for Hawaiian seabirds

image: Newell's Shearwaters are endemic to Hawaii and face a variety of threats, but the discovery of a possible new population of Oahu is good news for the species.

Image: 
Lindsay Young

The two seabird species unique to Hawaii, Newell's Shearwaters and Hawaiian Petrels, are the focus of major conservation efforts--at risk from habitat degradation, invasive predators, and other threats, their populations plummeted 94% and 78% respectively between 1993 and 2013. However, a new study in The Condor: Ornithological Applications offers hope of previously undetected colonies of these birds on the island of Oahu, from which they were believed to have vanished by the late 1700s.

Shearwaters and petrels nest colonially in crevices, burrows, and under vegetation at mid to high elevations. They currently breed on other Hawaiian islands including Kauai and Maui, but were both believed to have extirpated from Oahu prior to European contact in 1778; biologists believed that occasional records from the island were birds thrown off-course at night by city lights.

Pacific Rim Conservation's Lindsay Young and her colleagues used a spatial model based on elevation, forest cover, and illumination to identify potential suitable breeding habitat for both species on Oahu, then deployed automated acoustic recording units at 16 sites on the island to listen for the birds' calls in 2016 and 2017, accessing remote mountain locations via helicopter. To their surprise, they detected petrels at one site and shearwaters at two sites.

"We were doing a statewide survey for these species for the U.S. Fish and Wildlife Service as part of recovery action planning, but Oahu was not initially included as one of the sites to survey, since evidence suggested they weren't there," says Young. "Since we're Oahu-based, we thought we would at least put a few recording units out to see if there was anything. And we were surprised, to say the least, that we not only had calls detected, but detected both species across two years."

These could be the last survivors of remnant breeding populations on Oahu, or they could be young birds from other islands that are searching for mates and breeding sites. "Either way, it gives us hope that we will be able to use social attraction--that is, using calls and decoys--to attract them nest on an island where they were once abundant," says Young. Oahu birds could help boost connectivity between individual island populations and provide extra insurance in case any one island's seabird population is decimated by an event such as a hurricane. As petrel and shearwater numbers continue to decline, protecting Hawaii's remaining seabirds remains a major conservation priority in the region, and the possibility that they're continuing to breed on Oahu provides new reason for optimism.

Credit: 
American Ornithological Society Publications Office

Artificial intelligence can dramatically cut time needed to process abnormal chest X-rays

image: This is professor Giovanni Montana, Chair in Data Science WMG, University of Warwick.

Image: 
University of Warwick

New research has found that a novel Artificial Intelligence (AI) system can dramatically reduce the time needed to ensure that abnormal chest X-rays with critical findings will receive an expert radiologist opinion sooner, cutting the average delay from 11 days to less than 3 days. Chest X-rays are routinely performed to diagnose and monitor a wide range of conditions affecting the lungs, heart, bones, and soft tissues.

Researchers from WMG at the University of Warwick, working with Guy's and St Thomas' NHS Hospitals, extracted a dataset of half million anonymised adult chest radiographs (X-rays) and developed an AI system for computer vision that can recognise radiological abnormalities in the X-rays in real-time and suggest how quickly these exams should be reported by a radiologist. In the process of building the AI system, the team developed and validated a Natural Language Processing (NLP) algorithm that can read a radiological report, understand the findings mentioned by the reporting radiologist, and automatically infer the priority level of the exam. By applying this algorithm to the historical exams, the team generated a large volume of training exams that allowed the AI system to understand which visual patterns in X-rays were predictive of their urgency level.

The research team, led by Professor Giovanni Montana, Chair in Data Science in WMG at the University of Warwick, found that normal chest radiographs were detected with a positive predicted value of 73% and a negative predicted value of 99%, and at a speed that meant that abnormal radiographs with critical findings could be prioritised to receive an expert radiologist opinion much sooner than the usual practice.

The results of the research are published today, 22nd January 2019 in the leading journal Radiology in a paper entitled "Automated triaging and prioritization of adult chest radiographs using deep artificial neural networks."

WMG's Professor Giovanni Montana said:

"Artificial intelligence led reporting of imaging could be a valuable tool to improve department workflow and workforce efficiency. The increasing clinical
demands on radiology departments worldwide has challenged current service delivery models, particularly in publicly-funded healthcare systems. It is no longer feasible for many Radiology departments with their current staffing level to report all acquired plain radiographs in a timely manner, leading to large backlogs of unreported studies. In the United Kingdom, it is estimated that at any time there are over 300,000 radiographs waiting over 30 days for reporting. The results of this research shows that alternative models of care, such as computer vision algorithms, could be used to greatly reduce delays in the process of identifying and acting on abnormal X-rays - particularly for chest radiographs which account for 40% of all diagnostic imaging performed worldwide. The application of these technologies also extends to many other imaging modalities including MRI and CT."

Credit: 
University of Warwick

Half of parents try Zicam or other fake cold prevention methods for kids

image: Many parents still believe "folklore strategies" or use vitamins or supplements for cold prevention that are not scientifically supported.

Image: 
C.S. Mott Children's Hospital National Poll on Children's Health at the University of Michigan.

ANN ARBOR, Mich. -- Vitamin C to keep the germs away. Never go outside with wet hair. Stay inside.

Despite little or no evidence suggesting these types of methods actually help people avoid catching or preventing a cold, more than half of parents have tried them with their kids, according to the C.S. Mott Children's Hospital National Poll on Children's Health at the University of Michigan.

The good news: Almost all parents (99 percent) say their approach to cold prevention involves strong personal hygiene, which science shows prevents spreading colds. These strategies include encouraging children to wash hands frequently, teaching children not to put their hands near their mouth or nose and discouraging children from sharing utensils or drinks with others.

Still, 51 percent of parents gave their child an over-the-counter vitamin or supplement to prevent colds, even without evidence that they work. Seventy-one percent of parents also say they try to protect their child from catching a cold by following non-evidence-based "folklore" advice, such as preventing children from going outside with wet hair or encouraging them to spend more time indoors.

Colds are caused by viruses spread most frequently from person to person. The most common mechanism is from mucous droplets from the nose or mouth that get passed on through direct contact or through the air by sneezing or coughing and landing on the hands and face, or on surfaces such as door handles, faucets, countertops and toys.

"The positive news is that the majority of parents do follow evidence-based recommendations to avoid catching or spreading the common cold and other illnesses," says Gary Freed, M.D., M.P.H., co-director of the poll and a pediatrician at Mott.

"However, many parents are also using supplements and vitamins not proven to be effective in preventing colds and that are not regulated by the U.S. Food and Drug Administration. These are products that may be heavily advertised and commonly used but none have been independently shown to have any definitive effect on cold prevention."

There is no evidence that giving a child Vitamin C, multivitamins or other products advertised to boost the immune system is effective in preventing the common cold. Freed notes that the effectiveness of supplements and vitamins do not need to be proven in order for them to be sold.

The folklore strategies, he adds, have likely been passed down from generation to generation and started before people knew that germs were actually the cause of illnesses like colds.

On the bright side, even more parents use cold prevention strategies that are supported by science. In addition to helping children practice good hygiene habits, 87 percent of parents keep children away from people who are already sick. Sixty-four percent of parents reported that they ask relatives who have colds not to hug or kiss their child, and 60 percent would skip a playdate or activity if other children attending were ill. Some parents (31 percent) avoid playgrounds altogether during the cold season.

Eighty-four percent of parents also incorporated sanitizing their child's environment as a strategy for preventing colds, such as frequent washing of household surfaces and cleaning toys.

On average, school-age children experience three to six colds per year, with some lasting as long as two weeks.

"When children are sick with a cold, it affects the whole family," Freed says. "Colds can lead to lack of sleep, being uncomfortable and missing school and other obligations. All parents want to keep families as healthy as possible."

But, he adds, "It's important for parents to understand which cold prevention strategies are evidence-based. While some methods are very effective in preventing children from catching the cold, others have not been shown to actually make any difference."

Credit: 
Michigan Medicine - University of Michigan

Leaf age determines the division of labor in plant stress responses

image: PBS3 protects young leaves from ABA-mediated immune suppression, which tips the scales towards enhanced pathogen resistance. In older leaves exposed to simultaneous stresses, this effect is lost, and immunity is dampened.

Image: 
Kenichi Tsuda

Unlike animals, plants cannot move freely to escape from life-threatening conditions. This constraint means that they require strategies to protect themselves against the diverse stresses they encounter in their natural environments. These environmental stresses can be of a physical (abiotic) nature, such as drought and high salinity, or are biotic, such as attack from microbial pathogens and insect pests. The underlying protective mechanisms of plants involve inducible stress responses that are specialized to the respective stress. However, the finite resources that are available to plants means that specialized defenses also pose a problem: inducible stress responses tailored to a physical stress, such as drought, lower resistance against pathogen attack. What then happens when a plant is simultaneously exposed to physical and biological stresses? This question has now been answered by researchers led by Kenichi Tsuda and Paul Schulze-Lefert at the Max Planck Institute for Plant Breeding Research in Cologne, Germany.

Many plant responses to stress are mediated by small signaling molecules called phytohormones, and the authors focused on two particular stress pathways in their study: One mediated by abscisic acid (ABA), which triggers a program that protects plants from abiotic stress and another activated by salicylic acid (SA), which provides protection against pathogens. To allow efficient allocation of resources, activation of ABA-mediated defense dampens the SA response. To determine the significance of this crosstalk for plants simultaneously exposed to both physical and pathogen stresses, the authors first more closely studied the crosstalk between the two phytohormone pathways in the model plant Arabidopsis thaliana. Surprisingly, the authors found that pre-exposure of plants to ABA blocked activity of the SA-dependent response arm only in older leaves, which rendered these more sensitive to bacterial infection, while younger leaves were protected from this block of the SA response. Using RNA sequencing technology, they identified a gene called PBS3, which was then shown to be responsible for the protection of young leaves from ABA-mediated immunosuppression. They made similar observations under physical stresses such as drought and high salinity stress. Thus, plants actively balance trade-offs between biological and physical stress responses based on leaf age.

Crucially, the lack of PBS3 not only affected young leaves under combined stresses but led to compromised growth and a lower number of seed capsules and thus compromised overall plant reproductive fitness. Thus, an active balancing mechanism between biological and physical stress responses depending on leaf age increases plant fitness under combined stresses.

Several important questions remain: For example, do other plants such as crops balance stress response trade-offs to maintain growth and reproduction? How does PBS3 protect younger leaves from abiotic stress-triggered immunosuppression? Considering that trade-offs between stress responses exert a major constraint on crop productivity, it will be crucial to answer these questions for sustainable agriculture.

Credit: 
Max Planck Institute for Plant Breeding Research

New study raises hopes of eradication of malaria

image: This is Anders Björkman, Professor at the Department of Microbiology, Tumour and Cell Biology, Karolinska Institutet, Sweden.

Image: 
John Sennett

After major global successes in the battle against malaria, the positive trend stalled around 2015 - apart from in Zanzibar in East Africa, where only a fraction of the disease remains. In a new study published in BMC Medicine, researchers at Karolinska Institutet in Sweden explain why this was and show that new strategies are needed to eradicate the disease. One of the problems is a change in mosquito behaviour and selection in the parasites.

The years around 2000 Professor Anders Björkman described as catastrophic with respect to the global spread of malaria. This triggered a world-wide initiative that was given a boost by new kinds of drug and the widespread distribution of impregnated mosquito nets and domestic anti-mosquito sprays. The outcome was a halving of the global spread of the disease by 2015.

"But after that, the decline tailed off," says Professor Anders Björkman at the Department of Microbiology, Tumour and Cell Biology, Karolinska Institutet, who has been running the malaria project for 18 years. "Except for in Zanzibar, where the action taken for its 1.4 million citizens has led to approximately a 96 per cent decline in the incidence of malaria. We've optimised these measures with the Zanzibar Malaria Control Programme and can now explain why malaria has not yet been fully eliminated."

The study reveals altered behaviour in the malaria mosquitoes, which now bite outdoors instead of indoors. They have also developed a kind of resistance to modern pesticides. Furthermore, there has been a process of selection in the pathogenic parasite, where the remaining form is more difficult to detect but still spreads the disease as before. The researchers have been monitoring 100,000 or so residents of two districts in Zanzibar since 2002.

"Both the mosquitoes and the parasites have found ways to avoid control measures," says Professor Björkman. "We now need to develop new strategies to overcome this if we're to attain the goal of eliminating the disease from Zanzibar, an endeavour that can prove a model for the entire continent."

What surprised the researchers was the dramatic decline in child mortality in Zanzibar, where malaria control has caused more than a 70 per cent drop in the total child mortality rate. It was previously estimated that only 20 per cent of child deaths in Africa were malaria-related; the researchers now think the reason for this dramatic reduction is that the disease has a greater and more chronic effect on the general health of babies than suspected, thus lowering their resistance to other diseases throughout early childhood.

"Malaria is still the greatest obstacle to a healthy childhood in Africa," says Professor Björkman. "If you ask African women today, their greatest concern is usually that malaria doesn't affect their pregnancy and their babies. The global community must continue the fight for improved strategies and control measures. If this happens, I think we'll be able to reach the goal of ultimate elimination."

Zanzibar was one of the first countries to put the global initiatives against malaria to use and has since been tireless in its work to control the disease. The researchers now hope that these lessons can revive anti-malaria strategies throughout Africa.

Credit: 
Karolinska Institutet

The diversity of rural African populations extends to their microbiomes

image: A nurse from the Kilimanjaro Christian Medical Center, works in the field with a study participant in Tanzania.

Image: 
Alessia Ranciaro/Tishkoff Lab

Our microbiome, the complex community of bacteria, fungi, parasites, and other microorganisms in and on our bodies, reflects the way we live. If we own a pet, we likely share microbes with them. If we eat meat, the microbiome in our intestines may look different from that of a vegan.

In a growing field of study to determine how we acquire the microbes we carry within us and their influence on our health, most analyses have focused on people living in developed nations. But in the last several years, scientists have begun to investigate whether people in non-industrialized societies possess distinctly different microbiomes and, if so, what factors shape those differences.

A new report, published in the journal Genome Biology, has made significant strides in addressing these questions. Led by a team of geneticists from the University of Pennsylvania, in collaboration with researchers from Tanzania, Botswana, and the National Institutes of Health, the study is one of the largest to date to analyze the gut microbiomes of ethnically diverse Africans, with samples from 114 Botswanan and Tanzanian people from seven populations, as well as a comparison group from the United States.

The results point to the wide range of microbiome profiles across populations, which practice a variety of different lifestyles, from agropastoralist to pastoralist to hunter-gatherer. The magnitude of the differences is on par with the differences seen between industrialized and non-industrialed populations. Yet the researchers were also intrigued by unexpected similarities between groups.

"When we started this," says geneticist Sarah Tishkoff, a Penn Integrates Knoweldge Professor at Penn and senior author, "my hypothesis was that diet was going to be the driving factor in distinguishing the microbiome of these diverse populations. My biggest surprise was that that wasn't the case."

In fact, a subset of the samples from Bantu-speaking people living in farming communities in Botswana were nearly indistinguishable from those collected from people living in the Philadelphia area, samples collected by study coauthor Frederick Bushman, a microbiologist at Penn's Perelman School of Medicine.

"The bacterial composition from some of the agropastoralists in Botswana was strikingly similar to the U.S. cohort," says Matthew Hansen, a scientist in Tishkoff's lab and co-lead author on the paper. "These are rural groups; they have a very different lifestyle but some factor is giving them a very similar microbiota to healthy Philadelphia-area residents."

Previous efforts to examine the gut microbiomes of rural Africans have typically compared a single African population to one or more populations from industrialized nations. These earlier studies pointed to differences between groups; for example, a comparison of gut microbiomes between Italians and Hadza hunter-gatherers in Tanzania identified several groups of bacteria present in the Hadza that had not been previously identified in populations from Westerners.

But to get a more nuanced understanding of the factors influencing the microbial diversity in rural Africans, Tishkoff's team collected samples from seven far-flung African populations. From Tanzania: Hadza hunter-gatherers, Maasai cattle-herders, Sandawe agropastoralists, who were hunter-gatherers until the late 19th century, and Burunge agropastoralists. And from Botswana: San hunter-gatherers, Bantu-speaking Herero pastoralists, and several groups of Bantu-speaking agropastoralists.

Gathering the data, which required requesting residents of remote villages to provide fecal samples to the scientists, was by no means a simple process. Alessia Ranciaro, who helped lead data collection efforts for the work, says she and colleagues learned from early experiences how to navigate cultural and logistical hurdles to make the process smoother for both the participants and the scientists.

The researchers extracted DNA from the samples and sequenced a portion of the 16S ribosomal RNA gene, widely used in microbiome studies to help identify and compare bacteria.

"The analysis allows you to classify bacteria in a sample down to the genus level in many cases," says Meagan Rubel, a doctoral student in Tishkoff's lab who co-led the work. "But in the sample databases that we're using, the bacteria has been classified based on industrialized or western groups, so people living these traditional lifestyles may have bacteria that we have never seen before."

From the results they were able to obtain, broad patterns quickly emerged. Notably, the Botswanan samples differed from the Tanzanian ones. The gut microbiomes from Tanzanian populations tended to have a higher number of microbes in each sample, and individuals tended to share similar microbial profiles. In Botswana, on the other hand, samples tended to have fewer microbial species overall, and individuals' microbiomes tended to be more different from one another. The latter pattern was also present in the U.S. samples.

While the analysis didn't point to a "smoking gun" explaining this result, the researchers hypothesize that the reasons are tied to Botswana's comparative national wealth and access to medical care.

"Botswana has diamonds and is relatively wealthy," says Tishkoff. "They have a free medical system and a free educational system, which is very different from Tanzania."

"You can imagine that within the spectrum of very different groups in African countries," says Rubel, "there are groups undergoing these soft measures of industrialization that could be everything from increased access to clinical care to different kinds of foods. Antibiotic usage is something that can really change the gut microbiome, so people who have more access to that might be seeing marked shifts in their microbiome."

Such shifts could help explain the similarities between Botswana and U.S. samples, though the researchers say more work needs to be done to confirm that is the case.

Overall, however, broad differences in the gut microbiome were readily apparent between the U.S. and most Africans. Within Africa, the frequency of some bacterial groups could differentiate populations by ethnicity and lifestyle. For instance, the two hunter-gatherer populations, the San and the Hadza, possessed different patterns of gut bacteria compared to pastoralist or agropastoralist groups. In addition, in the Maasai and Hadza populations, two groups in which the division of labor between men and women is particularly extreme, the researchers found significant sex differences in the microbiomes they analyzed.

To understand more about what the bacteria in the gut were actually doing, the researchers looked for molecular pathways that were abundant across the various microbial species in a given sample. In the U.S. samples, they identifed pathways involved in breaking down environmental pollutants, such as bisphenol, which includes bisphenol A, better known as "the dreaded BPA in plastics," says Rubel, as well as DDT, the insecticide responsible for thinning birds' eggshells that has been banned in the U.S. since the 1970s.

They found evidence of DDT-breakdown pathways as well in the samples from Botswana, a country that has continued to use the chemical to control mosquitoes responsible for transmitted diseases such as malaria.

The findings raise a number of interesting questions, on which the researchers hope to shed more light with additional analyses and ore thorough genetic and genomic sequencing in the future. They're also probing the samples further to see whether the presence of gastrointestinal pathogens may play a primary role in influencing the gut microbiota of people in areas where such infections are prevalent.

"Our work expands a growing narrative," says Rubel, "where microbiome trends seem to track with the level of industrialization across populations."

The scientists note that even these remote African populations are not static in their lifestyle practices; development and its influences on the environment and traditional lifestyles may manifest in shifting microbiome patterns.

"What would be great to do is a longitudinal study of some of these groups," says Hansen. "Over the next 20 years, their lives are certain to change rapidly, and it would be interesting to see if we find a shift in the microbiome as these lifestyle factors change."

Credit: 
University of Pennsylvania

Mystery orbits in outermost reaches of solar system not caused by 'Planet Nine'

The strange orbits of some objects in the farthest reaches of our solar system, hypothesised by some astronomers to be shaped by an unknown ninth planet, can instead be explained by the combined gravitational force of small objects orbiting the Sun beyond Neptune, say researchers.

The alternative explanation to the so-called 'Planet Nine' hypothesis, put forward by researchers at the University of Cambridge and the American University of Beirut, proposes a disc made up of small icy bodies with a combined mass as much as ten times that of Earth. When combined with a simplified model of the solar system, the gravitational forces of the hypothesised disc can account for the unusual orbital architecture exhibited by some objects at the outer reaches of the solar system.

While the new theory is not the first to propose that the gravitational forces of a massive disc made of small objects could avoid the need for a ninth planet, it is the first such theory which is able to explain the significant features of the observed orbits while accounting for the mass and gravity of the other eight planets in our solar system. The results are reported in the Astronomical Journal.

Beyond the orbit of Neptune lies the Kuiper Belt, which is made up of small bodies left over from the formation of the solar system. Neptune and the other giant planets gravitationally influence the objects in the Kuiper Belt and beyond, collectively known as trans-Neptunian Objects (TNOs), which encircle the Sun on nearly-circular paths from almost all directions.

However, astronomers have discovered some mysterious outliers. Since 2003, around 30 TNOs on highly elliptical orbits have been spotted: they stand out from the rest of the TNOs by sharing, on average, the same spatial orientation. This type of clustering cannot be explained by our existing eight-planet solar system architecture and has led to some astronomers hypothesising that the unusual orbits could be influenced by the existence of an as-yet-unknown ninth planet.

The 'Planet Nine' hypothesis suggests that to account for the unusual orbits of these TNOs, there would have to be another planet, believed to be about ten times more massive than Earth, lurking in the distant reaches of the solar system and 'shepherding' the TNOs in the same direction through the combined effect of its gravity and that of the rest of the solar system.

"The Planet Nine hypothesis is a fascinating one, but if the hypothesised ninth planet exists, it has so far avoided detection," said co-author Antranik Sefilian, a PhD student in Cambridge's Department of Applied Mathematics and Theoretical Physics. "We wanted to see whether there could be another, less dramatic and perhaps more natural, cause for the unusual orbits we see in some TNOs. We thought, rather than allowing for a ninth planet, and then worry about its formation and unusual orbit, why not simply account for the gravity of small objects constituting a disc beyond the orbit of Neptune and see what it does for us?"

Professor Jihad Touma, from the American University of Beirut, and his former student Sefilian modelled the full spatial dynamics of TNOs with the combined action of the giant outer planets and a massive, extended disc beyond Neptune. The duo's calculations, which grew out of a seminar at the American University of Beirut, revealed that such a model can explain the perplexing spatially clustered orbits of some TNOs. In the process, they were able to identify ranges in the disc's mass, its 'roundness' (or eccentricity), and forced gradual shifts in its orientations (or precession rate), which faithfully reproduced the outlier TNO orbits.

"If you remove planet nine from the model and instead allow for lots of small objects scattered across a wide area, collective attractions between those objects could just as easily account for the eccentric orbits we see in some TNOs," said Sefilian, who is a Gates Cambridge Scholar and a member of Darwin College.

Earlier attempts to estimate the total mass of objects beyond Neptune have only added up to around one-tenth the mass of the Earth. However, in order for the TNOs to have the observed orbits and for there to be no Planet Nine, the model put forward by Sefilian and Touma requires the combined mass of the Kuiper Belt to be between a few to ten times the mass of the Earth.

"When observing other systems, we often study the disc surrounding the host star to infer the properties of any planets in orbit around it," said Sefilian. "The problem is when you're observing the disc from inside the system, it's almost impossible to see the whole thing at once. While we don't have direct observational evidence for the disc, neither do we have it for Planet Nine, which is why we're investigating other possibilities. Nevertheless, it is interesting to note that observations of Kuiper belt analogues around other stars, as well as planet formation models, reveal massive remnant populations of debris.

"It's also possible that both things could be true - there could be a massive disc and a ninth planet. With the discovery of each new TNO, we gather more evidence that might help explain their behaviour."

Credit: 
University of Cambridge

Using bacteria to create a water filter that kills bacteria

image: Jan. 2, 2019 Environmental Science & Technology

Image: 
<em>Environmental Science & Technology</em>

More than one in 10 people in the world lack basic drinking water access, and by 2025, half of the world's population will be living in water-stressed areas, which is why access to clean water is one of the National Academy of Engineering's Grand Challenges. Engineers at Washington University in St. Louis have designed a novel membrane technology that purifies water while preventing biofouling, or buildup of bacteria and other harmful microorganisms that reduce the flow of water.

And they used bacteria to build such filtering membranes.

Srikanth Singamaneni, professor of mechanical engineering & materials science, and Young-Shin Jun, professor of energy, environmental & chemical engineering, and their teams blended their expertise to develop an ultrafiltration membrane using graphene oxide and bacterial nanocellulose that they found to be highly efficient, long-lasting and environmentally friendly. If their technique were to be scaled up to a large size, it could benefit many developing countries where clean water is scarce.

The results of their work were published as the cover story in the Jan. 2 issue of Environmental Science & Technology.

Biofouling accounts for nearly half of all membrane fouling and is highly challenging to eradicate completely. Singamaneni and Jun have been tackling this challenge together for nearly five years. They previously developed other membranes using gold nanostars, but wanted to design one that used less expensive materials.

Their new membrane begins with feeding Gluconacetobacter hansenii bacteria a sugary substance so that they form cellulose nanofibers when in water. The team then incorporated graphene oxide (GO) flakes into the bacterial nanocellulose while it was growing, essentially trapping GO in the membrane to make it stable and durable.

After GO is incorporated, the membrane is treated with base solution to kill Gluconacetobacter. During this process, the oxygen groups of GO are eliminated, making it reduced GO.  When the team shone sunlight onto the membrane, the reduced GO flakes immediately generated heat, which is dissipated into the surrounding water and bacteria nanocellulose.

Ironically, the membrane created from bacteria also can kill bacteria.

"If you want to purify water with microorganisms in it, the reduced graphene oxide in the membrane can absorb the sunlight, heat the membrane and kill the bacteria," Singamaneni said.

Singamaneni and Jun and their team exposed the membrane to E. coli bacteria, then shone light on the membrane's surface. After being irradiated with light for just 3 minutes, the E. coli bacteria died. The team determined that the membrane quickly heated to above the 70 degrees Celsius required to deteriorate the cell walls of E. coli bacteria.

While the bacteria are killed, the researchers had a pristine membrane with a high quality of nanocellulose fibers that was able to filter water twice as fast as commercially available ultrafiltration membranes under a high operating pressure.

When they did the same experiment on a membrane made from bacterial nanocellulose without the reduced GO, the E. coli bacteria stayed alive.

"This is like 3-D printing with microorganisms," Jun said. "We can add whatever we like to the bacteria nanocellulose during its growth. We looked at it under different pH conditions similar to what we encounter in the environment, and these membranes are much more stable compared to membranes prepared by vacuum filtration or spin-coating of graphene oxide."

While Singamaneni and Jun acknowledge that implementing this process in conventional reverse osmosis systems is taxing, they propose a spiral-wound module system, similar to a roll of towels. It could be equipped with LEDs or a type of nanogenerator that harnesses mechanical energy from the fluid flow to produce light and heat, which would reduce the overall cost.

Credit: 
Washington University in St. Louis

Classic double-slit experiment in a new light

image: An intense beam of high-energy X-ray photons (violet) hits two adjacent iridium atoms (green) in the crystal. This excites electrons in the atoms for a short time. The atoms emit X-ray photons which overlap behind the two iridium atoms (red) and can be analyzed as interference images.

Image: 
Markus Grueninger, University of Cologne

An international research team led by physicists from Collaborative Research Centre 1238, 'Control and Dynamics of Quantum Materials' at the University of Cologne has implemented a new variant of the basic double-slit experiment using resonant inelastic X-ray scattering at the European Synchrotron ESRF in Grenoble. This new variant offers a deeper understanding of the electronic structure of solids. Writing in Science Advances, the research group have now presented their results under the title 'Resonant inelastic x-ray incarnation of Young's double-slit experiment'.

The double-slit experiment is of fundamental importance in physics. More than 200 years ago, Thomas Young diffracted light at two adjacent slits, thus generating interference patterns (images based on superposition) behind this double slit. That way, he demonstrated the wave character of light. In the 20th century, scientists have shown that electrons or molecules scattered on a double slit show the same interference pattern, which contradicts the classical expectation of particle behaviour, but can be explained in quantum-mechanical wave-particle dualism. In contrast, the researchers in Cologne investigated an iridium oxide crystal (Ba3CeIr2O9) by means of resonant inelastic X-ray scattering (RIXS).

The crystal is irradiated with strongly collimated, high-energy X-ray photons. The X-rays are scattered by the iridium atoms in the crystal, which take over the role of the slits in Young's classical experiment. Due to the rapid technical development of RIXS and a skilful choice of crystal structure, the physicists were now able to observe the scattering on two adjacent iridium atoms, a so-called dimer.

'The interference pattern tells us a lot about the scattering object, the dimer double slit', says Professor Markus Grueninger, who heads the research group at the University of Cologne. In contrast to the classical double-slit experiment, the inelastically scattered X-ray photons provide information about the excited states of the dimer, in particular their symmetry, and thus about the dynamic physical properties of the solid.

These RIXS experiments require a modern synchrotron as an extremely brilliant X-ray light source and a sophisticated experimental setup. To specifically excite only the iridium atoms, scientists have to select the very small proportion of photons with the right energy from the broad spectrum of the synchrotron, and the scattered photons are selected even more strictly according to energy and direction of scattering. Only a few photons remain. With the required accuracy, these RIXS experiments are currently only possible at two synchrotrons worldwide, including the ESRF (European Synchrotron Radiation Facility) in Grenoble, where the team from Cologne conducted their experiment.

'With our RIXS experiment, we were able to prove a fundamental theoretical prediction from 1994. This opens a new door for a whole series of further experiments that will allow us to gain a deeper understanding of the properties and functionalities of solids', says Grueninger.

Credit: 
University of Cologne

New technologies enable better-than-ever details on genetically modified plants

image: Researchers used the latest DNA sequencing technologies to study exactly what happens at a molecular level when they insert new genes into plants. From left: Joseph Ecker, Florian Jupe, Todd Michael, Mark Zander and Angeline Rivkin.

Image: 
Salk Institute

LA JOLLA--(January 15, 2019) Salk researchers have mapped the genomes and epigenomes of genetically modified plant lines with the highest resolution ever to reveal exactly what happens at a molecular level when a piece of foreign DNA is inserted. Their findings, published in the journal PLOS Genetics on January 15, 2019, elucidate the routine methods used to modify plants, and offer new ways to more effectively minimize potential off-target effects.

"This was really a starting point for showing that it's possible to use the latest mapping and sequencing technologies to look at the impact of inserting genes into the plant genome," says Howard Hughes Medical Institute Investigator Joseph Ecker, a professor in Salk's Plant Molecular and Cellular Biology Laboratory and head of the Genomic Analysis Laboratory.

When a scientist wants to put a new gene into a plant--for basic research purposes or to boost the health or nutrition of a food crop--they usually rely on Agrobacterium tumefaciens to get the job done. Agrobacterium is the bacteria that causes crown gall tumors, large bulges on the trunks of trees. Decades ago, scientists discovered that when the bacteria infected a tree, it transferred some of its DNA to the tree's genome. Since then, researchers have co-opted this transfer ability of Agrobacterium for their own purposes, using its transfer-DNA (T-DNA) to move a desired gene into a plant.

Recently, DNA sequencing technologies had started to hint that when the Agrobacterium T-DNA is used to insert new genes into a plant, it may cause additional changes to the structural and chemical properties of the native DNA.

"Biotech companies spend a lot of time and effort to characterize transgenic plants and disregard candidates with unwanted changes without understanding--from a basic biological perspective--why these changes may have occurred," says Ecker. "Our new approach offers a way to better understand these effects and may help to speed up the process."

"The biggest unknown was whether, and how many copies of, the T-DNA were inserted at the same time as the piece you wanted," says Florian Jupe, a former Salk research associate who now works at Bayer Crop Science. Jupe, Salk Staff Researcher Mark Zander and Research Assistant Angeline Rivkin are co-first authors of the new paper, along with Todd Michael of the J. Craig Venter Institute.

Since the T-DNA approach can lead to an integration of many copies of a desired gene into a plant, it can be difficult to study the final result with standard DNA sequencing, as most technologies struggle to sequence highly repetitive stretches of DNA. But Ecker and his colleagues turned to a new combination of approaches--including optical mapping and nanopore sequencing--to look at these long stretches in high resolution. They applied the technologies to four randomly selected T-DNA lines of Arabidopsis thaliana, a commonly used model plant in biology. (These plants are derived from a large population of T-DNA insertional mutants that were created using an Arabidopsis transformation method, called floral dip, to study gene function.)

Optical mapping revealed that the plants had between one and seven distinct insertions or rearrangements in their genomes, ranging in size by almost tenfold. Nanopore sequencing and reconstruction of the genomes of two lines confirmed the insertions to single-letter resolution, including whole segments of DNA that had been exchanged--or translocated--between chromosomes in one of the selected lines. Gene insertions themselves showed a variety of patterns, with the inserted DNA fragment sometimes scrambled, inverted or even silenced.

"This study was not possible even a year ago," says Michael. "Nanopore sequencing, which some refer to as the 'holy grail' of DNA sequencing, has revolutionized the reading of even the most complex regions of the genome that were completely inaccessible and unknown until now."

Finally, when the researchers studied packets of genetic material called histones they found additional changes. Histone proteins package DNA into structural units, and modifications of these histones mediate whether a gene can be accessed for use by a cell (a level of regulation called epigenetics). Depending on where T-DNA was integrated, certain nearby histone modifications appeared or disappeared potentially changing the regulation or activation of other nearby genes.

"Now we have the first high-resolution insights on how T-DNA insertions can shape the local epigenome environment," says Zander.

In an ideal world, the researchers say, T-DNA would insert one single, functional copy of a desired gene with no nearby side effects on a plant's genome. While their findings show this is rarely the case in Arabidopsis, their methods offer a path to a better understanding and surveillance of the effects.

"This technology is exciting because it gives us a much clearer look at what's going on in some of these transgenic Arabidopsis lines," says Rivkin.

"With Arabidopsis, it's relatively easy because it has such a small genome, but because of continued improvements in DNA sequencing technology, we expect this approach will also be possible for crop plants," adds Ecker, who holds the Salk International Council Chair in Genetics. "Current methods require screening of hundreds of transgenic lines to find good performing ones, such as those without extra insertions, so this technology could provide a more efficient approach."

Credit: 
Salk Institute

Coming soon: A blood test for Alzheimer's disease?

People with symptoms of Alzheimer's disease (AD), such as cognitive difficulties, behavior changes and mood swings, may wait months or even years to get a definitive diagnosis. That's because doctors lack a simple, accurate and inexpensive test for it. But according to an article in Chemical & Engineering News (C&EN), the weekly newsmagazine of the American Chemical Society, researchers are getting much closer to developing the elusive blood test for AD.

About 5.5 million Americans are living with AD, according to the National Institute on Aging. Most do not seek treatment until their symptoms are well advanced, freelance contributor Jyoti Madhusoodanan writes. By then, substantial and irreversible damage to the brain has already occurred. Current tests for AD, such as positron emission tomography (PET) and lumbar puncture, are invasive, cost thousands of dollars and aren't covered by most health insurance plans in the U.S. For nearly 20 years, researchers have been trying to develop a blood test for AD, but they've been stymied by the low amounts of potential biomarkers in blood.

Recently, new biomarkers and assays have moved a reliable blood test closer to the clinic. For example, instead of measuring the total amount of amyloid -- the protein that forms clumps in the brains of AD patients -- in blood, researchers can more accurately diagnose AD by looking at ratios of different peptides that form when amyloid breaks down. Sensitive new assays can detect smaller amounts of these peptides in blood. These and other developments have made many researchers optimistic that a blood test for AD will be available within the next five years. Such a test would not only aid in diagnosis, but might also help in the search for better AD treatments because it could identify participants for clinical trials.

Credit: 
American Chemical Society

Frailty could make people more susceptible to dementia

New research published in The Lancet Neurology journal suggests that frailty makes older adults more susceptible to Alzheimer's dementia, and moderates the effects of dementia-related brain changes on dementia symptoms. The findings suggest that frailty should be considered in clinical care and management of Alzheimer's dementia.

The study found that older adults (59 years and older) with higher levels of frailty were more likely to have both Alzheimer's disease-related brain changes and symptoms of dementia, whilst others with substantial brain changes, but who were not frail, showed fewer clinical symptoms.

"By reducing an individual's physiological reserve, frailty could trigger the clinical expression of dementia when it might remain asymptomatic in someone who is not frail," explains Professor Kenneth Rockwood from Nova Scotia Health Authority and Dalhousie University, Canada, who led the study. "This indicates that a 'frail brain' might be more susceptible to neurological problems like dementia as it is less able to cope with the pathological burden." [1]

"This is an enormous step in the right direction for Alzheimer's research. Our findings suggest that the expression of dementia symptoms results from several causes, and Alzheimer's disease-related brain changes are likely to be only one factor in a whole cascade of events that lead to clinical symptoms. Understanding how individual risk factors work together to give rise to late-life dementia is likely to offer a new way to develop targeted treatment options." [1]

The findings support the idea that late-life dementia (and particularly Alzheimer's disease) is a complex phenomenon rather than a single disease entity marked by genetic risk or single protein abnormalities in the brain. However, the authors caution that this study is a cross-sectional comparison of pathology data from a single database that only includes adults living in Illinois, USA.

Previous research has shown that some people with Alzheimer's disease-related brain changes (eg, amyloid deposition) can have few characteristic symptoms of the disease (cognitive and functional decline), whereas others with few brain changes may have symptoms. These discrepancies suggest that some hidden factors might affect the relationship between Alzheimer's disease-related brain changes and Alzheimer's dementia.

Most people who develop Alzheimer's dementia are older than 65 years and have several other health problems. Frailty - a condition linked with reduced physiological reserve and increased vulnerability to other ailments - is associated with age and higher rates of cognitive deficit and dementia, but little research has explored how these conditions might be related.

In this study, researchers used modelling to assess relationships between frailty, Alzheimer's disease-related brain changes, and Alzheimer's dementia among 456 participants of the Rush Memory and Ageing Project (MAP) who had either no dementia or Alzheimer's dementia, and who subsequently died and underwent brain autopsy. MAP is a longitudinal clinical-pathological study of older adults living in Illinois, USA, which began in 1997 [2].

Every year participants received neuropsychological and clinical evaluations, which included detailed cognitive testing and neurological examinations. Clinical diagnosis of Alzheimer's dementia was based on clinician consensus, with just over half (53%; 242) the participants given a diagnosis of possible or probable Alzheimer's dementia at their last clinical assessment. Brain plaques and tangles were measured after death to quantify Alzheimer's disease-related changes. The researchers also developed a frailty index using a combination of 41 components of health status (eg, fatigue, joint and heart problems, osteoporosis, mobility, meal preparation) obtained at each clinical evaluation.

Overall, 35 participants (8%) had substantial Alzheimer's disease-related brain changes without having been diagnosed with dementia, and 50 (11%) had Alzheimer's dementia but had little disease-related brain changes (table 2).

The analysis revealed that frailty and Alzheimer's disease-related brain changes independently contribute to dementia status, after adjusting for age, sex, and education.

The researchers also found a significant association between frailty and Alzheimer's disease-related brain changes after excluding activities of daily living from the frailty index and adjusting for other risk factors such as stroke, heart failure, high blood pressure, and diabetes.

"While frailty is likely to reduce the threshold for Alzheimer's disease-related brain changes to cause cognitive decline, it probably also contributes to other mechanisms in the body that give rise to dementia, weakening the direct link between Alzheimer's disease-related brain changes and dementia," says Rockwood. [1]

"While more research is needed, given that frailty is potentially reversible, it is possible that helping people to maintain function and independence in later life could reduce both dementia risk and the severity of debilitating symptoms common in this disease." [1]

The authors say that future studies should examine longitudinal relationships between frailty, cognition, and biomarkers of Alzheimer's dementia to establish causation. They also note several limitations, including that a single definition of frailty has not been well established - some definitions are more biological, others are more physical, whilst some combine physical, biological, psychological, and social risk factors. They also note that frailty measurements were taken close to death and might reflect terminal decline, which could result in the relationship between Alzheimer's disease-related brain changes and dementia status among people with high levels of frailty being overestimated.

Writing in a linked Comment, Dr Francesco Panza from the University of Bari Aldo Moro, Italy, discusses how understanding frailty could help predict and prevent dementia. He concludes: "In light of current knowledge on the cognitive frailty phenotype, secondary preventive strategies for cognitive impairment and physical frailty can be suggested. For instance, individualised multidomain interventions can target physical, nutritional, cognitive, and psychological domains that might delay the progression to overt dementia and secondary occurrence of adverse health-related outcomes, such as disability, hospitalisation, and mortality."

Credit: 
The Lancet