Tech

iPhone plus nanoscale porous silicon equals cheap, simple home diagnostics

image: Silicon chips similar to the ones that would be used in the iPhone diagnostic system.

Image: 
Heidi Hall/Vanderbilt University

The simplest home medical tests might look like a deck of various silicon chips coated in special film, one that could detect drugs in the blood, another for proteins in the urine indicating infection, another for bacteria in water and the like. Add the bodily fluid you want to test, take a picture with your smart phone, and a special app lets you know if there's a problem or not.

That's what electrical engineer Sharon Weiss, Cornelius Vanderbilt Professor of Engineering at Vanderbilt University, and her students developed in her lab, combining their research on low-cost, nanostructured thin films with a device most American adults already own. "The novelty lies in the simplicity of the basic idea, and the only costly component is the smart phone," Weiss said.

"Most people are familiar with silicon as being the material inside your computer, but it has endless uses," she said. "With our nanoscale porous silicon, we've created these nanoscale holes that are a thousand times smaller than your hair. Those selectively capture molecules when pre-treated with the appropriate surface coating, darkening the silicon, which the app detects."

Similar technology being developed relies on expensive hardware that compliments the smart phone. Weiss' system uses the phone's flash as a light source, and the team plans to develop an app that could handle all data processing necessary to confirm that the film simply darkened with the adding of fluid. What's more, in the future, such a phone could replace a mass spectrometry system that costs thousands of dollars. The Transportation Security Administration owns hundreds of those at airports across the country, where they're used to detect gunpowder on hand swabs.

Other home tests rely on a color change, which is a separate chemical reaction that introduces more room for error, Weiss said.

Weiss, Ph.D. student Tengfei Cao, and their team used a biotin-streptavidin protein assay and an iPhone SE, model A1662, to test their silicon films and found the accuracy to be similar to that of benchtop measurement systems. They also used a 3D printed box to stabilize the phone and get standardized measurements for the paper, but Weiss said that wouldn't be necessary if further research and development led to a commercialized version.

Credit: 
Vanderbilt University

New look at old data leads to cleaner engines

image: Sandia National Laboratories researcher Nils Hansen, shown here assembling equipment in the Combustion Research Facility, says new insights on how to control the chemistry of ignition behavior and pollutant formation will lead to the design of new fuels and improved combustion strategies.

Image: 
Dino Vournas

LIVERMORE, Calif. -- New insights about how to understand and ultimately control the chemistry of ignition behavior and pollutant formation have been discovered in research led by Sandia National Laboratories. The discovery eventually will lead to cleaner, more efficient internal combustion engines.

"Our findings will allow the design of new fuels and improved combustion strategies," said Nils Hansen, Sandia researcher and lead author of the research. "Making combustion cleaner and more efficient will have a huge impact, reducing energy use around the globe."

The work, which focuses on the chemical science of low-pressure flame measurements, is featured in the Proceedings of the Combustion Institute and was selected as a distinguished paper in Reaction Kinetics for the 37th International Symposium on Combustion. Authors include Hansen, Xiaoyu He, former Sandia intern Rachel Griggs and former Sandia postdoctoral appointee Kai Moshammer, who is now at the Physikalisch-Technische Bundesanstalt in Germany. The research was funded by the Department of Energy's Office of Science.

Creating a massive dataset of flames and fuels

The team combined the output from carefully controlled measurements on a wide range of fuels into a single categorized and annotated dataset. Correlations among the 55 individual flames involving 30 different fuels were then used to reduce uncertainty, identify inconsistent data and disentangle the effects of the fuel structure on chemical combustions pathways that lead to harmful pollutants. An initial analysis considered relationships among peak concentrations of chemical intermediates that play a role in molecular weight growth and eventual soot formation.

Hansen said that, to his knowledge, this is the first time that researchers have looked at these possibilities. By identifying inconsistencies, the new methods ultimately should lead to better models for understanding combustion. Typically, well-controlled experiments help validate computer models to understand the combustion process and to develop new combustion strategies.

Data from low-pressure premixed flames is typically used to validate chemical kinetic mechanisms in combustion. These detailed mechanisms then provide the basis for understanding the formation of pollutants and predicting behavior for combustion applications.

Historically, research papers reported data from a single flame or a few flames, along with one new mechanism for a specific fuel. However, the approach pioneered by Hansen's team paves the way for measuring a large number of flames and publishing numerous mechanisms that are not usually cross-validated with other data and mechanisms.

Hansen compares the discovery to the unearthing of an old artifact. Very few conclusions can be drawn from a single artifact. However, piecing together thousands of similar artifacts creates a more complete historical picture.

"Our work reveals information typically hidden in the ensemble of low-pressure flame data," Hansen said. "For example, useful targets for model validation can be gleaned from a database with more than 30,000 data points."

Analyzing flames

After analyzing the flames, researchers found that correlated properties provide new validation targets accessible only when examining the chemical structures of a wide set of low-pressure flames.

Hansen said the comprehensive chemical-kinetic models for combustion systems increasingly are used as the basis for engineering models that predict fuel performance and emissions for combustor design. These models are often ambiguous due to the large set of parameters used to inform the model, but synchrotron-based, single-photon ionization mass spectrometry measurement, pioneered in DOE's Gas Phase Chemical Physics program, has created an unprecedented surge of detailed chemical data.

Long-term benefits

The work eventually will help to assemble more accurate chemical mechanisms for describing combustion processes, Hansen said.

"Our goal is to better understand and ultimately control the chemistry of ignition behavior and pollutant formation," he said. "Subsequently, this will lead to clean and efficient internal combustion engines."

Hansen said that his team's findings unlock an entirely new avenue for research at Sandia's Combustion Research Facility.

"Applying data science and machine-learning tools extracts even more information from large datasets," he said. "Our work has opened the gate wide to show that data science can be applied to combustion research."

Credit: 
DOE/Sandia National Laboratories

Sex, lice and videotape

image: Lice hide in tiny gaps inside feathers, called the interbarb space, to escape deadly beaks. If the lice are too big, the birds quickly snatch the up. But bigger female lice lay more eggs. Evolutionary winners occupy a sweet spot: They're small enough to hide, but big enough to outbreed their neighbors.

Image: 
Scott Villa

A few years ago, Scott Villa of Emory University had a problem. Then a graduate student at the University of Utah, he was stumped with an issue never addressed in school: How does one film lice having sex?

Villa and University of Utah biologists had demonstrated real-time adaptation in their lab that triggered reproductive isolation in just four years, mimicking millions of years of evolution. They began with a single population of parasitic feather lice, split the population in two and transferred them onto different-sized hosts--pigeons with small feathers, and pigeons with large feathers. The pigeons preened at the lice and populations adapted quickly by evolving differences in body size. After 60 generations, the biologists saw larger lice on larger pigeons and smaller lice on smaller pigeons. When they paired the different-sized male and female lice together, the females laid zero eggs. The divergent body sizes were likely preventing the lice from physically mating with each other, which demonstrates the beginning stages of a new species.

But the researchers needed to know for sure. They put the lice on a plate of pigeon feathers to set the mood, primed the camera and waited. But the lice had stage fright.

"There was a lot of trial and error. No one has filmed lice mating before, I guarantee you that," said Villa.

They were flummoxed until an undergraduate researcher brought a heating pad into the lab for her sore back. It gave Villa an idea. Turns out that for feather lice, a hot pad tuned to a bird's core temperature is where the magic happens.

"What we saw was amazing, the male lice physically could not mate with the females, so we think this is how new species start to form," said Villa. "We already knew that in the wild, larger species of birds have larger species of lice. What we didn't know, and what came out of this study, is that because of the way that the lice mate, adapting to a new host by changing size has this massive automatic effect on reproduction."

The study experimentally demonstrates ecological speciation, a concept first championed by Charles Darwin. Different populations of the same species locally adapt to their environments, and those adaptations can cause reproductive isolation and eventually, lead to the origin of a new species.

"People study this in all sorts of systems, everything from fruit flies to stickleback fish to walking sticks. But they are always taking recently evolved species or populations that have already diverged and trying to understand why they're no longer reproductively compatible," said Dale Clayton, professor of biology and co-author of the study. "Very few have taken a single population and evolved it under natural conditions into two different populations that cannot reproduce. That's the new piece of this."

The paper was published in the Proceedings of the National Academy of Sciences of the United States of America on June 10, 2019.

The sweet spot

Different-sized pigeons have different-sized lice; in most cases, the bigger the pigeon, the bigger their lice. In 1999, Clayton led a study that found that birds' preening drives this pattern.

Feathers consist of ridges, called barbs, that create tiny gaps known as the interbarb space. It's the pigeon's blind spot--lice wedge in their long, slender bodies to escape deadly beaks. When big lice crawl on smaller feathers, they stick out of the space and birds pick them off. So it's good to be small, right? Not quite. In 2018, this same research group found that bigger female lice lay more eggs. Evolutionary winners fall into a sweet spot--they're just small enough to squeeze into the interbarb space, but big enough to outbreed smaller neighbors.

"There's constant selective pressure to be as big as possible to pump out as many eggs as possible. But preening puts the breaks on getting too big. There's a sweet spot," said Villa. "If you put lice on different sized birds, the sweet spot shifts and the lice evolve optimal body sizes after a few generations."

The experimental change in size is heritable-- the biologists showed that big parents had big offspring and small parents had small offspring, regardless of the size of the birds on which they were mating.

The parasitic lice populations adapted quickly. "Significant size differences appeared after just 18 months," said co-author Sarah Bush, associate professor of biology at the U. This pattern informs more than just this system.

"The idea is that bigger hosts have bigger parasites. That's true for trees with parasitic insects, for fleas on animals, for ticks on mammals--it's true for life," Bush continued. "It's a bigger question than just this one particular system. It happens everywhere. Part of what we're doing is trying to figure out that pattern."

Lice, camera, no action!

The researchers are the first to capture how feather lice mate. By understanding the mechanics of lice sex, they saw what works, and what fails. In short--size matters.

Female lice are naturally about 13% bigger than male lice. This dimorphism between the two sexes is critical for reproduction. Males have thick antenna to cling to females during copulation. They approach the female from behind, slide underneath her and curl the tip of their abdomen while holding her thorax. If the male is too small, he may struggle to reach the female where he needs to. If he's too large, he'll overshoot the female. That's exactly what the researchers saw.

"There's a Goldilocks Zone. The males and females have to be just the right size for each other. Pairs of lice where dimorphism falls outside of that zone suffer massive reproductive consequences," said Villa.

They found that typically sized lice copulated the longest and laid the most eggs. Pairs of lice with dimorphism outside of the Goldilocks Zone copulated for shorter amounts of time and laid zero eggs. They think this is because males either physically fail to inseminate the females, or they can't copulate long enough to fertilize her eggs. Their experiments tested this with lice on feathers and a heat pad on camera, and on pigeons themselves. The results were the same--pairs with sizes in the Goldilocks Zone had the most offspring.

The researchers think that the lice populations evolved reproductive isolation so quickly because body size is a 'magic' trait that is essential for both survival and reproduction. If there's a selective pressure on survival, such as preening, then reproductive isolation will automatically follow.

"The idea of a single trait governing both survival and reproduction has been known for some time. However, pinning down how these multipurpose traits actually drive speciation has been challenging. What makes this paper so interesting is that we actually identified how these "magic traits" work in real time. And just as theory predicts, selection on these traits can generate reproductive isolation in the evolutionary blink of an eye. Our study complements a lot of fantastic work on ecological speciation and adds our greater understanding of how new species actually form," said Villa.

Credit: 
University of Utah

A bubbly new way to detect the magnetic fields of nanometer-scale particles

video: A tiny magnetic rod is placed over a strip of magnetic film. This nanorod has a particular magnetic orientation, and a fringe field that interacts with the film, creating a bubble-shaped area where the direction of magnetism is reversed. By applying a second magnetic field, researchers can change the magnetic orientation of the nanorod, causing the magnetic bubble to shift from one end of the rod to the other. Measuring the location of the bubble can give scientists insight into the geometry and magnetic properties of the nanorod, and reveal if it is alone or clustered with other nanoparticles.

Image: 
S. Kelley/NIST

As if they were bubbles expanding in a just-opened bottle of champagne, tiny circular regions of magnetism can be rapidly enlarged to provide a precise method of measuring the magnetic properties of nanoparticles.

The technique, uncorked by researchers at the National Institute of Standards and Technology (NIST) and their collaborators, provides a deeper understanding of the magnetic behavior of nanoparticles. Because the method is fast, economical and does not require special conditions -- measurements can occur at room temperature and under atmospheric pressure, or even in liquids -- it provides manufacturers with a practical way to measure and improve their control of the properties of magnetic nanoparticles for a host of medical and environmental applications.

Magnetic nanoparticles can serve as tiny actuators, magnetically pushing and pulling other small objects. Relying on this property, scientists have employed the nanoparticles to clean up chemical spills and assemble and operate nanorobotic systems. Magnetic nanoparticles even have the potential to treat cancer -- rapidly reversing the magnetic field of nanoparticles injected into a tumor generates enough heat to kill cancer cells.

Individual magnetic nanoparticles generate magnetic fields like the north and south poles of familiar bar magnets. These fields create magnetic bubbles -- flat circles with initial diameters less than 100 nanometers (billionths of a meter) -- on the surface of a magnetically sensitive film developed at NIST. The bubbles surround the nanoparticle pole that points opposite to the direction of the magnetic field of the film. Although they encode information about the magnetic orientation of the nanoparticles, the tiny bubbles are not easily detected with an optical microscope.

However, like bubbles in champagne, the magnetic bubbles can be expanded to hundreds of times their initial diameter. By applying a small external magnetic field, the team enlarged the diameter of the bubbles to tens of micrometers (millionths of a meter) -- big enough to see with an optical microscope. The brighter signal of the enlarged bubbles rapidly revealed the magnetic orientation of individual nanoparticles.

After determining the initial magnetic orientation of the nanoparticles, the researchers used the enlarged bubbles to track the changes in that orientation as they applied an external magnetic field. Recording the strength of the external field required to flip the north and south magnetic poles of the nanoparticles revealed the magnitude of coercive field, a fundamental measure of the magnetic stability of the nanoparticles. This important property had previously been challenging to measure for individual nanoparticles.

Samuel M. Stavis of NIST and Andrew L. Balk, who conducted most of his research at the Los Alamos National Laboratory and NIST, along with colleagues at NIST and the Johns Hopkins University, described their findings in a recent issue of Physical Review Applied.

The team examined two types of magnetic nanoparticles -- rod-shaped particles made of a nickel-iron alloy and irregularly shaped particle clusters made of iron oxide. The applied magnetic field that expanded the bubbles plays a similar role to that of the pressure in a bottle of champagne, Balk said. Under high pressure, when the champagne bottle is corked, the bubbles are essentially nonexistent, just as the magnetic bubbles on the film are too small to be detected by an optical microscope when no external magnetic field is applied. When the cork is popped and the pressure is lowered, the champagne bubbles expand, just as the external magnetic field enlarged the magnetic bubbles.

Each magnetic bubble reveals the orientation of the magnetic field of a nanoparticle at the instant that the bubble formed. To study how the orientation varied with time, the researchers generated thousands of new bubbles every second. In this way, the researchers measured changes in the magnetic orientation of the nanoparticles at the moment that they occurred.

To enhance the sensitivity of the technique, the researchers tuned the magnetic properties of the film. In particular, the team adjusted the Dzyaloshinskii-Moriya (DMI) interaction, a quantum-mechanical phenomenon that imposes a twist in the bubbles within the film. This twist reduced the energy needed to form a bubble, providing the high sensitivity necessary to measure the field of the smallest magnetic particles in the study.

Other methods to measure magnetic nanoparticles, which require cooling with liquid nitrogen, working in a vacuum chamber, or measuring the field at only a single location, do not allow such rapid determination of nanoscale magnetic fields. With the new technique, the team rapidly imaged the magnetic fields from the particles over a large area at room temperature. The improvement in speed, convenience and flexibility enables new experiments in which researchers can monitor the behavior of magnetic nanoparticles in real time, such as during the assembly and operation of magnetic microsystems with many parts.

The study is the most recent example of an ongoing effort at NIST to make devices that improve the measurement capabilities of optical microscopes, an instrument available in most labs, said Stavis. This enables rapid measurement of the properties of single nanoparticles for both fundamental research and for nanoparticle manufacturing, he added.

Credit: 
National Institute of Standards and Technology (NIST)

Past climate change pushed birds from the northern hemisphere to the tropics

image: Great Blue Turaco (Corythaeola cristata), Entebbe Botanical Gardens, Uganda.

Image: 
Daniel J. Field

Researchers have shown how millions of years of climate change affected the range and habitat of modern birds, suggesting that many groups of tropical birds may be relatively recent arrivals in their equatorial homes.

The researchers, from the Universities of Cambridge and Oxford, applied climate and ecological modelling to illustrate how the distribution of major bird groups is linked to climate change over millions of years. However, while past climate change often occurred slowly enough to allow species to adapt or shift habitats, current rates of climate change may be too fast for many species, putting them at risk of extinction. The results are reported in Proceedings of the National Academy of Sciences.

"Palaeontologists have documented long-term links between climate and the geographic distributions of major bird groups, but the computer models needed to quantify this link had not been applied to this question until now," said Dr Daniel Field from Cambridge's Department of Earth Sciences, the paper's co-lead author.

For the current study, the researchers looked at ten bird groups currently limited to the tropics, predominantly in areas that were once part of the ancient supercontinent of Gondwana (Africa, South America and Australasia). However, early fossil representatives of each of these groups have been found on northern continents, well outside their current ranges.

For example, one such group, the turacos ('banana eaters') are fruit-eating birds which are only found in the forests and savannahs of sub-Saharan Africa, but fossils of an early turaco relative have been found in modern-day Wyoming, in the northern United States.

Today, Wyoming is much too cold for turacos for most of the year, but during the early Palaeogene period, which began with the extinction of non-avian dinosaurs 66 million years ago, the Earth was much warmer. Over time, global climates have cooled considerably, and the ancestors of modern turacos gradually shifted their range to more suitable areas.

"We modelled the habitable area for each group of birds and found that their estimated habitable ranges in the past were very different from their geographic distributions today, in all cases shifting towards the equator over geological time," said Dr Erin Saupe from the University of Oxford, the paper's other lead author.

Saupe, Field and their collaborators mapped information such as average temperature and rainfall and linked it to where each of the bird groups is found today. They used this climatic information to build an 'ecological niche model' to map suitable and unsuitable regions for each bird group. They then projected these ecological niche models onto palaeoclimate reconstructions to map potentially-suitable habitats over millions of years.

The researchers were able to predict the geographic occurrences of fossil representatives of these groups at different points in Earth's history. These fossils provide direct evidence that these groups were formerly distributed in very different parts of the world to where they are presently found.

"We've illustrated the extent to which suitable climate has dictated where these groups of animals were in the past, and where they are now," said Field. "Depending on the predictions of climate change forecasts, this approach may also allow us to estimate where they might end up in the future."

"Many of these groups don't contain a large number of living species, but each lineage represents millions of years of unique evolutionary history," said Saupe. "In the past, climate change happened slowly enough that groups were able to track suitable habitats as these moved around the globe, but now that climate change is occurring at a much faster rate, it could lead to entire branches of the tree of life going extinct in the near future."

Credit: 
University of Cambridge

Switchgrass hybrid yields insights into plant evolution

image: Scientists have identified specific parts of genetic code within switchgrass that could contribute to larger switchgrass harvests while reducing potential crop weaknesses.

Image: 
Courtesy of MSU

Switchgrass is attractive as a potential bioenergy crop because it can grow for years without having to be replanted. Requiring less fertilizer than typical annual crops like corn, switchgrass can keep more nitrogen, phosphorus and carbon in the soil and out of our air and waterways. But, unlike corn, breeding of switchgrass for optimal traits is still in its early stages.

Now, in a collaboration with the U.S Department of Energy, or DOE, the Joint Genome Institute, a DOE Office of Science User Facility, and the University of Texas at Austin, Great Lakes Bioenergy Research Center researchers at Michigan State University have identified specific parts of genetic code within the plant that could contribute to larger switchgrass harvests while reducing potential crop weaknesses.

David Lowry, assistant professor of plant biology at MSU, and Thomas Juenger, professor of integrative biology at UT Austin, led a team that identified genes that boost switchgrass growth across a wide range of climates - and that these plants can be bred to include these genetic markers.

Additionally, the researchers found that some adaptive traits in switchgrass can often be improved without hurting plant health in other ways. The results of this study were published this week in the Proceedings of the National Academy of Sciences.

Like many domesticated plants, switchgrass has multiple cultivars adapted to different geographic regions. Northern upland switchgrass typically lives on ridgelines and in prairie areas in states such as South Dakota, Michigan, and Wisconsin. This breed does not mind the cold and can survive harsh winters, as is the case with many things from the north.

However, northern switchgrass does not grow as large or as quickly as the southern lowland ecotype that thrives in wet, marshy areas from Missouri and Kansas down to Texas. The southern cultivar also tends to be more pest-resistant and can better withstand extreme conditions such as drought and flooding. The southern ecotype, on the other hand, does not tolerate cold winters, often dying off when the weather turns too chilly.

To try and get the best of both worlds, Lowry, Juenger and their partners devised an experiment across the central United States - from Texas to South Dakota - to determine what loci cause differences between the varieties adapted to cold northern areas versus hot southern areas.

"We're interested in how often loci that improve performance in some sites (say in the south) come at a cost or tradeoff for performance at other locations (say in the north)," Juenger said. "The 10 sites were mainly chosen to involve people we'd met or had interacted with in the switchgrass community over the past few years - and cover the latitudinal distribution of the species. We've really tried to build the field plantings as a community resource."

First, they created a crossbreed of the two varieties, then planted hundreds of the same switchgrass hybrid in 10 locations covering the entire geographical region. Then, comparing growth of the hybrids to the parent cultivars at each site, the researchers isolated the genetic cause of beneficial outcomes using an approach called quantitative trait locus, or QTL, mapping.

The southern variety is generally a larger grass, so researchers have often thought its loci would be solely responsible for increasing the size of hybrids. Surprisingly, the experiment revealed that northern switchgrass gene play a role in increased biomass as well.

"This means if we can combine those genomes together through hybridization, we can actually get a much higher-yielding plant," said Lowry.

Genetic adaptation often comes with tradeoffs. For example, traits that confer winter hardiness in the north might divert resources from growth. Similarly, the prevalence of growth genes in southern-adapted plants might reduce their cold tolerance.

In this study, advantageous characteristics for growth were expressed in one region of the country without others being diminished. "That tells us a little bit about how adaptation works and how it could provide a benefit for agriculture," said Lowry. "If we can select on these loci, we can get a benefit without much of cost."

Because they designed the study to cover such a broad geographical area, the group could isolate these win/win genetic scenarios.

"Usually there's only two field sites involved in these types of experiments, but we had 10," Lowry said. "So we had a much better idea that there were these loci that have much lower costs associated with them."

"We're getting new insights into how plants adjust in different environments," Lowry said. "Understanding how switchgrass adapts to the northern U.S. versus the southern U.S. will allow us to use that knowledge to improve it as a bioenergy crop."

Credit: 
Michigan State University

New study shows how climate change could affect impact of roundworms on grasslands

image: A team of soil and plant ecologists simulated low and high precipitation at grassland research sites in Colorado, Kansas and New Mexico over two years. The site pictured is the Central Plains Experimental Range in northern Colorado.

Image: 
Andre Franco/Colorado State University

Soil food webs play a key role in supporting grassland ecosystems, which cover about one-quarter of the land on Earth. Climate change poses a threat to these environments, partly because of the uncertainty of extremes in rainfall, which is projected to increase.

To learn more about the effects of these extreme events, a team of soil and plant ecologists, led by Colorado State University faculty, studied nematodes, commonly known as roundworms, that play a key role in carbon and nutrient cycling and decomposition in soil.

Simulating low and high precipitation at grassland research sites in Colorado, Kansas and New Mexico over two years, the researchers found in extreme drought conditions that predator nematodes significantly decreased, which led to the growth of root-feeding nematodes. Typically, these predator roundworms feed on the root-feeding species.

Scientists said the findings may have serious implications for grassland productivity under climate change because under drought conditions, having fewer predator nematodes allows root herbivore populations to explode, which could decrease grass root production.

CSU Research Scientist André Franco and University Distinguished Professor Diana Wall led the study, teaming up with Arizona State University Professor Osvaldo Sala and Laureano Gherardi, a postdoctoral research associate.

The study was published June 10 in the Proceedings of the National Academy of Sciences.

Researchers already understood that the root-feeding nematodes are incredibly important in controlling how much soil carbon an ecosystem is able to store from the atmosphere. In this new study, the scientists found the root-feeding nematodes were thriving more in wetter regions experiencing drought, compared with the drier sites.

Franco said the research team is now analyzing the combined soil and plant results from these sites to learn more about whether plants will suffer more than previously thought, due to extreme drought conditions. CSU scientists are also replicating the research in a controlled environment in a greenhouse, looking at the interaction among changes in moisture and soil fauna composition.

"Root biomass responds to most of the carbon sequestered in grasslands, and it might be that the increased population of root feeders is exacerbating the negative effects of drought on carbon sequestration in these ecosystems," he said.

The research team hopes to learn more about the interaction between water and nematode stresses to plants and whether these effects could be even worse for ecosystem functioning than they previously thought.

Credit: 
Colorado State University

SIRT1 plays key role in chronic myeloid leukemia to aid persistence of leukemic stem cells

image: Ravi Bhatia

Image: 
UAB

BIRMINGHAM, Ala. - Patients with chronic myeloid leukemia can be treated with tyrosine kinase inhibitors. While these effective drugs lead to deep remission and prolonged survival, primitive leukemia stem cells resist elimination during the remission and persist as a major barrier to cure.

As a result, the majority of patients with chronic myeloid leukemia, or CML, require indefinite inhibitor treatment to prevent disease recurrence. They also face risks of noncompliance, toxicity and financial burden. Development of effective therapeutic strategies to improve patient outcomes for CML and related cancers depends on identifying the key mechanisms that contribute to the persistence of these leukemic stem cells.

In a study published in the Journal of Clinical Investigation, Ajay Abraham, Ph.D., Shaowei Qiu, M.D., Ravi Bhatia, M.D., and colleagues at the University of Alabama at Birmingham show how the stress-responsive protein SIRT1 plays important roles in maintaining the regenerative potential of CML leukemic stem cells and promoting leukemia development in CML.

"Our studies provide a conceptual advance and new biological insights regarding the activity of SIRT1 and its role in CML leukemic stem cells," said senior author Bhatia. At UAB, Bhatia is a professor of medicine, director of the Division of Hematology and Oncology, and interim director of the O'Neal Comprehensive Cancer Center at UAB.

In 2012, Bhatia and colleagues reported that SIRT1 was overexpressed in CML leukemic stem cells compared to normal hematopoietic stem cells, and this overexpression contributed to CML leukemic stem cell maintenance and resistance to tyrosine kinase inhibitors. However, the underlying mechanisms were not known.

To study those mechanisms, the UAB researchers used a CML mouse model that also has a genetic deletion of SIRT1. This allowed them to compare wild-type leukemic stem cells with SIRT1-deletion leukemic stem cells.

Study details

They found that SIRT1 plays an important role to enhance oxidative phosphorylation by the mitochondria in leukemic stem cells. Furthermore, the researchers found that this increased mitochondrial metabolism in leukemic stem cells did not depend on activity of the mutated kinase that transforms the normally quiescent hematopoietic stem cells into leukemic stem cells.

Mitochondria are the powerhouses of the cell, supplying nearly all the energy a cell normally needs. Oxidative phosphorylation uses oxygen in producing energy; non-leukemic hematopoietic stem cells -- the non-cancerous blood-forming cells of the body, from healthy mice or humans, produce energy by an alternative metabolism called glycolysis.

Treatment with tyrosine kinase inhibitors is known to suppress leukemic hematopoiesis. When SIRT1-deleted mice were treated with tyrosine kinase inhibitors, the UAB researchers found an even greater suppression of leukemic hematopoiesis.

The SIRT1 knock-out also impaired development of CML in the mouse model. Compared with the CML mice without SIRT1 knock-out, the researchers saw significant delays in developing increased numbers of leukocytes and neutrophils, and delayed enlargement of the spleen and time of death. The deletion also reversed redistribution of CML stem cells from the bone marrow to the spleen.

SIRT1 is a deacetylase enzyme, known to deacetylate and activate the transcriptional co-activator PGC-1-alpha. This enhances mitochondrial DNA replication and gene expression, and it promotes mitochondrial activity. Bhatia and colleagues showed that PGC-1-alpha inhibitors were able to significantly reduce mitochondrial oxygen consumption, a sign of oxidative phosphorylation. Thus, the inhibitors acted similarly to the SIRT1 deletion. This finding supports an important role for PGC-1-alpha in the regulation of mitochondrial metabolism in CML stem and progenitor cells.

The researchers also found that a chemical inhibitor of SIRT1 was able to reduce oxidative phosphorylation in both mouse and human CML leukemic stem cells.

Interestingly, the SIRT1 deletion in normal, non-leukemic blood-forming stem cells did not inhibit steady-state normal hematopoiesis in the mouse model. Bhatia noted that development of effective approaches to target the persistent CML leukemic stem cells that resist tyrosine kinase-inhibitor treatment -- while not causing toxicity to normal hematopoietic stem cells -- has been a challenge.

Bhatia says the impact of this study extends to other hematological malignancies, including acute myeloid leukemia, myelodysplastic syndromes and myeloproliferative neoplasms.

"Our research reveals new knowledge and concepts regarding the role of SIRT1 in metabolic regulation of hematopoietic stem cell and leukemic stem cell maintenance, growth and resistance," Bhatia said. "This raises the possibility of developing improved strategies to target kinase-independent metabolic alterations."

Credit: 
University of Alabama at Birmingham

Cognitive behavioral therapy delivered by telemedicine is effective for insomnia

SAN ANTONIO – Preliminary findings from two analyses of an ongoing study suggest that cognitive behavioral therapy for insomnia delivered by telemedicine is as effective as face-to-face delivery.

Results of a randomized controlled non-inferiority trial show that both delivery methods were equally effective at improving sleep outcomes measured by sleep diaries, reducing self-reported sleep latency and wake after sleep onset while increasing total sleep time and sleep efficiency. There also were no differences between the two delivery methods in patient perception of therapeutic alliance, warmth, and confidence in the therapist’s skills.

“The most surprising findings in the study were that, contrary to our hypotheses, participants who received CBT for insomnia via telemedicine rated therapist alliance similarly to participants who received face-to-face CBT for insomnia,” said principal investigator J. Todd Arnedt, Ph.D., an associate professor of psychiatry and neurology and co-director of the Sleep and Circadian Research Laboratory at Michigan Medicine, University of Michigan in Ann Arbor. “In addition, ratings of satisfaction with treatment were equivalent between face-to-face and telemedicine participants. Relative to other remote modalities, telemedicine may offer a unique blend of convenience for the patient while preserving fidelity of the face-to-face interaction.”

Insomnia involves difficulty falling asleep or staying asleep, or regularly waking up earlier than desired, despite allowing enough time in bed for sleep. Daytime symptoms associated with insomnia include fatigue or sleepiness; feeling dissatisfied with sleep; having trouble concentrating; feeling depressed, anxious or irritable; and having low motivation or energy.

The most effective treatment for chronic insomnia is cognitive behavioral therapy for insomnia (CBT-I). It combines behavioral strategies, such as setting a consistent sleep schedule and getting out of bed when struggling with sleep, with cognitive strategies, such as replacing fears about sleeplessness with more helpful expectations. CBT-I recommendations are customized to address each patient’s individual needs and symptoms.

The analysis comparing sleep and daytime functioning variables included 47 adults with chronic insomnia, including 33 women. The analysis of therapeutic alliance involved 38 adults with insomnia, including 25 women. Participants had a mean age of about 52 years. They were randomized to six sessions of CBT-I delivered face-to-face or via the AASM SleepTM telemedicine system. One therapist delivered CBT-I in both conditions.

“Preliminary findings from this study suggest that patients undergoing telemedicine for insomnia can feel just as close and supported by their therapist as if they were in the office,” said co-investigator Deirdre Conroy, Ph.D., a clinical associate professor of psychiatry and clinical director of the Behavioral Sleep Medicine Program at Michigan Medicine, University of Michigan in Ann Arbor. “Telemedicine could be utilized more for CBT-I to bridge the gap between supply and demand for this service.”

Both research abstracts were published recently in an online supplement of the journal Sleep and will be presented Monday, June 10, in San Antonio at SLEEP 2019, the 33rd annual meeting of the Associated Professional Sleep Societies LLC (APSS), which is a joint venture of the American Academy of Sleep Medicine and the Sleep Research Society.

The study was supported by funding from the American Academy of Sleep Medicine Foundation.

Abstract Title: Efficacy of Cognitive Behavioral Therapy Delivered via Telemedicine vs. Face-to-Face: Preliminary Results from a Randomized Controlled Non-Inferiority Trial

Abstract ID: 0363
Presentation Date: Monday, June 10
Poster Presentation: 5:15 p.m. to 7:15 p.m., Board 098
Presenter: J. Todd Arnedt, Ph.D.

Abstract Title: Comparison of Therapeutic Alliance for Telemedicine vs. Face-to-Face Delivered Cognitive Behavioral Therapy for Insomnia: Preliminary Results
Abstract ID: 0364
Presentation Date: Monday, June 10
Poster Presentation: 5:15 p.m. to 7:15 p.m., Board 099
Presenter: Deirdre Conroy, Ph.D.

For a copy of the abstracts or to arrange an interview with a study author or an AASM spokesperson, please contact the AASM at 630-737-9700 or media@aasm.org.

About the American Academy of Sleep Medicine
Established in 1975, the American Academy of Sleep Medicine (AASM) improves sleep health and promotes high quality, patient-centered care through advocacy, education, strategic research, and practice standards. The AASM has a combined membership of 10,000 accredited member sleep centers and individual members, including physicians, scientists and other health care professionals. For more information about sleep and sleep disorders, including a directory of AASM-accredited member sleep centers, visit www.sleepeducation.org.

Journal

SLEEP

DOI

10.1093/sleep/zsz067.363

Credit: 
American Academy of Sleep Medicine

Drug delays type 1 diabetes in people at high risk

image: Brothers volunteer for Type 1 Diabetes TrialNet.

Image: 
Benaroya Research Institute

A treatment affecting the immune system effectively slowed the progression to clinical type 1 diabetes in high risk individuals, according to findings from National Institutes of Health-funded research. The study is the first to show that clinical type 1 diabetes can be delayed by two or more years among people who are at high risk. These results were published online in The New England Journal of Medicine and presented at the American Diabetes Association Scientific Sessions in San Francisco.

The study, involving treatment with an anti-CD3 monoclonal antibody (teplizumab), was conducted by Type 1 Diabetes TrialNet, an international collaboration aimed at discovering ways to delay or prevent type 1 diabetes. Researchers enrolled 76 participants ages 8-49 who were relatives of people with type 1 diabetes, had at least two types of diabetes-related autoantibodies (proteins made by the immune system), and abnormal glucose (sugar) tolerance.

Participants were randomly assigned to either the treatment group, which received a 14-day course of teplizumab, or the control group, which received a placebo. All participants received glucose tolerance tests regularly until the study was completed, or until they developed clinical type 1 diabetes - whichever came first.

During the trial, 72% of people in the control group developed clinical diabetes, compared to only 43% of the teplizumab group. The median time for people in the control group to develop clinical diabetes was just over 24 months, while those who developed clinical diabetes in the treatment group had a median time of 48 months before progressing to diagnosis.

"The difference in outcomes was striking. This discovery is the first evidence we've seen that clinical type 1 diabetes can be delayed with early preventive treatment," said Lisa Spain, Ph.D., Project Scientist from the NIH's National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), sponsor of TrialNet. "The results have important implications for people, particularly youth, who have relatives with the disease, as these individuals may be at high risk and benefit from early screening and treatment."

Type 1 diabetes develops when the immune system's T cells mistakenly destroy the body's own insulin-producing beta cells. Insulin is needed to convert glucose into energy. Teplizumab targets T cells to lessen the destruction of beta cells.

"Previous clinical research funded by the NIH found that teplizumab effectively slows the loss of beta cells in people with recent onset clinical type 1 diabetes, but the drug had never been tested in people who did not have clinical disease," said Kevan C. Herold, M.D., of Yale University, the study's lead author. "We wanted to see whether early intervention would have a benefit for people who are at high risk but do not yet have symptoms of type 1 diabetes."

The effects of the drug were greatest in the first year after it was given, when 41% of participants developed clinical diabetes, mainly in the placebo group. Many factors, including age, could have contributed to the ability of teplizumab to delay clinical disease, since at-risk children and adolescents are known to progress to type 1 diabetes faster than adults. Faster progression of type 1 diabetes is associated with a highly active immune system, which may explain the impact of immune system-modulating drugs like teplizumab.

Other data collected from the trial may help researchers to understand why certain people responded to treatment. Participants who responded to teplizumab tended to have certain autoantibodies and other immune system characteristics. The research team also cautioned that the study had limitations, including the small number of participants, their lack of ethnic diversity, and that all participants were relatives of people with type 1 diabetes, potentially limiting the ability to translate the study broadly.

"While the results are encouraging, more research needs to be done to address the trial's limitations, as well as to fully understand the mechanisms of action, long-term efficacy and safety of the treatment," said Dr. Spain.

"This trial illustrates how decades of research on the biology of type 1 diabetes can lead to promising treatments that have a real impact on people's lives. We're very excited to see the next steps in this research," said Dr. Griffin P. Rodgers, NIDDK Director. "The dedicated researchers, volunteers and families participating in this program make discoveries like this possible."

Credit: 
NIH/National Institute of Diabetes and Digestive and Kidney Diseases

Research reveals how diet influences diabetes risk

image: Researchers found that eating rice first and then a vegetable and meat caused significantly higher blood sugar levels after eating compared to other sequences. Members of the research team are pictured with the food portions used for the study (From left: Melvin Leow, Sun Lijuan, Christiani Jeyakumar, Henry, Priya Govindharajulu, Goh Huijen).

Image: 
Singapore Institute for Clinical Sciences, A*STAR.

Baltimore (June 8, 2019) - Could changing what we eat lower the chances of developing type 2 diabetes? Studies presented at Nutrition 2019 will examine how consuming certain foods, vitamins and even the order in which we eat can affect blood sugar levels and risk of developing 2 diabetes.

Nutrition 2019 is being held June 8-11, 2019 at the Baltimore Convention Center. Contact the media team for more information or to obtain a free press pass to attend the meeting.

What and how we eat influences diabetes risk

Fewer new diabetes cases seen in people who eat more plant-based foods
In a study of 2,717 young adults in the U.S. with long-term follow-up, people who increased the amount of fruits, vegetables, whole grains, nuts and vegetable oils in their diet over 20 years had a 60 percent lower risk of type 2 diabetes compared to those with a small decrease in plant foods. The findings suggest that long-term shifts toward a more plant-centered diet could help prevent diabetes. Yuni Choi, University of Minnesota-Twin Cities, will present this research on Tuesday, June 11, from 11 - 11:15 a.m. in the Baltimore Convention Center, Ballroom IV (abstract).

Large study points to importance of vitamins B2 and B6
Findings from a study examining three large cohorts of U.S. health professionals suggest that people with higher intakes of vitamins B2 and B6 from food or supplements have a lower risk for type 2 diabetes. The study, which included more than 200,000 people, also revealed that consuming higher levels of vitamin B12 from foods was associated with a higher type 2 diabetes risk, which may be due to consumption of animal products. Kim V. E. Braun, Erasmus University Medical Center, will present this research on Sunday, June 9, from 3:15 - 3:30 p.m. in the Baltimore Convention Center, Room 319/320 (abstract).

Food order can affect blood sugar levels
A new study reveals that changing the order in which food is eaten could reduce post-meal blood sugar spikes. The researchers found that eating rice first and then a vegetable and meat caused significantly higher blood sugar levels after eating compared to other sequences. The results point to a simple but effective way to lower blood sugar levels after eating, which could prevent the transition from prediabetes to diabetes. Christiani Henry, Singapore Institutes for Clinical Sciences, will present this research on Saturday, June 8, from 12 - 1 p.m. in the Baltimore Convention Center, Halls A-B (poster 56) (abstract).

The interaction of genes and food

Coffee may lower risk, especially for people with gene variant
A study of more than 4,000 Koreans adds evidence that drinking coffee may lower the risk for developing type 2 diabetes and reveals that a genetic factor could be involved. The researchers found both Korean men and women who drank at least one cup of black coffee a day were less likely to develop prediabetes or type 2 diabetes than those who drank no coffee. The association between drinking coffee and lowered risk for type 2 diabetes was strongest for Koreans with a genetic variation known as rs2074356, which was recently found to be linked with habitual coffee consumption. An Na Kim, Seoul National University, will present this research on behalf of Taiyue Jin on Monday, June 10, from 1:45 - 2:45 p.m. in the Baltimore Convention Center, Halls A-B (poster 502) (abstract).

Credit: 
American Society for Nutrition

How do foams collapse?

image: The film is seen to recede into surrounding films, while a droplet is released which penetrates other films and causes further collapse.

Image: 
Rei Kurita

Tokyo, Japan - Researchers from Tokyo Metropolitan University have successfully found two distinct mechanisms by which foams can collapse, yielding insight into the prevention/acceleration of foam rupture in industrial materials e.g. foods, cosmetics, insulation, stored chemicals. When a bubble breaks, they found that a collapse event propagates via impact with the receding film and tiny scattered droplets breaking other bubbles. Identifying which mechanism is dominant in different foams may help tailor them to specific applications.

Foams play a key role in a wide range of industrial products, from foods, beverages, pharmaceuticals, cleaning products and cosmetics to material applications such as building insulation, aircraft interiors and flame-retardant barriers. They might also be an unwanted property of a product of e.g. frothing in stored chemicals during transit. From a scientific perspective, they also constitute a unique form of matter, a fine balance between the complex network of forces acting on the liquid film network that makes up its structure and the pressure of the gas trapped inside: understanding how foams behave may yield new physical insights as well as better ways to use them.

Naoya Yanagisawa and Associate Professor Rei Kurita set out to observe how foams collapse. They took a solution of water, glycerol and a common surfactant (a film-stabilizing agent) and created a two-dimensional foam squashed between two pieces of glass. Using an ultra-fast camera and a needle, they were able to controllably break a bubble at the edge of the foam raft and observe "collective bubble collapse" (CBC). They identified two distinct ways in which the breakage of one bubble at the edge led to a cascade of breakage events around it, a "propagating" mode due to the film of the broken bubble being absorbed into surrounding liquid film, and a "penetrating" mode due to droplets being released from the rupture event flying away and breaking other bubbles.

As the investigators changed the amount of water in the film, they identified several key trends in how the bubbles reacted at a microscopic level. For example, they found that more liquid in the foam led to the release of slower droplets, unable to penetrate surrounding films. This was correlated with a drastic drop in the number of bubbles collapsed; CBCs were thus crucially underpinned by the "penetrating" mode of collapse. Droplet speed was determined by the speed at which the film receded; this "streaming velocity" was found to be proportional to the osmotic pressure of the film i.e. the pressure at which a liquid brought into contact with the foam is driven into the film network. The team showed that the Navier-Stokes equations, key relations describing how fluids behave over time, could be used to explain these trends.

A key finding was that changing the viscosity of the fluid did not lead to a significant change in the number of bubbles broken. Methods to stabilize foams commonly rely on changing the viscosity, yet the team's findings clearly show how both the number of bubbles collapsed and the velocity of the receding film are unaffected. Coupled with the dominant role played by the "penetrating" mode, future strategies to prevent foam collapse may instead focus on e.g. combining multiple surfactants to make the film more resistant to droplet impact.

Credit: 
Tokyo Metropolitan University

Preliminary study finds health coaches and incentives help youth with type 1 diabetes

video: The endocrinology team at Children's National used weekly health coaching and incentives to help 25 pediatric patients with type 1 diabetes improve glycemic control, or A1C, over a 10-week period.

Image: 
Children's National Health System

The life of a type 1 diabetes patient - taking daily insulin shots or wearing an insulin pump, monitoring blood sugar, prioritizing healthful food choices and fitting in daily exercise - can be challenging at age 5 or 15, especially as holidays, field trips and sleepovers can disrupt diabetes care routines, creating challenges with compliance. This is why endocrinologists from Children's National Health System experimented with using health coaches over a 10-week period to help families navigate care for children with type 1 diabetes.

By assembling a team of diabetes educators, dietitians, social workers, psychologists and health care providers, Fran Cogen, M.D., C.D.E., director of diabetes care at Children's National, helped pediatric patients with type 1 diabetes manage their glycemic status, or blood-sugar control.

Starting Saturday, June 8, Dr. Cogen will share results of the pilot program as poster 1260-P, entitled "A Clinical Care Improvement Pilot Program: Individualized Health Coaching and Use of Incentives for Youth with Type 1 Diabetes and their Caregivers," at the American Diabetes Association's 79th Scientific Sessions, which takes place June 7-11 at the Moscone Center in San Francisco.

Dr. Cogen's study was offered at no cost to caregivers of 179 patients at Children's National seeking treatment for type 1 diabetes. The pilot program included two components: 1. Weekly phone calls or emails from a health coach to a caregiver with personalized insulin adjustments, based on patient blood sugars submitted through continuous glucose monitoring apps; and 2. Incentives for patients to participate in the program and reach health targets.

Twenty-five participants, ages 4-18, with a mean age of 11.6 and A1c ranges between 8.6 - 10% joined the study. The average A1c was 9.4% at the beginning of the program and dropped by an average of .5% at the end of the trial. Twenty of the 25 participants, 80%, improved A1c levels by .5%. Seventeen participants, 68%, improved A1c levels by more than .5%, while seven participants, 28%, improved A1c levels by more than 1%.

"Chronic disease is like a marathon," says Dr. Cogen. "You need to have constant reinforcement and coaching to get people to do their best. Sometimes what drives people is to have people on the other end say, 'Keep it up, you're doing a good job, keep sending us information so that we can make changes to improve your child's blood sugar management,' which gives these new apps and continuous glucose monitoring devices a human touch."

Instead of waiting three months between appointments to talk about ways a family can make changes to support a child's insulin control and function, caregivers received feedback from coaches each week. Health coaches benefitted, too: They reported feeling greater empathy for patients, while becoming more engaged in personalizing care plans.

Families who participated received a gift card to a local grocery store, supporting a child's dietary goals. Children who participated were also entered into an iPad raffle. Improvements in A1c levels generated extra raffle tickets per child, which motivated participants, especially teens.

"These incentives are helpful in order to get kids engaged in their health and in an immediate way," says Dr. Cogen. "Teenagers aren't always interested in long-term health outcomes, but they are interested in what's happening right now. Fluctuating blood sugars can cause depression and problems with learning, while increasing risk for future complications, including eye problems, kidney problems and circulation problems. As health care providers, we know the choices children make today can influence their future health outcomes, which is why we designed this study."

Moving forward, Dr. Cogen and the endocrinologists at Children's National would like to study the impact of using this model over several months, especially for high-risk patients, while asynchronously targeting information to drive behavior change - accommodating the needs of families, while delivering dose-specific recommendations from health care providers.

Dr. Cogen adds, "We're moving away from office-centric research models and creating interventions where they matter: at home and with families in real time."

Credit: 
Children's National Hospital

Researchers warn: junk food could be responsible for the food allergy epidemic

image: The role of AGEs in the development of food allergy.

Image: 
ESPGHAN

(Glasgow, 8 June, 2019) Experts at the 52nd Annual Meeting of the European Society for Paediatric Gastroenterology, Hepatology and Nutrition are today presenting the results of a study that show higher levels of advanced glycation end products (AGEs), found in abundance in junk food, are associated with food allergy in children [1].

Researchers from the University of Naples 'Federico II' observed three groups of children aged between 6-12 years old (61 children in total): those with food allergies, those with respiratory allergies, and healthy controls. The study revealed a significant correlation between subcutaneous levels of advanced glycation end products (AGEs) and junk food consumption, and further, that children with food allergies presented with higher levels of subcutaneous levels of AGEs than those children with respiratory allergies or no allergies at all. In addition, the research team found compelling evidence relating to the mechanism of action elicited by AGEs in determining food allergy.

AGEs are proteins or lipids that become glycated after exposure to sugars [2] and are present at high levels in junk foods - deriving from sugars, processed foods, microwaved foods and roasted or barbequed meats. AGEs are already known to play a role in the development and progression of different oxidative-based diseases including diabetes, atherosclerosis (where plaque builds up inside the arteries), and neurological disorders [3] but this is the first time an association has been found between AGEs and food allergy.

While firm statistics on global food allergy prevalence are lacking, there is growing evidence that incidence is on the increase, especially amongst young children, and prevalence is reported to be as high as 10% in some countries [4,5,6]. Similarly, over recent decades there has been a dramatic increase in the consumption of highly-processed foods [7] (which are known to contain higher levels of AGEs), and highly-processed foods have been reported as comprising up to 50% of total daily energy intake in European countries [8].

Commenting on the research, principal investigator Roberto Berni Canani said:

"As of yet, existing hypotheses and models of food allergy do not adequately explain the dramatic increase observed in the last years - so dietary AGEs may be the missing link. Our study certainly supports this hypothesis, we now need further research to confirm it. If this link is confirmed, it will strengthen the case for national governments to enhance public health interventions to restrict junk food consumption in children."

Isabel Proaño, Director of Policy and Communications at the European Federation of Allergy and Airways Diseases Patients' Associations (EFA) added:

"These new findings show there are still many environmental and dietary issues affecting our health and wellbeing. Healthcare professionals and patients do not have all of the important information to face a disease that dramatically impacts their quality of life, and industrialised food processing and labelling gaps will not help them. We call on the public health authorities to enable better prevention and care of food allergy."

Credit: 
Spink Health

AI tool helps radiologists detect brain aneurysms

image: HeadXNet team members (from left to right, Andrew Ng, Kristen Yeom, Christopher Chute, Pranav Rajpurkar and Allison Park) looking at a brain scan. Scans like this were used to train and test their artificial intelligence tool, which helps identify brain aneurysms.

Image: 
L.A. Cicero/Stanford News Service

Doctors could soon get some help from an artificial intelligence tool when diagnosing brain aneurysms - bulges in blood vessels in the brain that can leak or burst open, potentially leading to stroke, brain damage or death.

The AI tool, developed by researchers at Stanford University and detailed in a paper published June 7 in JAMA Network Open, highlights areas of a brain scan that are likely to contain an aneurysm.

"There's been a lot of concern about how machine learning will actually work within the medical field," said Allison Park, a Stanford graduate student in statistics and co-lead author of the paper. "This research is an example of how humans stay involved in the diagnostic process, aided by an artificial intelligence tool."

This tool, which is built around an algorithm called HeadXNet, improved clinicians' ability to correctly identify aneurysms at a level equivalent to finding six more aneurysms in 100 scans that contain aneurysms. It also improved consensus among the interpreting clinicians. While the success of HeadXNet in these experiments is promising, the team of researchers - who have expertise in machine learning, radiology and neurosurgery - cautions that further investigation is needed to evaluate generalizability of the AI tool prior to real-time clinical deployment given differences in scanner hardware and imaging protocols across different hospital centers. The researchers plan to address such problems through multi-center collaboration.

Augmented expertise

Combing brain scans for signs of an aneurysm can mean scrolling through hundreds of images. Aneurysms come in many sizes and shapes and balloon out at tricky angles - some register as no more than a blip within the movie-like succession of images.

"Search for an aneurysm is one of the most labor-intensive and critical tasks radiologists undertake," said Kristen Yeom, associate professor of radiology and co-senior author of the paper. "Given inherent challenges of complex neurovascular anatomy and potential fatal outcome of a missed aneurysm, it prompted me to apply advances in computer science and vision to neuroimaging."

Yeom brought the idea to the AI for Healthcare Bootcamp run by Stanford's Machine Learning Group, which is led by Andrew Ng, adjunct professor of computer science and co-senior author of the paper. The central challenge was creating an artificial intelligence tool that could accurately process these large stacks of 3D images and complement clinical diagnostic practice.

To train their algorithm, Yeom worked with Park and Christopher Chute, a graduate student in computer science, and outlined clinically significant aneurysms detectable on 611 computerized tomography (CT) angiogram head scans.

"We labelled, by hand, every voxel - the 3D equivalent to a pixel - with whether or not it was part of an aneurysm," said Chute, who is also co-lead author of the paper. "Building the training data was a pretty grueling task and there were a lot of data."

Following the training, the algorithm decides for each voxel of a scan whether there is an aneurysm present. The end result of the HeadXNet tool is the algorithm's conclusions overlaid as a semi-transparent highlight on top of the scan. This representation of the algorithm's decision makes it easy for the clinicians to still see what the scans look like without HeadXNet's input.

"We were interested how these scans with AI-added overlays would improve the performance of clinicians," said Pranav Rajpurkar, a graduate student in computer science and co-lead author of the paper. "Rather than just having the algorithm say that a scan contained an aneurysm, we were able to bring the exact locations of the aneurysms to the clinician's attention."

Eight clinicians tested HeadXNet by evaluating a set of 115 brain scans for aneurysm, once with the help of HeadXNet and once without. With the tool, the clinicians correctly identified more aneurysms, and therefore reduced the "miss" rate, and the clinicians were more likely to agree with one another. HeadXNet did not influence how long it took the clinicians to decide on a diagnosis or their ability to correctly identify scans without aneurysms - a guard against telling someone they have an aneurysm when they don't.

To other tasks and institutions

The machine learning methods at the heart of HeadXNet could likely be trained to identify other diseases inside and outside the brain. For example, Yeom imagines a future version could focus on speeding up identifying aneurysms after they have burst, saving precious time in an urgent situation. But a considerable hurdle remains in integrating any artificial intelligence medical tools with daily clinical workflow in radiology across hospitals.

Current scan viewers aren't designed to work with deep learning assistance, so the researchers had to custom-build tools to integrate HeadXNet within scan viewers. Similarly, variations in real-world data - as opposed to the data on which the algorithm is tested and trained - could reduce model performance. If the algorithm processes data from different kinds of scanners or imaging protocols, or a patient population that wasn't part of its original training, it might not work as expected.

"Because of these issues, I think deployment will come faster not with pure AI automation, but instead with AI and radiologists collaborating," said Ng. "We still have technical and non-technical work to do, but we as a community will get there and AI-radiologist collaboration is the most promising path."

Credit: 
Stanford University