Tech

Demolishing abandoned houses does not reduce nearby crime, study finds

LAWRENCE -- Cities across the country have sought ways to improve neighborhood safety and in recent years have pointed to demolishing abandoned housing as a way to achieve the goal. While millions of dollars have been spent on the efforts, a recent University of Kansas study found a program demolishing more than 500 abandoned residential properties in Kansas City, Missouri, did not significantly reduce nearby violent or property crime.

Since the housing foreclosure crisis of 2007-08, the number of abandoned homes across the country has rapidly increased, drawing attention to dilapidated and abandoned residential properties and their effect on neighborhoods, including elevated crime rates. Hye-Sung Han, assistant professor of public affairs & administration at KU, conducted a study in which she examined 559 abandoned properties in Kansas City, Missouri, and nearby crime rates in the surrounding area. She found the demolition did not lead to a reduction in nearby crime and that localized socioeconomic and housing characteristics were much stronger predictors of any change in crime rates.

While scholars have long associated abandoned property and crime, there has been little research on whether demolishing abandoned properties reduces crime, even though police and city officials argue that demolition increases neighborhood safety and ease the burden on police by removing places for illegal activities.

"There is not much data out there on housing abandonment. Because of that, there aren't many studies done, and those that are, are mostly about how abandonment affects the neighborhood housing market," Han said. "But one other negative factor is crime and how it affects the quality of life it has for those still living in neighborhoods with abandoned housing. It has been pretty much proven that there is more crime where there is more abandoned housing."

Han said she decided to study the matter after hearing Kansas City's former police chief in 2016 touting a $10 million plan to demolish 800 dilapidated, abandoned homes to help reduce crime. The city has more than 10,000 abandoned properties, though not all have been deemed dangerous. Han identified 559 properties that were demolished between 2012 and 2016. Crime near the properties was compared to crime near another abandoned home that was not demolished. Her study, co-written with Scott Helm of the University of Missouri-Kansas City, was published in the journal Housing Policy Debate.

"I found out the crime around a property that was demolished did not change," Han said. "So, I looked at other explanations for crime going up or down. I think one of the reasons demolition doesn't make a difference is once you demolish the abandoned property, you are left a vacant lot. That is not necessarily better for neighborhood safety. In fact, I found that every single lot where the 559 properties were demolished was still vacant in 2020. Unkept vacant lots can encourage crime."

Han emphasized the study is not intended to criticize city officials or police but rather to shed light on whether programs using taxpayer funding to address crime are effective. Demolition is expensive, often costing $8,000 to $10,000 per home. The study focused on Kansas City, but similar demolition programs have taken place across the country, including cities such as Detroit, Philadelphia, Pittsburgh and Baltimore, some tearing down thousands of abandoned homes per year. One cannot assume the findings would be the same in other cities as they were in Kansas City, but Han said she feels similar results would be likely. There are differences among cities to consider, however, including types of properties being demolished, mostly single-family dwellings in Kansas City, while multiple-family rowhouses are more frequent targets for the wrecking ball in other cities, which carry a higher price tag.

The findings show that demolition of abandoned homes did not reduce nearby crime, which Han said indicates policymakers should consider holistic approaches to improve neighborhood safety. The focus should be on improving neighborhood social and physical characteristics, particularly in urban neighborhoods with high housing abandonment.

"In most cases, the very first thing that should happen is to get these houses and properties occupied," Han said. "So someone is at least living there and paying taxes, and so the city can provide more services."

Credit: 
University of Kansas

Rapid test can ID unknown causes of infections throughout the body

UC San Francisco scientists have developed a single clinical laboratory test capable of zeroing in on the microbial miscreant afflicting patients hospitalized with serious infections in as little as six hours -- irrespective of what body fluid is sampled, the type or species of infectious agent, or whether physicians start out with any clue as to what the culprit may be.

The test will be a lifesaver, speeding appropriate drug treatment for the seriously ill, and should transform the way infectious diseases are diagnosed, said the authors of the study, published November 9, 2020 in Nature Medicine.

"The advance here is that we can detect any infection from any body fluid, without special handling or processing for each distinct body fluid," said study corresponding author Charles Chiu, MD, PhD, a professor in the UCSF Department of Laboratory Medicine and director of the UCSF-Abbott Viral Diagnostics and Discovery Center. "It's a simple procedure."

Conventional diagnostic tests are designed to detect only one or sometimes a small panel of potential pathogens. In contrast, the new protocol employs powerful "next-generation" DNA-sequencing technology to account for all DNA in a sample, which may be from any species -- human, bacterial, viral, parasitic, or fungal. Clinicians do not need to have a suspect in mind. To identify a match, the new test relies on specially developed analytical software to compare DNA sequences in the sample to massive genomic databases covering all known pathogens.

Chiu and colleagues at the UCSF Center for Next-Gen Precision Diagnostics first developed this method to identify infectious agents in spinal fluid in cases of encephalitis and meningitis, notably helping to save a long-sick boy's life, and later validating the protocol for use as a clinical test that is now being ordered by physicians at hospitals nationwide.

Chiu and collaborators also developed a similar blood test for sepsis, a leading killer of hospital patients, while other tests use respiratory fluid to diagnose infectious causes of pneumonia.

But each of these tests is designed to work only with specific body fluids, not all. Unfortunately, physicians are often uncertain of the origin of a patient's infection and must send off samples of several different body fluids simultaneously for lab analysis.

In the new study, the UCSF researchers, including Center for Next-Gen Precision Diagnostics co-founders Joe DeRisi, PhD, and Steve Miller, MD, PhD, compared performance of their new single-protocol "metagenomic" DNA test to gold-standard laboratory culture-based tests and now-standard PCR-based DNA tests, using two high-powered DNA sequencing technologies to diagnose bacterial or fungal infection. One was a portable, pocket-sized sequencer made by Oxford Nanopore Technologies, which can complete sequencing within six hours and to date has been used almost exclusively by research labs. The other was Illumina sequencing, which can simultaneously handle many samples in parallel and which already is used in some clinical labs (including at UCSF), but which requires more than 24 hours to complete.

The researchers analyzed body fluids -- 180 samples from in and around the lungs, the peritoneal cavity, pus-filled abscesses, the spinal cord, joints, and other sites such as tonsillar fluid and even vitreal (eye) fluid-- from 160 patients, 144 of whom were hospitalized.

Compared with gold-standard culture and PCR, the researchers diagnosed 79% of bacterial and 91% of fungal infections by Illumina sequencing, and 75% of bacterial and 91% of fungal infections by nanopore sequencing.

Using the metagenomic DNA test, Chiu and colleagues were also able to diagnose infections in seven of 12 patients whose illnesses had remained undiagnosed after standard culturing or PCR-based DNA testing.

"We think this one metagenomic test can potentially replace all PCR-based DNA tests now being used to detect hundreds of organisms that can't be adequately cultured," Chiu said.

The researchers are now moving towards FDA regulatory approval in hopes of making this test a standard part of clinical practice at UCSF and elsewhere.

Credit: 
University of California - San Francisco

Explaining the religious vote for Trump

New research by LSU sociologists indicate it wasn't Christian nationalism that drove churchgoers' Trump vote in 2016. Rather, surprisingly, Christian nationalism was important among non-churchgoers. Christian nationalism is thought to have been an important factor in the election of Donald Trump as President of the United States in 2016--and likely drove many of his supporters to the polls in 2020. Now, new research shows Christian nationalist support of Trump isn't tied to religious institutions or attending church on a regular basis. Instead, it's tied to not attending church.

Regardless of political or personal background, voters who hold strong Christian nationalist values voted for Trump at high levels if they didn't go to church, according to 2017 survey data analyzed by Samuel Stroope and Heather Rackin, associate professors of sociology in the LSU College of Humanities & Social Sciences, with co-authors Paul Froese of Baylor University and Jack Delehanty of Clark University. The researchers define Christian nationalism as a set of beliefs about how Christianity should be prioritized in public life, in laws, and in America's national identity. In a forthcoming paper in Sociological Forum, titled "Unchurched Christian Nationalism and the 2016 U.S. Presidential Election," they call for nuance in explaining the so-called "religious vote" for Trump.

"The 2016 election may not be a straightforward story of religious communities coalescing around the Christian nationalist candidate...Christian nationalism operates differently for those inside and outside of religious institutions [and] religion's most dynamic effects on U.S. politics may have less to do with what happens inside churches than with how people--whether they are individually religious or not--use religious ideas to draw and impose boundaries around national identity," write the authors.

Stroope and Rackin pull together several threads from previous research. First, how Christian nationalism can be seen as an aspect of a larger populist ethos of victimization, embattlement, and resentment. Trump received significant support from alienated Americans who appear to be disengaged from religious congregations and other social institutions. Second, how Christian nationalist rhetoric can indicate nostalgia or be used as a veil for increasingly unpopular opinions, such as racial bias or anti-LGBTQ views. Referencing previous research, the authors write that "many Americans now feel that they are victimized for expressing traditional values concerning marriage, sexuality, and gender identity."

Detachment from religious communities can also intensify conservative attitudes.

"Institutions in general can have a stabilizing effect on people's lives and ideologies," Stroope said. "People who want to have their views 'checked' might also self-sort into institutions. Furthermore, religious communities can have a stress-buffering effect, so people feel less desperate for an authoritarian figure like Trump."

Their analysis using national data confirmed that churchgoers overall were more likely to vote for Trump than non-churchgoers. But these findings became more interesting when the researchers took Christian nationalism into account, indicated by voters' agreement or disagreement with statements such as "the federal government should declare the United States a Christian nation," or "the success of the United States is part of God's plan."

For non-churchgoers, the percentage who voted for Trump contrasted sharply. Less than 10 percent of non-churchgoers who strongly disagreed with the Christian nationalist statements voted for Trump. Meanwhile, nearly 90 percent of those who strongly agreed with Christian nationalist statements did. For regular churchgoers, however, Trump support did not have the same dramatic swing across different levels of Christian nationalist sentiment. After Stroope and Rackin controlled for an array of background characteristics, such as voters' party affiliation, the effect of Christian nationalism on Trump-voting was only clear for non-churchgoers. Stroope and Rackin did not find any evidence that Christian nationalism was tied to Trump-voting among churchgoers.

What motivated Stroope to study the religious vote for Trump in the first place was the "dissonance" he perceived between why churchgoers would vote Republican and Trump's style of Christian nationalism.

"Some of what I saw didn't quite mesh for me," Stroope said. "On the one hand, I heard anecdotal reports of patriotic church services and commentators' claims that Christian nationalism explained the 'religious vote' for Trump. Clearly, just like in other recent elections, the religious vote mattered in 2016, but I questioned whether it was because of Christian nationalism. On the other hand, research coming out of Europe on right-wing populism suggests how it seems to activate religious identity among people who aren't regular churchgoers. In some ways, Trump is actually the perfect candidate for people who aren't very religiously observant yet have Christian nationalist sentiments. He may have attracted unchurched Christian nationalist voters because he uses pro-Christian language but is himself not personally religiously observant."

So, rather than being a story of how the religious nationalist vote for Trump was driven by Christian leaders, churches, and institutions, Stroope and Rackin suggest that it was buoyed by the religiously disconnected.

"You have to keep in mind that religion is complex and multidimensional," Stroope said. "It shouldn't be surprising that many people who don't attend church still have religious beliefs and identities, and these religious identities can be used to draw boundaries, infer value, and be a salve for alienation in a changing America."

"In a relatively short time in our country, we've also seen rapid demographic and cultural change," Stroope continued. "With the first Black president in Barack Obama and marriage equality, many people see rapid changes in American society, and this can feel distressing or at least disorienting to some. And if they don't belong to a community or church where they can feel anchored and emotionally supported, their feelings of distress probably aren't soothed by things like talk radio, cable news, or social media. Likely the opposite. If they fear their identity or way of life is threatened, their distress may fester."

With religious attendance generally in decline, great uncertainty with the U.S. economy due to COVID-19 and a changing climate, Stroope and Rackin cannot dismiss the possibility of Christian nationalism becoming an even stronger driver of American politics in the future.

"There is room for yet more surprises," Rackin said.

Credit: 
Louisiana State University

Glioblastoma nanomedicine crosses into brain in mice, eradicates recurring brain cancer

'I've worked in this field for more than 10 years and have not seen anything like this.'

A new synthetic protein nanoparticle capable of slipping past the nearly impermeable blood-brain barrier in mice could deliver cancer-killing drugs directly to malignant brain tumors, new research from the University of Michigan shows.

The study is the first to demonstrate an intravenous medication that can cross the blood-brain barrier.

The discovery could one day enable new clinical therapies for treating glioblastoma, the most common and aggressive form of brain cancer in adults, and one whose incidence is rising in many countries. Today's median survival for patients with glioblastoma is around 18 months; the average 5-year survival rate is below 5%.

In combination with radiation, the U-M team's intravenously injected therapy led to long-term survival in seven out of eight mice. When those seven mice experienced a recurrence of glioblastoma, their immune responses kicked in to prevent the cancer's regrowth--without any additional therapeutic drugs or other clinical treatments.

"It's still a bit of a miracle to us," said Joerg Lahann, the Wolfgang Pauli Collegiate Professor of Chemical Engineering and a co-senior author of the study. "Where we would expect to see some levels of tumor growth, they just didn't form when we rechallenged the mice. I've worked in this field for more than 10 years and have not seen anything like this."

The findings suggest that the U-M team's combination of therapeutic drugs and nanoparticle delivery methods not only eradicated the primary tumor, but resulted in immunological memory, or the ability to more quickly recognize--and attack--remaining malignant cancer cells.

"This is a huge step toward clinical implementation," said Maria Castro, the R.C. Schneider Collegiate Professor of Neurosurgery and a co-senior author of the study. "This is the first study to demonstrate the ability to deliver therapeutic drugs systemically, or intravenously, that can also cross the blood-brain barrier to reach tumors."

Five years ago, Castro knew how she wanted to target glioblastoma. She wanted to stop a signal that cancer cells send out, known as STAT3, to trick immune cells into granting them safe passage within the brain. If she could shut down that pathway with an inhibitor, the cancer cells would be exposed and the immune system could eliminate them. But she didn't have a way to get past the blood-brain barrier.

She attended a workshop at the Biointerfaces Institute, which Lahann leads, and the two discussed the problem. Lahann's team began working on a nanoparticle that could ferry a STAT3 inhibitor past the blood-brain barrier.

A protein called human serum albumin, which is present in blood, is one of the few molecules that can cross the blood-brain barrier, so Lahann's team used it as the structural building block for their nanoparticles. They used synthetic molecules to link these proteins up and then attached the STAT3 inhibitor and a peptide called iRGD, which serves as a tumor homing device.

Over the course of three weeks, a cohort of mice received multiple doses of the new nanomedicine, extending their median survival to 41 days, up from 28 days for those untreated. Following that success, the team performed a second mouse study using the drug alongside today's current standard of care: focused radiotherapy. Seven of the eight mice reached long-term survival and appeared completely tumor-free, with no signs of malignant, invasive tumor cells.

The researchers say their synthetic protein nanoparticles could be adopted, after further development and preclinical testing, to deliver other small-molecule drugs and therapies to currently "undruggable" solid-based tumors.

Credit: 
University of Michigan

Making 3D nanosuperconductors with DNA

image: An illustration showing how highly nanostructured 3-D superconducting materials can be created based on DNA self-assembly.

Image: 
Brookhaven National Laboratory

UPTON, NY--Three-dimensional (3-D) nanostructured materials--those with complex shapes at a size scale of billionths of a meter--that can conduct electricity without resistance could be used in a range of quantum devices. For example, such 3-D superconducting nanostructures could find application in signal amplifiers to enhance the speed and accuracy of quantum computers and ultrasensitive magnetic field sensors for medical imaging and subsurface geology mapping. However, traditional fabrication tools such as lithography have been limited to 1-D and 2-D nanostructures like superconducting wires and thin films.

Now, scientists from the U.S. Department of Energy's (DOE) Brookhaven National Laboratory, Columbia University, and Bar-Ilan University in Israel have developed a platform for making 3-D superconducting nano-architectures with a prescribed organization. As reported in the Nov. 10 issue of Nature Communications, this platform is based on the self-assembly of DNA into desired 3-D shapes at the nanoscale. In DNA self-assembly, a single long strand of DNA is folded by shorter complementary "staple" strands at specific locations--similar to origami, the Japanese art of paper folding.

"Because of its structural programmability, DNA can provide an assembly platform for building designed nanostructures," said co-corresponding author Oleg Gang, leader of the Soft and Bio Nanomaterials Group at Brookhaven Lab's Center for Functional Nanomaterials (CFN) and a professor of chemical engineering and of applied physics and materials science at Columbia Engineering. "However, the fragility of DNA makes it seem unsuitable for functional device fabrication and nanomanufacturing that requires inorganic materials. In this study, we showed how DNA can serve as a scaffold for building 3-D nanoscale architectures that can be fully "converted" into inorganic materials like superconductors."

To make the scaffold, the Brookhaven and Columbia Engineering scientists first designed octahedral-shaped DNA origami "frames." Aaron Michelson, Gang's graduate student, applied a DNA-programmable strategy so that these frames would assemble into desired lattices. Then, he used a chemistry technique to coat the DNA lattices with silicon dioxide (silica), solidifying the originally soft constructions, which required a liquid environment to preserve their structure. The team tailored the fabrication process so the structures were true to their design, as confirmed by imaging at the CFN Electron Microscopy Facility and small-angle x-ray scattering at the Complex Materials Scattering beamline of Brookhaven's National Synchrotron Light Source II (NSLS-II). These experiments demonstrated that the structural integrity was preserved after they coated the DNA lattices.

"In its original form, DNA is completely unusable for processing with conventional nanotechnology methods," said Gang. "But once we coat the DNA with silica, we have a mechanically robust 3-D architecture that we can deposit inorganic materials on using these methods. This is analogous to traditional nanomanufacturing, in which valuable materials are deposited onto flat substrates, typically silicon, to add functionality."

The team shipped the silica-coated DNA lattices from the CFN to Bar-Ilan's Institute of Superconductivity, which is headed by Yosi Yeshurun. Gang and Yeshurun became acquainted a couple years ago, when Gang delivered a seminar on his DNA assembly research. Yeshurun--who over the past decade has been studying the properties of superconductivity at the nanoscale--thought that Gang's DNA-based approach could provide a solution to a problem he was trying to solve: How can we fabricate superconducting nanoscale structures in three dimensions?

"Previously, making 3-D nanosuperconductors involved a very elaborate and difficult process using conventional fabrication techniques," said Yeshurun, co-corresponding author. "Here, we found a relatively simple way using Oleg's DNA structures."

At the Institute of Superconductivity, Yeshurun's graduate student Lior Shani evaporated a low-temperature superconductor (niobium) onto a silicon chip containing a small sample of the lattices. The evaporation rate and silicon substrate temperature had to be carefully controlled so that niobium coated the sample but did not penetrate all the way through. If that happened, a short could occur between the electrodes used for the electronic transport measurements.

"We cut a special channel in the substrate to ensure that the current would only go through the sample itself," explained Yeshurun.

The measurements revealed a 3-D array of Josephson junctions, or thin nonsuperconducting barriers through which superconducting current tunnels. Arrays of Josephson junctions are key to leveraging quantum phenomena in practical technologies, such as superconducting quantum interference devices for magnetic field sensing. In 3-D, more junctions can be packed into a small volume, increasing device power.

"DNA origami has been producing beautiful and ornate 3-D nanoscale structures for almost 15 years, but DNA itself is not necessarily a useful functional material," said Evan Runnerstrom, program manager for materials design at the U.S. Army Combat Capabilities Development Command Army Research Laboratory of the U.S. Army Research Office, which funded the work in part. "What Prof. Gang has shown here is that you can leverage DNA origami as a template to create useful 3-D nanostructures of functional materials, like superconducting niobium. This ability to arbitrarily design and fabricate complex 3-D-structured functional materials from the bottom-up will accelerate the Army's modernization efforts in areas like sensing, optics, and quantum computing."

"We demonstrated a pathway for how complex DNA organizations can be used to create highly nanostructured 3-D superconducting materials," said Gang. "This material conversion pathway gives us an ability to make a variety of systems with interesting properties--not only superconductivity but also other electronic, mechanical, optical, and catalytic properties. We can envision it as a "molecular lithography," where the power of DNA programmability is transferred to 3-D inorganic nanofabrication."

Credit: 
DOE/Brookhaven National Laboratory

Analysis of Trump's tweets reveals systematic diversion of the media

President Donald Trump's controversial use of social media is widely known and theories abound about its ulterior motives. New research published today in Nature Communications claims to provide the first evidence-based analysis demonstrating the US President's Twitter account has been routinely deployed to divert attention away from a topic potentially harmful to his reputation, in turn suppressing negative related media coverage.

The international study, led by the University of Bristol in the UK, tested two hypotheses: whether an increase in harmful media coverage was followed by increased diversionary Twitter activity, and if such diversion successfully reduced subsequent media coverage of the harmful topic.

Lead author Professor Stephan Lewandowsky, Professor of Cognitive Psychology at the University of Bristol, said: "Our analysis presents empirical evidence consistent with the theory that whenever the media report something threatening or politically uncomfortable for President Trump, his account increasingly tweets about unrelated topics representing his political strengths. This systematic diversion of attention away from a topic potentially damaging to him was shown to significantly reduce negative media coverage the next day."

Social media gives political leaders direct and immediate access to their constituents, offering an opportunity to explain their actions and policy proposals at an unprecedented scale. President Trump is one of the most prolific users among world leaders. Since the beginning of his candidacy in 2015, approximately 30,000 tweets have been sent from Trump's account. While anecdotal reports suggest the tweets have served to divert media attention away from news that can be assumed to be politically harmful to him, evidence for such diversion has remained unsubstantiated - until now.

The study focused on Trump's first two years in office, scrutinising the Robert Mueller investigation into potential collusion with Russia in the 2016 Presidential Election, as this was politically harmful to the President. The team analysed content relating to Russia and the Mueller investigation in two of the country's most politically neutral media outlets, New York Times (NYT) and ABC World News Tonight (ABC). The team also selected a set of keywords judged to play to Trump's preferred topics at the time, which were hypothesized to be likely to appear in diversionary tweets. The keywords related to "jobs", "China", and "immigration"; topics representing the president's supposed political strengths.

The researchers hypothesized that the more ABC and NYT reported on the Mueller investigation, the more Trump's tweets would mention jobs, China, and immigration, which in turn would result in less coverage of the Mueller investigation by ABC and NYT.

In support of their hypotheses, the team found that every five additional ABC headlines relating to the Mueller investigation was associated with one more mention of a keyword in Trump's tweets. In turn, two additional mentions of one of the keywords in a Trump tweet was associated with roughly one less mention of the Mueller investigation in the following day's NYT.

Such a pattern did not emerge with placebo topics that presented no threat to the President, for instance Brexit or other non-political issues such as football or gardening.

The research also conducted an expanded analysis considering the President's entire Twitter vocabulary as a potential source of diversion, which confirmed the generality of the researchers' conclusions. Specifically, the analysis identified nearly 90 pairs of words that were more likely to appear in tweets when Russia-Mueller coverage increased, and that suppressed media coverage the next day. Those word pairs largely represented the President's political strengths, focusing again in particular on the economy.

Both analyses accounted for a number of potentially confounding factors and robustness checks, such as randomisation, sensitivity analyses, and the use of placebo keywords, to rule out artifactual explanations and strengthen claims of possible causal relationships.

Professor Lewandowsky said: "It's unclear whether President Trump, or whoever is at the helm of his Twitter account, engages in such tactics intentionally or if it's mere intuition. Either way, we hope these results serve as a helpful reminder to the media that they have the power to set the news agenda, focusing on the topics they deem most important, while perhaps not paying so much attention to the Twitter-sphere."

Credit: 
University of Bristol

Animal groups consider multiple factors before fighting

Groups of animals consider multiple factors before deciding whether to fight rivals, researchers say.

Before one-on-one fights, animals are known to make decisions based on factors including the size and strength of the opponent, the outcome of recent fights and the importance of the prize.

But scientists from the universities of Exeter and Plymouth say previous research has often overlooked complexity in group conflicts and assumed that larger groups will always win.

Instead, they say factors like group cohesion and teamwork, the strength of individual members and the location of battle all likely play a part - and animal groups weigh up the situation before fighting.

"Any potential fight - whether between humans or animals - gets more complex if there are multiple individuals on each side," said lead author Dr Patrick Green, of the Centre for Ecology and Conservation on the University of Exeter's Penryn Campus in Cornwall.

"Groups may assess both the importance of whatever they're fighting about, and a range of factors about their own group and the opponent.

"Research on dyadic (one-on-one) fights has developed an advanced framework on 'assessment' - how animals gather information and decide whether to fight, how much effort to put in, and if and when to give up.

"However, studies on group contests among social-living animals haven't generally focused on assessment.

"Understanding more about this can teach us not only about evolution, but also about conflict in humans."

Fights between social groups are common in nature.

Groups with more members are often assumed to be the likely winners of any fight, and indeed studies of animals including primates, lions, birds and ants show this is often correct.

However, the study highlights other factors that can play a part:

Strong individuals: Among grey wolves, smaller groups with more males - which are bigger and stronger than females - can overcome larger groups.

Motivation: Meerkat groups that contain pups can win despite inferior numbers - suggesting a "motivation advantage" because gaining new territory can result in more food for their young.

Chances of winning: Studies of turtle ants, which have multiple nests, suggest they prioritise defending those with narrower entrances - as larger entrances are harder to defend. This allows them to successfully defend certain parts of their territory.

"Winner/loser effect": Losers of baboon intergroup conflicts spend less time in the area where the fight occurred than they did before the fight, suggesting they avoid areas where they previously lost.

Social cohesion: In months in which they have lots of intergroup fights, chimpanzee social groups are more cohesive and males are less aggressive within the group, suggesting cohesion may be useful at times when fights are likely.

Co-author Mark Briffa, Professor of Animal Behaviour at the University of Plymouth, said: "Researchers have spent years wondering about the extent to which individual fighting animals use 'assessment' - effectively, sizing their opponent up.

"In this paper, we explore the scope for groups of rivals to do a similar thing.

"This could be a possibility in many examples across the animal kingdom where individuals work collectively, such as battles between rival groups of ants or even warfare between rival groups in humans."

Credit: 
University of Exeter

Stanford researchers develop DNA approach to forecast ecosystem changes

image: A night vision camera trap captured this image of mountain lions drinking from a stream at Stanford's Jasper Ridge Biological Preserve.

Image: 
Jasper Ridge Biological Preserve

When wolves returned to Yellowstone in 1995, no one imagined the predators would literally change the course of rivers in the national park through cascading effects on other animals and plants. Now, a Stanford University-developed approach holds the promise of forecasting such ecosystem changes as certain species become more prevalent or vanish altogether.

Outlined in Frontiers in Ecology and Evolution, the rapid, low-cost technique is the first to analyze DNA left behind in animals' feces to map out complex networks of species interactions in a terrestrial system. It could help redefine conservation as we know it, identify otherwise hard-to-find species and guide a global effort to rewild vast areas through the reintroduction of locally extirpated species.

"It's not just that we can rapidly capture the biodiversity of an area," said study lead author Jordana Meyer, a biology PhD candidate in the Stanford School of Humanities and Sciences. "We can also quantify the extent of indirect links among species, such as how a specific predator's behavior affects vegetation in an area. This allows us to measure impacts on species that are essential to the system or particularly vulnerable."

Just as the introduction of species, such as Yellowstone's wolves, can have widespread effects, their disappearance can be devastating in ways that are hard for scientists to predict. Meyer, whose work focuses primarily on African wildlife, has seen the impact first-hand in the Democratic Republic of Congo. There, the loss of large herbivores, such as rhinos and elephants, has led to the shrinking of once-massive grassland savannahs the creatures once grazed.

As human impacts on wild places accelerate, effective conservation and ecosystem management will require more rapid, inexpensive and non-invasive technologies for capturing changes in biodiversity and quantifying species interactions. One of the most promising tools is the study of so-called environmental DNA in left-behind animal materials, such as hair and skin. After extracting the DNA, scientists sequence and compare it to online databases to identify the organisms present in a certain area. It's a relatively fast, low-maintenance process compared to traditional approaches, such as live-trapping, animal-tracking and camera trapping.

Working at Stanford's 1,193-acre Jasper Ridge Biological Preserve, the researchers used their technique to analyze feces from carnivores such as mountain lions, omnivores such as gray foxes and herbivores such as black-tailed deer. By identifying the DNA in the diets of these animals, the researchers constructed an extraordinarily detailed, data-rich food web and accurately captured the biodiversity of the area when compared against other animal surveys and a long-term camera trap study in the preserve.

Among other surprises, the new analysis revealed the indirect effects of a predator cascade on vegetation and allowed the researchers to determine exactly how predators competed with each other. These results were validated against evidence from camera trap data gathered at Jasper Ridge over the past seven years in which the return of mountain lions, the ecosystem's top predator, caused a decline in deer and coyote occurrence. Without its coyote competitor, the formerly rare gray fox returned to Jasper Ridge. Gray foxes subsist more on plants, namely fruits and seeds, than do coyotes. Thus, the rise in gray foxes can lead to alterations in the distribution and abundance of fruit plants at the preserve because seeds often remain viable after being digested by mammals. Armed with this type of knowledge, managers can predict the impacts of shifting animal and plant communities, which can, in turn, provide a framework for conservation-relevant decisions.

The DNA the researchers collected in animal feces also identified plant and animal species not known to occur within the preserve, providing an early warning of invasive species.

"We are excited about this approach because it will not only help us to understand how and why species survive in protected areas based on what they eat, but also whether animals are able to capitalize on non-native plant and animal species," said study senior author Elizabeth Hadly, the Paul S. and Billie Achilles Professor in Environmental Biology in Stanford's School of Humanities and Sciences. Hadly's lab has pioneered work with left-behind and ancient DNA in the U.S., South America and India.

These methods could aid in rewilding protected areas by allowing researchers to model how ecosystems will respond to certain species before they are actually reintroduced. For example, before reintroducing the African lion to protected parts of Africa, scientists could first study the biodiversity and connectivity of the areas and predict how the lions could impact prey populations and other knock-on effects they might trigger throughout the entire ecosystem.

The researchers plan to scale-up their model across protected areas in Africa to assist in strategic adaptive management and rewilding strategies. "I am hopeful that techniques like this can help us secure and monitor natural spaces on a global scale," Meyer said.

Credit: 
Stanford University

Penn researchers present findings on cardiac risks for cancer patients

PHILADELPHIA - Developments in cancer treatment and care have dramatically improved survival for cancer patients, however, these treatments can also damage other parts of the body, including the heart. At the American Heart Association's Scientific Sessions 2020, physician-researchers from the Perelman School of Medicine at the University of Pennsylvania will present findings about cardiac care for cancer patients and survivors.

Cardiovascular Toxicities of Cyclin Dependent Kinase 4/6 Inhibitors in Metastatic Breast Cancer Patients

While the field is aware of the potential damage cancer treatment can have on the heart, little data exists regarding the connection between cardiovascular adverse events--such as heart failures and arrhythmias--and Cyclin Dependent Kinase (CDK) 4/6 inhibitor therapy agents. CDK 4/6 inhibitors, such as palbociclib, ribociclib, and abemaciclib, are a novel class of cancer therapeutics which have significantly improved survival in patients with hormone receptor positive (ER/PR+), HER2 negative metastatic breast cancer.

An analysis of the OneFlorida dataset led by Michael Fradley, MD, medical director of Penn Cardio-Oncology and an associate professor of Cardiovascular Medicine, found that cardiovascular adverse events occurred in 16.8 percent of adult patients without prior cardiovascular disease who received at least one CDK 4/6 inhibitor. Of those patients, 17.2 percent died.

"While no single CDK 4/6 inhibitor is uniquely responsible for these findings, cardiovascular adverse events are common in these patients. Patients taking CDK 4/6 inhibitors should be monitored with aggressive risk mitigation strategies to minimize any potential issues," Fradley said.

Fradley will present the findings in a moderated digital poster session on November 13, 2020 at 10:00 a.m. EST/9:00 a.m. CST. It will be available through the meeting's OnDemand content until the end of the conference on November 17, 2020, 9:30 p.m. EST/8:30 p.m. CST.

Additional authors include Penn's Bonnie Ky, as well as Nam Nguyen, Yiqing Chen, Avirup Guha, Jenica N. Upshaw, and Yan Gong.

Structural Characterization of Cardiotoxicity of Induced by Chemotherapy Agents

Marielle Scherrer-Crosbie, MD, PhD, director of the Cardiac Ultrasound Laboratory and a professor of Cardiovascular Medicine, will review the recent advances on how cardiac imaging can provide insights into the mechanisms of chemotherapy induced cardiotoxicity and identify patients at risk.

Scherrer-Crosbie's interest is the relationships between cardiovascular health and hematological malignancies--cancers that affect the blood, bone marrow, and lymph nodes, such as the various types of leukemia. Research from her group show a high rate of adverse cardiovascular events, especially heart failure, in patients with acute leukemia. To identify a patient's risk for heart failure following chemotherapy treatment, Scherrer-Crosbie developed a risk score based on clinical and imaging variables.

"Our goal is to help clinicians identify patients with the highest risk for potential cardiac damage, so they can more closely tailor treatment plans and monitor patients in order to improve outcomes," Scherrer-Crosbie said.

The session will take place during the ATVB journal session on the cardiovascular effects of anticancer treatments/agents. The session is part of the meeting's OnDemand content, and will be available from the start of the conference November 13, 2020 at 10:00 a.m. EST/9:00 a.m. CST until the end of the conference November 17, 2020, 9:30 p.m. EST/8:30 p.m. CST.

Novel Practical Concepts in Cardio-Oncology

During the AHA sessions, Bonnie Ky, MD, MSCE, the Founders Associate Professor of Cardiovascular Medicine, Epidemiology, and Cardio-Oncology; director of the Penn Cardio-Oncology Translational Center of Excellence, and editor-in-chief of JACC: CardioOncology, will address advances in the field of cardio-oncology, and new approaches to improving the cardiovascular care of cancer patients through personalized medicine.

"We know that potentially life-saving cancer therapies can impact the heart, but we believe these potential adverse reactions are detectable, treatable, and preventable," Ky said. "To me, it's a travesty that a patient can survive cancer, but then suffer from cardiovascular disease. We are working to try to understand in whom, why, and when do these adverse cardiovascular events occur in patients, so we can ultimately improve the lives of our cancer patients."

By using novel tools such as blood tests, imaging tools such as 2D and 3D echocardiography, and a person's genetic makeup, it may be possible to identify who is at increased risk prior to declines in cardiac function occurring. There are also newer strategies, including cardioprotection-guided risk stratification and efforts to identify, prevent, and treat heart disease in cancer patients.

Ky's session, titled "Personalized Medicine and Permissive Cardiotoxicity," takes place on Saturday, November 14, 2020 at 8:40 p.m. EST/7:40 p.m. CST.

Credit: 
University of Pennsylvania School of Medicine

Chemicals in your living room cause diabetes

A new UC Riverside study shows flame retardants found in nearly every American home cause mice to give birth to offspring that become diabetic.

These flame retardants, called PBDEs, have been associated with diabetes in adult humans. This study demonstrates that PBDEs cause diabetes in mice only exposed to the chemical through their mothers.

"The mice received PBDEs from their mothers while they were in the womb and as young babies through mother's milk," said Elena Kozlova, lead study author and UC Riverside neuroscience doctoral student. "Remarkably, in adulthood, long after the exposure to the chemicals, the female offspring developed diabetes."

Results of the study have been published in the journal Scientific Reports.

PBDEs are common household chemicals added to furniture, upholstery, and electronics to prevent fires. They get released into the air people breathe at home, in their cars, and in airplanes because their chemical bond to surfaces is weak.

"PBDEs are everywhere in the home. They're impossible to completely avoid," said UCR neuroscientist and corresponding author of the study, Dr. Margarita Curras-Collazo.

"Even though the most harmful PBDEs have been banned from production and import into the U.S., inadequate recycling of products that contain them has continued to leach PBDEs into water, soil, and air. As a result, researchers continue to find them in human blood, fat, fetal tissues, as well as maternal breast milk in countries worldwide."

Given their previous association with diabetes in adult men and women, and in pregnant women, Curras-Collazo and her team wanted to understand whether these chemicals could have harmful effects on children of PBDE-exposed mothers. But such experiments can only be done on mice.

Diabetes leads to elevated levels of blood glucose, or blood sugar. After a meal, the pancreas releases insulin, a hormone that helps cells utilize glucose sugar from food. When cells are resistant to insulin, it doesn't work as intended, and levels of glucose remain high in the blood even when no food has been eaten.

Chronically high levels of glucose can cause damage to the eyes, kidneys, heart, and nerves. It can also lead to life-threatening conditions.

"This study is unique because we tested both the mothers and their offspring for all the hallmarks of diabetes exhibited in humans," Curras-Collazo said. "This kind of testing has not been done before, especially on female offspring."

The researchers gave PBDEs to the mouse mothers at low levels comparable to average human environmental exposure both during pregnancy and lactation.

All of the babies developed glucose intolerance, high fasting glucose levels, insulin insensitivity, and low blood insulin levels, which are all hallmarks of diabetes. In addition, researchers also found the babies had high levels of endocannabinoids in the liver, which are molecules associated with appetite, metabolism, and obesity.

Though the mothers developed some glucose intolerance, they weren't as affected as their offspring.

"Our findings indicate that chemicals in the environment, like PBDEs, can be transferred from mother to offspring, and exposure to them during the early developmental period is damaging to health," Curras-Collazo said.

The research team feels future longitudinal studies in humans are needed to determine the long-term consequences of early-life PBDE exposure.

"We need to know if human babies exposed to PBDEs both before and after birth go on to become diabetic children and adults," Kozlova said.

In the meantime, Curras-Collazo advises people to limit PBDE exposure by taking steps such as washing hands before eating, vacuuming frequently, and buying furniture and other products that do not contain it. She also hopes expectant mothers are well informed about stealth environmental chemicals that can affect their unborn and developing children, as well as their breast milk.

"We believe the benefits babies get from mothers' milk far outweigh the risks of passing on the PBDEs to children. We do not recommend curtailing breastfeeding," she said. "But let's advocate for protecting breast milk and our bodies from killer couch chemicals."

Credit: 
University of California - Riverside

5 mistakes people make when sharing COVID-19 data visualizations on Twitter

image: Over 25 percent of coronavirus-related data visualizations analyzed on Twitter failed to clearly cite their information sources, reducing trustworthiness.

Image: 
Image courtesy Francesco Cafaro, Indiana University

INDIANAPOLIS -- The frantic swirl of coronavirus-related information sharing that took place this year on social media is the subject of a new analysis led by researchers at the School of Informatics and Computing at IUPUI.

Published in the open-access journal Informatics, the study focuses on the sharing of data visualizations on Twitter -- by health experts and average citizens alike -- during the initial struggle to grasp the scope of the COVID-19 pandemic, and its effects on society. Many social media users continue to encounter similar charts and graphs every day, especially as a new wave of coronavirus cases has begun to surge across the globe.

The work found that more than half of the analyzed visualizations from average users contained one of five common errors that reduced their clarity, accuracy or trustworthiness.

"Experts have not yet begun to explore the world of casual visualizations on Twitter," said Francesco Cafaro, an assistant professor in the School of Informatics and Computing, who led the study. "Studying the new ways people are sharing information online to understand the pandemic and its effect on their lives is an important step in navigating these uncharted waters."

Casual data visualizations refer to charts and graphs that rely upon tools available to average users in order to visually depict information in a personally meaningful way. These visualizations differ from traditional data visualization because they aren't generated or distributed by the traditional "gatekeepers" of health information, such as the Centers for Disease Control and Prevention or the World Health Organization, or by the media.

"The reality is that people depend upon these visualizations to make major decisions about their lives: whether or not it's safe to send their kids back to school, whether or not it's safe to take a vacation, and where to go," Cafaro said. "Given their influence, we felt it was important to understand more about them, and to identify common issues that can cause people creating or viewing them to misinterpret data, often unintentionally."

For the study, IU researchers crawled Twitter to identify 5,409 data visualizations shared on the social network between April 14 and May 9, 2020. Of these, 540 were randomly selected for analysis -- with full statistical analysis reserved for 435 visualizations based upon additional criteria. Of these, 112 were made by average citizens.

Broadly, Cafaro said the study identified five pitfalls common to the data visualizations analyzed. In addition to identifying these problems, the study's authors suggest steps to overcome or reduce their negative impact:

Mistrust: Over 25 percent of the posts analyzed failed to clearly identify the source of their data, sowing distrust in the accuracy. This information was often obscured due to poor design -- such as bad color choices, busy layout, or typos -- not intentional obfuscation. To overcome these issues, the study's authors suggest clearly labeling data sources as well as placing this information on the graphic itself rather than the accompanying text, as images are often unpaired from their original post during social sharing.

Proportional reasoning: Eleven percent of posts exhibited issues related to proportional reasoning, which refers to the users' ability to compare variables based on ratios or fractions. Understanding infection rates across different geographic locations is a challenge of proportional reasoning, for example, since similar numbers of infections can indicate different levels of severity in low- versus high-population settings. To overcome this challenge, the study's authors suggest using labels such as number of infections per 1,000 people to compare regions with disparate populations, as this metric is easier to understand than absolute numbers or percentages.

Temporal reasoning: The researchers identified 7 percent of the posts with issues related to temporal reasoning, which refers to users' ability to understand change over time. These included visualizations that compared the numbers of deaths from flu in a full year to the number of deaths from COVID-19 in a few months, or visualizations that failed to account for the delay between the date of infection and deaths. Recommendations to address these issues included breaking metrics that depend upon different time scales in separate charts, as opposed to conveying the data in a single chart.

Cognitive bias: A small percentage of posts (0.5 percent) contained text that seemed to encourage users to misinterpret data based upon the creator's "biases related to race, country and immigration." The researchers state that information should be presented with clear, objective descriptions carefully separated from any accompanying political commentary.

Misunderstanding about virus: Two percent of visualizations were based upon misunderstandings about the novel coronavirus, such as the use of data related to SARS or influenza.

The study also found certain types of data visualizations performed strongest on social media. Data visualizations that showed change over time, such as line or bar graphs, were most commonly shared. They also found that users engaged more frequently with charts conveying numbers of deaths as opposed to numbers of infections or impact on the economy, suggesting that people were more interested in the virus's lethality than its other negative health or societal effects.

Credit: 
Indiana University

Creating 3D virtual personas of all-solid-state batteries, building a better tomorrow

image: Prof. Yong Min Lee of DGIST, Korea, led the study to develop this novel 3D digital twinning technique for studying all-solid-state-batteries

Image: 
Yong Min Lee, DGIST

We live in a battery-powered world, and as electronics steadily invade its every corner, the need to find robust batteries grows increasingly important. Today, most devices run on lithium-ion batteries; and while these are generally safe, sometimes they have been known to catch fire or explode.

An alternative that is quickly growing in prominence within the research community is the all-solid-state lithium battery (ASSLB). Unlike lithium-ion batteries where the electrode is solid and the electrolyte is liquid, in ASSLBs, both electrode and electrolyte are solid, and they are extremely safe. However, this very property poses a problem: During operation, the volumes of the electrolyte and electrodes change, particularly in high-energy storage batteries. This can cause their surfaces to detach, resulting in poor performance.

If high performance ASSLBs are to be developed, the complex structures at the interfaces will need to be scrutinized in detail; or so Prof. Yong Min Lee from Daegu Gyeongbuk Institute of Science and Technology (DGIST) believes: "While most researchers have focused on developing new materials or improving the properties of existing all-solid-state-lithium batteries, we opted for a different route and decided to find solutions to minimizing the defects in the designs of electrodes and cells. This led us to wonder, 'Is there a way to quantitatively analyze the defects in these batteries?'"

Prof. Lee and team found the answer to their question when they came up with a clever technique: a 3D digital twinning platform in which the microstructures of the solid-solid interfaces can be rendered as detailed 3D replicas of the real thing. The details of their study are published in Elsevier's Nano Energy.

Using the platform, Prof. Lee and team explored the electrode-electrolyte interfaces of an oxide-based ASSLB, arguably the most promising kind of ASSLB. They captured 2D image slices of a select target area, sequenced the images to digitally reconstruct a 3D structure, and then carried out structural analyses. As expected, they found that the ASSLB's specific contact area was much less than that of lithium-ion batteries. This validated the efficacy of their method.

Excited about these results, Prof. Lee explains the immense potential of this technique: "Given the broad applicability of this technique, we're sure that its benefits can extend to all electrode-containing devices. But for now, we're are confident that our technique will help researchers save time and money while easily checking for defects during battery fabrication processes, helping with optimizing design, and ultimately speeding up the commercialization of all-solid-state batteries."

Perhaps, in the not too distant future, ASSLBs could be what powers our world!

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Blue whirl flame structure revealed with supercomputers

image: Blue whirls are a swirling flame phenomenon that evolves from a chaotic fire whirl and burns with nearly soot-free combustion. Supercomputer simulations have revealed the flame structure and flow structure of the blue whirl. (A) Volume rendering of the heat release rate from the numerical simulations. (B) Schematic diagram that summarizes a final result of the blue whirl simulation showing the combination of three different kinds of flame. (C) Observed blue whirl.

Image: 
H. Xiao, University of Science and Technology of China

Lightning struck a bourbon warehouse, setting fire to a cache of 800,000 gallons of liquor in the Bardstown countryside of Kentucky in 2003. Some of it spilled into a nearby creek, spawning a massive fire tornado, or "bourbonado," as reported locally.

Aerial video of it inspired scientists to investigate fire whirls, tornados of fire, as something promising for oil spill remediation because the hydrocarbons burned with relatively little soot.

Their fire whirl investigations in the lab led them to find something that astonished them. The chaotic and dangerous fire whirl transformed into a tame and clean burning flame they call a "blue whirl."

One of its discoverers is now on a science team using supercomputers allocated by the Extreme Science and Engineering Discovery Environment (XSEDE) to reveal the structure of the blue whirl, a new type of flame that consists of four separate flames. The scientists hope blue whirls can one day be used to burn fuels more cleanly.

"The main finding of this new computational study is that we now know the main structure of the blue whirl," said Elaine Oran, professor and O'Donnell Foundation Chair VI, Department of Aerospace Engineering, Texas A&M University. Oran is a co-discoverer of the blue whirl and a co-author of a study on its structure published in PNAS, August 2020. "We know that it's a combination of many types of flames which come together and form themselves into probably the most ideal configuration for burning, which we had seen before."

A blue whirl is akin to a spinning blue flame that looks like a child's toy top. Oran says the top of it has the same shape as the sorting hat from Harry Potter. Most of its burning is along a very bright blue rim which spins.

The researchers used experimental data from the 2016 study that first discovered the blue whirl. The experimental setup consisted of two half-cylinders and a cylindrical stainless steel pan full of water. A liquid fuel, n-heptane, was poured on the surface of still water at the center of the pan and then was ignited. Two quartz half-cylinders were suspended over the pan. Offsetting the half-cylinders created two vertical slits that allowed air to be drawn in tangentially to the flame region, a commonly used to create fire whirls for laboratory study.

A chaotic pool fire formed at first. Cold air drawn into the chamber created a strong vertical flow next created a tall and intense fire whirl. Then, unexpectedly, it collapsed into the calm blue whirl flame structure.

"We studied the structure of this new flame through the numerical simulation, and we found out the type of burning, and where they occur," said study co-author Xiao Zhang, post-doctoral Researcher, Department of Aerospace Engineering, Texas A&M University, who works for Oran.

Supercomputer simulations helped tease out the blue whirl's structure, which turns out to be made of three types of flames. At bottom is a rich pre-mixed flame, crowned on top with a purplish hat-shaped diffusion flame. The simulations revealed a hidden flame surrounding the purple haze, just outside the diffusion flame. The three flames combine into a triple flame that forms its bright rim.

The scientists faced a few challenges in simulating the flames.

"The blue whirl in the [laboratory] experiments evolved and developed by itself," Zhang said. There were limited diagnostics from the experiments that didn't give us enough conditions to use to start off the calculations. We started out with a numerical hunt."

They developed new algorithms that could simulate low-Mach number flows efficiently and implemented the algorithms into a computational fluid dynamics code that solves the unsteady, compressible, reactive Navier-Stokes equations of flow. Using this code, they explored the effects of controlling parameters such as fuel and air inlet sizes and velocities. Eventually, they were able to capture the blue whirl in their simulations.

"These simulations of the blue whirl involved multiple scales in time and space," Zhang said. "We also needed to model multiple physics and the heavy hydrocarbon chemistry. These can be very difficult and expensive to compute. On top of that, we wanted to keep the 3D dynamics of this new flame. These 3D aspects added more cost to the computation."

The scientists were awarded supercomputer allocations on XSEDE, funded by the National Science Foundation. Through XSEDE, they made use of the Stampede2 supercomputer and the Ranch data storage system at the Texas Advanced Computing Center.

The simulations for the numerical hunt and the final blue whirl simulation consumed 4 million CPU hours distributed over the Deepthought2 system from the University of Maryland; the Thunder system from the Air Force Research Laboratory; and Stampede2, accounting for about 23K node hours on its Skylake nodes.

Besides the flame structure, the scientists also looked at the flow structure of the blue whirl that involved a fluid dynamics phenomenon called vortex breakdown. Basically, the chaotic and whirling yellow flame collapses into a "bubble mode" of vortex breakdown and forms the blue whirl.

"What surprised me most was how it evolved from the fire whirl," Oran explained. "A fire whirl is a monster, a devastating thing. Then all of a sudden it turns into this quiet, tiny little flame with no turbulence. In the process of forming it, you saw all of these fluid dynamic modes of vortex breakdown, which is a beautiful fluid phenomenon that you might see from vortices shedding off of a wing of an airplane."

The researchers hope that further understanding of the blue whirl might help scientists develop ways to burn fuels more cleanly. "It can potentially be a new way to extract energy from traditional fossil fuels with minimal soot, reduced pollution, and environmental impact," Zhang said.

Oran emphasized that serendipity played a big part in discovering the blue whirl phenomenon.

Said Oran: "I think it's important to explore, follow your curiosity, and try out new ideas. If we had never seen, for example, the fire on the lake in Kentucky, when all of the bourbon spilled onto the lake there and lightning ignited it, and it formed fire whirls on the lake, we would never have found the blue whirl. Every time you look under the rug, you find something new. A new insect, a new flame."

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

Machine learning advances materials for separations, adsorption, and catalysis

image: Metal-organic frameworks (MOFs) are a class of porous and crystalline materials that are synthesized from inorganic metal ions or clusters connected to organic ligands. Shown are two such materials, HKUST-1 and MIL-100(Fe).

Image: 
Tania Evans, Georgia Tech

An artificial intelligence technique -- machine learning -- is helping accelerate the development of highly tunable materials known as metal-organic frameworks (MOFs) that have important applications in chemical separations, adsorption, catalysis, and sensing.

Utilizing data about the properties of more than 200 existing MOFs, the machine learning platform was trained to help guide the development of new materials by predicting an often-essential property: water stability. Using guidance from the model, researchers can avoid the time-consuming task of synthesizing and then experimentally testing new candidate MOFs for their aqueous stability. Already, researchers are expanding the model to predict other important MOF properties.

Supported by the Office of Science's Basic Energy Sciences program within the U.S. Department of Energy (DOE), the research was reported Nov. 9 in the journal Nature Machine Intelligence. The research was conducted in the Center for Understanding and Control of Acid Gas-Induced Evolution of Materials for Energy (UNCAGE-ME), a DOE Energy Frontier Research Center located at the Georgia Institute of Technology.

"The issue of water stability with MOFs has existed in this field for a long time, with no easy way to predict it," said Krista Walton, professor and Robert "Bud" Moeller faculty fellow in Georgia Tech's School of Chemical and Biomolecular Engineering. "Rather than having to do the synthesis and experimentation to figure this out for each candidate MOF, this machine learning model now provides a way to predict water stability given a set of desired features. This will really speed up the process of identifying new materials for specific applications."

MOFs are a class of porous and crystalline materials that are synthesized from inorganic metal ions or clusters connected to organic ligands. They are known for their easily tunable components that can be customized for specific applications, but the large number of potential combinations makes it difficult to choose MOFs with the desired properties. That's where artificial intelligence can help.

Machine learning is playing an increasingly important role in materials science, said Rampi Ramprasad, professor and Michael E. Tennenbaum Family Chair in the Georgia Tech School of Materials Science and Engineering and Georgia Research Alliance Eminent Scholar in Energy Sustainability.

"When materials scientists plan the next set of experiments, we use the intuition and insights that we have accumulated from the past," Ramprasad said. "Machine learning allows us to fully tap into this past knowledge in the most efficient and effective manner. If 200 experiments have already been done, machine learning allows us to exploit all that has been learned from them as we plan the 201st experiment."

Beyond experimental data, machine learning can also use the results of physics-based simulations. And unlike simulations, the results from machine learning models can be instantaneous. The machine learning algorithm improves as it receives more information, he noted, and both negative and positive results are useful.

"Great discoveries are as important as not-so-exciting discoveries -- failed experiments -- because machine learning uses both ends of the spectrum to get better at what it does," Ramprasad said.

The machine learning model used information Walton and her research team had gathered on hundreds of existing MOF materials, both from compounds developed in her own lab and those reported by other researchers. To prepare the information for the model to learn from, she categorized each MOF according to four measures of water stability.

"The couple hundred data points used to build the model represented years of experiments," Walton said. "I spent basically the first half of my career working to understand this water stability problem with MOFs, so it's something we have studied extensively."

Using the model, researchers who are developing new adsorbents and other porous materials for specific applications can now check their proposed formulas to determine the likelihood that a new MOF would be stable in the presence of water. That could be particularly helpful for researchers who don't have this particular expertise or who don't have easy access to experimental methods for examining stability.

"The MOF community is diverse, with a variety of subfields. Not everyone has the chemical intuition about which materials' features lead to good framework stability, and experimental evaluation often requires specialty equipment that many labs may not have or wouldn't otherwise need for their specific subfield. However, with good predictive models, they wouldn't necessarily need to develop it to choose a material for a specific application," Walton said. "This capability potentially opens up this field to a broader group of researchers that could accelerate application development."

While screening for water stability is important, Ramprasad says it's just the beginning of the potential benefits from the project. The machine learning model can be trained to predict other properties as long as a sufficient amount of data exists. For instance, the team is already teaching their model about factors affecting methane absorption under varying levels of pressure. In that case, simulations will provide much of the data from which the model will learn.

"We will have a very strong predictor that will tell us if a new MOF would be stable under aqueous conditions and a good candidate for methane uptake," he said. "What we are doing is creating a universal and scalable machine learning platform that can be trained on new properties. As long as the data is available, the model can learn from it, and make predictions for new cases."

In addition to those already mentioned, recent Georgia Tech postdoctoral fellow Rohit Batra and Georgia Tech graduate students Carmen Chen and Tania G. Evans were also coauthors on the Nature Machine Intelligence paper.

Ramprasad has experience with machine learning techniques applied to other materials and application spaces, and recently coauthored a review article, "Emerging materials intelligence ecosystems propelled by machine learning," about a range of artificial intelligence applications in materials science and engineering. Intended to demystify machine learning and to review success stories in the materials development space, it was published, also on Nov. 9, 2020, in the journal Nature Reviews Materials.

Credit: 
Georgia Institute of Technology

Slow-living animal species could be disease 'reservoirs'

Animals that live slowly - breeding less rapidly and living longer - could be "reservoirs" of diseases that could jump to new species including humans, new research suggests.

Some species "live fast and die young", devoting effort to reproduction, while others conserve more energy for survival.

The Covid-19 pandemic has drawn attention to fast-spreading infectious diseases, but the new study - by the University of Exeter - focusses on "endemic" diseases that co-exist with host species for long periods of time.

The researchers measured what they called "demographic competence" - the ability of a host species to survive in large numbers while sustaining high levels of infection.

They showed that slow-lived species often have higher demographic competence for persistent infections, and are therefore more likely to act as reservoirs of infection that can spill over into other species.

"Diseases of wildlife pose a threat to the survival of endangered species worldwide, and we know there is risk of spill-over of disease between closely related species of wildlife, livestock and humans," said Professor Dave Hodgson, director of the Centre for Ecology and Conservation on Exeter's Penryn Campus in Cornwall.

"These spill-over events are known to be influenced by similarities in immune systems, and by increasing levels of contact between humans and wildlife caused by exploitation of natural ecosystems like rainforests.

"Our findings highlight the potential to use other, more ecological, characteristics like lifespan, reproductive capacity and population size to identify and predict the wildlife reservoirs from which new diseases could emerge."

The researchers used mathematical models to explore what kinds of animal species and pathogens (diseases) are likely to co-exist for long periods.

"As well as finding that slow-living species may be reservoirs of infectious disease, we show a 'flip-side' whereby species with low demographic competence may not be able to co-exist with new diseases and might therefore suffer local or complete extinction," said Dr Matthew Silk, of the University of Exeter.

"It is important to note that pace-of-life in the host species isn't the only important factor affecting 'demographic competence'.

"Traits of the pathogen itself - such as how easily it is transmitted and how likely it is to kill a host - will also play a key role, as will the social behaviour of the host species.

"We must also consider the role of immunity. Differences in immune systems that we know exist between fast and slow hosts can influence how long individuals are ill and whether they can be re-infected."

Credit: 
University of Exeter