Tech

Polymers jump through hoops on pathway to sustainable materials

image: Chemical and biomolecular engineering professor Charles Schroeder, left, and graduate student Yuecheng (Peter) Zhou study the flow dynamics of ring and linear polymer solutions to tease out clues about how synthetic polymers interact during processing.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Recyclable plastics that contain ring-shaped polymers may be a key to developing sustainable synthetic materials. Despite some promising advances, researchers said, a full understanding of how to processes ring polymers into practical materials remains elusive. In a new study, researchers identified a mechanism called "threading" that takes place when a polymer is stretched - a behavior not witnessed before. This new insight may lead to new processing methods for sustainable polymer materials.

Most consumer plastics are blends of linear polymers. The concept of plastics made purely from ring polymers - molecules that form a closed ring - presents an enticing opportunity for sustainability, as shown by the Autonomous Materials Systems group at the Beckman Institute for Advanced Science and Technology. Once a single bond holding ring polymers together breaks, the entire molecule falls apart, leading to disintegration on demand. However, processing such polymers into practical materials remains a challenge, the researchers said.

A 2013 University of Illinois-led study showed that ring polymers could be broken with heat, but this comes at a price - the resulting plastics would likely become unstable and begin to break down prematurely.

In the new study, U. of I. researchers Charles Schroeder and Yuecheng (Peter) Zhou examine the flow dynamics of DNA-based ring and linear polymer solutions to tease out clues about how synthetic polymers interact during processing. Their findings are published in the journal Nature Communications.

"We lack a fundamental understanding of how ring polymers stretch and move in flow while navigating around other neighbor polymer chains. This work allowed us to probe these questions at a molecular level," said Schroeder, a chemical and biomolecular engineering professor, Beckman Institute researcher and study co-author.

In Schroeder's lab, the researchers stretch and squeeze polymers, causing them to flow and allowing direct observation of the behavior of individual molecules using single-molecule fluorescence microscopy.

"There is a fluctuation in the shape of the ring polymers and this depends on the concentration of linear polymers in the solution," said Zhou, a graduate student, Beckman Institute researcher and lead author of the study. "We do not see this behavior in pure solutions of ring or linear polymers, so this tells us that something unique is happening in mixed solutions."

Using a combination of direct single-molecule observations and physical measurements, the team concluded that the changes in shape of the ring polymers occur because linear molecules thread themselves through the ring molecules when stressed, causing the ring shape to fluctuate under fluid flow.

"We observed this behavior even when there is a very low concentration of linear polymers in the mix," Zhou said. "This suggests that it only takes a very minute level of contamination to cause this phenomenon."

This threading of linear polymers through ring polymers during stress is something that had been theorized before, using bulk-scale studies of the physical properties, but now it has been observed at the molecular scale, the researchers said.

"Bulk studies typically mask the importance of what is going on at the smaller scale," Schroeder said.

How these observations will translate into further development of sustainable consumer plastics remains unclear, the researchers said. However, any insight into the fundamental molecular properties of mixed-polymer solutions is a step in the right direction.

"To make pure ring polymer plastics a reality, we need to understand both mixed and pure solutions at a fundamental level," Schroeder said. "Once we can figure out how they work, then we can move on to synthesizing them and ultimately how to use them in sustainable consumer plastics."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Finding the 'Goldilocks' level of enthusiasm for business pitches

image: Dong Liu, associate professor in Georgia Tech's Scheller College of Business.

Image: 
Georgia Tech

When it comes to pitching business ideas to potential investors, an entrepreneur's excitement and enthusiasm can be the difference between dreams taking shape or ultimately falling flat.

But it's not just the intensity of enthusiasm that's important, according to a recent study by a team led by Georgia Institute of Technology researchers. How long an entrepreneur displays the highest level of excitement during a pitch also plays a major role in predicting success in receiving funding.

Basically, too much enthusiasm can be a bad thing.

"The findings suggest that investors may interpret prolonged periods of high enthusiasm as over-optimistic," said Dong Liu, an associate professor in Georgia Tech's Scheller College of Business. "Over-optimistic entrepreneurs are thought to make irrational decisions and overestimate their products' profitability."

In the study, which was published April 8 in the Academy of Management Journal, the researchers described using artificial intelligence software to analyze video pitches for 1,460 business funding proposals for products posted on the crowd funding website Kickstarter.

The software used facial expression recognition and big data analytics to measure the intensity of enthusiasm in more than 8 million frames of video, then recorded how long the presenters stayed at their maximum level of excitement, which the researchers described as the point of "peak joy."

They found that, generally speaking, the higher the peak level of enthusiasm, the more likely the entrepreneur was to receive funding, after controlling for differences in the products and business ideas. But there was a bell curve in the results, where the likelihood of funding tended to fall as "peak joy" levels went on for too long.

"Although a higher level of peak joy displayed by entrepreneurs during their pitches leads to better funding performance over time, prolonged display of peak joy seemed to undermine funding performance," Liu said. "Another possible interpretation is that investors may believe the entrepreneur is acting and the pitch is manipulative. Maybe they feel the entrepreneur is using his or her excitement to manipulate the investors' perceptions in hopes of increasing the odds of getting funding."

The facial recognition software analyzed when the presenters made expressions linked to joy, such as raising the cheek, drawing the corners of the mouth into a smile, and the movement and position of the eyes. The researchers noted that the software, called FaceReader, was even more accurate in recognizing emotions than real people analyzing those video frames.

The researchers also found another factor that influenced funding success was at what point during the presentation "peak joy" happened. They found the most effective times to display enthusiasm was at the beginning of the pitch and near the end.

"The results of our research could be broadly applicable to different kinds of audiences, not just those funding projects through crowd funding websites," Liu said. "Venture capitalists are looking for good business ideas on these websites too. But in general these findings could help inform any business pitch."

Credit: 
Georgia Institute of Technology

Machine learning speeds modeling of experiments aimed at capturing fusion energy on Earth

image: Fast-camera photo of a plasma produced by the first NSTX-U operations campaign.

Image: 
NSTX-U experiment

Machine learning (ML), a form of artificial intelligence that recognizes faces, understands language and navigates self-driving cars, can help bring to Earth the clean fusion energy that lights the sun and stars. Researchers at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) are using ML to create a model for rapid control of plasma -- the state of matter composed of free electrons and atomic nuclei, or ions -- that fuels fusion reactions.

The sun and most stars are giant balls of plasma that undergo constant fusion reactions. Here on Earth, scientists must heat and control the plasma to cause the particles to fuse and release their energy. PPPL research shows that ML can facilitate such control.

Neural Networks

Researchers led by PPPL physicist Dan Boyer have trained neural networks -- the core of ML software -- on data produced in the first operational campaign of the National Spherical Torus Experiment-Upgrade (NSTX-U), the flagship fusion facility, or tokamak, at PPPL. The trained model accurately reproduces predictions of the behavior of the energetic particles produced by powerful neutral beam injection (NBI) that is used to fuel NSTX-U plasmas and heat them to million-degree, fusion-relevant temperatures.

These predictions are normally generated by a complex computer code called NUBEAM, which incorporates information about the impact of the beam on the plasma. Such complex calculations must be made hundreds of times per second to analyze the behavior of the plasma during an experiment. But each calculation can take several minutes to run, making the results available to physicists only after an experiment that typically lasts a few seconds is completed.

The new ML software reduces the time needed to accurately predict the behavior of energetic particles to under 150 microseconds -- enabling the calculations to be done online during the experiment.

Initial application of the model demonstrated a technique for estimating characteristics of the plasma behavior not directly measured. This technique combines ML predictions with the limited measurements of plasma conditions available in real-time. The combined results will help the real-time plasma control system make more informed decisions about how to adjust beam injection to optimize performance and maintain stability of the plasma -- a critical quality for fusion reactions.

Rapid evaluations

The rapid evaluations will also help operators make better-informed adjustments between experiments that are executed every 15-20 minutes during operations. "Accelerated modeling capabilities could show operators how to adjust NBI settings to improve the next experiment," said Boyer, lead author of a paper in Nuclear Fusion that reports the new model.

Boyer, working with PPPL physicist Stan Kaye, generated a database of NUBEAM calculations for a range of plasma conditions similar to those achieved in experiments during the initial NSTX-U run. Researchers used the database to train a neural network to predict effects of neutral beams on the plasma, such as heating and profiles of the current. Software engineer Keith Erickson then implemented software for evaluating the model on computers used to actively control the experiment to test the calculation time.

New work will include development of neural network models tailored to the planned conditions of future NSTX-U campaigns and other fusion facilities. In addition, researchers plan to expand the present modeling approach to enable accelerated predictions of other fusion plasma phenomena. Support for this work comes from the DOE Office of Science.

Credit: 
DOE/Princeton Plasma Physics Laboratory

Scientists propose rethinking 'endangered species' definition to save slow-breeding giants

image: Mother and calf.

Researchers at the Smithsonian Institute have proposed a 'demographic safe space' for Asian elephants, to improve conservation of these and other large, slow-breeding animals.

Image: 
Udawalawe Elephant Research Project

Conservation decisions based on population counts may fail to protect large, slow-breeding animals from irrevocable decline, according to new research coinciding with Endangered Species Day.

"Critical thresholds in so-called vital rates - such as mortality and fertility rates among males and females of various ages - can signal an approaching population collapse long before numbers drop below a point of no return," says lead author Dr. Shermin de Silva, President & Founder of Asian elephant conservation charity Trunks & Leaves. "We propose that conservation efforts for Asian elephants and other slow-breeding megafauna be aimed at maintaining their 'demographic safe space': that is, the combination of key vital rates that supports a non-negative growth rate."

A mammoth insight

Published in Frontiers in Ecology and Evolution, the study suggests that a combination of key vital rates governing population growth is a better indicator of a species' viability than short-term trends in population size and distribution.

"History bears this out," argues de Silva. "Genomic studies of the last mammoths isolated on Wrangel Island - between Russia and Alaska - have shown that although they were able to persist for thousands of years beyond the extinction of mainland populations with just ~300 individuals, they had accumulated numerous genetic mutations that may have eventually contributed to their extinction."

In other words populations of megafauna can become biologically inviable long before they disappear, if pushed beyond their 'demographic safe space.'

Females and calves key to saving the Asian elephant

The group applied the 'demographic safe space' concept to the case of the Asian elephant.

"Asian elephants are classified as 'Endangered' under the IUCN Red List because populations are thought to have declined by at least 50% in less than a century," explains de Silva. "There are fewer than 50,000 wild Asian elephants living today."

Studies show that wild Asian elephants breed extremely slowly, the majority producing just one calf in six years or more. Using mathematical modeling, de Silva and colleagues found that near-optimal reproduction and high calf survival is necessary to maintain non-negative population growth in the face of even modestly increased mortality among adult female age classes.

The approach shows a clear conservation priority for Asian elephants, a species in which the vast majority is tuskless.

"Measures to enhance survival of calves, and particularly females, are key to saving the Asian elephant," emphasizes de Silva.

"But while the attention of the world has been focused on the ivory trade, for critically endangered Asian elephant populations the greatest threat is habitat loss - followed by illegal trade in live animals and parts.

"Habitat loss can create something known as 'extinction debt' by slowing down birth rates and increasing mortality rates. For slow breeding long-lived species, even incremental changes make a big difference, but their longevity can obscure the risk of extinction."

A demographic safe space for all megafauna

Conservation efforts for other large, slow-breeding species - such as giraffes, rhinos, Bactrian camels and eastern gorillas - could also benefit from modelling the interaction between vital rates. Data for these species in the wild are a scarce yet urgent necessity, suggest the authors.

"Rather than rely on simple population counts or estimates of near-term extinction probability, we urge that conservation resources for slow-breeding megafauna also be invested in identifying demographic tipping points and how to maintain populations within their safe spaces.

"Populations of slow-breeding taxa need proactive management well before numbers become critically low, when returns on investment are potentially greater and populations less likely committed to extinction," concludes de Silva.

Credit: 
Frontiers

Children describe technology that gives them a sense of ambiguity as 'creepy'

Many parents express concerns about privacy and online safety in technology designed for their children. But we know much less about what children themselves find concerning in emerging technologies.

Now University of Washington researchers have defined for the first time what children mean when they say technology is "creepy." Kids in a new study described creepy technology as something that is unpredictable or poses an ambiguous threat that might cause physical harm or threaten an important relationship. The researchers also pinpointed five aspects of emerging technologies that could contribute to this feeling of ambiguity.

The team presented its results May 8 at the 2019 ACM CHI conference on Human Factors in Computing Systems in Glasgow, Scotland.

"Over the years of working with kids we realized they use the word 'creepy' a lot as a way to reject specific technologies," said first author Jason Yip, an assistant professor in the UW's Information School. "But kids have a difficult time articulating what makes something creepy. So we designed a series of activities to give them the chance to work out their own thoughts and help us understand."

Previous research indicated that adults describe ambiguous threats as creepy, not scary, so the team conducted four separate design sessions to see if children felt similarly about creepy technology. These sessions had 11 children aged 7 to 11 prototype their own technologies or rank real or imagined technologies as "creepy," "not creepy" or "don't know." Devices that could bring about physical harm or disrupt an important relationship were most consistently ranked as being creepy.

"When we were brainstorming about what kids were going to be worried about, we never considered that they might be concerned that somehow technology would get between them and their parents, and that this would be such a salient issue in their minds," said co-author Alexis Hiniker, an assistant professor in the iSchool.

The team found five properties of technology that lead to those fears:

Deception versus transparency

Kids want to understand how technology works and what information a device is collecting. For example, when a child asked a digital voice assistant if it would kill him in his sleep and it said, "I can't answer that," the child was concerned.

"'I'm afraid I don't have an answer to that' works well if I ask how many hairs are on the top of my head," Yip said. "But with these types of questions, this response sounds deceptive."

Ominous physical appearance

Kids are sensitive to how a technology looks, sounds and feels. But that doesn't mean that only traditionally scary-looking technologies are creepy: The children were also wary of Maslo, an app with a large black dot as its interface, because it looked like a "black spirit" or a "black hole."

Lack of control

Kids want to control technology's access to their information and the flow of that information to their parents. For example, when kids were asked to design a technology that was trustworthy, some of the children designed an intelligent trash can that both scanned and deleted their facial recognition data each time they used it. Their trash can also had a button that allowed for manual deletion of data.

Unpredictability

Kids don't like it when technology does things unexpectedly, like automatically knowing their name or laughing. To kids, laughing could communicate hidden, and possibly malicious, intent.

Mimicry

Kids also don't like technology that pretends to be something else, especially when it's trying to mimic people in their lives or themselves. Technology that mimics them could be trying to steal their identities or disrupting family relationships.

"All five themes are related to ambiguous threats. It's a not specific monstrosity coming after them here like when something is scary; it's more nuanced so that they're not sure of the consequences of their actions," Yip said. "The kids kept referencing the movie Coraline. In the story, the dolls ask Coraline to make a change: 'If you sew buttons over your eyes and become just like us, we will love you forever.' That prompts this feeling of, 'Wait a second, sew buttons over my eyes? What am I compromising here?'"

The team found that trusted adults had some influence over whether or not the children thought that specific devices were creepy. For example, one child deemed smartphones "not creepy" because he saw his parents using them. Another kid thought that laptops were creepy because his parents taped a piece of paper over the camera to "keep the robbers away."

The researchers acknowledge that their results could be used to make technology that tricks kids into a false sense of security. But the team thinks it is more important to have these results available to the public to help parents talk to their kids about technology and any types of fears that might arise.

"Children have access to so many different kinds of technologies compared to when we were growing up," Hiniker said. "But their basic fears haven't changed at all. Kids want to feel physically safe and anchored to trusted adults who are going to protect them."

Credit: 
University of Washington

Antibiotic treatment alleviates Alzheimer's disease symptoms in male mice, study reveals

image: Compared with a control (left), long-term antibiotic treatment (right) reduces the size of amyloid plaques (green) and alters the appearance of microglia (red) in the brains of male APPPS1-21 mice.

Image: 
Dodiya <em>et al</em>., 2019

Researchers at The University of Chicago have demonstrated that the type of bacteria living in the gut can influence the development of Alzheimer’s disease symptoms in mice. The study, which will be published May 16 in the Journal of Experimental Medicine, shows that, by altering the gut microbiome, long-term antibiotic treatment reduces inflammation and slows the growth of amyloid plaques in the brains of male mice, though the same treatment has no effect on female animals.

The community of bacteria that live in the gastrointestinal tract—the gut microbiome—is generally harmless, but, because they affect the activity of the body’s immune system, these bacteria can influence a wide range of diseases, even in distant tissues such as the brain.

“Recent evidence suggests that intestinal bacteria could play a major role in various neurological conditions including autism spectrum disorders, multiple sclerosis, Parkinson’s disease, and Alzheimer’s disease,” explains Professor Sangram S. Sisodia, director of the Center for Molecular Neurobiology at The University of Chicago.

Alzheimer’s disease is characterized by the formation of amyloid plaques and the activation of immune cells present in the brain known as microglia. These cells can help remove amyloid plaques, but their activation may also exacerbate the disease by causing neuroinflammation.

Alzheimer’s patients exhibit changes in their gut microbiome, and Sisodia and colleagues have previously reported that gut bacteria may influence the development of these symptoms in rodents. Long-term antibiotic treatment limited the formation of amyloid plaques and reduced microglia activation in male, but not female, mice expressing mutant proteins associated with familial Alzheimer’s disease. “While compelling, our published studies on the role of the gut microbiome on amyloid plaque formation were limited to a single strain of mice,” Sisodia says.

In the new study, Sisodia and colleagues therefore examined the effects of antibiotics on a different mouse model of Alzheimer’s disease known as APPS1-21. Long-term treatment with a cocktail of antibiotics again reduced the formation of amyloid plaques in male mice but had no effect on females. Antibiotic treatment also appeared to alter the activation of microglia in male mice, changing them from a form that is thought to promote neurodegeneration to a form that helps to maintain a healthy brain.

To prove that these improvements in Alzheimer’s symptoms were caused by alterations in the gut microbiome, the researchers transplanted fecal matter from untreated mice into antibiotic-treated animals. This procedure restored the gut microbiome and caused an increase in amyloid plaque formation and microglial activation.

But why do alterations in the gut microbiome only affect male mice? Sisodia and colleagues discovered that long-term antibiotic treatment changed the gut bacteria of male and female mice in different ways. The changes in the microbiome of female mice caused their immune systems to increase production of several proinflammatory factors that could influence the activation of microglia.

“Our study shows that antibiotic-mediated perturbations of the gut microbiome have selective, sex-specific influences on amyloid plaque formation and microglial activity in the brain,” Sisodia says. “We now want to investigate whether these outcomes can be attributed to changes in any particular type of bacteria.”

Credit: 
Rockefeller University Press

Australian islands home to 414 million pieces of plastic pollution

video: This is video of washed-up marine plastic debris on beaches of Cocoa (Keeling) Islands.

Image: 
Silke Stuckenbrock

A survey of plastic pollution on Australia's Cocos (Keeling) Islands has revealed the territory's beaches are littered with an estimated 414 million pieces of plastic debris.

The study led by IMAS researcher Dr Jennifer Lavers and published in the journal Scientific Reports estimated beaches on the Indian Ocean islands are littered with 238 tonnes of plastic, including 977 000 shoes and 373 000 toothbrushes.

Dr Lavers' research made headlines around the world when in May 2017 she revealed that beaches on remote Henderson Island in the South Pacific had the highest density of plastic debris reported anywhere on Earth.

While the density of plastic debris on Cocos (Keeling) Islands beaches is lower than on Henderson Island, the total volume dwarfs the 38 million pieces weighing 17 tonnes found on the Pacific island.

Dr Lavers said remote islands which don't have large human populations depositing rubbish nearby are an indicator of the amount of plastic debris circulating in the world's oceans.

"Islands such as these are like canaries in a coal mine and it's increasingly urgent that we act on the warnings they are giving us," Dr Lavers said.

"Plastic pollution is now ubiquitous in our oceans, and remote islands are an ideal place to get an objective view of the volume of plastic debris now circling the globe.

"Our estimate of 414 million pieces weighing 238 tonnes on Cocos (Keeling) is conservative, as we only sampled down to a depth of 10 centimetres and couldn't access some beaches that are known debris 'hotspots'.

"Unlike Henderson Island, where most identifiable debris was fishing-related, the plastic on Cocos (Keeling) was largely single-use consumer items such as bottle caps and straws, as well as a large number of shoes and thongs," Dr Lavers said.

Co-author Dr Annett Finger from Victoria University said global production of plastic continues to increase, with almost half of the plastic produced over the past 60-years manufactured in the last 13-years.

"An estimated 12.7 million tonnes of plastic entered our oceans in 2010 alone, with around 40 per cent of plastics entering the waste stream in the same year they're produced," Dr Finger said.

"As a result of the growth in single-use consumer plastics, it's estimated there are now 5.25 trillion pieces of ocean plastic debris.

"Plastic pollution is a well-documented threat to wildlife and its potential impact on humans is a growing area of medical research.

"The scale of the problem means cleaning up our oceans is currently not possible, and cleaning beaches once they are polluted with plastic is time consuming, costly, and needs to be regularly repeated as thousands of new pieces of plastic wash up each day.

"The only viable solution is to reduce plastic production and consumption while improving waste management to stop this material entering our oceans in the first place," Dr Finger said.

Credit: 
University of Tasmania

New study analyzes tweets to reveal how ISIS still inspires low-level attacks

By analyzing 26.2 million Twitter comments in the Arabic language, researchers found that despite losing territory, ISIS remains successful at inspiring low-level attacks because of its messaging for a "call for lone jihad." The study, "ISIS at its apogee: The Arabic discourse about support for ISIS on Twitter and what we can learn from that," was recently published in SAGE Open.

The researchers analyzed Tweets from July 2014 to January 2015, when ISIS's strength reached its peak and the group was expanding its territory. Assessing the connection between ongoing events and Tweet timing, the researchers found:

1. Heavy media coverage leads to increased expressed support of ISIS online.

2. When online conversations surrounding ISIS surged, many individuals who did not normally engage in the debate entered to express their negative views towards the group.

3. Online support changes according to the target of action. For example, terrorist groups often gain legitimacy by claiming they are acting on behalf of a larger cause, but if they choose a target considered unpopular by their constituents, they lose support.

4. There is no clear evidence of ISIS members backlashing against ISIS leaders for their actions towards prisoners (such as beheadings).

The researchers also noticed that foreign fighters often lived in countries with a lower share of support towards ISIS, geographically isolating them from the group.

"This correlation suggests that ISIS sympathizers living in countries with a lower support towards the group might feel marginalized, leading them to radicalize their views and join ISIS," said the study's authors Andrea Ceron, Luigi Curini and Stefano M. Iacus. "Because they don't have an easily accessible way to join or support the organization in their own country."

The researchers concluded that policy that increases censorship is not a solution to the ISIS threat as decreasing expressed support for the group can favor radicalization.

Credit: 
SAGE

Warming climate threatens microbes in alpine streams, new research shows

image: UW research scientist Lusha Tronstad takes notes while conducting research downstream of Petersen Glacier in Grand Teton National Park.

Image: 
Scott Hotaling

Changes to alpine streams fed by glaciers and snowfields due to a warming climate threaten to dramatically alter the types of bacteria and other microbes in those streams, according to a research team that included a University of Wyoming scientist.

But streams that are fed by underground ice insulated by rock -- called "icy seeps" -- offer some hope that the impact of climate change will be less severe in some areas, say the researchers, who include Lusha Tronstad, research scientist with UW's Wyoming Natural Diversity Database (WYNDD).

"Our results show that patterns of microbial diversity support an ominous trend for alpine stream biodiversity...," the researchers wrote in the journal Global Change Biology. "Icy seeps, however, represent a source of optimism for the future of biodiversity in these imperiled ecosystems."

Tronstad is an expert on the alpine streams in Grand Teton National Park, where her 2015 discovery of a rare insect called the western glacier stonefly provided information to assist in deciding whether the species should be protected under the federal Endangered Species Act. In the latest research, she joined scientists from the University of Kentucky, Washington State University, Rutgers University, Kansas State University, Missouri State University, the University of Montana and the U.S. Geological Survey to study much smaller organisms in those Grand Teton streams and others in Montana's Glacier National Park.

In those high-mountain streams -- some fed by glaciers, some by snowfields, some by underground ice and some by groundwater -- a wide variety of bacteria and other microbes exist. They provide the foundation for larger organisms such as insects in alpine waters and fish downstream.

In Wyoming, North America's Rocky Mountains and mountain ranges worldwide, glaciers and perennial snowfields are shrinking and, in some cases, disappearing as the climate warms. While previous research detailed expected impacts on insects and other larger species, Tronstad's research team examined the organisms that can't be seen with the naked eye in six alpine streams in Grand Teton and seven in Glacier National Park.

The scientists found that colder streams fed by glaciers and underground ice have less microbial diversity than those fed by snowmelt and groundwater, but the colder streams are home to some bacteria that don't exist in the warmer streams. As warmer temperatures and reduced snowfields and glaciers increase water temperatures and change the flow and variability of the streams, microbial diversity will decline across the alpine waters, the researchers say. That likely will result in broader environmental impacts -- though the scientists acknowledge that "the degree to which environmental shifts will translate to altered ecosystem functioning remains largely unknown."

While the new research expands understanding of microbial diversity across the range of alpine streams, the scientists say the most significant new insights relate specifically to icy seeps, which are fed by rock glaciers -- masses of underground ice surrounded by rocky debris. It's estimated there are more than 10,000 of these rock glaciers across the United States -- about double the number of surface glaciers and perennial snowfields. Icy seeps are closest in temperature to streams fed by surface glaciers, with less seasonal flow variability. One such underground-ice-fed stream that was part of the study originates from Wind Cave in the Targhee National Forest.

Because they're insulated by thick layers of rock, icy seeps may be more buffered against warming atmospheric conditions than glaciers and perennial snowfields, making them less susceptible to climate change.

"Consequently, there is strong potential for icy seeps to serve as (a refuge) for cold-adapted mountain stream species and unique ecological functions on a global scale," the researchers wrote, adding that icy seeps "may represent the last stronghold of meltwater-associated (life forms) in a landscape without glaciers and perennial snowfields."

Credit: 
University of Wyoming

Companies benefit from giving congressional testimony, study finds

image: Jason Ridge, University of Arkansas

Image: 
University Relations, University of Arkansas

Firms looking to boost their market value and make a favorable impression on investors might consider opportunities to testify before Congress, according to a new study by management researchers at the University of Arkansas.

If chosen to testify, these companies could benefit from publicizing such events, said Jason Ridge, assistant professor of management in the Sam M. Walton College of Businees, and the study's lead author.

"Firms rarely accentuate their testimony in Congress with public announcements," Ridge said. "Based on our findings - that testimony given to Congress has a positive impact on investor reactions - we think maybe they should. Firms could gain real market benefits by making announcements prior to upcoming testimony."

In "Market Reactions to Non-Market Strategy: The Signaling Role of Congressional Testimony," published this month in Strategic Management Journal, Ridge and co-authors Amy Ingram, associate professor of management at Clemson University, and Mirzokhidjon Abdurakhmonov and Dinesh Hasija, doctoral students at the University of Arkansas, studied the effect of congressional testimony given by representatives of large companies. Focusing on Fortune 500 firms that testified between 2004 and 2014, the researchers found that investors responded favorably to several aspects of testimony and that firms reaped positive returns after appearing before a congressional committee.

Previous research by Ridge and others has shown that corporate political activity, as a business strategy, can lead to several benefits, including government contracts, favorable tax rates and preferential access to financing. Though it can be negative, congressional testimony is one of the most powerful forms of political influence. As one of the few publicly viewed points of governmental access, it is highly coveted by firms. It is also a way for external groups, including investors, to see if a firm has political influence.

To study this, Ridge and his colleagues examined "underlying mechanisms" such as status of the witness, length of testimony and "jurisdiction," that is, whether the congressional committee had legislative or regulatory power over the industry represented by the witness. Investors responded favorably to all three of these factors, the researchers found. Responses were even more favorable when a firm faced high regulatory risk and when the congressional committee members' tone was negative.

The researchers also studied whether the proceedings had a spillover effect on firms other than those represented in testimony. Similarly sized companies within the same industry experienced a limited positive effect following their competitor's testimony, the researchers found.

The researchers were also curious about the effect of media attention on public testimony.

"Perhaps surprisingly, we did not find that it amplified the effects of witness status, testimony length and committee jurisdiction," Ridge said. "This is important, because it runs counter to contrary arguments. Our results suggest that neither the amount of media coverage nor the attention given to testimony mattered as much as the testimony itself."

Credit: 
University of Arkansas

Renal infarction is associated with acute kidney injury in patients with cardiac amyloidosis

Rochester, MN, May 15, 2019 - Systemic amyloidosis is a major cause of renal injury, mostly due to direct kidney damage caused by deposits of abnormal protein called amyloid, in the kidney parenchyma. In patients with cardiac amyloidosis, renal infarction is associated with acute kidney injury according to a new study in Mayo Clinic Proceedings, published by Elsevier. Investigators recommend that a diagnosis of renal infarction should be systematically considered in patients with unexplained acute kidney injury in the context of cardiac amyloidosis.

Amyloidosis is a heterogeneous group of diseases, classified according to the main precursor protein forming the amyloid fibrils. The most prevalent types are acquired systemic immunoglobulin light chain amyloidosis (AL amyloidosis); circulating acute-phase reactant serum amyloid A protein amyloidosis (AA amyloidosis); non-mutated or wild-type transthyretin amyloidosis; and hereditary forms caused by genetic variants which encode abnormal proteins. Cardiac involvement (the most relevant prognosis factor) is increasingly being diagnosed in these patients and is frequently accompanied by significant dysfunction of other major organs. Amyloid cardiomyopathy may be associated with thromboembolic events, mostly due to atrial fibrillation, leading to intracardiac thrombus formation, or promoted by the hypercoagulability state observed in patients with nephrotic syndrome.

"The prevalence of renal infarction and the risk factors for this condition have never before been assessed in this patient population," explained lead investigator Vincent Audard, MD, PhD, Service de Néphrologie et Transplantation, Groupe Hospitalier Henri Mondor-Albert Chenevier, Université Paris Est Créteil, Créteil, France. "Computed tomography scans and magnetic resonance imaging are the gold standard method for confirming renal infarction but may be unsuitable in patients with systemic amyloidosis because these patients frequently exhibit impaired renal function and/or have heart devices."

In this observational study, investigators at the Amyloidosis Referral Center of Henri Mondor Hospital assessed the frequency of renal infarction in 87 patients with confirmed cardiac amyloidosis who underwent 99mTc-labeled DMSA renal scintigraphy from October 1, 2015, through February 28, 2018. Three groups of patients were defined on the basis of the underlying amyloidosis disorder: AL amyloidosis in 24 patients; mutated-transthyretin amyloidosis in 24 patients; and wild-type transthyretin amyloidosis in 39 patients.

One of the study's most significant findings is that acute kidney injury in a context of cardiac amyloidosis may be due to renal infarction. The prevalence of renal infarction was relatively high (20.7 percent) among the 87 patients with a definitive diagnosis of confirmed cardiac amyloidosis. These cases were evenly distributed between the three groups. At the time of renal scintigraphy, the frequency of acute kidney injury was higher in patients with renal infarction, and the likelihood of renal infarction diagnosis according to the presence or absence of acute kidney injury was 47.1 percent and 14.5 percent, respectively.

The investigators suggest that several factors, including direct kidney damage due to deposits of amyloid and indirect mechanisms of damage, such as renal failure due to low cardiac output, may be involved in the renal failure observed in these patients.

This study showed that after excluding heart transplant cases, patient survival did not differ significantly between patients with and without a diagnosis of renal infarction. By contrast, the authors found that death- and heart transplant--censored renal survival was significantly lower in patients with renal infarction.

"Overall, these data suggest that renal infarction should probably be added to the spectrum of renal manifestations related to systemic amyloidosis in cases of heart involvement," concluded Dr. Audard. "We recommend that a diagnosis of renal infarction should be systematically considered in patients with unexplained acute kidney injury in the context of cardiac amyloidosis."

Credit: 
Elsevier

Cofilin may be early culprit in tauopathy process leading to brain cell death

image: Neurofibrillary tau tangles (stained red) are one of the two major brain lesions of Alzheimer's disease. Blue fluorescent stain (DAPI) depicts the nerve cell nuclei.

Image: 
© USF Health, University of South Florida

TAMPA, Fla. -- The two primary hallmarks of Alzheimer's disease are clumps of sticky amyloid-beta (Aβ) protein fragments known as amyloid plaques and neurofibrillary tangles of a protein called tau. Abnormal accumulations of both proteins are needed to drive the death of brain cells, or neurons. But scientists still have a lot to learn about how amyloid impacts tau to promote widespread neurotoxicity, which destroys cognitive abilities like thinking, remembering and reasoning in patients with Alzheimer's.

While investigating the molecular relationship between amyloid and tau, University of South Florida neuroscientists discovered that the Aβ-activated enzyme cofilin plays an essential intermediary role in worsening tau pathology.

Their latest preclinical study was reported March 22, 2019 in Communications Biology.

The research introduces a new twist on the traditional view that adding phosphates to tau (known as phosphorylation) is the most important early event in tau's detachment from brain cell-supporting microtubules and its subsequent build-up into neurofibrillary tangles. These toxic tau tangles disrupt brain cells' ability to communicate, eventually killing them.

"We identified for the first time that cofilin binds to microtubules at the expense of tau - essentially kicking tau off the microtubules and interfering with tau-induced microtubule assembly. And that promotes tauopathy, the aggregation of tau seen in neurofibrillary tangles," said senior author David Kang, PhD, a professor of molecular medicine at the USF Health Morsani College of Medicine and director of basic research at Byrd Alzheimer's Center, USF Health Neuroscience Institute.

Dr. Kang also holds the Fleming Endowed Chair in Alzheimer's Research at USF Health and is a biological scientist at James A. Haley Veterans' Administration Hospital. Alexa Woo, PhD, assistant professor of molecular pharmacology and physiology and member of the Byrd Alzheimer's Center, was the study's lead author.

The study builds upon previous work at USF Health showing that Aβ activates cofilin through a protein known as Slingshot, or SSH1. Since both cofilin and tau appear to be required for Aβ neurotoxicity, in this paper the researchers probed the potential link between tau and cofilin.

The microtubules that provide structural support inside neurons were at the core of their series of experiments.

Without microtubules, axons and dendrites could not assemble and maintain the elaborate, rapidly changing shapes needed for neural network communication, or signaling. Microtubules also function as highly active railways, transporting proteins, energy-producing mitochondria, organelles and other materials from the body of the brain cell to distant parts connecting it to other cells. Tau molecules are like the railroad track ties that stabilize and hold train rails (microtubules) in place.

Using a mouse model for early-stage tauopathy, Dr. Kang and his colleagues showed that Aβ-activated cofilin promotes tauopathy by displacing the tau molecules directly binding to microtubules, destabilizes microtubule dynamics, and disrupts synaptic function (neuron signaling) -- all key factors in Alzheimer's disease progression. Unactivated cofilin did not.

The researchers also demonstrated that genetically reducing cofilin helped prevent the tau aggregation leading to Alzheimer's-like brain damage in mice.

"Our data suggests that cofilin kicks tau off the microtubules, a process that possibly begins even before tau phosphorylation," Dr. Kang said. "That's a bit of a reconfiguration of the canonical model of how the pathway leading to tauopathy works."

Since cofilin activation is largely regulated by SSH1, an enzyme also activated by Aβ, the researchers propose that inhibiting SSH1 represents a new target for treating Alzheimer's disease or other tauopathies.

Dr. Kang's laboratory is working with James Leahy, PhD, a USF professor of chemistry, and Yu Chen, PhD, a USF Health professor of molecular medicine, on refining several SSH1 inhibitors that show preclinical promise as drug candidates.

Credit: 
University of South Florida

Arts education can provide creative counter narratives against hate speech

Hate speech has become a growing topic of discussion on a global scale, especially as advances in the internet have transformed communication on many levels. Nowadays, it's easy to spread hate speech on user-generated and anonymous online platforms.

A practical and creative way for policy-makers to raise awareness of these issues is to create culturally sensitive and effective counter narratives with the help of arts education. It also helps teachers empower students to fight against hate speech.

According to research, teachers have seen positive results from educating their pupils about cyberbullying, but they need additional training to gain more knowledge on how to reduce involvement in and long-term exposure to bullying.

This is where arts education can make a difference, because banning hate speech doesn't reach the roots of hatred, says doctoral candidate Tuula Jääskeläinen from the University of the Arts Helsinki.

According to Jääskeläinen's paper 'Countering hate speech through arts and arts education', art can provide a space to support diverse viewpoints that can question hate speakers' simplified generalisations. Arts education can offer ways to disclose what is hidden and give tools to examine the ignorance, misunderstandings, and false beliefs within the historical and cultural contexts of hate speech.

Over the years, people have come up with clever solutions to strengthen solidarity. In the Council of Europe's Living Library, one can 'borrow' people instead of books. In this case, the people may be victims of hate speech or activists in combating hate speech, for example. In Finland, there is a community called ByHelpers, which fights against the bystander effect by encouraging people to help strangers in everyday life.

Recent research provides evidence that people's greater engagement with the arts often leads to greater pro-sociality through volunteering and charitable giving. Furthermore, research shows that children and young people who have been involved with arts in school become more active and engaged citizens than their less artistically involved peers when it comes to voting, volunteering, and general participation in society. Therefore, Jääskeläinen concludes that art can act as an important socio-psychological catalyst towards a cohesive and socially prosperous society.

Credit: 
University of the Arts Helsinki

Compositional design of multi-component alloys by high-throughput screening

image: (a) Schematic diagram of high-throughput, (b) physical shadow mask.

Image: 
©Science China Press

Recently, multi-component materials have become one of the most promising materials in the engineering and biomedical applications. Compared with traditional alloys, the composition design of multi-component materials is more complicated, and lots of alloys with different compositions need to be prepared and tested. In addition, the relationship between the mixing entropy and performance of multi-component materials are nonlinear, thereby the structure and performance cannot be effectively predicted by mixing entropy values, which makes it more difficult to design the alloys efficiently. In this case, high-throughput technology is effective way to solve this issue. A recent study reported that high-throughput screening of the composition and Young's modulus of Ti-Zr-Nb alloy was successfully achieved by co-sputtering technology with the aid of a physical mask.

The research paper, entitled "High-throughput screening for biomedical applications in a Ti-Zr-Nb alloy system through masking co-sputtering", is published in SCIENCE CHINA Physics, Mechanics & Astronomy in Issue 9 of 2019, with Prof. Yong Zhang from Beijing University of Science and Technology as the corresponding author.

To develop the new alloys with special properties, e.g., excellent mechanical properties, or biomedical properties, is usually a time-consuming process. The conventional "trial and error" method cannot meet the requirements. On the other hand, owing to the limitations of research methods, only few specific compositions can be obtained from a set of experiments using conventional methods.

Taking biomedical materials as an example, the obtained low Young's modulus value is generally a relatively low value in a small composition region, rather than the lowest value of a global system. Therefore, the conventional "trial and error" method inevitably causes incompleteness and contingency in research results.

High-throughput technology is an effective way to obtain a composition with desirable properties in a larger composition region while improving efficiency. On the basis of multi-target co-sputtering, an auxiliary physical mask was used to facilitate the preparation of compositional gradient materials and 16 independent specimens were obtained in this work.

Particularly, the Young's modulus of the Ti-Zr-Nb alloys were tested by nanoindentation. The tested Young's modulus values were fitted to 3D surface maps and contour maps, as shown in Figure 2. Significantly, a low Young's modulus region is evident in Figure 2(a). To determine whether a lower modulus composition existed in the blank areas between the specimens with lower Young's moduli, further optimization of the composition was conducted. Based on the screening results, the formation, structure and mechanical properties of bulk alloys can be further discussed in detail.

It should be noted that the application of the physical mask is necessary to prevent component diffusion between the sample units. In general, the composition of the materials obtained by the multi-target co-sputtering could be continuously changed, which means that the process of component diffusion is inevitable. To ensure the composition difference of the specimens, a separate mask has been used in this work.

This work not only offers novel multi-component alloys with prominent properties for practical applications, but also shed new light on development of high-throughput preparation technology in general.

Credit: 
Science China Press

Flu virus' best friend: Low humidity

New Haven, Conn. -- Yale researchers have pinpointed a key reason why people are more likely to get sick and even die from flu during winter months: low humidity.

While experts know that cold temperatures and low humidity promote transmission of the flu virus, less is understood about the effect of decreased humidity on the immune system's defenses against flu infection.

The Yale research team, led by Akiko Iwasaki, the Waldemar Von Zedtwitz Professor of Immunobiology, explored the question using mice genetically modified to resist viral infection as humans do. The mice were all housed in chambers at the same temperature, but with either low or normal humidity. They were then exposed to the influenza A virus.

The researchers found that low humidity hindered the immune response of the animals in three ways. It prevented cilia, which are hair-like structures in airways cells, from removing viral particles and mucus. It also reduced the ability of airway cells to repair damage caused by the virus in the lungs. The third mechanism involved interferons, or signaling proteins released by virus-infected cells to alert neighboring cells to the viral threat. In the low-humidity environment, this innate immune defense system failed.

The study offers insight into why the flu is more prevalent when the air is dry. "It's well known that where humidity drops, a spike in flu incidence and mortality occurs. If our findings in mice hold up in humans, our study provides a possible mechanism underlying this seasonal nature of flu disease," said Iwasaki.

While the researchers emphasized that humidity is not the only factor in flu outbreaks, it is an important one that should be considered during the winter season. Increasing water vapor in the air with humidifiers at home, school, work, and even hospital environments is a potential strategy to reduce flu symptoms and speed recovery, they said.

The study was published in the journal Proceedings of the National Academy of Sciences (PNAS).

Credit: 
Yale University