Tech

A deep look at nasal polyps offers insights into allergic diseases

Nasal polyps - soft, grape-like outgrowths that can appear in the nasal passages and sinuses - can be chronic and relentless. Although noncancerous, these outgrowths can grow large enough to block the nose and sinuses, leading to discomfort, breathing problems and infections. Nasal polyps can be surgically removed, but may grow back, sometimes in a matter of days. Although most patients are happy to be rid of their polyps, for researchers, that tissue is precious: It may hold critical clues about intense, allergic inflammation. Investigators from Brigham and Women's Hospital, along with collaborators from the Broad Institute and MIT, have used some of the most advanced sequencing technology to peer into nasal polyps, gleaning new insights not only into this condition but also the severe form of inflammation that may lead to other disorders, such as asthma, allergic rhinitis and allergic eczema. The team's findings are published Aug. 22 online in Nature.

"For our patients, one of the most frustrating things about chronic, allergic conditions is that we have no cure. Surgery can relieve discomfort for those with nasal polyps, but in many cases the effect is only temporary," said co-corresponding author Nora Barrett, MD, a physician in the Division of Rheumatology, Immunology, and Allergy at BWH. "My group's goal is to understand why the inflammatory process persists once it begins and to uncover the cause of these conditions."

Barrett and colleagues obtained samples from 12 patients with nasal polyps or other sinus conditions, for a total collection of 18,036 cells. They compared these to nasal scrapings from healthy individuals. Barrett and her team member Dan Dwyer, PhD, partnered with Alex K. Shalek, PhD, of the Broad Institute, Ragon and MIT, and his postdoctoral fellow Jose Ordovas-Montanes, PhD, to use massively-parallel, single-cell RNA sequencing - a technique that allows investigators to determine genes turned on in each cell recovered. Rather than focusing on just one kind of suspected cell, they used this approach to look at every cell type found in harvested samples.

What they found surprised them. One of the most striking findings the researchers reported was that epithelial progenitor cells, stem-like cells which give rise to the cells that line the airways, had been completely remodeled in the polyp samples. And they had been permanently altered. In fact, even when removed from the tissue and grown in the lab, the resulting cells showed marked genetic differences.

"We found that stem-like epithelial cells expressed an aberrant program," said Barrett. "To a dramatic extent, inflammation had changed the basic tissue architecture at the genetic level."

The Nature publication offers a global, cellular map of inflamed tissue for what is known as type 2 inflammation - a severe form of inflammation that involves immune cells that have gone rogue, triggering a cascade of immune action. The map points to many pathways that have been altered, and Barrett and colleagues are working through these to identify which ones may be driving inflammation and which ones may be resulting from the inflammatory process. The research team also tested a monoclonal antibody that helped restore normal genetic activity - suggesting that it may be possible to develop therapies in the future to restore a normal balance to cells that have been altered through inflammation.

The team also hopes to use the new information to develop a genetic signature that would allow clinicians to take a swab of nasal mucosa to test for lung conditions. Using the new map to chart the key characteristics of type 2 inflammation could yield the tools needed to predict disease states and, potentially, direct targeted therapy.

Credit: 
Brigham and Women's Hospital

AMP addresses clinical relevance of DNA variants in chronic myeloid neoplasms

ROCKVILLE, Md. -- Aug. 21, 2018 -- The Association for Molecular Pathology (AMP), the premier global molecular diagnostics professional society, today published consensus, evidence-based recommendations to aid clinical laboratory professionals with the management of most Chronic Myeloid Neoplasms (CMNs) and development of high-throughput pan-myeloid sequencing testing panels. The report, "Clinical Significance of DNA Variants in Chronic Myeloid Neoplasms (CMNs): A Report of the Association for Molecular Pathology," was released online ahead of publication in The Journal of Molecular Diagnostics.

CMNs are a complex group of hematopoietic disorders, encompassing myelodysplastic syndromes (MDS), myeloproliferative neoplasms (MPNs), and the overlap entities (MDS/MPNs), that causes a person's bone marrow to make too many, or too few, red blood cells, white blood cells or platelets. The increasing availability of targeted high-throughput next-generation sequencing (NGS) panels has enabled scientists to explore the genetic heterogeneity and clinical relevance of the small DNA variants in CMNs. However, the biological complexity and multiple forms of CMNs has led to variability in the genes included on the available panels that are used to make an accurate diagnosis, provide reliable prognostic information and select an appropriate therapy based on DNA variant profiles present at various time points. The AMP CMN Working Group was established to review published literature, summarize key findings that support clinical utility and define a minimum set of critical gene inclusions for all high-throughput pan-myeloid sequencing testing panels.

"The molecular pathology community has witnessed a recent explosion of scientific literature highlighting the clinical significance of small DNA variants in CMNs," said Rebecca F. McClure, MD, Associate Professor, Health Sciences North and the Northern Ontario School of Medicine, AMP CMN Working Group Member and Co-lead Author. "AMP's working group recognized a clear unmet need for evidence-based recommendations to assist in the development of the high-quality pan-myeloid gene panels that provide relevant diagnostic and prognostic information and enable monitoring of clonal architecture."

The AMP CMN Working Group proposed the following 34 genes as a minimum recommended testing list: ASXL1, BCOR, BCORL1, CALR, CBL, CEBPA, CSF3R, DNMT3A, ETV6, EZH2, FLT3, IDH1, IDH2, JAK2, KIT, KRAS, MPL, NF1, NPM1, NRAS, PHF6, PPM1D, PTPN11, RAD21, RUNX1, SETBP1, SF3B1, SMC3, SRSF2, STAG2, TET2, TP53, U2AF1, and ZRSR2. The list of genes is meant to aid clinical laboratory professionals with the management of most CMNs and selection of myeloid testing panels.

"While the goal of the study was to distill the literature for molecular pathologists, in doing so we also revealed recurrent mutational patterns of clonal evolution that will aid hematologist/oncologists, researchers, and pathologists understand how to interpret the results of these panels as they reveal critical biology of the neoplasms," said Annette S. Kim, MD, PhD, Associate Professor of Pathology at Harvard Medical School and Brigham and Women's Hospital, AMP Hematopathology Subdivision Chair and CMN Working Group Chair.

"This new CMN report is another validation of AMP's commitment to continuously improve clinical practice and patient care," said Mark D. Ewalt, MD, Assistant Professor at University of Colorado, AMP CMN Working Group Member and Co-lead Author. "Moving forward, the AMP CMN Working Group will plan on revisiting and updating the gene list as insight on specific clinicopathologic characteristics of CMNs accumulates."

Credit: 
Association for Molecular Pathology

Simple leg exercises could reduce impact of sedentary lifestyle on heart and blood vessels

image: Data collection of popliteal artery blood flow measurements with the subject in the seated position.

Image: 
Vranish et. al.

A sedentary lifestyle can cause an impairment of the transport of blood around the body, which increases the risk of disease in the heart and blood vessels. New research published in Experimental Physiology suggests that performing simple leg exercises whilst lying down might help to prevent these problems.

Previous work has demonstrated that prolonged sitting for up to 6 hours results in a decline in both blood flow to the limbs and in our larger arteries' ability to widen to accommodate increased blood flow. This is the first study to demonstrate that sitting for just 10 minutes is sufficient to reduce blood flow to the legs and impairs the function of small blood vessels supplying muscles in the leg.

This paper also demonstrates a reduction in the function of small blood vessels when lying down. However, this study suggests we might be able to somewhat reverse this impairment in function by performing simple leg exercises when lying down in bed or on the sofa. These findings are important in increasing our understanding of the negative impact of sitting and physical inactivity on blood vessel function and the supply of blood to the legs.

The effects of sitting on blood circulation have been attributed to blood passing more sluggishly through arteries whilst sitting. The researchers who performed this study aimed to find out whether these reductions were caused by sustained sitting, or whether 10 minutes would be sufficient to have a negative effect.

The research group used a Doppler ultrasound technique alongside the knee to measure blood flow and examined the extent to which blood vessels widened in 18 healthy, young males. These measurements were made prior to and following a 10-minute period of sitting or during a period of rest whilst lying down, with or without leg exercises, which were performed by extending the foot back and forth every two seconds for a third of the time spent lying down. Results showed that a 10 minute period of sitting reduced participants' ability to rapidly increase blood flow to the lower legs via small blood vessels, but did not affect the widening of larger arteries in response to increased blood flow. These findings suggest that a brief period of inactivity affects an individual's ability to rapidly push blood to the lower limbs as efficiently as normal, but doesn't affect the ability of large blood vessels to widen. The results also suggest leg exercises can help maintain rapid increases in the blood supply to the limbs.

The current study demonstrates changes in blood vessel function measured at the level of the knee. However, the researchers only tested healthy young males, and as such, their findings cannot be extended to females. It remains unknown as to how these responses may vary with age, or with people who have heart problems. Further research may investigate the impact of sitting and inactivity on blood vessels in other places in the body. For example, would sitting impact the function of blood vessels supplying the brain? Finally, studies designed to investigate the impact of repeated bouts of short-term sitting on blood vessel function are needed.

Co-author Dr. Paul Fadel sheds light on his team's results: "These findings further our understanding of the negative impact of inactivity on blood vessel function and demonstrate the positive effects of simple leg exercises whilst lying down providing further insight into how inactivity affects vascular health of the lower legs".

Credit: 
The Physiological Society

Preparing for chemical attacks with improved computer models

image: Plume development in time.

Image: 
Suddher BhimiReddy and Kiran Bhaganagar

On April 4, 2017, the town of Khan Sheikhoun in northwest Syria experienced one of the worst chemical attacks in recent history. A plume of sarin gas spread more than 10 kilometers (about six miles), carried by buoyant turbulence, killing more than 80 people and injuring hundreds.

Horrified by the attack, but also inspired to do something useful, Kiran Bhaganagar, professor of mechanical engineering at The University of Texas at San Antonio, and her team from Laboratory of Turbulence Sensing and Intelligence Systems, used computer models to replicate the dispersal of the chemical gas. Results were published in Natural Hazards in May 2017. The accuracy of her simulations showed the ability to capture real world conditions despite a scarcity of information.

"If there is a sudden a chemical attack, questions that are important are: 'how far does it go' and 'what direction does it go,'" Bhaganagar said. "This is critical for evacuations."

Bhaganagar's research is supported by the U.S. Department of Army Edgewood Chemical and Biological Center (ECBC), who hope to adopt her models to assist in the case of an attack on American soil.

Chemicals, whether toxic agents like sarin gas or exhaust from vehicles, travel differently from other particulates in the atmosphere. Like wildfires, which can move incredibly fast, chemicals create their own micro-conditions, depending on the density of the material and how it mixes with the atmosphere. This phenomenon is known as buoyant turbulence and it leads to notable differences in how chemicals travel during the day or at night, and during different seasons.

"In the nighttime and early morning, even when you have calm winds, the gradients are very sharp, which means chemicals travel faster," Bhaganagar explained.

Even ordinary turbulence is difficult to mathematically model and predict. It functions on a range of scales, each interacting with the others, and disperses energy as it moves to the smallest levels. Modeling buoyant turbulence is even harder. To predict the effects of turbulence on the dispersal of chemical particles, Bhaganagar's team ran computer simulations on the Stampede2 supercomputer at the Texas Advanced Computing Center (TACC), the largest system at any U.S. university.

"We go into the physics of it and try to understand what the vertices are and where the energy is," Bhaganagar said. "We decompose the problem and each processor solves for a small portion. Then we put everything back together to visualize and analyze the results."

Bhaganagar used TACC's supercomputers through the University of Texas Research Cyberinfrastructure (UTRC) initiative, which, since 2007, has provided researchers at any of the University of Texas System's 14 institutions access to TACC's resources, expertise and training.

The background atmosphere and time of day play a big role in the dispersal. In the case of the Syria attacks, Bhaganagar first had to determine the wind speeds, temperature, and the kinds of chemicals involved. With that information in hand, her high resolution model was able to predict how far and in what direction chemical plumes travelled.

"In Syria, it was very bad because the timing caused it to be ideal conditions to spread very fast," she said. "We ran the actual case of Syria on the TACC supercomputer, got all of the background information and added it to the models, and our models captured the boundaries of the plume and which cities it spread to. We saw it was very similar to what was reported in the news. That gave us confidence that our system works and that we could use it as an evacuation tool."

The research is targeted to short-term predictions: understanding in what direction chemicals will propagate within a four-hour window and working with first responders to deploy personnel appropriately.

However, running the high-resolution model takes time. In the case of the Syria simulation, it required five full days of number crunching on Stampede2 to complete. During a real attack, such time wouldn't be available. Consequently, Bhaganagar also developed a coarser model that uses a database of seasonal conditions as background information to speed up the calculations.

For this purpose, Bhaganagar's team has introduced a novel mobile sensing protocol where they deploy low-cost mobile sensors consisting of aerial drones and ground-based sensors to gather the local wind data and use the courser model to predict the plume transport.

Using this method, the four-hour predictions can be computed in as little as 30 minutes. She is working to bring the time down even further, to 10 minutes. This would allow officials to rapidly issue accurate evacuation orders, or place personnel where they are needed to assist in protecting citizens.

"There are hardly any models that can predict to this level of accuracy," Bhaganagar said. "The Army uses trucks with mobile sensors, which they send into a circle around the source. But it's very expensive and they have to send soldiers, which is a danger to them." In the future, the army hopes to combine computer simulations and live monitoring in the case of a chemical attack.

Bhaganagar will conduct tests in the coming months at the U.S. Army experimental facility in Maryland to determine how well drones can predict wind conditions accurately.

"The higher the accuracy of the data -- the wind speed, wind direction, local temperature -- the better the prediction," she explained. "We use drones to give us additional data. If you can feed this data into the model, the accuracy for the four-hour window is much higher."

Most recently, she and her graduate student, who is a Ph.D. candidate, Sudheer Bhimireddy, integrated their buoyant turbulence model with the high-resolution Advanced Research Weather Research and Forecast model to understand the role of atmospheric stability on the short-term transport of chemical plumes.The research appears in the September 2018 edition of Atmospheric Pollution Research

Developing Tools to Detect Pollution in Your Community

In related work funded by the National Science Foundation, Bhaganagar has adopted her chemical plume model to do pollution tracking. She hopes her code can help communities predict local pollution conditions.

According to Bhaganagar, low-cost wind and gas sensors purchased by a community could help produce daily forecasts so individuals can take the proper precautions when pollution levels are concentrated in an area. Recent efforts have tried to determine how many sensors are needed to allow accurate local predictions.

"Can we detect zones of pollution and take effective measures to avoid pollution?" Bhaganagar asked. "If we had our own small-scale models that we could use in our communities that would have a big impact on pollution."

Though community pollution forecasts would ultimately run on consumer-grade computers, such predictions would not be possible without access to supercomputers to test the models and generate a database of background conditions.

"TACC resources are so valuable," she said. "I wouldn't have even attempted these research projects if I wasn't able to access TACC supercomputers. They're absolutely necessary for developing new turbulence models that can save lives in the future."

Credit: 
University of Texas at Austin, Texas Advanced Computing Center

NIST details steps to keep buildings functioning after natural hazards

image: The devastation (left image) following the 1906 earthquake that dramatically impacted the economic and social growth of San Francisco compared with the "West Coast's most resilient skyscraper," the 181 Fremont tower (right image, center), designed to withstand a 475-year seismic event with minimal disruption. Ironically, the tower stands one block from the corner seen in the 1906 photo.

Image: 
Left image: California State Library Right image: Jay Paul Company

After an earthquake, hurricane, tornado or other natural hazard, it's considered a win if no one gets hurt and buildings stay standing. But an even bigger victory is possible: keeping those structures operational. This outcome could become more likely with improved standards and codes for the construction of residential and commercial buildings, according to a new report recently delivered to the U.S. Congress by the National Institute of Standards and Technology (NIST).

"Current standards and codes focus on preserving lives by reducing the likelihood of significant building damage or structural collapse from hazards," said Steven McCabe, director of the NIST-led, multiagency National Earthquake Hazards Reduction Program (NEHRP) and one of the authors of the new publication. "But they generally don't address the additional need to preserve quality of life by keeping buildings habitable and functioning as normally as possible, what we call 'immediate occupancy.' The goal of our report is to put the nation on track to achieve this performance outcome."

The impact of a natural hazard on a community is usually most evident in the lives lost and physical destruction, but the accompanying economic shock, social disruptions and reduced quality of life can often be devastating as well. "Cities and towns can be rebuilt, but lifestyles are damaged, sometimes permanently, if businesses, schools, utilities, transportation and other essential operations are out of service for an extended period," said Therese McAllister, manager of NIST's Community Resilience Program and another report author.

The infamous 1906 San Francisco earthquake provides a dramatic example of that impact. In the half-century following the 1840s Gold Rush in California, San Francisco was the fastest growing metropolitan area in the region. That all changed on April 18, 1906, when the quake and resulting fires destroyed 80 percent of the city, killed some 3,000 residents and left nearly 300,000 people--three-fourths of the population--homeless, out of work and without essential services. Though San Francisco would rebuild quickly, the disaster diverted trade, industry and people south to Los Angeles, which then supplanted the "City by the Bay" as the largest, most important urban center in the western United States.

Even with modern building codes and standards in place, there is still room for improvement, as evidenced by the massive damage from the May 2011 tornado in Joplin, Missouri, and the three major 2017 hurricanes striking Texas, Florida and Puerto Rico.

"Immediate occupancy performance measures would help avoid catastrophes because they could build up a community's resiliency against natural hazards so that people still can live at home, still can go to work and still can have the supporting infrastructure providing them services such as water and electricity," McCabe said.

In 2017, Congress tasked NIST to define what it would take to achieve immediate occupancy performance codes and standards for all buildings in all types of natural hazards, specifically in terms of fundamental research needs, possible technological applications based on that research and key strategies that could be used to implement any resulting regulations.

The result of that effort is the new NIST report, Research Needs to Support Immediate Occupancy Building Performance Objective Following Natural Hazard Events (NIST Special Publication 1224). The publication identifies a large portfolio of research and implementation activities that target enhanced performance objectives for residential and commercial buildings.

"The report provides valuable information about steps that could be taken to achieve immediate occupancy in the future," McAllister said.

The potential research activities presented in the report to Congress were developed with the assistance of a steering committee of recognized experts and stakeholder input obtained during a national workshop hosted by NIST in January 2018. The workshop participants identified four key areas that they believe must be considered when developing plans to achieve immediate occupancy performance: building design, community needs, economic and social impacts, and fostering acceptance and use of new practices.

For example, the report states that immediate occupancy performance measures must be developed, established and implemented with a sensitivity to how they will economically affect building owners, business operators, occupants and even whole communities. "You have to make sure that the cost of keeping buildings functional after natural hazards remains reasonable enough that everyone will be able to afford them," McCabe said.

The report also discusses key challenges facing the effort to make buildings functional in the wake of natural hazards, such as motivating communities to make the investment, managing how costs and benefits are balanced, and garnering public support.

Finally, the report concludes by recognizing that "increasing the performance goals for buildings would not be easily achieved, but the advantages may be substantial" and that making such objectives a reality "would entail a significant shift in practice for development, construction and maintenance or retrofit of buildings." The report, its authors state, is the first step toward creating an action plan to achieve immediate occupancy across the nation with coordinated and detailed research goals and implementation activities.

"Our report outlines the steps that could be taken for a big raise of the bar--perhaps the biggest change in building standards and codes in 50 years--but one we believe is possible," McCabe said.

Credit: 
National Institute of Standards and Technology (NIST)

Summer weather stall: Nice sunny days can grow into heat waves and wildfires

Be it heavy downpours or super-hot spells, summer weather becomes more persistent in North America, Europe and parts of Asia. When those conditions stall for several days or weeks, they can turn into extremes: heatwaves resulting in droughts, health risks and wildfires; or relentless rainfall resulting in floods. A team of scientists now presents the first comprehensive review of research on summer weather stalling focusing on the influence of the disproportionally strong warming of the Arctic as caused by greenhouse-gas emissions from burning fossil fuels. Evidence is mounting, they show, that we likely meddle with circulation patterns high up in the sky. These are affecting, in turn, regional and local weather patterns - with sometimes disastrous effects on the ground. This has been the case with the 2016 wildfire in Canada, another team of scientists show in a second study.

"Giant airstreams encircle our globe in the upper troposphere - we call them planetary waves," explains Hans Joachim Schellnhuber, Director of the Potsdam Institute for Climate Impact Research (PIK) and co-author of the second paper. "Now evidence is mounting that humanity is messing with these enormous winds. Fueled by human-made greenhouse-gas emissions, global warming is probably distorting the natural patterns." Usually the waves, conveying chains of high- and low-pressure domains, travel eastwards between the equator and the North Pole. "Yet when they get trapped due to a subtle resonance mechanism," says Schellnhuber, "they slow down so the weather in a given region gets stuck. Rains can grow into floods, sunny days into heat waves, and tinder-dry conditions into wildfires."

Investigating the Arctic Factor and connecting the dots

"While it might not sound so bad to have more prolonged sunny episodes in summer, this is in fact a major climate risk," says Dim Coumou from the Potsdam Institute for Climate Impact Research (PIK) and Vrije Universiteit Amsterdam, lead-author of the review paper and co-author of the wildfire case study. "We have rising temperatures due to human-caused global warming which intensifies heat waves and heavy rainfall, and on top of that we could get dynamical changes that make weather extremes even stronger - this is quite worrying." This summer is an impressive example of how stalling weather can impact societies: persistent hot and dry conditions in Western Europe, Russia and parts of the US threaten cereal yields in these breadbaskets.

Tons of studies have appeared on this topic in recent years, sometimes with seemingly conflicting results. For the paper now published in Nature Communications, an international team of scientists set out to review the existing research and tried to connect the dots, with a focus on the Arctic factor. Under global warming, the Arctic warms more than the rest of the Northern hemisphere. This reduces the temperature difference between the North Pole and the equator, yet this very difference is a main driving force for the airstreams. "There are many studies now, and they point to a number of factors that could contribute to increased airstream stalling in the mid-latitudes - besides Arctic warming, there's also the possibility of climate-change-induced shifting of the storm tracks, as well as changes in the tropical monsoons," says Simon Wang from Utah State University in the US, a co-author of the review paper.

"Under global warming, the Indian summer monsoon rainfall will likely intensify and this will also influence the global airstreams and might ultimately contribute to more stalling weather patterns. All of these mechanisms do not work in isolation but interact," says Wang. "There is strong evidence that winds associated with summer weather systems are weakening and this can interact with so-called amplified quasi-stationary waves. These combined effects point towards more persistent weather patterns, and hence more extreme weather."

The case of the Canadian wildfire disaster

The wildfire in Canada's Alberta region in 2016 is one stark example for the potentially disastrous impact of planetary-waves slow-down and the resulting summer-weather stalling. In a study now published in Scientific Reports, the other research team shows that indeed the blaze has been preceded by the trapping of a specific kind of airstreams in the region. In combination with a very strong El-Nino event this favored unusually dry and high-temperature conditions on the ground, entailing an increased fire hazard here. It took two months before the officials eventually could declare the fire to be under control. This was the costliest disaster in Canadian history with total damages reaching 4.7 billion Canadian Dollars.

"Clearly, the planetary wave pattern wasn't the only cause for the fire - yet it was an additional important factor triggering a deplorable disaster," says Vladimir Petoukhov from PIK, lead-author of the case study. "In fact, our analysis reveals that beyond that single event, actually from the 1980s on, planetary waves were a significant factor for wildfire risks in the region. Since it is possible to detect the wave patterns with a relatively long lead-time of ten days, we hope that our findings can help forest managers and fire forecasters in the future."

A phenomenon that sounds funny but isn't: "extreme extremes"

"Computer simulations generally support the observations and our theoretical understanding of the processes, so this seems pretty robust," concludes Coumou. "However, the observed changes are typically more pronounced than those seen in climate models." So either the simulations are too conservative, or the observed changes are strongly influenced by natural variability. "Our review aims at identifying knowledge gaps and ways forward for future research," says Coumou. "So there's still a lot to do, including machine learning and the use of big data. While we do not have certainty, all in all the state of research indicates that changes in airstreams can, together with other factors, lead to a phenomenon that sounds funny but isn't: extreme extremes."

Credit: 
Potsdam Institute for Climate Impact Research (PIK)

E-cigarettes can damage DNA, claims study in cell cultures of just 5 users

BOSTON, Aug. 20, 2018 -- The popularity of electronic cigarettes continues to grow worldwide, as many people view them as a safer alternative to smoking. But the long-term effects of e-cigarette usage, commonly called "vaping," are unknown. Today, researchers report that vaping may modify the genetic material, or DNA, in the oral cells of users, which could increase their cancer risk.

The researchers will present their results today at the 256th National Meeting & Exposition of the American Chemical Society (ACS). ACS, the world's largest scientific society, is holding the meeting here through Thursday. It features more than 10,000 presentations on a wide range of science topics.

"E-cigarettes are a popular trend, but the long-term health effects are unknown," says Romel Dator, Ph.D., who is presenting the work at the meeting. "We want to characterize the chemicals that vapers are exposed to, as well as any DNA damage they may cause."

Introduced to the market in 2004, e-cigarettes are handheld electronic devices that heat a liquid, usually containing nicotine, into an aerosol that the user inhales. Different flavors of liquids are available, including many that appeal to youth, such as fruit, chocolate and candy. According to a 2016 report by the U.S. Surgeon General, 13.5 percent of middle school students, 37.7 percent of high school students and 35.8 percent of young adults (18 to 24 years of age) have used e-cigarettes, compared with 16.4 percent of older adults (25 years and up).

"It's clear that more carcinogens arise from the combustion of tobacco in regular cigarettes than from the vapor of e-cigarettes," says Silvia Balbo, Ph.D., the project's lead investigator, who is at the Masonic Cancer Center at the University of Minnesota. "However, we don't really know the impact of inhaling the combination of compounds produced by this device. Just because the threats are different doesn't mean that e-cigarettes are completely safe."

To characterize chemical exposures during vaping, the researchers recruited five e-cigarette users. They collected saliva samples before and after a 15-minute vaping session and analyzed the samples for chemicals that are known to damage DNA. To evaluate possible long-term effects of vaping, the team assessed DNA damage in the cells of the volunteers' mouths. The researchers used mass-spectrometry-based methods they had developed previously for a different study in which they evaluated oral DNA damage caused by alcohol consumption.

Dator and Balbo identified three DNA-damaging compounds, formaldehyde, acrolein and methylglyoxal, whose levels increased in the saliva after vaping. Compared with people who don't vape, four of the five e-cigarette users showed increased DNA damage related to acrolein exposure. The type of damage, called a DNA adduct, occurs when toxic chemicals, such as acrolein, react with DNA. If the cell does not repair the damage so that normal DNA replication can take place, cancer could result.

The researchers plan to follow up this preliminary study with a larger one involving more e-cigarette users and controls. They also want to see how the level of DNA adducts differs between e-cigarette users and regular cigarette smokers. "Comparing e-cigarettes and tobacco cigarettes is really like comparing apples and oranges. The exposures are completely different," Balbo says. "We still don't know exactly what these e-cigarette devices are doing and what kinds of effects they may have on health, but our findings suggest that a closer look is warranted."

A press conference on this topic will be held Tuesday, August 21, at 11 a.m. Eastern time in the Boston Convention & Exhibition Center. Reporters may check-in at the press center, Room 102 A, or watch live on YouTube http://bit.ly/ACSLive_Boston2018. To ask questions online, sign in with a Google account.

The researchers acknowledge support and funding from the University of Minnesota.

Credit: 
American Chemical Society

Turning tracking codes into 'clouds' to authenticate genuine 3-D printed parts

image: In a new article in Advanced Engineering Materials, Nikhil Gupta, an associate professor of mechanical engineering at NYU Tandon, and collaborators at NYU Tandon and NYU Abu Dhabi exploited the layer-by-layer AM printing process to 'explode' QR codes within computer-assisted design (CAD) files so that they present several false faces -- dummy QR tags -- to a micro-CT scanner or other scanning device.

Image: 
Nikhil Gupta

BROOKLYN, New York, Monday, August 21, 2018 - The worldwide market for 3D-printed parts is a $5 billion business with a global supply chain involving the internet, email, and the cloud - creating a number of opportunities for counterfeiting and intellectual property theft. Flawed parts printed from stolen design files could produce dire results: experts predict that by 2021, 75 percent of new commercial and military aircraft will fly with 3D-printed engine, airframe, and other components, and the use of AM in the production of medical implants will grow by 20 percent per year over the next decade.

A team at NYU Tandon School of Engineering has found a way to prove the provenance of a part by employing QR (Quick Response) codes in an innovative way for unique device identification. In the latest issue of Advanced Engineering Materials, the researchers describe a method for converting QR codes, bar codes, and other passive tags into three-dimensional features hidden in such a way that they neither compromise the part's integrity nor announce themselves to counterfeiters who have the means to reverse engineer the part.

Noted materials researcher Nikhil Gupta, an associate professor of mechanical engineering at NYU Tandon; Fei Chen, a doctoral student under Gupta; and joint NYU Tandon and NYU Abu Dhabi researchers Nektarios Tsoutsos, Michail Maniatakos and Khaled Shahin, detail how they exploited the layer-by-layer AM printing process to turn QR codes into a game of 3D chess. Gupta's team developed a scheme that "explodes" a QR code within a computer-assisted design (CAD) file so that it presents several false faces -- dummy QR tags -- to a micro-CT scanner or other scanning device. Only a trusted printer or end user would know the correct head-on orientation for the scanner to capture the legitimate QR code image.

"By converting a relatively simple two-dimensional tag into a complex 3D feature comprising hundreds of tiny elements dispersed within the printed component, we are able to create many 'false faces,' which lets us hide the correct QR code from anyone who doesn't know where to look," Gupta said.

The team tested different configurations -- from distributing a code across just three layers of the object, to fragmenting the code into up to 500 tiny elements -- on thermoplastics, photopolymers, and metal alloys, with several printing technologies commonly employed in the industry.

Chen, the study's lead author, said that after embedding QR codes in such simple objects as cubes, bars, and spheres, the team stress-tested the parts, finding that the embedded features had negligible impact on structural integrity.

"To create typical QR code contrasts that are readable to a scanner you have to embed the equivalent of empty spaces," she explained. "But by dispersing these tiny flaws over many layers we were able to keep the part's strength well within acceptable limits."

Tsoutsos and Maniatakos explored threat vectors to determine which AM sectors are best served by this security technology, a step that Gupta said was crucial in the research.

"You need to be cost efficient and match the solution to the threat level," he explained. "Our innovation is particularly useful for sophisticated, high-risk sectors such as biomedical and aerospace, in which the quality of even the smallest part is critical."

A 2016 article by Gupta and a team of researchers that included Maniatakos and Tsoutsos in JOM, The Journal of the Minerals, Metals & Materials Society explored how flaws caused by printing orientation and insertion of fine defects could be foci for AM cyber-attacks. The paper was the most-read engineering research that year among Springer's over 245 engineering journals. In a paper last year in Materials and Design, Gupta detailed methods of inserting undetectable flaws within CAD files so that only a trusted printer could correctly produce the parts.

Credit: 
NYU Tandon School of Engineering

The environmental cost of contact lenses

image: Contact lenses recovered from treated sewage sludge could harm the environment.

Image: 
Charles Rolsky

BOSTON, Aug. 19, 2018 -- Many people rely on contact lenses to improve their vision. But these sight-correcting devices don't last forever -- some are intended for a single day's use -- and they are eventually disposed of in various ways. Now, scientists are reporting that throwing these lenses down the drain at the end of their use could be contributing to microplastic pollution in waterways.

The researchers are presenting their results today at the 256th National Meeting & Exposition of the American Chemical Society (ACS). ACS, the world's largest scientific society, is holding the meeting here through Thursday. It features more than 10,000 presentations on a wide range of science topics.

The inspiration for this work came from personal experience. "I had worn glasses and contact lenses for most of my adult life," Rolf Halden, Ph.D., says. "But I started to wonder, has anyone done research on what happens to these plastic lenses?" His team had already been working on plastic pollution research, and it was a startling wake-up call when they couldn't find studies on what happens to contact lenses after use.

"We began looking into the U.S. market and conducted a survey of contact lens wearers. We found that 15 to 20 percent of contact wearers are flushing the lenses down the sink or toilet," says Charlie Rolsky, a Ph.D. student who is presenting the work. Halden, Rolsky and a third member of the team, Varun Kelkar, are at the Biodesign Institute's Center for Environmental Health Engineering at Arizona State University (ASU). "This is a pretty large number, considering roughly 45 million people in the U.S. alone wear contact lenses."

Lenses that are washed down the drain ultimately end up in wastewater treatment plants. The team estimates that anywhere from six to 10 metric tons of plastic lenses end up in wastewater in the U.S. alone each year. Contacts tend to be denser than water, which means they sink, and this could ultimately pose a threat to aquatic life, especially bottom feeders that may ingest the contacts, Halden says.

Analyzing what happens to these lenses is a challenge for several reasons. First, contact lenses are transparent, which makes them difficult to observe in the complicated milieu of a wastewater treatment plant. Further, the plastics used in contact lenses are different from other plastic waste, such as polypropylene, which can be found in everything from car batteries to textiles. Contact lenses are instead frequently made with a combination of poly(methylmethacrylate), silicones and fluoropolymers to create a softer material that allows oxygen to pass through the lens to the eye. So, it's unclear how wastewater treatment affects contacts.

These differences make processing contact lenses in wastewater plants a challenge. To help address their fate during treatment, the researchers exposed five polymers found in many manufacturers' contact lenses to anaerobic and aerobic microorganisms present at wastewater treatment plants for varying times and performed Raman spectroscopy to analyze them. "We found that there were noticeable changes in the bonds of the contact lenses after long-term treatment with the plant's microbes," says Kelkar. The team concluded that microbes in the wastewater treatment facility actually altered the surface of the contact lenses, weakening the bonds in the plastic polymers.

"When the plastic loses some of its structural strength, it will break down physically. This leads to smaller plastic particles which would ultimately lead to the formation of microplastics," Kelkar says. Aquatic organisms can mistake microplastics for food and since plastics are indigestible, this dramatically affects the marine animals' digestive system. These animals are part of a long food chain. Some eventually find their way to the human food supply, which could lead to unwanted human exposures to plastic contaminants and pollutants that stick to the surfaces of the plastics.

By calling attention to this first-of-its-kind research, the team hopes that industry will take note and at minimum, provide a label on the packaging describing how to properly dispose of contact lenses, which is by placing them with other solid waste. Halden mentions, "Ultimately, we hope that manufacturers will conduct more research on how the lenses impact aquatic life and how fast the lenses degrade in a marine environment."

Credit: 
American Chemical Society

A paper battery powered by bacteria

image: Researchers harnessed bacteria to power these paper batteries.

Image: 
Seokheun Choi

BOSTON, Aug. 19, 2018 -- In remote areas of the world or in regions with limited resources, everyday items like electrical outlets and batteries are luxuries. Health care workers in these areas often lack electricity to power diagnostic devices, and commercial batteries may be unavailable or too expensive. New power sources are needed that are low-cost and portable. Today, researchers report a new type of battery -- made of paper and fueled by bacteria --- that could overcome these challenges.

The researchers will present their results today at the 256th National Meeting & Exposition of the American Chemical Society (ACS). ACS, the world's largest scientific society, is holding the meeting here through Thursday. It features more than 10,000 presentations on a wide range of science topics.

"Paper has unique advantages as a material for biosensors," says Seokheun (Sean) Choi, Ph.D., who is presenting the work at the meeting. "It is inexpensive, disposable, flexible and has a high surface area. However, sophisticated sensors require a power supply. Commercial batteries are too wasteful and expensive, and they can't be integrated into paper substrates. The best solution is a paper-based bio-battery."

Researchers have previously developed disposable paper-based biosensors for cheap and convenient diagnosis of diseases and health conditions, as well as for detecting contaminants in the environment. Many such devices rely on color changes to report a result, but they often aren't very sensitive. To boost sensitivity, the biosensors need a power supply. Choi wanted to develop an inexpensive paper battery powered by bacteria that could be easily incorporated into these single-use devices.

So Choi and his colleagues at the State University of New York, Binghamton made a paper battery by printing thin layers of metals and other materials onto a paper surface. Then, they placed freeze-dried "exoelectrogens" on the paper. Exoelectrogens are a special type of bacteria that can transfer electrons outside of their cells. The electrons, which are generated when the bacteria make energy for themselves, pass through the cell membrane. They can then make contact with external electrodes and power the battery. To activate the battery, the researchers added water or saliva. Within a couple of minutes, the liquid revived the bacteria, which produced enough electrons to power a light-emitting diode and a calculator.

The researchers also investigated how oxygen affects the performance of their device. Oxygen, which passes easily through paper, could soak up electrons produced by the bacteria before they reach the electrode. The team found that although oxygen slightly decreased power generation, the effect was minimal. This is because the bacterial cells were tightly attached to the paper fibers, which rapidly whisked the electrons away to the anode before oxygen could intervene.

The paper battery, which can be used once and then thrown away, currently has a shelf-life of about four months. Choi is working on conditions to improve the survival and performance of the freeze-dried bacteria, enabling a longer shelf life. "The power performance also needs to be improved by about 1,000-fold for most practical applications," Choi says. This could be achieved by stacking and connecting multiple paper batteries, he notes. Choi has applied for a patent for the battery and is seeking industry partners for commercialization.

Credit: 
American Chemical Society

AI could make dodgy lip sync dubbing a thing of the past

Researchers have developed a system using artificial intelligence that can edit the facial expressions of actors to accurately match dubbed voices, saving time and reducing costs for the film industry. It can also be used to correct gaze and head pose in video conferencing, and enables new possibilities for video postproduction and visual effects.

The technique was developed by an international team led by a group from the Max Planck Institute for Informatics and including researchers from the University of Bath, Technicolor, TU Munich and Stanford University. The work, called Deep Video Portraits, was presented for the first time at the SIGGRAPH 2018 conference in Vancouver on 16th August.

Unlike previous methods that are focused on movements of the face interior only, Deep Video Portraits can also animate the whole face including eyes, eyebrows, and head position in videos, using controls known from computer graphics face animation. It can even synthesise a plausible static video background if the head is moved around.

Hyeongwoo Kim from the Max Planck Institute for Informatics explains: "It works by using model-based 3D face performance capture to record the detailed movements of the eyebrows, mouth, nose, and head position of the dubbing actor in a video. It then transposes these movements onto the 'target' actor in the film to accurately sync the lips and facial movements with the new audio."

The research is currently at the proof-of-concept stage and is yet to work at real time, however the researchers anticipate the approach could make a real difference to the visual entertainment industry.

Professor Christian Theobalt, from the Max Planck Institute for Informatics, said: "Despite extensive post-production manipulation, dubbing films into foreign languages always presents a mismatch between the actor on screen and the dubbed voice.

"Our new Deep Video Portrait approach enables us to modify the appearance of a target actor by transferring head pose, facial expressions, and eye motion with a high level of realism."

Co-author of the paper, Dr Christian Richardt, from the University of Bath's motion capture research centre CAMERA, adds: "This technique could also be used for post-production in the film industry where computer graphics editing of faces is already widely used in today's feature films."

A great example is 'The Curious Case of Benjamin Button' where the face of Brad Pitt was replaced with a modified computer graphics version in nearly every frame of the movie. This work remains a very time-consuming process, often requiring many weeks of work by trained artists.

"Deep Video Portraits shows how such a visual effect could be created with less effort in the future. With our approach even the positioning of an actor's head and their facial expression could be easily edited to change camera angles or subtly change the framing of a scene to tell the story better."

In addition, this new approach can also be used in other applications, which the authors show on their project website, for instance in video and VR teleconferencing, where it can be used to correct gaze and head pose such that a more natural conversation setting is achieved. The software enables many new creative applications in visual media production, but the authors are also aware of the potential of misuse of modern video editing technology.

Dr Michael Zollhöfer, from Stanford University, explains: "The media industry has been touching up photos with photo-editing software for many years, meaning most of us have learned to take what we see in photos with a pinch of salt. With ever improving video editing technology, we must also start being more critical about the video content we consume every day, especially if there is no proof of origin. We believe that the field of digital forensics should and will receive a lot more attention in the future to develop approaches that can automatically prove the authenticity of a video clip. This will lead to ever better approaches that can spot such modifications even if we humans might not be able to spot them with our own eyes."

To address this, the research team is using the same technology to develop in tandem neural networks trained to detect synthetically generated or edited video at high precision to make it easier to spot forgeries. The authors have no plans to make the software publicly available but state that any software implementing the many creative use cases should include watermarking schemes to clearly mark modifications.

Credit: 
University of Bath

Novel nanoparticle-based approach detects and treats oral plaque without drugs

image: In this illustration, nanoparticles attach to or are taken up by the bacteria cells. Pan and his students are the first group to demonstrate that early detection of dental plaque in the clinic is possible using the regular intraoral X-ray machine which can seek out harmful bacteria populations.

Image: 
University of Illinois Laboratory for Materials in Medicine.

When the good and bad bacteria in our mouth become imbalanced, the bad bacteria form a biofilm (aka plaque), which can cause cavities, and if left untreated over time, can lead to cardiovascular and other inflammatory diseases like diabetes and bacterial pneumonia.

A team of researchers from the University of Illinois has recently devised a practical nanotechnology-based method for detecting and treating the harmful bacteria that cause plaque and lead to tooth decay and other detrimental conditions.

Bioengineering Associate Professor Dipanjan Pan (seated) and doctoral student Fatemeh Ostadhossein have demonstrated a drug-free, nanotechnology-based method for detecting and destroying the bacteria that causes dental plaque.

Oral plaque is invisible to the eye so dentists currently visualize it with disclosing agents, which they administer to patients in the form of a dissolvable tablet or brush-on swab. While useful in helping patients see the extent of their plaque, these methods are unable to identify the difference between good and bad bacteria.

"Presently in the clinic, detection of dental plaque is highly subjective and only depends on the dentist's visual evaluation," said Bioengineering Associate Professor Dipanjan Pan, head of the research team. "We have demonstrated for the first time that early detection of dental plaque in the clinic is possible using the regular intraoral X-ray machine which can seek out harmful bacteria populations."

In order to accomplish this, Fatemeh Ostadhossein, a Bioengineering graduate student in Pan's group, developed a plaque detection probe that works in conjunction with common X-ray technology and which is capable of finding specific harmful bacteria known as Streptococcus mutans (S. mutans) in a complex biofilm network. Additionally, they also demonstrated that by tweaking the chemical composition of the probe, it can be used to target and destroy the S. mutans bacteria.

The probe is comprised of nanoparticles made of hafnium oxide (HfO2), a non-toxic metal that is currently under clinical trial for internal use in humans. In their study, the team demonstrated the efficacy of the probe to identify biochemical markers present at the surface of the bacterial biofilm and simultaneously destroy S. mutans. They conducted their study on Sprague Dawley rats.

In practice, Pan envisions a dentist applying the probe on the patient's teeth and using the X-ray machine to accurately visualize the extent of the biofilm plaque. If the plaque is deemed severe, then the dentist would follow up with the administering of the therapeutic HfO2 nanoparticles in the form of a dental paste.

In their study, the team compared the therapeutic ability of their nanoparticles with Chlorhexidine, a chemical currently used by dentists to eradicate biofilm. "Our HfO2 nanoparticles are far more efficient at killing the bacteria and reducing the biofilm burden both in cell cultures of bacteria and in [infected] rats," said Ostadhossein, noting that their new technology is also much safer than conventional treatment.

The nanoparticles' therapeutic effect is due, said Pan, to their unique surface chemistry, which provides a latch and kill mechanism. "This mechanism sets our work apart from previously pursued nanoparticle-based approaches where the medicinal effect comes from anti-biotics encapsulated in the particles," said Pan, also a faculty member of the Carle Illinois College of Medicine and the Beckman Institute for Advanced Science and Technology. "This is good because our approach avoids anti-biotic resistance issues and it's safe and highly scalable, making it well-suited for eventual clinical translation."

In addition to Pan and Ostadhossein, other members of the research team include bioengineering post-doctoral researcher Santosh Misra, visiting scholar Indu Tripathi, undergraduate Valeriya Kravchuk, visiting scholar Gururaja Vulugundam; and Veterinary Medicine clinical assistant professor Denae LoBato and adjunct assistant professor Laura Selmic.

Credit: 
University of Illinois Grainger College of Engineering

Progress toward personalized medicine

A few little cells that are different from the rest can have a big effect. For example, individual cancer cells may be resistant to a specific chemotherapy--causing a relapse in a patient who would otherwise be cured. In the journal Angewandte Chemie, scientists have now introduced a microfluidics-based chip for the manipulation and subsequent nucleic-acid analysis of individual cells. The technique uses local electric fields to highly efficiently "trap" the cells (dielectrophoresis).

Molecular analyses of individual cells are necessary to better understand the role of heterogenous cell populations in the development of diseases and to develop effective therapies for personalized medicine. Identifying individual cells in a mass of other cells is an enormous challenge in diagnostic medicine. The cells must be sorted, held, transferred into another container with an extremely small volume (

Scientists from the University of Washington (Seattle, USA), Iowa State University (Ames, USA), and Fred Hutchinson Cancer Research Center (Seattle, USA) have used microfluidic technology to overcome these problems. All of the necessary steps occur reliably on a specially developed microchip using minimal amounts of solvent and without requiring the cells to be marked. In contrast to conventional microfluidic chips, this one requires neither complex fabrication technology nor components like valves or agitators.

The Self-Digitization Dielectrophoretic (SD-DEP) chip is about the size of a coin and has two parallel microchannels (50 μm deep x 35 μm wide x 3.2 cm long) connected by numerous tiny little chambers. The openings of the microchannels are only 15 μm wide. A thin electrode is stretched along the length of the channels. The channels and chambers are filled with a buffer, an alternating voltage is applied, and the sample is added to one of the microchannels. The team headed by Robbyn K. Anand and Daniel T. Chiu used leukemia cells in their experiments.

Local maxima of the electric field occur at the narrow entrances to the chambers. Cells that enter the chambers are "trapped". Because the dimensions of the entrance are similar to the average size of a cell, only a single cell can be trapped by each chamber entrance. When the alternating current is switched off and the flow rate is increased by injection of the reagents required for subsequent analysis, the cells are washed into the chambers. An oil is then added to seal the chambers. The cells are then dissolved, and the nucleic acids are released and multiplied and can be identified as leukemia cells by a marker gene.

In future studies, the researchers hope to use the chip to determine the distribution of genetic mutations that are related to resistance in leukemia cells and thus may cause relapses.

Credit: 
Wiley

Robots as tools and partners in rehabilitation

image: A robot congratulates a patient for correctly sorting the colored beakers.

Image: 
Shelly Levy-Tzedek

In future decades the need for effective strategies for medical rehabilitation will increase significantly, because patients' rate of survival after diseases with severe functional deficits, such as a stroke, will increase. Socially assistive robots (SARs) are already being used in rehabilitation for this reason. In the journal Science Robotics, a research team led by neuroscientist Dr. Philipp Kellmeyer of the Freiburg University Medical Center and Prof. Dr. Oliver Müller from the Department of Philosophy of the University of Freiburg, analyzes the improvements necessary to make SARs valuable and trustworthy assistants for medical therapies.

The researchers conclude that the development of SARs not only requires technical improvements, but primarily social, trust-building measures. Rehabilitation patients in particular are dependent on a reliable relationship with their therapists. So there must be trust in the safety of the robotic system, especially regarding the predictability of the machines' behavior. Given the ever-growing intelligence of the robots and with it their independence, this is highly important.

In addition, robots and patients can only interact well, the scientists explain, when they have shared goals that they pursue through the therapy. To achieve this, aspects of philosophical and developmental psychology must also be taken into account in the development of SARs: the ability of robots to recognize the aims and motives of a patient is a critical requirement if cooperation is to be successful. So there must also be trust for the participants to adapt to one another. The frustration felt by patients, for instance as a result of physical or linguistic limitations, would be avoided if the robots were adapted to the specific needs and vulnerabilities of the patient in question.

Philipp Kellmeyer and Oliver Müller are members of the Cluster of Excellence BrainLinks-BrainTools of the University of Freiburg. The study also involved Prof. Dr. Shelly Levy-Tzedek and Ronit Feingold-Polak from the Ben Gurion University of the Negev, Israel. In the 2018/19 academic year, the Freiburg researchers together with the legal academic Prof. Dr. Silja Vöneky and the IT specialist Prof. Dr. Wolfram Burgard, both from the University of Freiburg, are developing a Research Focus into normative aspects of interaction between people and autonomous intelligent systems at the Freiburg Institute for Advanced Studies (FRIAS).

Credit: 
University of Freiburg

New technology for smart materials

CHARLOTTESVILLE, Va. - University of Virginia mechanical engineers and materials scientists, in collaboration with materials scientists at Penn State, the University of Maryland and the National Institute of Standards and Technology, have invented a "switching effect" for thermal conductivity and mechanical properties that can be incorporated into the fabrication of materials including textiles and garments.

Using heat transport principles combined with a biopolymer inspired by squid ring teeth, the team studied a material that can dynamically regulate its thermal properties - switching back and forth between insulating and cooling - based on the amount of water that is present.

The invention holds great promise for all sorts of new devices and materials with the ability to regulate temperature and heat flow on demand, including the "smart" fabrics.

"The switching effect of thermal conductivity would be ideal for many applications, including athletics," said John Tomko, a Ph.D. candidate in UVA's Department of Materials Science & Engineering and lead author of an article about the invention published this week in Nature Nanotechnology. "This material has the potential to revolutionize active wear, unleashing the possibility of clothing that can dynamically respond to body heat and regulate temperature. For example, the biopolymer has a low thermal conductivity while dry, essentially storing body heat and keeping the athlete (and his or her muscles!) warm while not active. As soon as the wearer begins to sweat, the material could become hydrated and instantly increase its thermal conductivity, allowing this body heat to escape through the material and cool the athlete down. When the person is done training and the sweat has evaporated, the material could go back to an insulative state and keep the wearer warm again.

"And while it may sound highly specialized and only for professional athletes, it would be equally useful from an apparel company perspective," said Tomko, whose research is being conducted as part of the ExSite Group led by Professor Patrick Hopkins of UVA's departments of Mechanical & Aerospace Engineering, Materials Science & Engineering and Physics.

The garments made using this technology would be a step above what is available on the market today because of the materials' extremely wide range of technical capabilities. For example, polar fleece generally requires different weights to accommodate different combinations of temperatures and activity levels. The new material could accommodate the whole gamut of athletic scenarios within one garment. Fleece is considered breathable, a passive state, but the biopolymer material would actively conduct heat out of the garment.

"While realizing thermally and mechanically smart fabrics is one major advance of this work, the ability to provide such large and reversible modification in the thermal conductivity of a material `on-demand' has potential game-changing applications," said Hopkins, Tomko's Ph.D. advisor and co-lead on this research effort with Professor Melik Demirel at Penn State. "The thermal conductivity of materials is typically assumed to be a static, intrinsic property of a material. What we have shown is that you can 'switch' the thermal conductivity of a material in a similar way that you would turn on and off a light bulb via a switch on the wall, only instead of using electricity, we can use water to create this switch. This will allow for dynamic and controllable ways to regulate the temperature and/or heat flow of materials and devices.

"The magnitude of this on/off thermal conductivity ratio is large enough where we can now envision applications including not only smart fabrics, but also more efficient recycling of wasted heat to create electricity, making self-thermally regulating electrical devices, or creating new avenues for wind- and hydropower production."

The process of creating "programmable" materials could be good news for manufacturers and the environment. Usually textile companies have to rely on different types of fibers and different manufacturing processes to create clothing with varying attributes, but the tunable aspect of these materials means that insulating and cooling attributes can be created from the same process. This could lead to lower manufacturing costs and reduced carbon emissions.

Squid ring teeth, which make programmable materials possible, are an inspiring new avenue of scientific research first discovered at Penn State. These biomaterials contain unique properties such as strength, self-healing and biocompatibility, making them exceptionally suitable for programming at the molecular level, in this case for thermal regulation. This is more good news for the environment, since they can be extracted from the suction cups of squids or can be synthetically produced via industrial fermentation, both sustainable resources.

Tomko's and Hopkins' collaborators on the research are Abdon Pena-Francesch, former Ph.D. student at Penn State and now a von Humboldt Fellow at the Max Planck Institute in Stuttgart, Germany; Huihun Jung, a doctoral candidate in engineering science and mechanics at Penn State; Madhusudan Tyagi a researcher with the University of Maryland and the National Institute of Standards and Technology; Benjamin D. Allen, assistant research professor of biochemistry and molecular biology at Penn State; and Demirel, professor of engineering science and mechanics and director, Center for Research on Advanced Fiber Technologies at Penn State.

"The beauty and unique power of neutron scattering helped us solve the puzzle of how tandem repeat units really influence the observed thermal conductivity in hydrated samples, as heavy water simply becomes 'invisible' to neutrons! We found that the increased and 'altered' dynamics of amorphous strands were, actually, responsible for this increased thermal conductivity in hydrated samples," said the University of Maryland's Tyagi. "I believe this research is going to change how we study thermal properties of soft matter, particularly proteins and polymers, using neutrons as typically hard condensed matter is where most of the work is done in this regard."

Tomko and fellow UVA Engineering researchers, along with graduate students from UVA's Darden School of Business, won first place in a Patagonia outdoor apparel company competition this spring to determine the best ideas for attaining carbon neutrality. Raw materials production is responsible for about 80 percent of Patagonia's total carbon emissions, largely attributed to the production of polyester fabrics derived from fossil fuels. The UVA team proposed that the company transition to biopolymer textiles, which can be engineered solely from renewable resources. The new materials would look and function better than polyester and wool alternatives without relying on fossil fuel.

Credit: 
University of Virginia School of Engineering and Applied Science