Tech

Leap forward' in risk management of rectal cancer

image: (From left) Ultrasound images, photoacoustic microscopy (PAM)/US images, and representative hematoxylin-eosin (H&E) stain of the tumor bed. Panel C: treated tumor bed with residual cancer; Panel E, treated tumor bed with no residual cancer.

Image: 
(Image courtesy of Zhu lab)

Rectal cancer, along with colon cancer, is the third-most common type of cancer in the United States, and treatment and surgery greatly affect the quality of life of patients. A multi-disciplinary team at Washington University in St. Louis has developed and tested an innovative imaging technique that is able to differentiate between rectal tissues with residual cancers and those without tumors after chemotherapy and radiation, which could one day help to avoid unnecessary surgeries in some patients who have achieved complete tumor destruction after chemoradiation.

Quing Zhu, PhD, professor of biomedical engineering at the McKelvey School of Engineering, and members of her lab developed a system using a new imaging technique -- acoustic resolution photoacoustic microscopy co-registered with ultrasound (AR-PAM/US) and paired with a "deep learning" artificial intelligence neural network. This technique was better able to determine the presence of residual tumors in treated rectal tumor bed tissues than other types of imaging, such as MRI, which is often unable to discern residual cancer from scar tissue. Results of the research -- the first feasibility study using AR-PAM imaging in patients with rectal cancer previously treated with radiation and chemotherapy -- are published in the journal Radiology March 23.

"Our PAM/US system paired with the deep learning neural network has great potential to better identify patients suitable for nonoperative management and improve patient quality of life," Zhu said. "If we can tell after radiation and chemotherapy which patients may have a good response with no residual tumors, the patient may be able to avoid surgery."

Zhu, also professor of radiology at the School of Medicine, was joined on the paper by doctoral students Xiandong Leng, co-first author with Shihab Uddin, PhD, who earned a doctorate in biomedical engineering from McKelvey Engineering in 2020, and Hongbo Luo; and Sitai Kou.

They conducted a yearlong prospective study of patients with rectal cancer treated at Barnes-Jewish Hospital in St. Louis by School of Medicine clinicians: Matthew Mutch, MD, the Solon and Bettie Gershman Chair in Colon and Rectal Surgery, chief of the Section of Colon and Rectal Surgery and professor of surgery; and William Chapman Jr., MD, a resident physician in surgery. Zhu said others who contributed significantly to the study were: Steven R. Hunt, MD, associate professor of surgery; Anup Shetty, MD, assistant professor of radiology; Deyali Chatterjee, MD, assistant professor of pathology and immunology; and Michelle Cusumano, study coordinator in the Section of Colon and Rectal Surgery.

Chapman said the team spent more than three years investigating this technology in surgically removed colon and rectum specimens with promising results before developing the prototype for patient studies.

"We hope that improved imaging provided by AR/PAM will significantly improve our ability to discriminate between patients with residual tumors and those who have been completely cured without surgery," he said. "By both avoiding morbid, unneeded surgery and reducing the burden of surveillance testing, photoacoustic imaging could be a leap forward in the current management of locally invasive rectal cancer."

In the study, after completing chemotherapy and radiation, patients underwent PAM/US imaging with a handheld endorectal laser probe developed in Zhu's lab. The probe has a rotating head that allows for a 360-degree image of the rectum, the last 6 inches of the colon. The end of the probe, which takes one image per second, is covered by a small latex balloon inflated with water that allows transmission of the ultrasound and photoacoustic waves to the rectal wall. These waves highlight changes in the vasculature in the tissue and as well as new tumor growth. The imaging procedure added about 20 minutes of time that patients were under anesthesia.

Leng, who has been working on this project since 2017, was instrumental in system and software development, Zhu said. He designed and built the AR/PAM endoscope -- the first of its kind -- and programmed the system to acquire data and process and display images in real time.

"From the very preliminary ex vivo data, my setup clearly disclosed multi-layer structure from ultrasound image and rich blood vessels in the submucosa of normal colorectal tissue," Leng said. "In contrast to normal tissue, malignant tumor bed shows a lack of multilayer structure and blood vessels. This important finding may reveal an important feature accessing patients' treatment response to chemotherapy and radiation therapy."

In the first phase of the study, the team used data from surgically removed tissue specimens from more than 2,000 images from 22 patients to train the neural network, an artificial intelligence-based set of algorithms that operates similar to the human brain, to recognize normal and cancerous colorectal tissue. In the second phase, they used images from the living tissue from 10 patients who had previously undergone chemotherapy and radiation. Several hundred images from five of those patients were used to fine-tune the neural network, and hundreds of images from five patents were withheld for testing.

The deep-learning PAM model, designed and developed by Uddin, correctly predicted the cancerous status of all five of the patients who had undergone imaging, while the MRI images misclassified three out of five patients, and the ultrasound-only deep-learning model incorrectly declared three patients as cancer free.

Mutch said the team is very optimistic about the results.

"This is spectacular news, and it moves us closer in the transition from concept to clinically useful technology," he said. "The hope is that it will allow us to differentiate patients who had a complete response to chemotherapy and radiation from those patients with residual tumor. This will help better determine which patients can be managed nonoperatively versus those who truly need an operation."

Going forward, the team plans to conduct a clinical study to confirm these initial results in a large group of rectal cancer patients who have completed chemotherapy and radiation and will undergo surgery or follow patients after treatment.

Credit: 
Washington University in St. Louis

IQWiG publishes new version of its General Methods - English translation now available

The Institute for Quality and Efficiency in Health Care (IQWiG) revised its methods paper and published the German original version "Allgemeine Methoden 6.0" (General Methods 6.0) on http://www.iqwig.de in November 2020. This document is the basis for the scientific work of the Institute and its external experts as well as for the collaboration with its contracting agencies, the Federal Joint Committee (G-BA) and Federal Ministry of Health (BMG). The English translation is now available on http://www.iqwig.de/en/about-us/methods/methods-paper/.

New features include statements on the investigation of the relationship between volume of services and quality, a section on different treatment periods in studies, and a more concrete approach to the assessment of clinical relevance.

In contrast to the draft published for commenting in December 2019, the published Version 6.0 no longer contains a section on determining the extent of added benefit in the case of continuous data, as further clarification is needed here. In a future draft, IQWiG will therefore revise the derivation of thresholds and submit an updated proposal for renewed commenting.

The General Methods summarize the scientific standards used by the Institute. In order to reflect the expansion of the Institute's legal tasks and the further development of standards in scientific disciplines, this manual is updated regularly - partly in smaller steps, partly by fundamental revision, which is then reflected in a new version number.

Minimum volumes: How are volume of services and quality related?

In Germany, hospitals may only provide and charge for certain elective services if they have performed them frequently enough in previous years. This is based on findings on the relationship between the volume of services and the quality of treatment outcome. On behalf of the G-BA, for several interventions IQWiG already investigated whether a positive correlation is proven. Section 5.2 describes the Institute's approach to information retrieval and assessment in these cases: possible volume-outcome relationships are examined on the basis of observational studies or controlled intervention studies.

Different mean observation periods may reduce the certainty of conclusions

In oncology in particular, the dossiers submitted by manufacturers for the early benefit assessment of new drugs often contain data from studies in which the mean observation periods differ in the two groups to be compared. This makes it difficult to conduct fair comparisons in which the adverse effects of treatments should neither be over- nor underestimated. The manufacturers usually argue that treatment in one arm was discontinued or switched more often. However, IQWiG stresses that simple statistical methods based on relative frequencies or incidence densities cannot adequately compensate for the decrease in the certainty of conclusions that this causes. For this reason, in Version 6.0, the Institute emphasizes the necessity of complete data collection and in the new Section 9.3.12 calls for the use of adequate survival time methods, also in the case of treatment discontinuation or switching.

In the course of the commenting procedure, the Institute adopted a proposal of the biometric societies and in the above section no longer refers only to "different mean observation periods", but generally to "variable observation periods". The new section now also includes references to special methods of survival time analysis, such as how to deal with competing risks.

Shift of emphasis in the assessment of clinical relevance

In order to assess the clinical relevance of a difference between two treatment alternatives, in recent years responder analyses with a response criterion for patient-relevant outcomes such as "health-related quality of life" or "symptoms" have been increasingly performed. IQWiG has now specified when responder analyses will be used for the assessment, that is, the required minimum scale range of the score of a measurement instrument (Section 9.3.3). This is intended to provide clarity for manufacturers and to prevent arbitrary responder analyses based on incomprehensible responder definitions.

Further changes and updates

The Institute also supplemented or modified many other parts of the General Methods. For example, Section 3.1.3 on the presentation of aspects of harm in benefit assessments now contains more details than before. In Chapter 5, renamed "Assessments of health care", the Institute fundamentally revised its comments on evidence-based guidelines. And in Section 9.3.7 on meta-analyses, the methodological approach for applying the Knapp-Hartung method in meta-analyses with random effects, which was introduced in 2016 in the last revision of the methods paper, was substantiated.

An overview of the main changes is given in the methods paper, which now comprises almost 300 A4 pages, under the heading "What is new?"

Active participation in the commenting procedure

IQWiG published the draft for the new version of the General Methods at the end of 2019 and called for comments. The Institute then received 40 comments, some of them very detailed. After the deadline for the submission of comments expired, a debate with persons who had submitted comments was held in June 2020.

Helpful suggestions from the commenting procedure were incorporated into the present version. In addition to the major adjustments mentioned above, the Institute, among other things, adopted a further proposal by statistical societies and supplemented the methods for meta-analyses of diagnostic accuracy studies in Section 9.3.7. The removal of the section on the assessment of open-label studies with subjective outcomes, which was criticized in the commenting procedure, should also be mentioned here. Due to new literature on the topic published in the meantime, the final version of the General Methods no longer includes this section.

Together with the methods paper, IQWiG publishes a documentation and evaluation of the hearing in which, on the one hand, all written comments are documented in full. On the other hand, the document contains the Institute's response to the comments submitted, which address all main arguments of the comments.
IQWiG will provide timely information on the next processing step.

Credit: 
Institute for Quality and Efficiency in Health Care

3D-printed artificial lung model

image: Inkjet bioprinted alveolar barrier model.

Image: 
POSTECH

The warmer temperature and blooming flowers signal the arrival of spring. However, worries about respiratory diseases are also on the rise due to fine dust and viruses. The lung, which is vital to breathing, is rather challenging to create artificially for experimental use due to its complex structure and thinness. Recently, a POSTECH research team has succeeded in producing an artificial lung model using 3D printing.

Professor Sungjune Jung of the Department of Materials Science and Engineering, and Professor Joo-Yeon Yoo and Ph.D. candidate Dayoon Kang of the Department of Life Sciences at POSTECH have together succeeded in creating a three-dimensional lung model containing a variety of human alveolar cell lines using inkjet bioprinting. Inkjet bioprinting is attracting attention for enabling the production of standardized and patient-customized tissues, and is anticipated to replace conventional test models as it can be mass-produced. The findings of this study were recently published in Advanced Science.

Human lungs constantly breathe to take in oxygen necessary for vital activity and expel carbon dioxide generated as a by-product. Oxygen entering the body arrives at the alveoli through the airways and is replaced with carbon dioxide carried by blood through the capillaries of the alveoli.

Here, the alveoli are made of a thin layer of epithelial cells and are surrounded by thin capillaries that mimic hollow grapes. The alveolar membrane, through which oxygen and carbon dioxide travel, is a three-layered structure of epithelial/basement membrane/endothelial capillary layer and is very thin for ease of gas exchange process. Until now, there have been limitations in accurately replicating alveoli with such thin and complex structure.

To this, the research team fabricated a three-layer alveolar barrier model with thickness of about 10 micrometers (μm) through high-resolution deposition of alveolar cells using drop-on-demand1 inkjet printing. This newly produced model showed higher degree of simulation compared to a two-dimensional cell culture model as well as a three-dimensional non-structured model cultured from mixing alveolar cells and collagen.

The research team also confirmed that the newly developed alveolar barrier model similarly reproduced the physiological response at the actual tissue level in regards to viral infectivity and antiviral response. When this model was used as an influenza virus infection model, the researchers were able to observe the self-proliferation and antiviral response of the virus.

"We have been printing cells and fabricating tissues using the bioprinting method, but this is the first time in the world to simulate an alveolar barrier with a three-layer structure of about 10 μm thickness," explained Professor Sungjune Jung of POSTECH. "It is also the first time an artificial alveolar barrier was infected with a virus and a physiological antiviral response was observed."

Professor Jung added, "The artificial tissue produced this time can be used as an early platform for evaluating efficacy of therapeutic drugs and vaccines countering infectious respiratory viruses - including the COVID-19 virus - as it enables mass production and quality control as well as fabrication of patient-customized disease models."

Credit: 
Pohang University of Science & Technology (POSTECH)

Tired of video conferencing? Research suggests you're right to question its effectiveness

In the year since the coronavirus pandemic upended how just about every person on the planet interacts with one another, video conferencing has become the de facto tool for group collaboration within many organizations. The prevalent assumption is that technology that helps to mimic face-to-face interactions via a video camera will be most effective in achieving the same results, yet there's little data to actually back up this presumption. Now, a new study challenges this assumption and suggests that non-visual communication methods that better synchronize and boost audio cues are in fact more effective.

Synchrony Promotes Collective Intelligence

Researchers from Carnegie Mellon's Tepper School of Business and the Department of Communication at the University of California, Santa Barbara, have studied collective intelligence--the ability of a group to solve a wide range of problems--and how synchrony in non-verbal cues helps to develop it. There are many forms of synchrony, but the common view is that synchrony occurs when two or more nonverbal behaviors are aligned. Essentially, conversation is what happens when at least two speakers take turns sharing their thoughts, and nonverbal cues are how they establish when and how to take these turns.

Previous research has shown that synchrony promotes collective intelligence because it improves joint problem solving. So it's not too far-fetched that many would assume that if a conversation can't take place face-to-face, it would be best simulated with both video and audio software.

The researchers focused on two forms of synchrony: facial expression synchrony and prosodic synchrony. Facial expression synchrony is pretty straightforward and involves the perceived movement of facial features. Prosodic synchrony, on the other hand, captures the intonation, tone, stress, and rhythm of speech. They hypothesized that during virtual collaboration, collective intelligence would develop through facial expression synchrony when the collaborators had access to both audio and visual cues. Without visual cues, though, they predicted that prosodic synchrony would enable groups to achieve collective intelligence instead.

Collective Intelligence Is Achievable With or Without Video, but Even More So Without

"We found that video conferencing can actually reduce collective intelligence," says Anita Williams Woolley, Associate Professor of Organizational Behavior and Theory at Carnegie Mellon's Tepper School of Business, who co-authored the paper. "This is because it leads to more unequal contribution to conversation and disrupts vocal synchrony. Our study underscores the importance of audio cues, which appear to be compromised by video access."

Woolley and her colleagues pulled together a large, diverse sample of 198 individuals and divided them into 99 pairs. Forty-nine of these pairs formed the first group, which were physically separated with audio capabilities but not video capabilities. The remaining 50 pairs were also physically separated but had both video and audio capabilities. During a 30-minute session, each duo completed six tasks designed to test collective intelligence. As Woolley points out, the results challenge the prevailing assumptions.

The groups with video access did achieve some form of collective intelligence through facial expression synchrony, suggesting that when video is available, collaborators should be aware of these cues. However, the researchers found that prosodic synchrony improved collective intelligence whether or not the group had access to video technology and that this synchrony was enhanced by equality in speaking turns. Most strikingly, though, was that video access dampened the pairs' ability to achieve equality in speaking turns, meaning that using video conferencing can actually limit prosodic synchrony and therefore impede upon collective intelligence.

Specifically, groups regulate speaking turns via a set of interaction rules, which include yielding, requesting, or maintaining turns. Collaborators often subtly communicate these rules through nonverbal cues such as eye contact or vocal cues, such as altering volume and rate. However, visual nonverbal cues appear to enable some collaborators to dominate the conversation. By contrast, the study shows that when groups have audio cues only, the lack of video does not prevent them from communicating these interaction rules but actually helps them to regulate their conversation more smoothly by engaging in more equal exchange of turns and by establishing improved prosodic synchrony.

What does this mean for organizations whose members are still physically separated by the COVID-19 pandemic? It might be worth it to disable the video function in order to promote better communication and social interaction during collaborative problem solving.

Credit: 
Carnegie Mellon University

Leveraging the 5G network to wirelessly power IoT devices

video: Researchers at Georgia Tech's ATHENA lab discuss an innovative way to tap into the over-capacity of 5G networks, turning them into "a wireless power grid" for powering Internet of Things (IoT) devices.

Image: 
Christopher Moore

Researchers at the Georgia Institute of Technology have uncovered an innovative way to tap into the over-capacity of 5G networks, turning them into "a wireless power grid" for powering Internet of Things (IoT) devices that today need batteries to operate.

The Georgia Tech inventors have developed a flexible Rotman lens-based rectifying antenna (rectenna) system capable, for the first time, of millimeter-wave harvesting in the 28-GHz band. (The Rotman lens is key for beamforming networks and is frequently used in radar surveillance systems to see targets in multiple directions without physically moving the antenna system.)

But to harvest enough power to supply low-power devices at long ranges, large aperture antennas are required. The problem with large antennas is they have a narrowing field of view. This limitation prevents their operation if the antenna is widely dispersed from a 5G base station.

"We've solved the problem of only being able to look from one direction with a system that has a wide angle of coverage," said senior researcher Aline Eid in the ATHENA lab, established in Georgia Tech's School of Electrical and Computer Engineering to advance and develop novel technologies for electromagnetic, wireless, RF, millimeter-wave, and sub-terahertz applications.

The findings were reported in the Jan.12 issue of the journal Scientific Reports.

The FCC has authorized 5G to focalize power much more densely compared with previous generations of cellular networks. While today's 5G was built for high-bandwidth communication, the high-frequency network holds rich opportunity to "harvest" unused power that would otherwise be wasted.

Tapping Into 5G High-frequency Power

"With this innovation, we can have a large antenna, which works at higher frequencies and can receive power from any direction. It's direction-agnostic, which makes it a lot more practical," noted Jimmy Hester, senior lab advisor and the CTO and co-founder of Atheraxon, a Georgia Tech spinoff developing 5G radio-frequency identification (RFID) technology.

With the Georgia Tech solution, all the electromagnetic energy collected by the antenna arrays from one direction is combined and fed into a single rectifier, which maximizes its efficiency.

"People have attempted to do energy harvesting at high frequencies like 24 or 35 Gigahertz before," Eid said, but such antennas only worked if they had line of sight to the 5G base station; there was no way to increase their angle of coverage until now.

Operating just like an optical lens, the Rotman lens provides six fields of view simultaneously in a pattern shaped like a spider. Tuning the shape of the lens results in a structure with one angle of curvature on the beam-port side and another on the antenna side. This enables the structure to map a set of selected radiation directions to an associated set of beam-ports. The lens is then used as an intermediate component between the receiving antennas and the rectifiers for 5G energy harvesting.

This novel approach addresses the tradeoff between rectenna angular coverage and turn-on sensitivity with a structure that merges unique radio frequency (RF) and direct current (DC) combination techniques, thereby enabling a system with both high gain and large beamwidth.

In demonstrations, Georgia Tech's technology achieved a 21-fold increase in harvested power compared with a referenced counterpart, while maintaining identical angular coverage.

This robust system may open the door for new passive, long-range, mm-wave 5G-powered RFID for wearable and ubiquitous IoT applications. The researchers used inhouse additive manufacturing to print the palm-sized mm-wave harvesters on a multitude of everyday flexible and rigid substrates. Providing 3D and inkjet printing options will make the system more affordable and accessible to a broad range of users, platforms, frequencies, and applications.

Replacing Batteries With Over-the-air Charging

"The fact is 5G is going to be everywhere, especially in urban areas. You can replace millions, or tens of millions, of batteries of wireless sensors, especially for smart city and smart agricultural applications," said Emmanouil (Manos) Tentzeris, Ken Byers Professor in Flexible Electronics in the School of Electrical and Computer Engineering.

Tentzeris predicts that power as a service will be the next big application for the telecom industry, just as data overtook voice services as a major revenue producer.

The research team is most excited by the prospect of service providers embracing this technology to offer power on demand "over the air," eliminating the need for batteries.

"I've been working on energy harvesting conventionally for at least six years, and for most of this time it didn't seem like there was a key to make energy harvesting work in the real world, because of FCC limits on power emission and focalization," Hester said. "With the advent of 5G networks, this could actually work and we've demonstrated it. That's extremely exciting -- we could get rid of batteries."

Credit: 
Georgia Institute of Technology

The 'great leveler' revisited: Why the Corona pandemic might boost inequality in society

How will Covid-19 affect inequality in countries worldwide? The current pandemic is sometimes marked as ‘great equalizer’, but scientists from Utrecht and Wageningen University show why the opposite might be true. A study by prof. Bas van Bavel and prof. Marten Scheffer shows that throughout history, most disasters and pandemics have boosted inequality instead of levelling it. Whether such disastrous events function as levellers or not, depends on the distribution of economic wealth and political leverage within a society at the moment of crisis. Their findings on the historical effects of crises on equality in societies are now published open access in Nature HSS Communications.

It is often thought that the main levellers of inequality in societies were natural disasters such as epidemics or earthquakes, and social turmoil such as wars and revolutions. The most salient example is the Black Death of 1347-1352, a large-scale pandemic that killed up to half of the Eurasian population. In several European societies, wealth disparities seem to have been reduced afterwards. The suggested logic behind that equitable effect is the decimation of people while capital remained intact, thereby shifting the economic balance in favour of labour. Crises as windows of opportunity In most cases throughout history, the opposite is true. Bas van Bavel: "In spite of the marked differences in character and direct impact of the shocks we studied, most historical disasters were followed by a widening of wealth gaps." In their article, historian Bas van Bavel (Utrecht University) and ecologist Marten Scheffer (Wageningen University) critically review evidence of the effects of catastrophes such as the plague on inequality, from medieval times till the present. Van Bavel and Scheffer used empirical data to study the long term effects of shocks on inequality.

Their research shows a twofold effect. First, the wealth distribution and institutional outlay of these societies at the moment of the shock to a large extent shaped the impact. Subsequently, the distribution of political leverage in society came into play in determining the institutional responses. Van Bavel explains: "Upon a crisis, rules tend to be rewritten. Social groups and organizations with the greatest leverage can therefore use that window of opportunity to adapt institutional rules, thereby shaping long-term wealth distribution. As most societies were historically unequal, in most cases the result was a further widening of disparities."

Power to the people: the importance of bottom-up organisations Over the centuries, exceptions have occurred in situations where the ordinary people had strong leverage in shaping the response to the crisis - through organizations such as guilds, fraternities, trade unions, cooperatives, and political movements. Scheffer: "Our results provide empirical support for the view that in nations where such leverage of ordinary people is weak, the responses to novel crises such as the COVID-19 pandemic may increase inequality instead of diminishing it. Furthermore, when explaining the effects of a disaster on equality, we need to distinguish between the immediate impact, the medium-term effects of the institutional measures taken in response to the disaster, and the indirect outcomes in the long run."

What history suggests about the current pandemic Their insights also hold relevance when thinking about the effects of the COVID 19-crisis. Van Bavel: "The direct impact and long term effects are likely to enlarge material inequalities. The social and economic context at present is much more similar to that during the 2008 crisis than to the context during the twentieth-century disasters - when societies were more equitable both in wealth distribution and societal leverage than at present."

Credit: 
Utrecht University

Scientists uncover a process that stands in the way of making quantum dots brighter

image: SLAC and Stanford researchers have made the first atomic-scale observations of how nanocrystals known as quantum dots lose their light-producing efficiency when excited with intense light. Dots were excited with green light (top) or higher-energy purple light (bottom), and scientists watched them respond with an "electron camera," MeV-UED. When hit with green light, the dots relaxed, and excited pairs of electrons and holes converted virtually all of the incoming energy to light. But when hit with purple light, some of the energy was trapped on the surface of the dot; this distorted the arrangement of surrounding atoms and wasted energy as heat.
The results have broad implications for developing future quantum and photonics technologies where light replaces electrons in computers and fluids in refrigerators.

Image: 
B. Guzelturk et al., Nature Communications, 25 March 2021

Bright semiconductor nanocrystals known as quantum dots give QLED TV screens their vibrant colors. But attempts to increase the intensity of that light generate heat instead, reducing the dots' light-producing efficiency.

A new study explains why, and the results have broad implications for developing future quantum and photonics technologies where light replaces electrons in computers and fluids in refrigerators, for example.

In a QLED TV screen, dots absorb blue light and turn it into green or red. At the low energies where TV screens operate, this conversion of light from one color to another is virtually 100% efficient. But at the higher excitation energies required for brighter screens and other technologies, the efficiency drops off sharply. Researchers had theories about why this happens, but no one had ever observed it at the atomic scale until now.

To find out more, scientists at the Department of Energy's SLAC National Accelerator Laboratory used a high-speed "electron camera" to watch dots turn incoming high-energy laser light into their own glowing light emissions.

The experiments revealed that the incoming high-energy laser light ejects electrons from the dot's atoms, and their corresponding holes - empty spots with positive charges that are free to move around - become trapped at the surface of the dot, producing unwanted waste heat.

In addition, electrons and holes recombine in a way that gives off additional heat energy. This increases the jiggling of the dot's atoms, deforms its crystal structure and wastes even more energy that could have gone into making the dots brighter.

"This represents a key way that energy is sucked out of the system without giving rise to light," said Aaron Lindenberg, a Stanford University associate professor and investigator with the Stanford Institute for Materials and Energy Sciences at SLAC who led the study with postdoctoral researcher Burak Guzelturk.

"Trying to figure out what underlies this process has been the subject of study for decades," he said. "This is the first time we could see what the atoms are actually doing while excited state energy is being lost as heat."

The research team, which included scientists from SLAC, Stanford, the University of California, Berkeley and DOE's Lawrence Berkeley National Laboratory, described the results in Nature Communications today.

Emitting a pure, brilliant glow

Despite their tiny size - they have about the same diameter as four strands of DNA - quantum dot nanocrystals are surprisingly complex and highly engineered. They emit extremely pure light whose color can be tuned by adjusting their size, shape, composition and surface chemistry. The quantum dots used in this study were invented more than two decades ago, and today they're widely used in bright, energy-efficient displays and in imaging tools for biology and medicine.

Understanding and fixing problems that stand in the way of making dots more efficient at higher energies is a very hot field of research right now, said Guzelturk, who carried out experiments at SLAC with postdoctoral researcher Ben Cotts.

Previous studies had focused on how the dots' electrons behaved. But in this study, the team was able to see the movements of whole atoms, too, with an electron camera known as MeV-UED. It hits samples with short pulses of electrons with very high energies, measured in millions of electronvolts (MeV). In a process called ultrafast electron diffraction (UED), the electrons scatter off the sample and into detectors, creating patterns that reveal what both electrons and atoms are doing.

As the SLAC/Stanford team measured the behavior of quantum dots that had been hit with various wavelengths and intensities of laser light, UC Berkeley graduate students Dipti Jasrasaria and John Philbin worked with Berkeley theoretical chemist Eran Rabani to calculate and understand the resulting interplay of electronic and atomic motions from a theoretical standpoint.

"We met with the experimenters quite often," Rabani said. "They came with a problem and we started to work together to understand it. Thoughts were going back and forth, but it was all seeded from the experiments, which were a big breakthrough in being able to measure what happens to the quantum dots' atomic lattice when it's intensely excited."

A future of light-based technology

The study was carried out by researchers in a DOE Energy Frontier Research Center, Photonics at Thermodynamic Limits, led by Jennifer Dionne, a Stanford associate professor of materials science and engineering and senior associate vice provost of research platforms/shared facilities. Her research group worked with Lindenberg's group to help develop the experimental technique for probing the nanocrystals.

The center's ultimate goal, Dionne said, is to demonstrate photonic processes, such as light absorption and emission, at the limits of what thermodynamics allows. This could bring about technologies like refrigeration, heating, cooling and energy storage - as well as quantum computers and new engines for space exploration - powered entirely by light.

"To create photonic thermodynamic cycles, you need to precisely control how light, heat, atoms, and electrons interact in materials," Dionne said. "This work is exciting because it provides an unprecedented lens on the electronic and thermal processes that limit the light emission efficiency. The particles studied already have record quantum yields, but now there is a path toward designing almost-perfect optical materials." Such high light emission efficiencies could open a host of big futuristic applications, all driven by tiny dots probed with ultrafast electrons.

Credit: 
DOE/SLAC National Accelerator Laboratory

Researchers harvest energy from radio waves to power wearable devices

image: An international team of researchers, led by Huanyu "Larry" Cheng, Dorothy Quiggle Career Development Professor in the Penn State Department of Engineering Science and Mechanics, has developed a stretchable antenna and rectenna system that harvests energy from radio waves in the ambient environment to power wearable devices.

Image: 
Larry Cheng, Penn State

From microwave ovens to Wi-Fi connections, the radio waves that permeate the environment are not just signals of energy consumed but are also sources of energy themselves. An international team of researchers, led by Huanyu "Larry" Cheng, Dorothy Quiggle Career Development Professor in the Penn State Department of Engineering Science and Mechanics, has developed a way to harvest energy from radio waves to power wearable devices.

The researchers recently published their method inMaterials Today Physics.

According to Cheng, current energy sources for wearable health-monitoring devices have their place in powering sensor devices, but each has its setbacks. Solar power, for example, can only harvest energy when exposed to the sun. A self-powered triboelectric device can only harvest energy when the body is in motion.

"We don't want to replace any of these current power sources," Cheng said. "We are trying to provide additional, consistent energy."

The researchers developed a stretchable wideband dipole antenna system capable of wirelessly transmitting data that is collected from health-monitoring sensors. The system consists of two stretchable metal antennas integrated onto conductive graphene material with a metal coating. The wideband design of the system allows it to retain its frequency functions even when stretched, bent and twisted. This system is then connected to a stretchable rectifying circuit, creating a rectified antenna, or "rectenna," capable of converting energy from electromagnetic waves into electricity. This electricity that can be used to power wireless devices or to charge energy storage devices, such as batteries and supercapacitors.

This rectenna can convert radio, or electromagnetic, waves from the ambient environment into energy to power the sensing modules on the device, which track temperature, hydration and pulse oxygen level. Compared to other sources, less energy is produced, but the system can generate power continuously -- a significant advantage, according to Cheng.

"We are utilizing the energy that already surrounds us -- radio waves are everywhere, all the time," Cheng said. "If we don't use this energy found in the ambient environment, it is simply wasted. We can harvest this energy and rectify it into power."

Cheng said that this technology is a building block for him and his team. Combining it with their novel wireless transmissible data device will provide a critical component that will work with the team's existing sensor modules.

"Our next steps will be exploring miniaturized versions of these circuits and working on developing the stretchability of the rectifier," Cheng said. "This is a platform where we can easily combine and apply this technology with other modules that we have created in the past. It is easily extended or adapted for other applications, and we plan to explore those opportunities."

Credit: 
Penn State

Turning wood into plastic

Efforts to shift from petrochemical plastics to renewable and biodegradable plastics have proven tricky -- the production process can require toxic chemicals and is expensive, and the mechanical strength and water stability is often insufficient. But researchers have made a breakthrough, using wood byproducts, that shows promise for producing more durable and sustainable bioplastics.

A study published in Nature Sustainability, co-authored by Yuan Yao, assistant professor of industrial ecology and sustainable systems at Yale School of the Environment (YSE), outlines the process of deconstructing the porous matrix of natural wood into a slurry. The researchers say the resulting material shows a high mechanical strength, stability when holding liquids, and UV-light resistance. It can also be recycled or safely biodegraded in the natural environment, and has a lower life-cycle environmental impact when compared with petroleum-based plastics and other biodegradable plastics.

"There are many people who have tried to develop these kinds of polymers in plastic, but the mechanical strands are not good enough to replace the plastics we currently use, which are made mostly from fossil fuels," says Yao. "We've developed a straightforward and simple manufacturing process that generates biomass-based plastics from wood, but also plastic that delivers good mechanical properties as well."

To create the slurry mixture, the researchers used a wood powder -- a processing residue usually discarded as waste in lumber mills -- and deconstructed the loose, porous structure of the powder with a biodegradable and recyclable deep eutectic solvent (DES). The resulting mixture, which features nanoscale entanglement and hydrogen bonding between the regenerated lignin and cellulose micro/nanofibrils, has a high solid content and high viscosity, which can be casted and rolled without breaking.

Yao then led a comprehensive life cycle assessment to test the environmental impacts of the bioplastic against commons plastics. Sheets of the bioplastic were buried in soil, fracturing after two weeks and completely degrading after three months; additionally, researchers say the bioplastic can be broken back down into the slurry by mechanical stirring, which also allows for the DES to be recovered and reused.

"That, to me, is what really makes this plastic good: It can all be recycled or biodegraded," says Yao. "We've minimized all of the materials and the waste going into nature."

The bioplastic has numerous applications, says Liangbing Hu, a professor at the Center for Materials Innovation at the University of Maryland and co-author of the paper. It can be molded into a film that can be used in plastic bags and packaging -- one of the major uses of plastic and causes of waste production. Hu also says that because the bioplastic can be molded into different shapes, it has potential for use in automobile manufacturing, as well.

One area the research team continues to investigate is the potential impact on forests if the manufacturing of this bioplastic is scaled up. While the process currently uses wood byproducts in manufacturing, the researchers say they are keenly aware that large-scale production could require usage of massive amounts of wood, which could have far-reaching implications on forests, land management, ecosystems and climate change, to name a few.

Yao says the research team has already begun working with a forest ecologist to create forest simulation models, linking the growth cycle of forests with the manufacturing process. She also sees an opportunity to collaborate with people who work in forest-related fields at YSE -- an uncommon convenience.

"It's not often an engineer can walk down the hall and talk to a forester," says Yao.

Credit: 
Yale School of the Environment

A clue to how some fast-growing tumors hide in plain sight

image: Misplaced or aberrant genetic material sends a danger signal.

Image: 
La Jolla Institute for Immunology

LA JOLLA--The glow of a panther's eyes in the darkness. The zig-zagging of a shark's dorsal fin above the water.

Humans are always scanning the world for threats. We want the chance to react, to move, to call for help, before danger strikes. Our cells do the same thing.

The innate immune system is the body's early alert system. It scans cells constantly for signs that a pathogen or dangerous mutation could cause disease. And what does it like to look for? Misplaced genetic material.

The building blocks of DNA, called nucleic acids, are supposed to be hidden away in the cell nucleus. Diseases can change that. Viruses churn out genetic material in parts of the cell where it's not supposed to be. Cancer cells do too.

"Cancer cells harbor damaged DNA," says Sonia Sharma, Ph.D., an associate professor at the La Jolla Institute for Immunology (LJI). "Mislocated DNA or aberrant DNA is a danger signal to the cell. They tell the cell, 'There's a problem here.' It's like the first ringing of the alarm bell for the immune system."

Now Sharma and her colleagues have published a new Nature Immunology study describing the process that triggers this alert system directly inside tumor cells. Their research shows that a tumor-suppressor enzyme called DAPK3 is an essential component of a multi-protein system that senses misplaced genetic material in tumor cells, and slows tumor growth by activating the fierce-sounding STING pathway.

In the world of cancer immunotherapy, the STING pathway is well-known as a critical activator of cancer-killing T cells that kicks off the body's powerful adaptive immune response. The new study shows that through DAPK3 and STING, the tumor's own innate immune system plays a greater role in cancer immunity than previously appreciated.

"The tumor-intrinsic innate immune response plays an important role in natural tumor growth and cancer immunotherapy response," says Sharma.

Tumors evolve mutations in tumor-suppressor genes that allow them to grow faster than normal tissue. Discovery of the critical role that DAPK3 plays in the STING pathway highlights a distinct problem in cancer and cancer immunotherapy. Tumor cells can acquire mutations that allow them to evade the immune system by keeping cells from sensing red flags such as misplaced DNA.

Sharma and her colleagues with the LJI Center for Cancer Immunotherapy, Max-Planck Institute of Biochemistry and UC San Diego found that loss of DAPK3 expression or function in tumor cells severely hindered STING activation. Their research in mouse models shows that these tumors were hidden from the immune system, and the researchers observed very few cancer-targeting CD8+ "killer" T cells in DAPK3-deficient tumors. As a result, loss of DAPK3 in tumors decreased responsiveness to cancer immunotherapy.

"Tumors lacking DAPK3 grow faster in vivo because they evade the immune system. They are also resistant to certain immunotherapy regimens, including combination therapies using the immune checkpoint blocker anti-PD1 to target anti-tumor T cells," says Sharma.

Pharmaceutical companies are pursuing immunotherapies to activate STING, which are intended to be used in combination with immune checkpoint blockers. The new findings emphasize the importance of activating STING in tumor cells themselves--to properly set off that early alert system.

"Tumor-intrinsic immune responses are important," says study co-first author Mariko Takahashi, Ph.D., a former LJI postdoctoral associate who now serves at Massachusetts General Hospital Cancer Center.

The researchers are now looking for additional proteins that play a role in the early innate immune response to cancer. "There are many players in the tumor microenvironment," says Takahashi.

Credit: 
La Jolla Institute for Immunology

Warm water has overlooked importance for cold-water fish, like salmon and trout

image: Arctic grayling in ephemerally warm lake outlet, Little Togiak River, Alaska.

Image: 
Jonny Armstrong

CORVALLIS, Ore. - Warm river habitats appear to play a larger than expected role supporting the survival of cold-water fish, such as salmon and trout, a new Oregon State University-led study published today found.

The research has important implications for fish conservation strategies. A common goal among scientists and policymakers is to identify and prioritize habitat for cold-water fish that remains suitably cool during the summer, especially as the climate warms.

This implicitly devalues areas that are seasonally warm, even if they are suitable for fish most of the year, said Jonny Armstrong, lead author of the paper and an ecologist at Oregon State. He called this a "potentially severe blind spot for climate change adaptation."

"Coldwater fish like trout and salmon are the polar bears of river ecosystems - iconic species that are among the most vulnerable to climate change," Armstrong said. "A huge challenge for conservation is to figure out how to help these fish survive a warmer future. The conclusion is that we should not waste money on warm habitats and instead focus on saving the coldest places, such as high mountain streams, which are already the most pristine parts of basins. Most people agree we should give up on places that are warm in summer, but forget that these places are actually optimal for much of the year."

In the new paper, published in Nature Climate Change, Armstrong and collaborators at Oregon State and several federal agencies, show that warm river habitats, typically lower in basins, provide pulses of growth potential during the spring and fall, so-called shoulder seasons, when the rivers are not at peak summer temperatures. Foraging in these warm habitats can provide fish the needed energy to travel to cooler parts of the river during the summer and to reproduce.

"The synergy between cold water and warm water is really important," said Armstrong, an assistant professor in the Department of Fisheries and Wildlife in the College of Agricultural Sciences. "We're not saying cold water is not important. We're saying that warm portions of basins are also important because they grow fish during the shoulder seasons. Conserving this habitat is critical for unlocking the full potential of rivers to support fisheries.

"In a warmer future, many fish will need fish to take a summer vacation and move to cold places to survive the hottest months of the year. Their ability to do that could often depend on how much energy they can get in the spring and how well they can feed in the fall to bounce back. The places that are stressfully warm in summer are just right in spring and fall, and there is growing evidence that they can fuel fisheries"

For the study, the researchers used data from another team of scientists that used remote sensing technology to obtain river water temperature data across entire landscapes throughout the year. That team compiled data for 14 river basins in Oregon, Washington and Idaho.

The OSU-led team plugged these temperature data into a "bioenergetics model" that predicts fish growth potential based on equations derived from lab studies. This provided new insights into how growth opportunities shift across river basins throughout the year, and how a large fraction of total growth potential can accrue during the spring and autumn in places that are too hot during summer.

To explore how these warm habitats could contribute to fisheries, the team created a simulation model in which virtual rainbow trout were given simple behavior rules and allowed to forage throughout the year in a basin with cold tributaries and a warm, productive main-stem river. Their simulations showed the majority of fish moved into cooler waters in the summer and exhibited meager growth rates. However, outside summer, the simulation showed the fish resided primarily in seasonally warm downstream habitats, which fueled the vast majority of their growth.

"In conservation, we often judge streams by their summer conditions; this is when we traditionally do field work, and this is the season we focus on when planning for climate change," Armstrong said. "We place value on places that hold fish during summer and devalue those that don't. Our simulation showed why this can be a problem - the portions of rivers that contribute most to growth may not be the places where fish are found during summer, so they get written off."

The simulations reveal the synergy between seasonally warm and perennially cool habitats and that fish that lived in these two types of habitats grew much more than fish that were restricted to either habitat alone, Armstrong said.

"We think of things in this binary way - it's either warm-water habitat or its cold-water habitat," Armstrong said. "And we have definitions for fish - it's either a warm-water fish or a cold-water fish. But the places we think of as warm are, in fact, cold way more than they are warm."

He then mentioned an example using rivers in Oregon, including the Willamette, a tributary of the Columbia River that runs nearly 200 miles from Eugene to Portland.

"When it's warm enough for humans to swim, it's bad for cold-water fish. But there's only like six weeks of the year where it is comfortable to go swimming in Oregon," Armstrong said. "That speaks to the fact that we write off places because they get too hot through the lens of August. They're actually pretty nice for most of the year if you're a cold-water fish. And fish don't necessarily have to live there in August, just like you don't have to go swimming in the Willamette in December."

Credit: 
Oregon State University

Ocean's mammals at crucial crossroads

image: Humpback whale and researchers pictured from a drone.

Image: 
Duke Marine Robotics and Remote Sensing Lab

The ocean's mammals are at a crucial crossroads - with some at risk of extinction and others showing signs of recovery, researchers say.

In a detailed review of the status of the world's 126 marine mammal species - which include whales, dolphins, seals, sea lions, manatees, dugongs, sea otters and polar bears - scientists found that accidental capture by fisheries (bycatch), climate change and pollution are among the key drivers of decline.

A quarter of these species are now classified as being at risk of extinction (vulnerable, endangered or critically endangered on the IUCN Red List), with the near-extinct vaquita porpoise and the critically endangered North Atlantic right whale among those in greatest danger.

Conservation efforts have enabled recoveries among other species, including the northern elephant seal, humpback whale and Guadalupe fur seal.

The international research team - led by the University of Exeter and including scientists from more than 30 institutions in 13 countries - highlight conservation measures and research techniques that could protect marine mammals into the future.

"We have reached a critical point in terms of marine mammal conservation," said lead author Dr Sarah Nelms, of the Centre for Ecology and Conservation on Exeter's Penryn Campus in Cornwall.

"Very few marine mammal species have been driven to extinction in modern times, but human activities are putting many of them under increasing pressure.

"Our paper examines a range of conservation measures - including Marine Protected Areas (MPAs), bycatch reduction methods and community engagement - as well as highlighting some of the species that are in urgent need of focus."

The researchers say 21% of marine mammal species are listed as "data deficient" in the IUCN Red List - meaning not enough is known to assess their conservation status.

This lack of knowledge makes it difficult to identify which species are in need of protection and what actions should be taken to save them.

Professor Brendan Godley, who leads the Exeter Marine research group, said: "To continue conservation successes and reverse the downward trend in at-risk species, we need to understand the threats they face and the conservation measures that could help.

"Technology such as drone and satellite imaging, electronic tags and molecular techniques are among the tools that will help us do this.

"Additionally, sharing best practice will empower us - and this is why we are so proud to be part of such a large and international group for this project."

Credit: 
University of Exeter

Soft robotic dragonfly signals environmental disruptions

video: With the ability to sense changes in pH, temperature and oil, this completely soft, electronics-free robot dubbed "DraBot" could be the prototype for future environmental sentinels.

Image: 
Ken Kingery, Duke University

DURHAM, N.C. - Engineers at Duke University have developed an electronics-free, entirely soft robot shaped like a dragonfly that can skim across water and react to environmental conditions such as pH, temperature or the presence of oil. The proof-of-principle demonstration could be the precursor to more advanced, autonomous, long-range environmental sentinels for monitoring a wide range of potential telltale signs of problems.

The soft robot is described online March 25 in the journal Advanced Intelligent Systems.

Soft robots are a growing trend in the industry due to their versatility. Soft parts can handle delicate objects such as biological tissues that metal or ceramic components would damage. Soft bodies can help robots float or squeeze into tight spaces where rigid frames would get stuck.

The expanding field was on the mind of Shyni Varghese, professor of biomedical engineering, mechanical engineering and materials science, and orthopaedic surgery at Duke, when inspiration struck.

"I got an email from Shyni from the airport saying she had an idea for a soft robot that uses a self-healing hydrogel that her group has invented in the past to react and move autonomously," said Vardhman Kumar, a PhD student in Varghese's laboratory and first author of the paper. "But that was the extent of the email, and I didn't hear from her again for days. So the idea sort of sat in limbo for a little while until I had enough free time to pursue it, and Shyni said to go for it."

In 2012, Varghese and her laboratory created a self-healing hydrogel that reacts to changes in pH in a matter of seconds. Whether it be a crack in the hydrogel or two adjoining pieces "painted" with it, a change in acidity causes the hydrogel to form new bonds, which are completely reversible when the pH returns to its original levels.

Varghese's hastily written idea was to find a way to use this hydrogel on a soft robot that could travel across water and indicate places where the pH changes. Along with a few other innovations to signal changes in its surroundings, she figured her lab could design such a robot as a sort of autonomous environmental sensor.

With the help of Ung Hyun Ko, a postdoctoral fellow also in Varghese's laboratory, Kumar began designing a soft robot based on a fly. After several iterations, the pair settled on the shape of a dragonfly engineered with a network of interior microchannels that allow it to be controlled with air pressure.

They created the body--about 2.25 inches long with a 1.4-inch wingspan--by pouring silicon into an aluminum mold and baking it. The team used soft lithography to create interior channels and connected with flexible silicon tubing.

DraBot was born.

"Getting DraBot to respond to air pressure controls over long distances using only self-actuators without any electronics was difficult," said Ko. "That was definitely the most challenging part."

DraBot works by controlling the air pressure coming into its wings. Microchannels carry the air into the front wings, where it escapes through a series of holes pointed directly into the back wings. If both back wings are down, the airflow is blocked, and DraBot goes nowhere. But if both wings are up, DraBot goes forward.

To add an element of control, the team also designed balloon actuators under each of the back wings close to DraBot's body. When inflated, the balloons cause the wings to curl upward. By changing which wings are up or down, the researchers tell DraBot where to go.

"We were happy when we were able to control DraBot, but it's based on living things," said Kumar. "And living things don't just move around on their own, they react to their environment."

That's where self-healing hydrogel comes in. By painting one set of wings with the hydrogel, the researchers were able to make DraBot responsive to changes in the surrounding water's pH. If the water becomes acidic, one side's front wing fuses with the back wing. Instead of traveling in a straight line as instructed, the imbalance causes the robot to spin in a circle. Once the pH returns to a normal level, the hydrogel "un-heals," the fused wings separate, and DraBot once again becomes fully responsive to commands.

To beef up its environmental awareness, the researchers also leveraged the sponges under the wings and doped the wings with temperature-responsive materials. When DraBot skims over water with oil floating on the surface, the sponges will soak it up and change color to the corresponding color of oil. And when the water becomes overly warm, DraBot's wings change from red to yellow.

The researchers believe these types of measurements could play an important part in an environmental robotic sensor in the future. Responsiveness to pH can detect freshwater acidification, which is a serious environmental problem affecting several geologically-sensitive regions. The ability to soak up oils makes such long-distance skimming robots an ideal candidate for early detection of oil spills. Changing colors due to temperatures could help spot signs of red tide and the bleaching of coral reefs, which leads to decline in the population of aquatic life.

The team also sees many ways that they could improve on their proof-of-concept. Wireless cameras or solid-state sensors could enhance the capabilities of DraBot. And creating a form of onboard propellant would help similar bots break free of their tubing.

"Instead of using air pressure to control the wings, I could envision using some sort of synthetic biology that generates energy," said Varghese. "That's a totally different field than I work in, so we'll have to have a conversation with some potential collaborators to see what's possible. But that's part of the fun of working on an interdisciplinary project like this."

Credit: 
Duke University

MIT engineers make filters from tree branches to purify drinking water

image: Xylem tissue in gymnosperm sapwood can be used?for?water filtration (as seen on top). Xylem is comprised of conduits that are interconnected?by membranes that? filter out?contaminants present in water (bottom).

Image: 
Courtesy: N.R. Fuller, Sayo Studio

The interiors of nonflowering trees such as pine and ginkgo contain sapwood lined with straw-like conduits known as xylem, which draw water up through a tree's trunk and branches. Xylem conduits are interconnected via thin membranes that act as natural sieves, filtering out bubbles from water and sap.

MIT engineers have been investigating sapwood's natural filtering ability, and have previously fabricated simple filters from peeled cross-sections of sapwood branches, demonstrating that the low-tech design effectively filters bacteria.

Now, the same team has advanced the technology and shown that it works in real-world situations. They have fabricated new xylem filters that can filter out pathogens such as E. coli and rotavirus in lab tests, and have shown that the filter can remove bacteria from contaminated spring, tap, and groundwater. They also developed simple techniques to extend the filters' shelf-life, enabling the woody disks to purify water after being stored in a dry form for at least two years.

The researchers took their techniques to India, where they made xylem filters from native trees and tested the filters with local users. Based on their feedback, the team developed a prototype of a simple filtration system, fitted with replaceable xylem filters that purified water at a rate of one liter per hour.

Their results, published today in Nature Communications, show that xylem filters have potential for use in community settings to remove bacteria and viruses from contaminated drinking water.

The researchers are exploring options to make xylem filters available at large scale, particularly in areas where contaminated drinking water is a major cause of disease and death. The team has launched an open-source website, with guidelines for designing and fabricating xylem filters from various tree types. The website is intended to support entrepreneurs, organizations, and leaders to introduce the technology to broader communities, and inspire students to perform their own science experiments with xylem filters.

"Because the raw materials are widely available and the fabrication processes are simple, one could imagine involving communities in procuring, fabricating, and distributing xylem filters," says Rohit Karnik, professor of mechanical engineering and associate department head for education at MIT. "For places where the only option has been to drink unfiltered water, we expect xylem filters would improve health, and make water drinkable."

Karnik's study co-authors are lead author Krithika Ramchander and Luda Wang of MIT's Department of Mechanical Engineering, and Megha Hegde, Anish Antony, Kendra Leith, and Amy Smith of MIT D-Lab.

Clearing the way

In their prior studies of xylem, Karnik and his colleagues found that the woody material's natural filtering ability also came with some natural limitations. As the wood dried, the branches' sieve-like membranes began to stick to the walls, reducing the filter's permeance, or ability to allow water to flow through. The filters also appeared to "self-block" over time, building up woody matter that clogged the conduits.

Surprisingly, two simple treatments overcame both limitations. By soaking small cross-sections of sapwood in hot water for an hour, then dipping them in ethanol and letting them dry, Ramchander found that the material retained its permeance, efficiently filtering water without clogging up. Its filtering could also be improved by tailoring a filter's thickness according to its tree type.

The researchers sliced and treated small cross-sections of white pine from branches around the MIT campus and showed that the resulting filters maintained a permeance comparable to commercial filters, even after being stored for up to two years, significantly extending the filters' shelf life.

The researchers also tested the filters' ability to remove contaminants such as E. coli and rotavirus -- the most common cause of diarrheal disease. The treated filters removed more than 99 percent of both contaminants, a water treatment level that meets the "two-star comprehensive protection" category set by the World Health Organization.

"We think these filters can reasonably address bacterial contaminants," Ramchander says. "But there are chemical contaminants like arsenic and fluoride where we don't know the effect yet," she notes.

Groundwork

Encouraged by their results in the lab, the researchers moved to field-test their designs in India, a country that has experienced the highest mortality rate due to water-borne disease in the world, and where safe and reliable drinking water is inaccessible to more than 160 million people.

Over two years, the engineers, including researchers in the MIT D-Lab, worked in mountain and urban regions, facilitated by local NGOs Himmotthan Society, Shramyog, Peoples Science Institute, and Essmart. They fabricated filters from native pine trees and tested them, along with filters made from ginkgo trees in the U.S., with local drinking water sources. These tests confirmed that the filters effectively removed bacteria found in the local water. The researchers also held interviews, focus groups, and design workshops to understand local communities' current water practices, and challenges and preferences for water treatment solutions. They also gathered feedback on the design.

"One of the things that scored very high with people was the fact that this filter is a natural material that everyone recognizes," Hegde says. "We also found that people in low-income households prefer to pay a smaller amount on a daily basis, versus a larger amount less frequently. That was a barrier to using existing filters, because replacement costs were too much."

With information from more than 1,000 potential users across India, they designed a prototype of a simple filtration system, fitted with a receptacle at the top that users can fill with water. The water flows down a 1-meter-long tube, through a xylem filter, and out through a valve-controlled spout. The xylem filter can be swapped out either daily or weekly, depending on a household's needs.

The team is exploring ways to produce xylem filters at larger scales, with locally available resources and in a way that would encourage people to practice water purification as part of their daily lives -- for instance, by providing replacement filters in affordable, pay-as-you-go packets.

"Xylem filters are made from inexpensive and abundantly available materials, which could be made available at local shops, where people can buy what they need, without requiring an upfront investment as is typical for other water filter cartridges," Karnik says. "For now, we've shown that xylem filters provide performance that's realistic."

Credit: 
Massachusetts Institute of Technology

Forty-three percent of melanoma patients have chronic complications from immunotherapies

Chronic side effects among melanoma survivors after treatment with anti-PD-1 immunotherapies are more common than previously recognized, according to a study published March 25 in JAMA Oncology.

The chronic complications, which occurred in 43% of patients, affected the joints and endocrine system most commonly, and less often involved salivary glands, eyes, peripheral nerves and other organs. These complications may be long lasting, with only 14% of cases having been resolved at last follow-up. This finding contrasted with previously reported immunotherapy-related acute complications that affected visceral organs -- including the liver, colon, lungs and kidneys -- which were effectively treated with steroids. However, the vast majority of chronic complications were not severe or life threatening.

"Chronic and long-lasting side effects were more common than we expected and involved a variety of often overlooked organs like the thyroid, salivary glands and joints," said the study's senior author, Douglas Johnson, MD, MSCI, associate professor of Medicine at Vanderbilt University Medical Center and clinical director of Melanoma at Vanderbilt-Ingram Cancer Center.

The retrospective study reviewed the incidence and spectrum of chronic immune-related adverse events in melanoma patients who were treated with adjuvant anti-PD-1 immunotherapies (pembrolizumab or nivolumab) at eight academic medical centers between 2015 and 2020. Chronic complications were defined as those persisting at least 12 weeks after immunotherapy treatments had ended. The study is the first to systematically examine anti-PD-1-related chronic complications in patients with high-risk, resected melanoma.

Most of the chronic complications (96%) were grade 1 or 2 events with no or mild symptoms. The more common complications included adrenal insufficiency, arthritis, dermatitis and thyroiditis.

Rare and acute immune-related adverse events that occur early in the course of treatment with the immunotherapy, such as cardiovascular complications, can be more serious. Johnson and colleagues at Vanderbilt-Ingram first reported in 2016 rare but fatal cardiac side effects from immunotherapies.

"While these side effects are important to monitor and treat, anti-PD-1 therapies remain life saving for many patients with melanoma" Johnson said.

Credit: 
Vanderbilt University Medical Center