Earth

Spacer protects healthy organs from radiation exposure during particle therapy

image: The Neskeep is available in 5, 10 and 15 mm nonwoven fabrics using polyglycolic acid, the same material as polyglycolic acid suture thread (Opepolix N and Opepolix, both already approved for medical use.

Image: 
Kobe University

It can be difficult to treat malignant tumors using radical particle therapy if the tumor is too close to healthy organs. A team led by Professor Takumi Fukumoto and Professor Ryohei Sasaki (Kobe University Graduate School of Medicine) has collaborated with Alfresa Pharma Corporation to develop a new medical device "Neskeep" to separate tumors from adjacent normal organs during particle therapy. The Neskeep is designed to be placed by surgical procedure.

Neskeep received pharmaceutical approval in December 2018 and is available to purchase from June 27, 2019.

Radiation therapy, which is one of the less invasive and less stressful treatments, has been recognized as an important cancer treatment. Health insurance from the Japanese government covers particle therapies for bone and soft tissue tumors, pediatric malignant tumors, and non-squamous head and neck tumors.

In some cases, it can be difficult to apply particle therapy when malignant tumors are located near digestive tract organs sensitive to radiation (the small and large intestine). Doctors currently use non-absorbent materials such as silicone balloons and Gore-Tex sheets to act as spacers in the abdomen and intestines, or they place the intestine or other organs outside the radiation field using an absorbent mesh. Based on several observations, some patients may then undergo a second surgery to remove the spacer or it may cause serious complications in their bodies for the rest of their lives.

The team succeeded in developing a nonwoven fabric bioabsorbable spacer that can create a space between healthy and cancerous tissues during particle therapy. After that the spacer is safely absorbed by the body.

The Neskeep is available in 5, 10 and 15mm nonwoven fabrics using polyglycolic acid, the same material as polyglycolic acid suture thread (Opepolix N (approval code: 16200BZZ01566000) and Opepolix (approval code: 16200BZZ01566000), both already approved for medical use.

In their preclinical evaluation, the team conducted physical experiments to check that the spacer fulfilled the requirements for isolating cancer from organs and providing a barrier against radiation. After safety studies in animal models, a clinical trial was carried out in Kobe University Hospital and Hyogo Ion Beam Medical Center (HIBMC). The human trial involved five patients with malignant tumors in the abdominal or pelvic region, for whom particle therapy was difficult because of the proximity of normal organs to the cancer.

In all treatments the spacer preserved enough distance between the tumor and healthy tissue during the particle therapy. Having successfully reduced the radiation exposure to the intestines, they were able to use a full dose of radiation therapy. There were no serious complications observed with this treatment, and they confirmed that the spacers safely disintegrated afterwards.

Particle therapy to treat malignant tumors is a promising new radiation treatment, and it is available in a growing number of clinics. Initially this spacer will be used mainly to treat bone tumors that are covered by insurance for particle therapy treatment. The team hopes to expand the use of this spacer alongside the expansion of insurance coverage for particle therapy, ultimately using it to treat conditions such as pancreatic and liver cancers.

Currently two kinds of treatment are used in clinical settings: proton therapy (using protons) and heavy-ion radiotherapy (using carbon ions).

Doses of concentrated radiation in particle therapy:
Particle beams display a physical trait called the Bragg Peak, which can be used to focus the dose of radiation to target the tumor. This also enables us to reduce the impact on healthy tissue.

Biological effects of particle therapy:
Compared to X-rays, proton beams and carbon ion beams are very effective. If we take the relative biological effect on organisms as 1, the effects of proton and heavy ion beams are 1.1 and 3, and there are high expectations for effective treatment against radiation-resistant tumors.

Credit: 
Kobe University

A new paradigm for efficient upward culture of 3D multicellular spheroids

image: Schematic image of upward culture process of 3D multicellular spheroids on durable superamphiphobic silica aerogel surface. The antifouling surface kept cell from adhering and triggered cell self-organization into spheroids. The spheroids can be long-term incubated and in-situ observed on the surface.

Image: 
©Science China Press

Conventional methods for in-vitro biological evaluations were mostly performed on 2D monolayer cell culture. In these systems, the intensive spatial interactions between cells were totally ignored, which might lead to information loss and incorrect results. 3D cellular spheroids culture offers a possibility to mimic physiological conditions with heterogeneous spatial distribution pattern of oxygen, nutrients metabolites and signal molecules, which is an ideal model for cell research. The advent of 3D cellular spheroids is of great significance for the analysis of tumor developing mechanism, drug evaluation, stem cell differentiation and regenerative medicine.

For the generation of spheroids, the common principle is cell self-organized on a biomaterial surface that cells cannot attach and thus are forced to interact with each other. To achieve this, numerous fabrication techniques have been established previously. Among them, important examples are hanging-droplets culture, low-adhesion matrix, NASA bioreactors and magnetic manipulation. These methods open the doors for applications of spheroids. Unfortunately, such advanced methods require specialized technologies, labor-intensive handing, time-consuming procedure, and most of them are unable for in-situ observation. Therefore, it is urgent to develop a new culture method that is efficient, safe and can be able for in situ observation.

In a research article recently published in National Science Review, the research group led by Professor Qing-hua Lu proposed a novel upward culture of 3D multicellular spheroids on a durable superamphiphobic surface (SSAS). Abundant hierarchical structures and chemical inertness were constructed on the antifouling surface, leading to extremely low solid-liquid adhesion for a series of liquids with different surface tensions ranging from water to n-dodecane. The SSAS exhibited long-term thermal and mechanical stabilities. Such a robust durable SSAS meets the requirements for long-term upward culture of 3D cell spheroids. Compact 3D cell spheroids with a consistent cell population were produced with the aid of gravity and droplet curvature in the quasi-spherical droplets of the medium. Distinguished from other methods, this method is more efficient, biocompatible, reproducible and able for in situ observation. The potential applications in drug screening, spheroidal fusion and long-term tracing have been demonstrated by authors. Interestingly, this surface can be recycled without any treatment. The described SSAS offers a new technological space for in-vitro research of drug screening, stem cell differentiation, and spheroid fusion.

Credit: 
Science China Press

Pesticides deliver a one-two punch to honey bees

image: Researchers conduct semi-field experiments on honey bees.

Image: 
Lang Chen

Adjuvants are chemicals that are commonly added to plant protection products, such as pesticides, to help them spread, adhere to targets, disperse appropriately, or prevent drift, among other things. There was a widespread assumption that these additives would not cause a biological reaction after exposure, but a number of recent studies show that adjuvants can be toxic to ecosystems, and specific to this study, honey bees. Jinzhen Zhang and colleagues studied the effects on honey bees when adjuvants were co-applied at "normal concentration levels" with neonicotinoids. Their research, recently published in Environmental Toxicology and Chemistry, found that the mixture of the pesticide and the adjuvant increased the mortality rate of honey bees in the lab and in semi-field conditions, where it also reduced colony size and brooding.

When applied alone, the three pesticide adjuvants caused no significant, immediate toxicity to honeybees. However, when the pesticide acetamiprid was mixed with adjuvants and applied to honeybees in the laboratory, the toxicity was quite significant and immediate. In groups treated with combined pesticide-adjuvant concentrates, the mortality was significantly higher than the control groups, which included a blank control (no pesticide, no adjuvant, only water) and a control with only pesticide (no adjuvant). Further, flight intensity, colony intensity and pupae development continued to deteriorate long after the application comparative to the control groups.

Zhang noted that this study, "contributed to the understanding of the complex relationships between the composition of pesticide formulations and bee harm," and stressed that "further research is required on the environmental safety assessment of adjuvants and their interactions with active ingredients on non-target species."

Credit: 
Society of Environmental Toxicology and Chemistry

Restoring forests means less fuel for wildfire and more storage for carbon

When wildfires burn up forests, they don't just damage the trees. They destroy a key part of the global carbon cycle. Restoring those trees as quickly as possible could tip the scale in favor of mitigating severe climate change.

Lisa A. McCauley, a spatial analyst at The Nature Conservancy, explains how quick action to thin out vegetation will actually increase carbon storage in forests by the end of this century. Her new paper is published in the Ecological Society of America's journal Ecological Applications, and she will present the findings this August at ESA's 2019 Annual Meeting in Louisville, KY.

"With predictions of widespread mortality of western U.S. forests under climate change," McCauley states, "our study addresses how large-scale restoration of overly-dense, fire-adapted forests is one of the few tools available to managers that could minimize the adverse effects of climate change and maintain forest cover."

Forests are a vital carbon sink - a natural sponge that pulls carbon out of the atmosphere through plant photosynthesis. Because carbon dioxide (CO2) emissions from human activities are a major cause of climate change, forests do humanity a huge service by disposing of much of its gaseous waste.

Unfortunately, wildfires are more common than they used to be. Higher tree density, more dry wood for fuel, and a warmer, drier climate have caused an increase in the frequency, size, and severity of wildfires in western U.S. forests. Restoring forests in a timely manner is critical in making subsequent wildfires are less likely. The U.S. Forest Service states that rehabilitation and restoration takes many years, and includes planting trees, reestablishing native species, restoring habitats, and treating for invasive plants. There is an urgent need for such restoration in the southwest U.S. to balance out the carbon cycle.

Enter, the Four Forest Restoration Initiative (4FRI). The U.S. Forest Service began the 4FRI in 2010 to restore 2.4 million acres (3750 square miles) of national forests in Arizona. The goals of the 4FRI are to restore the structure, pattern, composition, and health of fire-adapted (dependent on occasional fires for their lifecycle) ponderosa pine ecosystems; reduce fuels and the risk of unnaturally severe wildfires; and provide for wildlife and plant diversity. Doing so involves a full suite of restoration projects that are carried out by US Forest Service personnel, partners and volunteers, and contractors. Managers of the four forests - the Kaibab, Coconino, Apache-Sitgreaves, and Tonto - are engaged in a huge, collaborative initiative to with a diverse group of stakeholders to explore the best methods for restoring the ponderosa pine forests in the region.

One such exploration is a study in which researchers, including McCauley, use computer simulations to see how the carbon cycle and wildfire severity between the years 2010-2099 would be influenced by different rates of restoration of about 1500 square miles of forest.

A potential drawback to a very rapid restoration plan is that it includes the thinning out (harvesting) of dense, dry trees -- possibly by controlled burns -- to get rid of plants that could act as potential wildfire fuel. Reduction in overall vegetation could mean that the overall carbon uptake and storage of these forests would drop.

"The conventional wisdom has been that forest restoration in the western U.S. does not benefit carbon stocks," McCauley says. "However, with wildfire size, frequency and severity increasing, we believe that additional research is needed across more forests so that we can better understand the fate of carbon and forest cover, particularly for fire-adapted forests where tree densities exceed historical norms and the risks of climate-induced forest loss are increasing."

Interestingly, the simulations show that despite early decreases in the ecosystem's stored carbon, a rapid restoration plan increases total carbon storage by 11-20%, which is about 8-14 million metric tons of carbon by the end of the century. This is equal to the removal of carbon emissions from 67,000-123,000 passenger vehicles per year until 2100.

"By minimizing high-severity fires," McCauley explains, "accelerated forest thinning can stabilize forest carbon stocks and buy time - decades - to better adapt to the effects of climate change on forest cover."

Restored forests provide other benefits than just increased carbon storage in the next century. A restored fire-adapted forest would be less dense, with fewer trees but more diversity, allowing more sunlight to penetrate the canopy, increasing cover of grass and encouraging a more diverse understory. The wildfires that do occur would burn at lower severity as ground fires that consume grasses rather than torching canopies that kills trees.

McCauley says this study is unique because it is a large, landscape-scale study that uses data from a real-world restoration project--the largest restoration being implemented in the U.S. The results are indeed promising, indicating that restoration is likely to stabilize carbon and the benefits are greater when the pace of restoration is faster.

Credit: 
Ecological Society of America

Study of deceased football players with CTE examines contributors associated with dementia

What The Study Did: This study of 180 deceased former football players who had chronic traumatic encephalopathy (CTE) investigated the association of brain white matter pathologic changes and cerebrovascular disease with dementia.

Authors: Ann C. McKee, M.D., of Boston University, is the corresponding author.

(doi:10.1001/jamaneurol.2019.2244)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

# # #

Media Advisory: The full study and editorial are linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time: https://jamanetwork.com/journals/jamaneurology/fullarticle/2739480?guestAccessKey=9aa67571-51c7-4aaa-be07-d23b71970b00&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=080519

Credit: 
JAMA Network

Industry payments to physician director of NCI-designated cancer centers

What The Study Did: Data from the Centers for Medicare & Medicaid Services were used to examine industry payments to physician directors of National Cancer Institute-designated cancer centers in this research letter.

Authors: David Carr, M.D., of the University of California, San Diego, is the corresponding author.

(doi:10.1001/jamainternmed.2019.3098)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study is linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time: https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/2740204?guestAccessKey=948cb158-b3c6-45a5-a565-35ef8c3e75a5&utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=080519

Credit: 
JAMA Network

New Zealand's biodiversity will take millions of years to recover

image: This is a kaka, an endangered parrot species endemic to New Zealand. The kaka picture has been selected for the cover of Current Biology

Image: 
Juan Carlos Garcia

The arrival of humans in New Zealand, some 700 years ago, triggered a wave of extinction among native bird species. Many more species are currently under threat. Recent calculations by scientists from the University of Groningen in the Netherlands and Massey University in New Zealand show that it would take at least 50 million years of evolution to restore the biodiversity that has been lost. Their results were published on 5 August in the journal Current Biology.

Alfred Russel Wallace (a 19th-century naturalist who independently formulated a theory of evolution around the same time as Darwin) described New Zealand's biota as 'wonderfully isolated'. Until the first Maori arrived on the islands (in around 1280 CE), the only native mammals were bats. However, the Maori and the Europeans (who reached the islands in 1642) set in motion a wave of extinction that wiped out over 70 bird species, including the iconic giant moa.

Genetic data

But how extensive is the damage to this country's biodiversity? Dr. Luis Valente is an evolutionary biologist employed by the Naturalis Biodiversity Center (a museum of natural history) in Leiden, the Netherlands. He was recently appointed assistant professor at the University of Groningen. Several years ago, together with Professor Rampal Etienne (also from the University of Groningen), Dr. Valente developed a mathematical model to explore this question. The model can be used to estimate how long it will take evolution to restore biodiversity to its original level on these islands.

The model is based on the premise that island biodiversity eventually reaches an equilibrium. However, this idea has been challenged in recent years. Its critics suggest that the dynamics of island environments prevent equilibrium from ever being reached. In previous work, on a Caribbean island, Dr. Valente and Prof. Etienne used genetic data from extinct bat species to show that biodiversity levels had veered away from a state of equilibrium. They calculated that it would take the surviving bat population eight million years of evolution to recover.

Kakapo

The researchers have since performed similar calculations on bird species in New Zealand. 'We have sequenced parts of the genome from nearly all these species, including many extinct ones', explains Luis Valente. Computer simulations show that bird biodiversity will take around 50 million years to recover from the extinctions that followed the arrival of humans. However, a number of bird species, like the kakapo and several species of kiwi, are now classified as 'endangered' or 'critically endangered'. If they go extinct, this would add another six million years to the recovery time. And if species that are currently classified as 'vulnerable' were also to disappear, this would add a further 10 million years to that figure.

It should be noted that here, the term 'recovery' is used to mean the point at which evolution has returned the number of species to its original level. However, these newly evolved species will not be identical to those that have been lost. As Dr. Valente points out, 'I am also working on Mauritius, where the dodo used to live. While evolution will not bring the dodo back, it was related to pigeons, so a new bird species might eventually evolve from the island's present-day pigeon population.'

Conservation

So what is the use of knowing the recovery time of an island ecosystem? 'Well, one practical consideration is that it enables us to quantify the ecological damage that has been done. Given the limited resources available for conservation, these numbers could show us how to achieve the best cost-benefit ratio, in terms of investing in protecting endangered species', says Luis Valente. 'Using our model, we can determine the natural rates of change on islands. If two different islands each have an equal number of endangered species, then we might need to focus our conservation efforts on the one with the longer recovery time.' He adds that recovery time is just one of the many factors that need to be considered when establishing conservation priorities

Dr. Valente and his co-workers are planning to extend their studies to other islands, to further their understanding of evolutionary dynamics. 'This method would be useful for other isolated ecosystems as well, such as mountains or lakes.'

Credit: 
University of Groningen

Improving the magnetic bottle that controls fusion power on Earth

image: Physicist Nate Ferraro.

Image: 
Elle Starkman/PPPL Office of Communications

Scientists who use magnetic fields to bottle up and control on Earth the fusion reactions that power the sun and stars must correct any errors in the shape of the fields that contain the reactions. Such errors produce deviations from the symmetrical form of the fields in doughnut-like tokamak fusion facilities that can have a damaging impact on the stability and confinement of the hot, charged plasma gas that fuels the reactions.

Researchers led by scientists at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) have found clear evidence of the presence of error fields in the initial 10-week run of the National Spherical Torus Experiment--Upgrade (NSTX-U), the flagship fusion facility at the laboratory. The exhaustive detection method they used could provide lessons for error correction in future fusion devices such as ITER, the large international fusion facility under construction in France to demonstrate the practicality of controlled fusion energy.

Fusion powers the sun and stars

Fusion, the power that drives the sun and stars, is the fusing of light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei -- that generates massive amounts of energy. Scientists around the world are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

At PPPL, researchers have put together a combination of experimental data, detailed measurement of the position of the magnets, and computer modeling of the response of the plasma to locate the source of the NSTX-U error fields. The analysis uncovered a spectrum of small error fields -- an inevitable result of the fact that a tokamak cannot be perfectly symmetrical -- but most had an easily correctible impact on the plasma. However, one major find stood out: a slight misalignment of the magnetic coils that run down the center of the tokamak and produce the fields that wrap horizontally -- or "toroidally" -- around the interior of the vessel.

The clue scientists sought

This misalignment was the clue the scientists had sought. "We looked for the source of the error with the biggest impact on the plasma," said physicist Nate Ferraro, first author of the research that reported the search and discovery in Nuclear Fusion. "What we found was a small misalignment of the center-stack coils with the casing that encloses them."

The slight misalignment generated errors that resonated in the behavior of the plasma. Among the issues was a braking and locking effect that kept the edge of the plasma from rotating, and increased localized heating on plasma-facing components inside the tokamak.

Discovery of the misalignment followed shut-down of the tokamak for ongoing repairs in the wake of a coil failure. The misalignment findings are now being used "to drive new engineering tolerance requirements for NSTX-U as it is rebuilt," the researchers said. Such requirements call for tighter tolerance between the center stack and the casing that encloses it. The tighter tolerance would narrow the deviation from optimal alignment of the two components to less than two one-hundredths of an inch along the vertical axis of the center stack.

The adjustment would alleviate concerns about increased localized heating and would reduce the magnetic braking and locking, according to the authors. Such developments would thereby improve the stability of the plasma. "Every tokamak is concerned about error fields," Ferraro said. "What we are trying to do is optimize the NSTX-U."

Partnership with experiments

The findings demonstrate the relationship between the PPPL Theory Department and the NSTX-U experiment, said Amitava Bhattacharjee, who heads Theory. "This is an excellent example of the NSTX-U-Theory Partnership program that has been beneficial for both the NSTX-U and Theory Departments at PPPL, and which continues even when NSTX-U is in recovery," Bhattacharjee said.

Members of the research team included scientists from PPPL, Sandia National Laboratory, General Atomics and Oak Ridge National Laboratory. The DOE Office of Science funded the work.

Credit: 
DOE/Princeton Plasma Physics Laboratory

Machine learning classifies word type based on brain activity

image: This is an activity comparison for real (left) and pseudo (right) words.

Image: 
Jensen et al., <i>eNeuro</i> 2019

Pairing machine learning with neuroimaging can determine whether a person heard a real or made up word based on their brain activity, according to a new study published in eNeuro. These results lay the groundwork for investigating language processing in the brain and developing an imaging-based tool to assess language impairments.

Many brain injuries and disorders cause language impairments that are difficult to establish with standard language tasks because the patient is unresponsive or uncooperative, creating a need for a task-free diagnosis method. Using magnetoencephalography, Mads Jensen, Rasha Hyder, and Yury Shtyrov from Aarhus University examined the brain activity of participants while they listened to audio recordings of both similar-sounding real words with different meanings and made up "pseudowords." The participants were then instructed to ignore the words and focus on a silent film.

Using machine learning algorithms, the team was able to determine when a participant was hearing a real or made up word, a grammatically correct or incorrect word, and the word's meaning based on their brain activity. They also identified specific brain regions and frequencies responsible for processing different types of language.

Credit: 
Society for Neuroscience

Raman spectroscopy poised to make thyroid cancer diagnosis less invasive

WASHINGTON -- Researchers have demonstrated that an optical technique known as Raman spectroscopy can be used to differentiate between benign and cancerous thyroid cells. The new study shows Raman spectroscopy's potential as a tool to improve the diagnosis of thyroid cancer, which is the ninth most common cancer with more than 50,000 new cases diagnosed in the United States each year.

"Our encouraging results show that Raman spectroscopy could be developed into a new optical modality that can help avoid invasive procedures used to diagnose thyroid cancer by providing biochemical information that isn't currently accessible," said James W. Chan from the University of California, Davis, U.S.A. "This could have a major impact in the field of pathology and could lead to new ways to diagnose other diseases."

In The Optical Society (OSA) journal Biomedical Optics Express, a multidisciplinary team led by Chan; Michael J. Campbell, University of California, Davis, U.S.A.; and Eric C. Huang, University of Washington, Seattle, U.S.A., report that their Raman spectroscopy approach can distinguish between healthy and cancerous human thyroid cells with 97 percent accuracy.

"We are the first, to our knowledge, to use human clinical thyroid cells to show that Raman spectroscopy can identify cancer subtypes at the single cell level," said Chan. "However, we will need to increase both the number of cells and number of patients studied to confirm the accuracy of the Raman technique."

Improving cell-based diagnosis

A lump -- or nodule -- in the neck is a common symptom of thyroid cancer. However, most thyroid nodules aren't cancerous. Ultrasound-guided fine needle aspiration biopsies are typically used to check for cancer by inserting a thin needle into the nodule to obtain cells that are prepared on a microscope slide, stained and analyzed by a pathologist.

For about 15 to 30 percent of cases, the pathologist cannot determine whether cells acquired from the biopsy are benign or malignant. For these cases, a surgical procedure known as a thyroidectomy is required to remove tissue, which provides more information for a more accurate diagnosis.

The researchers turned to Raman spectroscopy as a possible solution because it is a non-invasive technique that requires no sample preparation or staining to determine subtle differences in the molecular composition of complex samples such as cells.

"We would like to use Raman spectroscopy to improve the pathologist's analysis of the cells obtained with fine needle aspiration to reduce the number of thyroidectomies necessary," said Chan. "This would both minimize surgical complications and reduce health care costs."

Biochemical information from an entire cell

For the new study, the researchers used a line-scan Raman microscope that allowed them to rapidly acquire Raman signals from an entire cell volume. This allowed them to more accurately capture the chemical composition of entire cells compared to other approaches that acquire a Raman spectrum from only part of a cell's volume. Multivariate statistical methods and classification methods were then used to analyze the Raman data and classify the cells in an objective, unbiased manner.

The researchers applied this Raman spectroscopy approach to individual cells isolated from 10 patient thyroid nodules diagnosed as benign or cancerous. The data analysis identified unique spectral differences that could distinguish cancerous cells from benign with 97 percent diagnostic accuracy. They also showed that other subtypes could be identified by their spectral differences.

"These preliminary results are exciting because they involve single cells from human clinical samples, but more work will need to be done to take this from a research project to final clinical use," said Chan.

In addition to testing it on more cells and patients, the researchers also need to apply the technique to cells obtained with a fine needle aspiration and test it on samples for which the pathologist can't determine if the cells are benign or cancerous. They also want to develop an automated prototype system that can perform the Raman measurements and analysis with minimal human intervention.

Credit: 
Optica

Quantum entanglement in chemical reactions? Now there's a way to find out

image: Purdue researchers have modified a popular theorem for identifying quantum entanglement and applied it to chemical reactions. This quantum simulation of a chemical reaction yielding deuterium hydride validated the new method.

Image: 
Purdue University image/Junxu Li

WEST LAFAYETTE, Ind. -- Scientists have long suspected that a quantum phenomenon might play a role in photosynthesis and other chemical reactions of nature, but don't know for sure because such a phenomenon is so difficult to identify.

Purdue University researchers have demonstrated a new way to measure the phenomenon of entanglement in chemical reactions - the ability of quantum particles to maintain a special correlation with each other over a large distance.

Uncovering exactly how chemical reactions work could bring ways to mimic or recreate them in new technologies, such as for designing better solar energy systems.

The study, published on Friday (Aug. 2) in Science Advances, generalizes a popular theorem called "Bell's inequality" to identify entanglement in chemical reactions. In addition to theoretical arguments, the researchers also validated the generalized inequality through a quantum simulation.

"No one has experimentally shown entanglement in chemical reactions yet because we haven't had a way to measure it. For the first time, we have a practical way to measure it," said Sabre Kais, a professor of chemistry at Purdue. "The question now is, can we use entanglement to our advantage to predict and control the outcome of chemical reactions?"

Since 1964, Bell's inequality has been widely validated and serves as a go-to test for identifying entanglement that can be described with discrete measurements, such as measuring the orientation of the spin of a quantum particle and then determining if that measurement is correlated with another particle's spin. If a system violates the inequality, then entanglement exists.

But describing entanglement in chemical reactions requires continuous measurements, such as the various angles of beams that scatter the reactants and force them to contact and transform into products. How the inputs are prepared determines the outputs of a chemical reaction.

Kais' team generalized Bell's inequality to include continuous measurements in chemical reactions. Previously, the theorem had been generalized to continuous measurements in photonic systems.

The team tested the generalized Bell's inequality in a quantum simulation of a chemical reaction yielding the molecule deuterium hydride, building off of an experiment by Stanford University researchers that aimed to study the quantum states of molecular interactions, published in 2018 in Nature Chemistry.

Because the simulations validated the Bells's theorem and showed that entanglement can be classified in chemical reactions, Kais' team proposes to further test the method on deuterium hydride in an experiment.

"We don't yet know what outputs we can control by taking advantage of entanglement in a chemical reaction - just that these outputs will be different," Kais said. "Making entanglement measurable in these systems is an important first step."

Credit: 
Purdue University

New wood membrane provides sustainable alternative for water filtration

image: Schematic of the process of using the new wood membrane to distill water.

Image: 
T. Li, University of Maryland.

Inspired by the intricate system of water circulating in a tree, a team of researchers led by Princeton University, have figured out how to use a thin slice of wood as a membrane through which water vapor can evaporate, leaving behind salt or other contaminants.

Most membranes that are used to distill fresh water from salty are made of polymers, which are derived from fossil fuels and are also difficult to recycle. The wood membrane is a more sustainable material, and according to the researchers, has very high porosity, which promotes water vapor transport and prevents heat loss. In a paper published Aug. 2 in the journal Science Advances, the researchers demonstrate that the new membrane they designed performs 20% better than commercial membranes in water distillation tests.

Credit: 
Princeton University, Engineering School

NASA catches birth of Northwestern Pacific's Tropical Storm Francisco

image: On August 2 at 8:20 a.m. EDT (1220 UTC) the MODIS instrument that flies aboard NASA's Terra satellite showed strongest storms (yellow) in Tropical Storm Francisco around the center and in fragmented bands of thunderstorms circling the center, where cloud top temperatures were as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Soon after Tropical Storm Francisco developed in the Northwestern Pacific Ocean NASA's Terra satellite passed overhead.

NASA's Terra satellite used infrared light to gather temperature information from newly developed Tropical Storm Francisco. The strongest thunderstorms reach high into the atmosphere and have the coldest cloud top temperatures.

On August 2 at 8:20 a.m. EDT (1220 UTC) the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite found coldest cloud top temperatures as cold as minus 80 degrees Fahrenheit (minus 62.2 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.Those strongest storms were seen around the center and in fragmented bands of thunderstorms circling the center. Those temperatures were also found in a large band of thunderstorms feeding into the center from the east.

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Francisco was located near latitude 21.7 degrees north and longitude 151.2 degrees east. That's about 576 nautical miles east-southeast of Iwo To island, Japan. Francisco is moving toward the northwest. Maximum sustained winds are near 40 knots (46 mph/74 kph) with higher gusts.

On the forecast track, the Joint Typhoon Warning Center forecasts Francisco to move northwest toward southwestern Japan. Francisco is expected to strengthen to 85 knots (98 mph/157 kph) before landfall in Kyushu by August 6, and a second landfall the next day in South Korea.

For updated forecasts, visit: https://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

Study estimates frailty, prefrailty among older adults

What The Study Did: This study (called a systematic review and meta-analysis) combined the results of 46 observational studies involving nearly 121,000 nonfrail adults (60 or older from 28 countries) and estimated the rate of new cases of frailty and prefrailty, which is a high risk of progressing to frailty.

Authors: Danny Liew, M.B.B.S.(Hons), F.R.A.C.P., Ph.D., of Monash University in Melbourne, Australia, is the corresponding author.

(doi:10.1001/jamanetworkopen.2019.8398)

Editor's Note: The article includes conflict of interest disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

#  #  #

Media advisory: The full study and commentary are linked to this news release.

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time: http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2019.8398?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=080219

About JAMA Network Open: JAMA Network Open is the new online-only open access general medical journal from the JAMA Network. Every Wednesday and Friday, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.

Credit: 
JAMA Network

Paradoxical outcomes for Zika-exposed tots

image: 'Sequential neuroimaging of the fetus and newborn with in utero Zika virus exposure.'
Sarah B. Mulkey, M.D., Ph.D., et al. Published Jan. 2019 in JAMA Pediatrics.

Image: 
IMAGE COURTESY OF JAMA NETWORK. COPYRIGHT AMERICAN MEDICAL ASSOCIATION https://jamanetwork.com/journals/jamapediatrics/article-abstract/2716801

In the midst of an unprecedented Zika crisis in Brazil, there were a few flickers of hope: Some babies appeared to be normal at birth, free of devastating birth defects that affected other Brazilian children exposed to the virus in utero. But according to a study published online July 8, 2019, in Nature Medicine and an accompanying commentary co-written by a Children's National clinician-researcher, the reality for Zika-exposed infants is much more complicated.

Study authors led by Karin Nielsen-Saines at David Geffen UCLA School of Medicine followed 216 infants in Rio de Janeiro who had been exposed to the Zika virus during pregnancy, performing neurodevelopmental testing when the babies ranged in age from 7 to 32 months. These infants' mothers had had Zika-related symptoms themselves, including rash.

Although many children had normal assessments, 29% scored below average in at least one domain of neurological development, including cognitive performance, fine and gross motor skills and expressive language, Sarah B. Mulkey, M.D., Ph.D., and a colleague write in a companion commentary published online by Nature Medicine July 29, 2019.

The study authors found progressively higher risks for developmental, hearing and eye abnormality depending on how early the pregnancy was at the time the infants were exposed. Because Zika virus has an affinity for immature neurons, even babies who were not born with microcephaly remained at continued risk for suffering abnormalities.

Of note, 24 of 49 (49%) infants who had abnormalities at birth went on to have normal test results in the second or third year of life. By contrast, 17 of 68 infants (25%) who had normal assessments at birth had below-average developmental testing or had abnormalities in hearing or vision by age 32 months.

"This work follows babies who were born in 2015 and 2016. It's heartening that some babies born with abnormalities tested in the normal range later in life, though it's unclear whether any specific interventions help to deliver these positive findings," says Dr. Mulkey, a fetal-neonatal neurologist in the Division of Fetal and Transitional Medicine at Children's National in Washington, D.C. "And it's quite sobering that babies who appeared normal at birth went on to develop abnormalities due to that early Zika exposure."

It's unclear how closely the findings apply to the vast majority of U.S. women whose Zika infections were asymptomatic.

"This study adds to the growing body of research that argues in favor of ongoing follow-up for Zika-exposed children, even if their neurologic exams were reassuring at birth," Dr. Mulkey adds. "As Zika-exposed children approach school age, it's critical to better characterize the potential implications for the education system and public health."

Credit: 
Children's National Hospital