Tech

Fracture risk associated with bisphosphonate drug holidays

Bisphosphonates have been shown to reduce the risk of osteoporotic fractures. To avoid possible side effects of long-term therapy, many patients take a drug holiday after several years of bisphosphonate therapy. A team from the Department of Medical Informatics, Biometry and Epidemiology at Ruhr-Universität Bochum has examined the implications of such drug holidays on fracture risk. They found that in patients who had previously suffered a vertebral fracture, a longer bisphosphonate drug holiday was associated with an increase in the risk of so-called major osteoporotic fractures (MOFs). These are clinical spine, hip, shoulder (upper arm) or forearm fractures. The research team published its findings in the journal Bone on 15 May 2020.

Rare but severe side effects

Long-term treatment with bisphosphonates is associated with an increase in rare, albeit serious side effects, especially osteonecrosis of the jaw or certain fractures of the thigh. "Residual effects on bone metabolism appear to persist for some time after bisphosphonates are discontinued," says Professor Johannes Pfeilschifter. "However, the number of studies that have investigated the risks and benefits of bisphosphonate drug holidays is limited."

Two years of interviews

In order to obtain additional information, the research team interviewed patients who had been treated with these drugs in the preceding four or more years. Five telephone interviews were conducted over a period of two years. The analysis included the observations of 1,973 participants who had been recruited from physicians' practices and clinics throughout Germany.

Rise in fracture risk in patients with previous vertebral fractures

In a simple two-group comparison, the research team found no difference in fracture risk between patients whose therapy was paused and those who continued bisphosphonate therapy. For a more detailed analysis of the changes in fracture risk in relation to the time since the start of a drug holiday, the researchers used a method that represented the therapy status of each patient at any time in the past year as a moving average. "This approach enables us to compare the risk of fractures between different periods of time since the therapy was interrupted and minimises the systematic bias related to the higher probability of taking a drug holiday in patients with a lower risk of fractures," says Dr. Henrik Rudolf, who performed the statistical analyses.

The analyses suggest that longer drug holidays from bisphosphonates are associated with an at least partial loss of their protective effect against fractures. The analyses revealed a so-called interaction (meaning that a characteristic modified the relative risk of fractures with increasing holiday length): "In patients who had already had vertebral fractures, the risk of MOFs for a time of more than 12 months since the therapy was paused increased 3.5 times compared to the risk in the second half of the first year of the drug holiday," explains Professor Hans Joachim Trampisch, senior professor at the Department of Medical Informatics, Biometry and Epidemiology. The corresponding estimated value was considerably lower in patients without a previous vertebral fracture. The sample size was too small to enable adequate estimates of changes in fracture risk in the patients without previous vertebral fractures with respect to other relevant risk factors for fractures.

"The findings of our study should be considered in the context of all study reports on long-term therapy with bisphosphonates and bisphosphonate drug holidays," emphasizes Johannes Pfeilschifter. "The decision regarding further management of osteoporosis in patients on bisphosphonate therapy should be made individually for each patient based on the benefits and potential risks of the available treatment options, and should be re-evaluated on a periodic basis."

Study limitations

The researchers point out that, as different bisphosphonates were merged for the analysis, the study does not allow to draw any drug-specific conclusions on fracture risk associated with bisphosphonate holidays. Due to the limited number of respective observations, the study also provides little information on bisphosphonate drug holidays in men and in patients who have been treated with bisphosphonates for more than ten years.

Credit: 
Ruhr-University Bochum

NASA's Terra Satellite finds no strong storms left in Tropical Storm Douglas  

image: On July 29 at 5:45 a.m. EDT (0945 UTC), the MODIS instrument that flies aboard NASA's Terra satellite gathered infrared data on Douglas showing that persistent south to southwest vertical wind shear had taken its toll on the storm. There were no strong thunderstorms remaining.

Image: 
NASA/NRL

Strong wind shear has been the undoing of Tropical Storm Douglas. NASA's Terra satellite provided infrared data revealed the tropical cyclone was devoid of strong storms, indicating wind shear has weakened it.

One Warning Remains for Douglas

On July 29, a Tropical Storm Warning is in effect for portions of the Papahanaumokuakea Marine National Monument from Maro Reef to Lisianski.

NASA's Terra Satellite Reveals Effects of Wind Shear 

NASA's Terra satellite uses infrared light to analyze the strength of storms by providing temperature information about the system's clouds. The strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures. However, cloud tops in Douglas showed no very cold cloud tops on July 29 at 5:45 a.m. EDT (0945 UTC) when they were imaged by the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite.

Wind shear had sapped the strength of the storm and prevented strong thunderstorms from forming. Tropical cyclones are made up of hundreds of thunderstorms, and when there are no strong storms present in satellite imagery, it is a sure sign of weakening.

At 5 a.m. EDT (0900 UTC) on July 29/11 p.m. HST on July 28, NOAA's Central Pacific Hurricane Center (CPHC) in Honolulu, Hawaii noted, "Due to persistent southerly vertical wind shear, Douglas has been devoid of deep convection for nearly 24 hours, and it appears that it will soon be a post-tropical remnant low."

About Wind Shear  

The shape of a tropical cyclone provides forecasters with an idea of its organization and strength. Wind shear occurs when outside winds batter a storm and change the storm's shape pusing much of the associated clouds and rain to one side of it.

In general, wind shear is a measure of how the speed and direction of winds change with altitude. Tropical cyclones are like rotating cylinders of winds. Each level needs to be stacked on top each other vertically in order for the storm to maintain strength or intensify. Wind shear occurs when winds at different levels of the atmosphere push against the rotating cylinder of winds, weakening the rotation by pushing it apart at different levels. Southwesterly wind shear was pushing the bulk of Douglas' clouds to the north-northeast of the center.

Status of Tropical Storm Douglas on July 29, 2020

At 8 a.m. EDT (2 a.m. HST/1200 UTC), the center of Tropical Storm Douglas was located near latitude 24.7 degrees north and longitude 174.3 degrees west. Douglas is about 320 miles (515 km) southeast of Midway Island. Douglas was moving toward the west near 23 mph (37 km/h), and this general motion is expected to continue until Douglas crosses the International Date Line in about 24 hours. Maximum sustained winds were near 40 mph (65 kph) with higher gusts. The estimated minimum central pressure is 1010 millibars.

Forecast for Douglas

NOAA's CPHC said, "Large seas and swells generated by Douglas will impact portions of the Papahanaumokuakea Marine National Monument west of Maro Reef through Wednesday. These swells may produce large breaking waves that could inundate some of the lower-lying atolls. Rainfall associated with Douglas will impact portions of the Papahanaumokuakea Marine National Monument west of Maro Reef through Wednesday."

Weakening is forecast during the next two days, and Douglas is expected to dissipate shortly after crossing the Date Line.

NASA Researches Tropical Cyclones

Hurricanes/tropical cyclones are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For more than five decades, NASA has used the vantage point of space to understand and explore our home planet, improve lives and safeguard our future. NASA brings together technology, science, and unique global Earth observations to provide societal benefits and strengthen our nation. Advancing knowledge of our home planet contributes directly to America's leadership in space and scientific exploration.

For updated forecasts. visit: http://www.nhc.noaa.gov

By Rob Gutro
NASA's Goddard Space Flight Center

Credit: 
NASA/Goddard Space Flight Center

Butterfly genomics: Monarchs migrate and fly differently, but meet up and mate

image: Tens to hundreds of millions of monarchs blanket the trees and landscape of central Mexico through the winter.

Image: 
Jaap de Roode

Each year, millions of monarch butterflies migrate across eastern North America to fly from as far north as the U.S.-Canadian border to overwinter in central Mexico -- covering as much as 3,000 miles. Meanwhile, on the other side of the Rocky Mountains, western monarchs generally fly 300 miles down to the Pacific Coast to spend the winter in California. It was long believed that the eastern and western monarchs were genetically distinct populations.

A new study, however, confirms that while the eastern and western butterflies fly differently, they are genetically the same. The journal Molecular Ecology published the findings, led by evolutionary biologists at Emory University.

"It was surprising," says Jaap de Roode, Emory professor of biology and senior author of the study. His lab is one of a handful in the world that studies monarch butterflies.

"You would expect that organisms with different behaviors and ecologies would show some genetic differences," de Roode says. "But we found that you cannot distinguish genetically between the western and eastern butterflies."

The current paper builds on previous work by the de Roode lab that found similarities between 11 genetic markers of the eastern and western monarchs, as well as other more limited genetic studies and observational and tracking data.

"This is the first genome-wide comparison of eastern and western monarchs to try to understand their behavioral differences better," says Venkat Talla, first author of the current study and an Emory post-doctoral fellow in the lab.

Talla analyzed more than 20 million DNA mutations in 43 monarch genomes and found no evidence for genomic differentiation between eastern and western monarchs. Instead, he found identical levels of genetic diversity.

"Our work shows that the eastern and western monarchs are mating together and exchanging genetic material to a much greater extent than was previously realized," Talla says. "And it adds to the evidence that it is likely differences in their environments that shapes the differences in their patterns of migration."

Co-author Amanda Pierce, who led the earlier study on 11 genetic markers, launched the project while she was a graduate student in the De Roode Lab.

"Monarch butterflies are so fragile and so lightweight, and yet they are able to travel thousands of miles," Pierce says. "They are beautiful creatures and a great model system to understand unique, innate behaviors. We know that migration is ingrained in their genetic wiring in some way."

After monarchs leave their overwintering sites, they fly north and lay eggs. The caterpillars turn into butterflies and then fly further, mating and laying another generation of eggs. The process repeats for several generations until finally, as the days grow shorter and the temperatures cooler, monarchs emerge from their chrysalises and start to fly south. This migratory generation does not expend any energy on breeding or laying eggs, saving it all for the long journey.

"For every butterfly that makes it to California or to Mexico, that's its first journey there," Pierce marvels.

Previous work had identified a propensity for the eastern and western monarchs to have slight differences in their wing shapes. For the current paper, the researchers wanted to identify any variations in their flight styles.

They collected eastern monarchs from a migratory stopover site in Saint Marks, Florida, and western monarchs from one of their overwintering sites near Oceano, California. Pierce ran flight trials with the butterflies by tethering them to a mill that restricted their flight patterns to circles with a circumference of about 25 feet. The trials were performed in a laboratory under controlled light and temperature conditions that mimicked overwintering sites. Artificial flowers were arranged around the circumference of the flight mills.

"The idea was to try to give them some semblance of a 'natural' environment to help motivate them and to orient them," Pierce explains.

Butterflies were released unharmed from the flight mills after performing short trials.

The results showed that the eastern monarchs would choose to fly for longer distances while the western monarchs flew shorter distances but with stronger bursts of speed. "The more powerful flight trait of the western monarch is like a sprinter, essentially," Pierce says, "while the eastern monarchs show a flight trait more like marathoners."

Pierce has since graduated from Emory and now works as a geneticist for the Environmental Protection Agency in Washington, D.C.

Talla, who specializes in bioinformatics, grew up in India where the rich diversity of wildlife inspired him to become an evolutionary biologist. He moved to Sweden to get his PhD, where he studied the genomics of the European wood white butterfly. Although all wood whites appear identical visually, they are actually three different species.

"One of the big questions I'm interested in answering is how does an individual species wind up becoming multiple species?" Talla says. "I want to understand all the processes involved in that evolution."

He jumped at the chance to join the De Roode Lab. "Monarchs have always been at the top of my list of butterflies I wanted to study because of their incredible migrations," Talla says. "They are a fascinating species."

Last November, he joined de Roode on a lab field trip to the eastern monarch overwintering site, inside and adjacent to the Monarch Butterfly Biosphere Reserve in central Mexico. Tens to hundreds of millions of monarchs blanket the trees and landscape through the winter. "It's a mind-blowing sight," Talla says. "It makes you wonder how they all know how to get there."

Previous tracking and observational studies had shown that at least some western monarchs fly south to Mexico instead of west to California. The full-genome analysis suggests that more than just a few of the western monarchs may be making the trip to Mexico where they mix with the eastern monarchs. And when the butterflies depart Mexico, some may fly west instead of east.

"Evidence from multiple directions is coming together to support the same view," de Roode says.

The findings may help in the conservation of monarchs. Due to a combination of habitat loss, climate change and lack of nectaring flowers, numbers of both eastern and western monarchs have declined in recent decades, with the western ones showing the most precipitous drop. The U.S. Fish and Wildlife Service is currently considering whether the butterflies need special protections.

"If environmental factors are all that drives the differences between the eastern and western monarchs, it's possible that we could help the western population by transplanting some of the eastern ones to the west," de Roode says.

The De Roode lab now plans to investigate what exactly in the environments of the butterflies triggers different expressions of their genes.

Credit: 
Emory Health Sciences

C&EN names top 50 chemical companies

After being dethroned last year, German chemical giant BASF is once again number one in C&EN's annual Global Top 50 list of chemical companies for 2019. Chemical & Engineering News, the weekly newsmagazine of the American Chemical Society, reports that a shakeup in the international chemical markets was brewing even before the upheaval caused by the COVID-19 pandemic.

At the end of the 2019 fiscal year, the group of 50 chemical companies earned a collective $855.6 billion in revenue, a 5% decrease from the previous year. In addition, overall earnings dropped by over 28% compared to 2018. This was a sign that the global economy was slowing even before the novel coronavirus emerged, writes Senior Editor Alex Tullo. Executives pinned this decline on trade tensions between the U.S. and China, along with poor performance in the automotive industry and other key markets. 

With $66.4 billion in chemical sales in 2019, BASF's reemergence as the top firm was attributed to the breakup of DowDuPont into three separate companies. Followed closely at number two is China-based Sinopec with $61.6 billion in sales, and the now-separate Dow and DuPont come in at numbers three and 14 on the list, respectively. With the global pandemic continuing to keep consumers at home, the outlook for 2020 is grim, experts note. They predict that U.S. chemical revenue will decline by 15%, and the worldwide gross domestic product will contract by 4.6%. In Europe, the picture is slightly rosier, with chemical production up in Germany by 3.2%, but with a decline in sales expected late in the year. Despite current volatility, the 2019 list is not much different than lists from recent years, but the coming year might shake things up even further.

The paper, "C&EN's Global Top 50 for 2020," is freely available here.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and its people. The Society is a global leader in providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a specialist in scientific information solutions (including SciFinder® and STN®), its CAS division powers global research, discovery and innovation. ACS' main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive press releases from the American Chemical Society, contact newsroom@acs.org.

Follow us: Twitter | Facebook

Credit: 
American Chemical Society

Transforming e-waste into a strong, protective coating for metal

image: The material (thick gray line in the center of the image) derived from e-waste remained intact when indented, and increased the hardness of the steel below it.

Image: 
Adapted from <i>ACS Omega</i> <b>2020</b>, DOI: 10.1021/acsomega.0c00485

A typical recycling process converts large quantities of items made of a single material into more of the same. However, this approach isn't feasible for old electronic devices, or "e-waste," because they contain small amounts of many different materials that cannot be readily separated. Now, in ACS Omega, researchers report a selective, small-scale microrecycling strategy, which they use to convert old printed circuit boards and monitor components into a new type of strong metal coating.

In spite of the difficulty, there's plenty of reason to recycle e-waste: It contains many potentially valuable substances that can be used to modify the performance of other materials or to manufacture new, valuable materials. Previous research has shown that carefully calibrated high temperature-based processing can selectively break and reform chemical bonds in waste to form new, environmentally friendly materials. In this way, researchers have already turned a mix of glass and plastic into valuable, silica-containing ceramics. They've also used this process to recover copper, which is widely used in electronics and elsewhere, from circuit boards. Based on the properties of copper and silica compounds, Veena Sahajwalla and Rumana Hossain suspected that, after extracting them from e-waste, they could combine them to create a durable new hybrid material ideal for protecting metal surfaces.

To do so, the researchers first heated glass and plastic powder from old computer monitors to 2,732 F, generating silicon carbide nanowires. They then combined the nanowires with ground-up circuit boards, put the mix on a steel substrate then heated it up again. This time the thermal transformation temperature selected was 1,832 F, melting the copper to form a silicon-carbide enriched hybrid layer atop the steel. Microscope images revealed that, when struck with a nanoscale indenter, the hybrid layer remained firmly affixed to the steel, without cracking or chipping. It also increased the steel's hardness by 125%. The team refers to this targeted, selective microrecycling process as "material microsurgery," and say that it has the potential to transform e-waste into advanced new surface coatings without the use of expensive raw materials. 

Credit: 
American Chemical Society

Healing an Achilles' heel of quantum entanglement

image: Louisiana State University physicist Mark Wilde and Xin Wang of Baidu Research's mathematical formula, called κ entanglement or max-logarithmic negativity (upper left) makes it possible to effeciently calculate the cost of entanglement, which in itself is an entanglement measure, in the creation of a two-party quantum state.

Image: 
Mark Wilde, LSU

Louisiana State University Associate Professor of Physics Mark M. Wilde and his collaborator have solved a 20-year-old problem in quantum information theory on how to calculate entanglement cost--a way to measure entanglement--in a manner that's efficiently computable, useful, and broadly applicable in several quantum research areas.

In a new paper published in Physical Review Letters, Wilde and co-author Dr. Xin Wang of Baidu Research describe how allowing a slightly wider range of physical operations than what's known as LOCC (local operations and classical communication)--which have boggled quantum scientists with difficult math for some time--makes it possible to characterize the exact entanglement cost of a given quantum state. Their work closes a longstanding investigation in entanglement theory that is known as the "PPT exact entanglement cost of a quantum state."

Quantum information science aims to understand and control the strange and sometimes spooky properties of quantum states (that is, entangled states) that enable information processing tasks that are impossible in the non-quantum world, such as teleportation, quantum computing, and absolutely secure communication.

The most basic unit of entanglement is known as a Bell state. You can think of it as the smallest possible molecule consisting of two entangled atoms (qubits, really) whose entanglement is absolute--implying, if you could peek at one of them, you would know beyond a doubt that the other one would be its twin, with the same characteristics. Like two people flipping a coin; if one person gets tails, which reasonably is a 50/50 chance, the other would be guaranteed to get tails (or they both get heads, same thing), a consequence of absolute entanglement or a Bell state. Additionally, no one else in the universe can know the exact outcome of the coin toss, and this is the main reason why secured communication based on quantum entanglement is possible as well as desirable.

"Quantum entanglement is a kind of super-correlation that two distant parties share," Wilde explained. "If the world were described by classical physics only, then it would not be possible to have the strong correlations available with quantum entanglement. However, our world is fundamentally quantum mechanical, and entanglement is an essential feature of it."

When quantum entanglement was first discovered in the 1930s, it was thought to be a nuisance because it was difficult to understand, and unclear what its benefits would be. But with the rise of quantum information science in the 1990s, it was understood in a theoretical sense as the key to remarkable quantum technologies. Recent examples of such technologies include the Chinese teleportation experiment from ground to satellite in 2017 as well as Google's quantum-computational supremacy achievement last year.

At LSU, quantum physicists like Omar Magaña-Loaiza and Thomas Corbitt routinely perform experiments that could benefit from Wilde and Wang's new and more precise measure. In their respective labs, Magaña-Loaiza recently generated entangled states via conditional measurements, which constitutes an important step in the development of entangled laser-like systems, while Corbitt performed a study of optomechanical entanglement, which has the potential to be a reliable source of multiphoton entanglement at short wavelengths. Wilde and Wang's new entanglement measure, called κ entanglement or max-logarithmic negativity, can be used to assess and quantify the entanglement produced in a wide range of quantum-physical experiments.

Basic entanglement units or Bell states are also known as e-bits. Entanglement can be looked at in two different ways: either how many e-bits it would take to prepare a quantum state, or how many e-bits one could extract or "distill" from a complex entangled state. The former is known as entanglement cost and is the problem Wilde and Wang considered.

"E-bits are a precious resource and you want to use as few of them as possible," Wilde said. "In physics, you often want to look at both the forwards process and the backwards process. Is it reversible? And if it is, do I lose something along the way? And the answer to that is yes."

Wilde admits the problem he and Wang have solved is somewhat esoteric--a mathematical trick. However, it will allow quantum information scientists to efficiently calculate entanglement costs given certain constraints.

"Not all entanglement measures are efficiently computable and have a meaning such as entanglement cost. That is a key distinction between all previous work and ours," Wilde added.

While the lack of this kind of measure has been an Achilles' heel in quantum information science for over 20 years, it was--ironically--Wilde becoming max-negatively "entangled" during a game of basketball in 2018 that led to him and Wang eventually solving the problem.

"I ruptured my Achilles' heel while going for the winning point of the game, then had surgery to repair it, and couldn't get out of bed for a month and a half," Wilde remembers. "So, I wrote a research paper about entanglement cost, and when Xin Wang learned about it, he asked me if I would be interested in developing this problem further. We then started working together, back and forth, and that became the paper we now have published in Physical Review Letters. We became good friends and collaborators after that--it is remarkable the surprises that can occur in life."

Credit: 
Louisiana State University

Engineers find thinner tissues in replacement heart valves create problematic flutter

video: These are computer models of replacement aortic heart valves. The models show biological tissues built into the valves at thicknesses of 100%, 75%, 50% and 25%. You can see fluttering in the thinner tissues at the bottom.

Image: 
Videos courtesy of Ming-Chen Hsu, Iowa State University

AMES, Iowa - You're in the middle of the aorta, the body's pipeline for oxygen-rich blood, looking back toward the heart's primary pump, the left ventricle.

The ventricle muscle contracts and the aortic heart valve's three leaflets explode open and blood flows by at up to 200 centimeters a second. And what's this?

Those three leaflets are flapping in the flow - fluttering, in engineering terms. That's a problem. It could lead to leaflet tearing, calcium deposits, fatigue failure, even damage to the blood flowing by.

We have an eyewitness look, from a physically impossible point of view, thanks to computational models of the fluid-structure interactions of blood and heart valves developed by engineers at Iowa State University and The University of Texas at Austin.

The engineers used their technology to study what happens when thinner and thinner biological tissues from cows or pigs are used in transcatheter aortic valve replacement. That procedure involves collapsing an artificial valve into a catheter that is threaded through an artery to the aortic root, where it expands and secures itself in place. It makes sense to choose thin tissues when building the replacement valves - thinner tissues can be folded into smaller catheters for easier movement through the narrow tubes of the arteries.

But, in side-by-side models comparing tissue thicknesses of 100%, 75%, 50% and 25%, you can see there are problems with the two thinner options.

The engineers' findings are reported in a paper just published online by the Proceedings of the National Academy of Sciences. Corresponding authors are Ming-Chen Hsu, an associate professor of mechanical engineering at Iowa State; Thomas J.R. Hughes, the Peter O'Donnell Jr. Chair in Computational and Applied Mathematics and professor of aerospace engineering and engineering mechanics at Texas and its Oden Institute for Computational Engineering and Sciences; and Michael S. Sacks, the W. A. "Tex" Moncrief Jr. Chair in Simulation-Based Engineering and Sciences, professor of biomedical engineering and director of the Willerson Center for Cardiovascular Modeling and Simulation at Texas and the Oden Institute. Emily L. Johnson, a doctoral student in mechanical engineering and the Wind Energy Science, Engineering and Policy program at Iowa State, is the first author. (See sidebar for other co-authors.)

The engineers' comparison of the performance of thinner valve tissues was supported by grants from the National Institutes of Health.

Years and years of challenge

It's not easy to develop a predictive computational model of a heart valve in action, said Iowa State's Hsu, who has been modeling heart valves for more than five years.

There's constant contraction, pressure and flow. The valves are flexible. It's a highly dynamic system, with a lot of variables.

"We're really modeling the whole physiological system," Hsu said. "That's why it has taken several years to correctly model the blood flows, which can change from laminar to turbulent, the heart valves, which are very thin and nonlinear, and the multiphysics coupling, which can be numerically unstable."

This kind of modeling takes supercomputing power, Hsu said. The valves in this study were simulated using computing resources at the Texas Advanced Computing Center, with each cardiac cycle taking about two days to run on 144 processing cores.

But this is a problem worth the time and effort. Any time a replacement heart valve wears out, patients face another heart procedure. That makes avoiding leaflet flutter in a replacement valve a "crucial quality criterion," the engineers wrote.

Let's investigate the science, too

Hsu credits Johnson, a doctoral student in his lab who also works on wind turbine modeling, with helping him take his lab's work further in a new direction.

"My background is in computational methods," he said. "But students suggested we should look more at the science questions, too. We're not just developing computational tools anymore."

In this case, the computer models and resulting videos make the science easy to see and understand. (As Hsu says, "I think videos are the best way to show our results.")

When thrown open by a pumping heart, the thinner leaflets buckle in the middle and flutter in the blood flow. "It's like a flag flapping," Johnson said.

She said the engineers were able to quantify the flapping and found that thinner tissues had as high as 80 times more "flutter energy" than thicker tissues.

The resulting conclusions are clear as the engineers' views of the fluid-structure interactions inside a heart valve:

"Considering the risks associated with such observed flutter phenomena, including blood damage and accelerated leaflet deterioration, this study demonstrates the potentially serious impact of introducing thinner, more flexible tissues into the cardiac system."

Credit: 
Iowa State University

Breakthrough method for predicting solar storms

Extensive power outages and satellite blackouts that affect air travel and the internet are some of the potential consequences of massive solar storms. These storms are believed to be caused by the release of enormous amounts of stored magnetic energy due to changes in the magnetic field of the sun's outer atmosphere - something that until now has eluded scientists' direct measurement. Researchers believe this recent discovery could lead to better "space weather" forecasts in the future.

"We are becoming increasingly dependent on space-based systems that are sensitive to space weather. Earth-based networks and the electrical grid can be severely damaged if there is a large eruption", says Tomas Brage, Professor of Mathematical Physics at Lund University in Sweden.

Solar flares are bursts of radiation and charged particles, and can cause geomagnetic storms on Earth if they are large enough. Currently, researchers focus on sunspots on the surface of the sun to predict possible eruptions. Another and more direct indication of increased solar activity would be changes in the much weaker magnetic field of the outer solar atmosphere - the so-called Corona.

However, no direct measurement of the actual magnetic fields of the Corona has been possible so far.

"If we are able to continuously monitor these fields, we will be able to develop a method that can be likened to meteorology for space weather. This would provide vital information for our society which is so dependent on high-tech systems in our everyday lives", says Dr Ran Si, post-doc in this joint effort by Lund and Fudan Universities.

The method involves what could be labelled a quantum-mechanical interference. Since basically all information about the sun reaches us through "light" sent out by ions in its atmosphere, the magnetic fields must be detected by measuring their influence on these ions. But the internal magnetic fields of ions are enormous - hundreds or thousands of times stronger than the fields humans can generate even in their most advanced labs. Therefore, the weak coronal fields will leave basically no trace, unless we can rely on this very delicate effect - the interference between two "constellations" of the electrons in the ion that are close - very close - in energy.

The breakthrough for the research team was to predict and analyze this "needle in the haystack" in an ion (nine times ionized iron) that is very common in the corona.

The work is based on state-of-the art calculations performed in the Mathematical Physics division of Lund University and combined with experiments using a device that could be thought of as being able to produce and capture small parts of the solar corona - the Electron Beam Ion Trap, EBIT, in Professor Roger Hutton's group in Fudan University in Shanghai.

"That we managed to find a way of measuring the relatively weak magnetic fields found in the outer layer of the sun is a fantastic breakthrough", concludes Tomas Brage.

Credit: 
Lund University

A safer cell therapy harnesses patient T cells to fight multiple myeloma

video: Dr. Premal Lulla, assistant professor at the Center for Cell and Gene Therapy at Baylor College of Medicine in Houston, details promising results in a clinical trial testing multi-antigen targeted T cell therapy in patients with multiple myeloma. This material relates to a paper that appeared in the Jun. 29, 2020, issue of Science Translational Medicine, published by AAAS. The paper, by P.D. Lulla at Baylor College of Medicine in Houston, TX; and colleagues was titled, "The safety and clinical effects of administering a multiantigen-targeted T cell therapy to patients with multiple myeloma."

Image: 
[Baylor College of Medicine]

A treatment for multiple myeloma that harnesses the body's cancer-fighting T cells was safe in humans and showed preliminary signs of effectiveness, according to a clinical trial involving 23 patients with relapsed or treatment-resistant disease. Although more research is needed to determine how well the treatment works, the trial results indicate that the regimen may be safer than other cell therapies and may improve outcomes in people with advanced multiple myeloma. Patients with this cancer often stop responding to standard treatments after some initial success, leaving them with no curative options. Cell therapies such as CAR T cells - immune cells genetically modified to better hunt cancer cells - have shown promising response rates of more than 80%, but patients still frequently relapse after the first year of treatment and can suffer from severe side effects. There is therefore a substantial need for new treatments that are both safer and more durable over the long term. Premal Lulla and colleagues tackled both of these issues with their autologous multitumor-associated antigen-specific T cells, which they developed using cells from 23 patients. Instead of genetically modifying the cells, the authors isolated and enriched T cells bearing receptors that showed the strongest responses to 5 multiple myeloma-linked proteins. The team infused the cells into 21 patients and observed that the treatment was generally well-tolerated, as only one patient showed strong side effects. Three patients showed objective clinical responses, and the team noted that patients with active disease survived for an average of 22 months after treatment. Lulla et al. caution that phase 2 or 3 trials involving larger samples of patients will be needed to better establish their treatment's efficacy.

Credit: 
American Association for the Advancement of Science (AAAS)

Anti-Asian racism during COVID-19 has historical ties in United States

AMES, Iowa -- Anti-Asian hate crimes during health crises are unfortunately not new, according to a new academic paper examining the history of this phenomenon.

The paper, published recently in the peer-reviewed American Journal of Criminal Justice, was co-authored by Shannon Harper, assistant professor of criminal justice at Iowa State University; Angela Gover, professor of criminology and criminal justice at University of Colorado Denver; and Lynn Langton, senior research criminologist at RTI International.

The team looked at how anti-Asian hate crimes - including verbal harassment and physical violence - during the COVID-19 pandemic have furthered the historical "othering" of Asian Americans and reproduced inequalities.

"COVID-19 has allowed for racism and xenophobia to spread because the majority population looks for someone to blame who looks or seems inherently different from themselves, which may be why anti-Asian hate crime appears to have increased during the pandemic," Harper said.

Politicians' use of derogatory terms such as the "China virus" inappropriately associates the disease with an ethnicity, Harper says. Placing blame on China in this way, the team wrote, leads to unwarranted suspicion and fear toward Asian Americans, "creating the perfect climate to cultivate hate crimes."

A history of Anti-Asian discrimination

Harassment and violence toward Asian Americans have existed since the first large group of Asian immigrants came to America during the Gold Rush in the mid-1800s, the authors found.

Individual-level racism and xenophobia eventually fueled institutional rhetoric and policies, including the Chinese Exclusion Act of 1882. In the face of housing segregation policies, many Chinese American communities formed Chinatowns in the late 1800s - which led to violent attacks by whites.

Racism and xenophobia continued into the early 20th century with the "yellow peril" myth and the U.S. government's Japanese concentration camps during World War II.

After the Immigration Act of 1965 led to a wave of East Asian immigrants into the U.S., another stereotype evolved: that of the "model minority."

Because the United States is a nation of immigrants, newcomers of all ethnicities have historically borne the brunt of discrimination and blame for infectious disease.

The COVID-19 pandemic is not the first public health crisis for which Asian-Americans have been scapegoated. The authors described the bubonic plague in San Francisco in 1900, when public health officials quarantined Chinese residents in Chinatown, and the SARS outbreak of the early 2000s, when East Asians experienced stigmatization worldwide.

Racist policies and political rhetoric toward Asian Americans persisted, including during the current pandemic.

"Unfortunately, eruptions of xenophobia have historically followed close on the heels of pandemics," the authors wrote. "Especially when viral outbreaks are deadly, fear often drives those at risk to place blame on some 'other,' or some group external to their own national, religious, or ethnic identity.

"Sickness cultivates fear, which in turn cultivates bias."

Today, social media plays a significant role in exposing anti-Asian hate crimes, particularly when bystander videos are posted and then referenced by the news media.

"Not only is it critical to stop the spread of COVID-19, but also the racial hatred it has produced," the authors wrote.

Credit: 
Iowa State University

NASA follows potential tropical cyclone 9 into eastern Caribbean

image: NASA's Terra satellite provided a visible image to forecasters of Potential Tropical Cyclone 9 on July 29 at 11:20 a.m. EDT (1520 UTC) after it moved into the eastern Caribbean Sea.

Image: 
Image Courtesy: NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA's Terra satellite obtained visible imagery of Potential Tropical Cyclone 9 after it moved into the Eastern Caribbean Sea and continued bringing heavy rainfall and gusty winds to the Leeward Islands, the U.S. and British Virgin Islands and Puerto Rico.

Warnings and Watches Abound

The National Hurricane Center has issued a number of warnings and watches associated with this potential tropical cyclone. A Tropical Storm Warning is in effect for Puerto Rico, Vieques, Culebra, the U.S. Virgin Islands, the British Virgin Islands, Montserrat, St. Kitts, Nevis, and Anguilla, St. Martin, and St. Barthelemy, Saba and St. Eustatius, St. Maarten, the Dominican Republic's entire southern and northern coastlines, the north coast of Haiti from Le Mole St Nicholas eastward to the northern border with the Dominican Republic, the Turks and Caicos Islands and the southeastern Bahamas including the Acklins, Crooked Island, Long Cay, the Inaguas, Mayaguana, and the Ragged Islands.

A Tropical Storm Watch is in effect for the central Bahamas, including Cat Island, the Exumas, Long Island, Rum Cay, and San Salvador.

Circulation Not Yet Well Defined on NASA Imagery

The Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite captured a visible image of Potential Tropical Cyclone 9 on July 29 at 11:20 a.m. EDT (1520 UTC). The image showed strong thunderstorms around the center of circulation and in fragmented bands northeast and southwest of the center. Deep convection has consolidated and there is some evidence of banding over the northern and western portions of the large circulation.

At 11 a.m. EDT, National Hurricane Center (NHC) forecaster Daniel Brown noted, "Surface observations from the Lesser Antilles show that the broader circulation of the disturbance has become slightly better defined but a recent Air Force Reserve reconnaissance aircraft was still unable to find a well-defined circulation. Therefore, the system has not yet become a tropical cyclone."

Potential Tropical Cyclone 9's Current Status at 2 p.m. EDT

At 2 p.m. EDT (1800 UTC), the disturbance was centered near latitude 16.2 degrees north and longitude 64.7 degrees west. The system is moving toward the west-northwest near 23 mph (37 kph), and this general motion with a reduction in forward speed is expected over the next few days. NHC noted the maximum sustained winds are near 45 mph (75 kph) with higher gusts. Some increase in strength is forecast through tonight, with weakening likely on Thursday due to land interaction. Some re-strengthening is possible by this weekend. The estimated minimum central pressure is 1006 millibars.

Environmental conditions are expected to be conducive to additional development, and a tropical storm is forecast to form later today or tonight. NHC said the formation chance through 48 hours remains high at 90 percent.

The Forecast Track

The NHC forecast calls for the low pressure area to strengthen into a tropical storm later in the day on July 29, 2020. If it does, it would be renamed Isaias.

On the forecast track, the system will move near or just south of Puerto Rico later today and tonight, near or over Hispaniola on Thursday, and near or over eastern Cuba on Friday. It is then forecast to cross the Dominican Republic and Haiti, and proceed northwest over eastern Cuba, heading toward the Florida Keys.

NASA's Terra satellite is one in a fleet of NASA satellites that provide data for hurricane research.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Major climate initiative in the Northeastern US benefits children's health

A new study by researchers from the Columbia Center for Children's Environmental Health (CCCEH) at Columbia University Mailman School of Public Health reports that the Regional Greenhouse Gas Initiative (RGGI) has been successful in reducing fine particulate matter (PM2.5) emissions and substantially improving children's health, both major co-benefits of this climate policy. Among the benefits between 2009-2014 were an estimated 537 avoided cases of childhood asthma, 112 preterm births, 98 cases of autism spectrum disorder (ASD), and 56 cases of term low birthweight. The associated economic savings were estimated at $191 to $350 million over the years. These findings are published today in Environmental Health Perspectives,

Initiated in 2009, RGGI is the country's first regional cap- and- trade program designed to limit carbon dioxide (CO2) emissions from the power sector. Under the agreement, nine participating northeastern states--including New York-- were required to limit emissions from fossil fuel power plants with a capacity of 25 megawatts or greater. Regulated power plants must obtain annual CO2 allowances which are auctioned quarterly and may be traded between plants or offset. Although RGGI is focused on reducing GHG emissions, it has also lowered emissions of other pollutants, including PM2.5., nitrogen oxides (NOx) and sulfur dioxide (SO2).

The researchers estimated the health benefits to children and associated economic savings using the Environmental Benefits Mapping and Analysis Program (BenMAP) tool, a computer program supported by the U.S. Environmental Protection Agency. For the first time, the researchers incorporated additional health outcomes, not previously included in BenMAP, that are associated with prenatal and childhood exposure to PM2.5.

"As impressive as they are, these estimated benefits for children do not take into account their potential life-long consequences, so they are likely underestimates of the true benefits of this policy," says lead author Frederica Perera, PhD, DrPH, professor of environmental health sciences at Columbia Mailman School and director of translational research at CCCEH. "These results should spur more such initiatives to address climate change and improve the health of our children."

Credit: 
Columbia University's Mailman School of Public Health

Influx of electric vehicles accelerates need for grid planning

image: Smart charging strategies can help manage loads on the grid and smooth out the duck curve.

Image: 
Mike Perkins, PNNL

RICHLAND, Wash. -- Electric vehicles are coming--en masse. How can local utilities, grid planners and cities prepare? That's the key question addressed with a new study led by researchers at Pacific Northwest National Laboratory for the U.S. Department of Energy's Office of Energy Efficiency and Renewable Energy's Vehicle Technologies Office.

"While we don't know exactly when the tipping point will happen, fleets of fast-charging vehicles are going to change how cities and utilities manage their electricity infrastructure" said Michael Kintner-Meyer, an electrical systems engineer in PNNL's Electricity Infrastructure group and the study's lead author. "It's not a question of if, but when."

The study, published today, integrates multiple factors not evaluated before, such as electric trucks for delivery and long haul, as well as smart EV charging strategies.

Transportation electrification is coming

According to EV Hub, about 1.5 million EVs, mostly cars and SUVs, are currently on the road in the United States. PNNL researchers evaluated the capacity of the power grid in the western U.S. over the next decade as growing fleets of EVs of all sizes, including trucks, plug into charging stations at homes and businesses and on transportation routes.

For their study, the authors used the best available data about future grid capacity from the Western Electricity Coordinating Council, or WECC. The analysis revealed the maximum EV load the grid could accommodate without building more power plants and transmission lines.

The good news is that through 2028, the overall power system, from generation through transmission, looks healthy up to 24 million EVs--about 9% of the current light-duty vehicle traffic in the United States.

However, at about 30 million EVs, things get dicey. At the local level, issues may arise at even smaller EV adoption numbers. That's because one fast-charging EV can draw as much load as up to 50 homes. If, for example, every house in a cul-de-sac has an EV, one power transformer won't be able to handle multiple EVs charging at the same time.

Smoothing out the duck curve.

As detailed in the report, current grid planning doesn't adequately account for a mass influx of EVs. That omission exacerbates an already stressful situation--the dreaded duck curve.

The duck curve is a 24-hour profile of load on the power system, and usually occurs in areas with a lot of photovoltaic--or solar--rooftop installations. The curve is based on moderate load in the morning, low load during the day when solar units feed electricity into the grid, and high load at night as people get home from work and the sun goes down.

When demand spikes, voltage plummets. This severe swing is hard on system operations that weren't designed to flip on and off like a light switch. And with more EVs plugging in to charge in the evening, the ramp-up becomes even steeper and drives up electricity costs.

Smart charging strategies--avoiding charging during peak hours in the morning and early evening--can smooth out demand peaks and fill in the duck curve, according to the study. The approach has two upsides. First, it would take advantage of relatively clean solar power during the day. It would also reduce or eliminate the sharp ramps in the evening when solar power fades and other sources kick in to make up the difference.

Plausible scenarios emphasize need for planning

Building from the WECC data, the team developed and modeled plausible scenarios for 2028. The scenarios were vetted with business leaders and included a mix of light- (passenger), medium- (delivery trucks and vans) and heavy- (semis and cargo) duty vehicles on the road--the first time all three vehicle classes have been included in such an analysis. PNNL also developed a transportation model for freight on the road, with charging stations on interstate freeways every 50 miles for all three vehicle classes.

The scenarios included the evolution of the grid and its capacity at state and regional levels. The team focused on scenarios with the greatest potential for grid impacts.

Bottlenecks due to new EV charging appeared the most in areas of California, including Los Angeles, which plans to go all-electric with its city fleet by 2030. The pinch came from the growth of fast-charging cars and commercial fleets of electric trucks. These vehicles can draw 400 amps through a circuit for as long as 45 minutes, instead of the 15 to 20 amps pulled over 6 to 8 hours by most EVs today.

Dennis Stiles oversees PNNL's energy efficiency and renewable energy research portfolio. He said fast-charging vehicles and integrating mobile loads--fleets on the move--are among the biggest challenges for planners today.

"They never really had to think about EVs before, but some cities are already looking into intelligent controls and other ways to modify their distribution systems and operations," said Stiles. "The key is to figure out now how to avoid large capital outlays in the future. Adding a new transformer here and there is a lot different than a substation overhaul."

Getting ahead of the curve

But the challenge isn't limited to large areas like Los Angeles. Kintner-Meyer said smaller cities with limited resources need help planning for their charging infrastructure and hosting capacity. That's the next step.

In a follow-on study, researchers will take a closer look at ways to integrate EVs into local and regional power distribution systems across the nation.

"We have the data and the method to run what-if scenarios," said Kintner-Meyer. "With data from utilities about feeders and infrastructure, we can build out the models then hand it off so they can get ahead of the curve."

Credit: 
DOE/Pacific Northwest National Laboratory

Lead released in Notre Dame Cathedral fire detected in Parisian honey

image: Hives on ND sacristy rooftop

Image: 
PCIGR

Elevated levels of lead have been found in samples of honey from hives downwind of the Notre Dame Cathedral fire, collected three months after the April 2019 blaze.

In research outlined in Environmental Science & Technology Letters, scientists from UBC's Pacific Centre for Isotopic and Geochemical Research (PCIGR) analyzed concentrations of metals, including lead, in 36 honey samples collected from Parisian hives in July 2019.

While all the honey fell within the EU's allowable limits for safe consumption, honey from hives downwind of the Notre Dame fire had average lead concentrations up to four times that of samples collected in the suburbs or countryside surrounding the city, and up to three and a half times the amount found in Parisian honey pre-dating the fire.

"Because of the way the wind was blowing the night the fire burned, the direction that the smoke plume traveled is well-defined. The elevated lead concentrations were measured in honey that was collected from beehives within that plume footprint," said Kate Smith, lead author of the study and PhD candidate at PCIGR.

The researchers compared honey collected after the fire to a Parisian honey blend from 2018 and to samples from the Auvergne-Rho?ne-Alpes region collected in 2017. The highest concentration of lead, 0.08 micrograms per gram, was found in a sample from a hive located within five kilometres west of the cathedral. The pre-fire Parisian honey had 0.009 micrograms of lead per gram, and honey from the Rho?ne-Alpes had 0.002 to 0.009 micrograms of lead per gram. The EU's maximum allowable lead content is 0.10 micrograms per gram for syrups, sweeteners, and juices.

Cathedral roof and spire contained tonnes of lead

Lead was a common building material in Paris throughout the time of construction of Notre Dame, which dates back to the 12th century. The cathedral's roof and spire contained several hundred tonnes of lead. While most of it simply melted in the fire, some flames reached temperatures high enough to aerosolize various lead oxides, and an estimated 180 tonnes of lead remain unaccounted for in the rubble.

"The fact that the Notre Dame spire was loaded with lead was absolutely a unique research opportunity," said co-author Dominique Weis, director of the PCIGR. "We were able to show that honey is also a helpful tracer for environmental pollution during an acute pollution event like the Notre-Dame fire. It is no surprise, since increased amounts of lead in dust or topsoil, both of which were observed in neighbourhoods downwind of the Notre Dame fire, are a strong indicator of increased amounts of lead in honey."

Because honey bees forage within a two- to three-kilometer radius of their hive, honey can provide a useful localized snapshot of the environment. As the bees forage, they collect dust and airborne particles, which make their way into the honey.

Smith and Weis worked with Parisian apiary company Beeopic, which manages around 350 hives throughout the city, and collected the samples for this study. Honey was sampled directly from each individual hive and sent to the PCIGR's clean laboratory for testing.

This study marks the first time this method of heavy-metal analysis using honey has been used in a megacity, and one with a history of lead use dating back for millennia. It came out of previous work by Smith and Weis, in which they measured trace amounts of metals in honey from urban beehives in six Metro Vancouver neighbourhoods, demonstrating the use of bees as an effective biomonitor.

"The highest levels of lead that we detected were the equivalent of 80 drops of water in an Olympic sized swimming pool," said Weis. "So even if the lead is relatively elevated, it's still very low. It's actually not higher than what we see in honey from downtown Vancouver. In a city as young as Vancouver, we are able to trace sources of the metal using distinct isotopic signatures. In Paris, however, the long history of lead use throughout the city made the interpretations more challenging. This provides an important consideration for future lead sourcing studies in very old cities."

Credit: 
University of British Columbia

Simulating quantum 'time travel' disproves butterfly effect in quantum realm

image: In research by a team at Los Alamos National Laboratory, Alice prepares her qubit and applies the information scrambling unitary U to this and many other qubits altogether. Bob measures her qubit in any basis, flipping the qubit to the state not known to Alice. Alice still can reconstruct her information via a single decoding unitary U†.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., July 28 2020--Using a quantum computer to simulate time travel, researchers have demonstrated that, in the quantum realm, there is no "butterfly effect." In the research, information--qubits, or quantum bits--"time travel" into the simulated past. One of them is then strongly damaged, like stepping on a butterfly, metaphorically speaking. Surprisingly, when all qubits return to the "present," they appear largely unaltered, as if reality is self-healing.

"On a quantum computer, there is no problem simulating opposite-in-time evolution, or simulating running a process backwards into the past," said Nikolai Sinitsyn, a theoretical physicist at Los Alamos National Laboratory and coauthor of the paper with Bin Yan, a post doc in the Center for Nonlinear Studies, also at Los Alamos. "So we can actually see what happens with a complex quantum world if we travel back in time, add small damage, and return. We found that our world survives, which means there's no butterfly effect in quantum mechanics."

In Ray Bradbury's 1952 science fiction story, "A Sound of Thunder," a character used a time machine to travel to the deep past, where he stepped on a butterfly. Upon returning to the present time, he found a different world. This story is often credited with coining the term "butterfly effect," which refers to the extremely high sensitivity of a complex, dynamic system to its initial conditions. In such a system, early, small factors go on to strongly influence the evolution of the entire system.

Instead, Yan and Sinitsyn found that simulating a return to the past to cause small local damage in a quantum system leads to only small, insignificant local damage in the present.

This effect has potential applications in information-hiding hardware and testing quantum information devices. Information can be hidden by a computer by converting the initial state into a strongly entangled one.

"We found that even if an intruder performs state-damaging measurements on the strongly entangled state, we still can easily recover the useful information because this damage is not magnified by a decoding process," Yan said. "This justifies talks about creating quantum hardware that will be used to hide information."

This new finding could also be used to test whether a quantum processor is, in fact, working by quantum principles. Since the newfound no-butterfly effect is purely quantum, if a processor runs Yan and Sinitsyn's system and shows this effect, then it must be a quantum processor.

To test the butterfly effect in quantum systems, Yan and Sinitsyn used theory and simulations with the IBM-Q quantum processor to show how a circuit could evolve a complex system by applying quantum gates, with forwards and backwards cause and effect.

Presto, a quantum time-machine simulator.

In the team's experiment, Alice, a favorite stand-in agent used for quantum thought experiments, prepares one of her qubits in the present time and runs it backwards through the quantum computer. In the deep past, an intruder - Bob, another favorite stand-in - meaures Alice's qubit. This action disturbs the qubit and destroys all its quantum correlations with the rest of the world. Next, the system is run forward to the present time.

According to Ray Bradbury, Bob's small damage to the state and all those correlations in the past should be quickly magnified during the complex forward-in-time evolution. Hence, Alice should be unable to recover her information at the end.

But that's not what happened. Yan and Sinitsyn found that most of the presently local information was hidden in the deep past in the form of essentially quantum correlations that could not be damaged by minor tampering. They showed that the information returns to Alice's qubit without much damage despite Bob's interference. Counterintuitively, for deeper travels to the past and for bigger "worlds," Alice's final information returns to her even less damaged.

"We found that the notion of chaos in classical physics and in quantum mechanics must be understood differently," Sinitsyn said.

Credit: 
DOE/Los Alamos National Laboratory