Tech

Noninvasive, self-adhesive sensor predicted worsening heart failure in veterans

DALLAS, Feb. 25, 2020 -- Wearing a removable, adhesive patch sensor accurately predicted worsening heart failure and the need for hospitalization several days before heart failure hospitalization occurred among veterans with heart failure, according to research published today in the American Heart Association's journal Circulation: Heart Failure.

The participants in the LINK-HF multi-center study consisted of one hundred heart failure (HF) patients who were veterans, average age of 68 years, enrolled at four Veterans Affairs (VA) hospitals -- in Salt Lake City; Palo Alto, Calif.; Houston, and Gainesville, Fla. -- after an initial acute heart failure admission to the hospital. The participants wore an adhesive sensor patch on their chest 24 hours a day for a minimum of 30 days and up to three months after their initial hospital discharge for a heart failure event. Of the eligible participants, 90% continued to wear the sensor at 30 days, and at 90 days; data was collected from August 2015 through December 2016.

The sensor monitored heart rate, heart rhythm, respiratory rate and physical activities such as, walking, sleeping and body posture for each participant. The data was transmitted from the sensor via Bluetooth to a smartphone, and then uploaded from the smartphone to an analytics platform on a secure computer server. A machine learning algorithm, a type of artificial intelligence approach to data analysis, established a normal baseline for each patient and then examined the incoming data. When observed data deviated from the expected 'normal baseline' behavior, the algorithm generated an alert to indicate the patient's heart failure was getting worse. The technology accurately predicted hospitalization risk more than 80% of the time. This prediction of hospitalization took place an average of 6.5 days before the readmission.

"With the use of remote data from the sensor and through data analysis by machine learning, we have shown that we can predict the future. Next, we will look at whether we can change the future," said lead study author Josef Stehlik, M.D., M.P.H., the Christi T. Smith Professor of Medicine at the University of Utah School of Medicine, medical director of the Heart Transplant Program and co-chief of the Advance Heart Failure Program at the University of Utah Hospital and the Salt Lake City Veterans Affairs Medical Center.

Heart failure (HF) is a major public health issue - it impacts about 6.2 million U.S. adults and is the No. 1 hospital discharge diagnosis in the U.S. Heart failure-related health care costs were an estimated $30.7 billion in 2012 and are expected to reach $53 billion by 2030. About 80% of these costs are for hospitalization according to a study cited by the researchers and published in 2011.[1] In the first 90 days after discharge from the hospital for heart failure, patients are at a high, up to 30%, risk of readmission. Given this significantly increased risk of HF worsening after hospitalization, the researchers focused on testing a non-invasive solution to improve HF management during this critical time period.

"In chronic heart failure, a person's condition can get worse with shortness of breath, fatigue and fluid buildup, to the point many end up in the emergency room and spend days in the hospital to recover," said Stehlik. "If we can identify patients before heart failure worsens and if doctors have the opportunity to change therapy based on this novel prediction, we could avoid or reduce hospitalizations, improve patients' lives and greatly reduce health care costs. With the evolution of technology and with artificial intelligence statistical methods, we have new tools to make this happen."

Study limitations included:

Study participants were 98% male, so it is unknown if these findings would be consistent in females;

A majority of the participants had HFrEF (heart failure with reduced ejection fraction), so further study in patients with HFpEF (heart failure with preserved ejection fraction) is needed; and

Additional research is necessary to determine if treatment changes based on the alerts could lead to improved patient outcomes.

In a future study, the researchers will test changing patient treatment based on the alert generated by the algorithm. "We are hoping that with this information, we can intervene and decrease the hospitalization rate, improve quality of life, and, for patients who end up being admitted to the hospital, shorten the length of stay," concluded Stehlik.

Credit: 
American Heart Association

Wearable sensor powered by AI predicts worsening heart failure before hospitalization

image: Wearable sensors like this one could help doctors remotely detect cardiovascular changes in heart failure patients days before a crisis occurs.

Image: 
Charlie Ehlert/University of Utah Health

A new wearable sensor that works in conjunction with artificial intelligence technology could help doctors remotely detect critical changes in heart failure patients days before a health crisis occurs and could prevent hospitalization, according to a study led by University of Utah Health and VA Salt Lake City Health Care System scientists. The researchers say the system could eventually help avert up to one in three heart failure readmissions in the weeks following initial discharge from the hospital and help patients sustain a better quality of life.

“This study shows that we can accurately predict the likelihood of hospitalization for heart failure deterioration well before doctors and patients know that something is wrong,” says the study’s lead author, Josef Stehlik, M.D., M.P.H, co-chief of the advanced heart failure program at U of U Health.

“Being able to readily detect changes in the heart sufficiently early will allow physicians to initiate prompt interventions that could prevent rehospitalization and stave off worsening heart failure,” adds Stehlik, who also serves as medical director of the heart failure and heart transplant program at George E. Wahlen VA Medical Center in Salt Lake.

The study appears in Circulation: Heart Failure, an American Heart Association journal.

About 6.2 million Americans live with heart failure and it is the top hospital discharge diagnosis in the U.S. Up to 30% of these patients will likely be readmitted to the hospital within 90 days of discharge with recurrent symptoms including shortness of breath, fatigue and fluid buildup. In many cases, hospitalization diminishes a patient’s ability to care for themselves independently.

“Those individuals who have repeated hospitalizations for heart failure have significantly higher mortality” says Biykem Bozkurt, M.D., Ph.D., a study co-author, director of the Winters Center for Heart Failure Research at the Baylor College of Medicine in Houston. “Even if patients survive, they have poor functional capacity, poor exercise tolerance and low quality of life after hospitalizations. This patch, this new diagnostic tool, could potentially help us prevent hospitalizations and decline in patient status.”

The researchers followed 100 heart failure patients, average age 68, who were diagnosed and treated at four VA hospitals in Salt Lake City, Utah; Houston, Texas; Palo Alto, California; and Gainesville, Florida. After discharge, participants wore an adhesive sensor patch on their chests 24 hours a day for up to three months.
The sensor monitored continuous electrocardiogram (ECG) and motion of each subject.

This information was transmitted from the sensor via Bluetooth to a smartphone and then passed on to an analytics platform, developed by PhysIQ, on a secure server, which derived heart rate, heart rhythm, respiratory rate, walking, sleep, body posture and other normal activities. Using artificial intelligence, the analytics established a normal baseline for each patient. When the data deviated from normal, the platform generated an indication that the patient’s heart failure was getting worse.

Overall, the system accurately predicted the impending need for hospitalization more than 80 percent of the time. On average, this prediction occurred 10.4 days before a readmission took place (median 6.5 days).

“There’s a high risk for readmission in the 90 days after initial discharge,” Stehlik says. “If we can decrease this readmission rate through monitoring and early intervention, that’s a big advance. We’re hoping even in patients who might be readmitted that their stays are shorter, and the overall quality of their lives will be better with the help of this technology.”

Next, the researchers plan to conduct a large clinical trial that will not only use the system to alert doctors of changes in a patient’s condition but also track if early intervention based on these alerts lead to fewer rehospitalizations for heart failure.

Credit: 
University of Utah Health

Researchers develop framework that improves Firefox security

Researchers from the University of California San Diego, University of Texas at Austin, Stanford University and Mozilla have developed a new framework to improve web browser security. The framework, called RLBox, has been integrated into Firefox to complement Firefox's other security-hardening efforts.

RLBox increases browser security by separating third-party libraries that are vulnerable to attacks from the rest of the browser to contain potential damage--a practice called sandboxing. The study will be published in the proceedings of the USENIX Security Symposium.

Browsers, like Firefox, rely on third-party libraries to support media decoding (e.g., rendering images or playing audio files) among many other functionalities. These libraries are often written in low-level programming languages, like C, and highly optimized for performance.

"Unfortunately, bugs in C code are often security vulnerabilities--security vulnerabilities that attackers are really good at exploiting," noted senior author Deian Stefan, an assistant professor with UC San Diego's Department of Computer Science and Engineering.

RLBox allows browsers to continue to use off-the-shelf, highly tuned libraries without worrying about the security impact of these libraries. "By isolating libraries we can ensure that attackers can't exploit bugs in these libraries to compromise the rest of the browser," said the lead PhD student on the project, Shravan Narayan.

A key piece of RLBox is the underlying sandboxing mechanism, which keeps a buggy library from interfering with the rest of the browser. The study investigates various sandboxing techniques with different trade-offs. But the team ultimately partnered with the engineering team at San Francisco-based Fastly to adopt a sandboxing technique based on WebAssembly, a new intermediate language designed with sandboxing in mind. The team believes that WebAssembly will be a key part of future secure browsers and secure systems more broadly. The WebAssembly sandboxing effort is detailed in a recent Mozilla Hacks blog post.

"Unfortunately, it's not enough to put a library in a sandbox, you need to carefully check all the data that comes out of the sandbox--otherwise a sophisticated attacker can trick the browser into doing the wrong thing and render the sandboxing effort useless, " said Stefan. RLBox eliminates these classes of attacks by tagging everything that crosses the boundary and ensuring that all such tagged data is validated before it is used.

RLBox has been integrated into Mozilla's Firefox and will be shipping to Linux users in Firefox 74 and Mac users in Firefox 75, with plans to implement in other platforms.

"This is a big deal," says Bobby Holley, principal engineer at Mozilla. "Security is a top priority for us, and it's just too easy to make dangerous mistakes in C/C++. We're writing a lot of new code in Rust, but Firefox is a huge codebase with millions of lines of C/C++ that aren't going away any time soon. RLBox makes it quick and easy to isolate existing chunks of code at a granularity that hasn't been possible with the process-level sandboxing used in browsers today."

In the study, the team isolated half a dozen libraries using RLBox. To start, Firefox will ship with their sandboxed Graphite font shaping library. Mozilla plans to apply the sandboxing more broadly in the future, ultimately making millions of users' browsers more secure.

Credit: 
University of California - San Diego

Tropical nations worst hit by climate-related fish shifts

image: The number of species shifting out of each exclusive economic zone (EEZ) by the year 2100 under a moderate (RCP 4.5, left) and more severe (RCP 8.5, right) greenhouse gas emissions scenario.

Image: 
Kimberly L. Oremus et al., <em>Nature Sustainability</em>. Feb. 24, 2020

Policymakers will need to step up to the challenges caused by significant shifts in fish species distributions caused by climate change.

Tropical countries stand to lose the most fish species due to climate change, with few if any stocks replacing them, according to a study published in the journal Nature Sustainability. This could pose a serious governance challenge that warrants careful policy-making, the researchers say.

As sea temperatures rise, fish species migrate towards cooler waters to maintain their preferred thermal environments.

Jorge García Molinos of Hokkaido University and colleagues in Japan and the USA developed a computer model to project how the ranges of 779 commercial fish species will expand or contract under a moderate and more severe greenhouse gas emissions scenario between 2015 and 2100, compared to their 2012 distribution.

The model showed that, under a moderate emissions scenario, tropical countries could lose 15% of their fish species by the year 2100. If a more high-end emissions scenario were to occur, they could lose more than 40% of their 2012 species.

The model projects northwest African countries could lose the highest percentage of species; while Southeast Asia, the Caribbean, and Central America could experience steep species declines under the worst of the two climate scenarios.

The scientists wondered if existing regional, multi-lateral or bilateral policies contain the necessary provisions to adequately manage climate-driven fish stock exits from each countries' jurisdictional waters (exclusive economic zones [EEZs]). They analyzed 127 publicly available international fisheries agreements. None contained language directly related to climate change, fish range shifts, or stock exits. Although some included mechanisms to manage short-term stock fluctuations, existing agreements also failed to contain long-term policies that prevent overfishing by nations losing fish stocks.

García Molinos and his colleagues suggest fair multilateral negotiations may be necessary among countries benefiting from the shift in fish species and those losing from it. Tropical nations in particular should highlight compensation for fishery damage during these negotiations recurring to international frameworks such as the Warsaw International Mechanisms for Loss and Damage, which aims to address losses caused by climate change. This information should also be highlighted to other financial schemes, such as the Green Climate Fund, which have been set up to assist developing countries to adapt to and mitigate the effects of climate change.

"The exit of many fishery stocks from these climate-change vulnerable nations is inevitable, but carefully designed international cooperation together with the strictest enforcement of ambitious reductions of greenhouse gas emissions, especially by the highest-emitter countries, could significantly ease the impact on those nations," García Molinos concludes.

Credit: 
Hokkaido University

The dangers facing fireflies

image: A composite image of several long exposure images of starlight and fireflies.

Image: 
Mike Lewinski on Unsplash.

The BioScience Talks podcast features discussions of topical issues related to the biological sciences.

Worldwide declines in insect populations have sparked considerable concern among researchers and members of the general public alike. To date, however, significant research gaps exist, and many insect threats remain under-investigated and poorly understood. For instance, despite their charismatic bioluminescent displays and cultural and economic importance, the 2000-plus species of firefly beetles have yet to be the subject of a comprehensive threat analysis.
Writing in BioScience, Sara M. Lewis of Tufts University and her colleagues aim to fill the gap with a broad overview of the threats facing these diverse and charismatic species--as well as potential solutions that may lead to their preservation into the future. Lewis and colleagues catalog numerous threats, foremost among them habitat loss, followed closely by artificial light and pesticide use. The future is not bleak, however, and the authors describe considerable opportunities to improve the prospects of bioluminescent insects, including through the preservation of habitat, reduction of light pollution, lowered insecticide use, and more-sustainable tourism. Dr. Lewis and coauthors Candace Fallon and Michael Reed join us on this episode of BioScience Talks to shed light on these challenges and opportunities.

To hear the whole discussion, visit this link for this latest episode of the BioScience Talks podcast.

Credit: 
American Institute of Biological Sciences

Design of the W7-X fusion device enables it to overcome obstacles

image: PPPL physicist Novimir Pablant.

Image: 
Elle Starkman/PPPL Office of Communications.

A key hurdle facing fusion devices called stellarators -- twisty facilities that seek to harness on Earth the fusion reactions that power the sun and stars -- has been their limited ability to maintain the heat and performance of the plasma that fuels those reactions. Now collaborative research by scientists at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) and the Max Planck Institute for Plasma Physics in Greifswald, Germany, have found that the Wendelstein 7-X (W7-X) facility in Greifswald, the largest and most advanced stellarator ever built, has demonstrated a key step in overcoming this problem.

Cutting-edge facility

The cutting-edge facility, built and housed at the Max Planck Institute for Plasma Physics with PPPL as the leading U.S. collaborator, is designed to improve the performance and stability of the plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei, or ions, that makes up 99 percent of the visible universe. Fusion reactions fuse ions to release massive amounts of energy -- the process that scientists are seeking to create and control on Earth to produce safe, clean and virtually limitless power to generate electricity for all humankind.

Recent research on the W7-X aimed to determine whether design of the advanced facility could temper the leakage of heat and particles from the core of the plasma that has long slowed the advancement of stellarators. "That is one of the most important questions in the development of stellarator fusion devices," said PPPL physicist Novimir Pablant, lead author of a paper describing the results in Nuclear Fusion.

His work validates an important aspect of the findings. The research, combined with the findings of an accepted paper by Max Planck physicist Sergey Bozhenkov and a paper under review by physicist Craig Beidler of the institute, demonstrates that the advanced design does in fact moderate the leakage. "Our results showed that we had a first glimpse of our targeted physics regimes much earlier than expected," said Max Planck physicist Andreas Dinklage. "I recall my excitement seeing Novi's raw data in the control room right after the shot. I immediately realized it was one of the rare moments in a scientist's life when the evidence you measure shows that you're following the right path. But even now there's still a long way to go."

Common problem

The leakage, called "transport," is a common problem for stellarators and more widely used fusion devices called tokamaks that have traditionally better coped with the problem. Two conditions give rise to transport in these facilities, which confine the plasma in magnetic fields that the particles orbit.

These conditions are:

Turbulence. The unruly swirling and eddies of plasma can trigger transport;

Collisions and orbits. The particles that orbit magnetic field lines can often collide, knocking them out of their orbits and causing what physicists call "neoclassical transport."

Designers of the W7-X stellarator sought to reduce neoclassical transport by carefully shaping the complex, three-dimensional magnetic coils that create the confining magnetic field. To test the effectiveness of the design, researchers investigated complementary aspects of it.

Pablant found that measurements of the behavior of plasma in previous W7-X experiments agreed well with the predictions of a code developed by Matt Landreman of the University of Maryland that parallels those the designers used to shape the twisting W7-X coils. Bozhenov took a detailed look at the experiments and Beidler traced control of the leakage to the advanced design of the stellarator.

"This research validates predictions for how well the optimized design of the W7-X reduces neoclassical transport," Pablant said. By comparison, he added, "Un-optimized stellarators have done very poorly" in controlling the problem.

Further benefit

A further benefit of the optimized design is that it reveals where most of the transport in the W7-X stellarator now comes from. "This allows us to determine how much turbulent transport is going on in the core of the plasma," Pablant said. "The research marks the first step in showing that high-performance stellarator designs such as W-7X are an attractive way to produce a clean and safe fusion reactor."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Forest 'duff' must be considered in controlled burning to avoid damaging trees

image: This infrared image shows that long-duration heating was most prominent where accumulated forest floor duff was deepest, particularly at the base of mature pines in long-unburned sites. Temperatures potentially lethal to plant tissues were sustained for several hours as deep as 4 inches near pines, and sustained temperatures high enough to impact soil nutrients were observed for up to 35 minutes at soil surfaces.

Image: 
Jesse Kreye, Penn State

Many decades of forest fire prevention and suppression has resulted in a thick buildup of organic matter on the forest floor in many regions of the United States, according to a Penn State researcher, whose new study suggests that the peculiar way that these layers burn should be considered in plans for controlled burns.

In both the eastern and western U.S., one of the consequences of avoiding fires for so long in fire-adapted pine forests is the build-up of forest floor "duff" -- a deep, dense layer of partially decomposed pine needles -- that would otherwise not accumulate under a frequent fire regime, explained Jesse Kreye, assistant research professor of fire and natural resources management in the College of Agricultural Sciences.

That accumulation of organic debris can complicate efforts to use prescribed fire as a forest management tool, he explained, and this buildup of duff, particularly pronounced at the base of pines, is problematic if there is a wildfire.

"When these forests do burn under dry conditions, the long-duration smoldering that occurs in this dense duff -- long after the 'flames' have gone out -- results in significant heat transfer to the tree as well as the soil," Kreye said. "That can result in mortality of large, older pines and potential ecological consequences below ground."

Restoring fire to these ecosystems with controlled burns requires particular burn prescriptions that will minimize duff smoldering, Kreye pointed out. This is primarily done, he said, through burning under a range of duff moistures that can result in some consumption of duff but not enough to cause significant damage.

"Repeated burns under these conditions can slowly restore bare ground by consuming some duff each time. That measured approach is important for pine regeneration as well as herbaceous plants that have been absent as a result of fire exclusion," said Kreye.

This duff-danger is a major issue in the South in longleaf pine ecosystems that are very well adapted to frequent fire, Kreye noted. Longleaf pine forests naturally have one of the most frequent-fire regimes of any forest ecosystems, burning every three to five years on average. But the phenomenon is not limited to the southeastern U.S., Kreye contends.

"This is an issue in many Western pine forests as well, particularly ponderosa pine forests that range across much of the western U.S.," he said. "This is also a likely problem in pine forests of the east that have depended on regular burning, such as red pine, pitch pine and shortleaf pine."

Prescribed fire is commonly used in southeastern U.S. forests and is being applied more widely in fire-prone ecosystems elsewhere. Research on direct effects of burning has focused on above ground impacts to plants with less attention to below ground effects, Kreye said. This study, recently published in Forest Science, is among the first to look at below ground effects of forest fire.

Researchers used probes attached to thermometers to measure soil heat at sampling locations after controlled fires near the bases of mature trees in a longleaf pine flatwoods ecosystem and a longleaf pine sandhill ecosystem, both in northern Florida. They found that soil heating was minimal in frequently burned sites. Where fire had been excluded for several decades, however, they detected substantial soil heating sustained for considerable durations.

Long-duration heating was most prominent where accumulated forest floor duff was deepest, particularly at the base of mature pines in long-unburned sites. Temperatures potentially lethal to plant tissues -- at or above 140 degrees Fahrenheit -- were sustained for several hours as deep as 4 inches near pines in flatwoods sites. Sustained temperatures at or above 500 F -- when impacts to soil nutrients can occur -- were observed for up to 35 minutes at soil surfaces.

Also involved in the research were J. Morgan Varner, U.S. Forest Service, Tall Timbers Research Station, Tallahassee, Florida; and Leda Kobziar, University of Idaho.

Credit: 
Penn State

Quantifiable observation of cloud seeding

Two University of Wyoming researchers contributed to a paper that demonstrated, for the first time, direct observation of cloud seeding using radar and gauges to quantify the snowfall. Traditionally, cloud seeding -- used to increase winter snowpack -- has been evaluated using precipitation gauges and target/control statistics that led mostly to inconclusive results.

The research, dubbed SNOWIE (Seeded and Natural Orographic Wintertime Clouds -- the Idaho Experiment), took place Jan. 7-March 17, 2017, within and near the Payette Basin, located approximately 50 miles north of Boise, Idaho. The research was in concert with Boise-based Idaho Power Co., which provides a good share of its electrical power through hydroelectric dams.

"This looks at how much snow falls out of seeded clouds at certain locations. That's what's in this paper," says Jeff French, an assistant professor in UW's Department of Atmospheric Science and fourth author of the paper. "We want to see if we can apply what we learned over a number of cases over an entire winter."

The paper, titled "Quantifying Snowfall from Orographic Cloud Seeding," appears in the Feb. 24 (today's) issue of the Proceedings of the National Academy of Sciences (PNAS), one of the world's most prestigious multidisciplinary scientific journals, with coverage spanning the biological, physical and social sciences.

The paper is a follow-up to a previous PNAS paper, by the same research team, titled "Precipitation Formation from Orographic Cloud Seeding," which was published in January 2018. That paper focused on what happens in the clouds when silver iodide is released into the clouds. In the case of the SNOWIE Project, the silver iodide was released by a second aircraft funded through Idaho Power Co., while the UW King Air took measurements to understand the impact of the silver iodide, French says.

Katja Friedrich, an associate professor and associate chair of atmospheric and oceanic sciences at the University of Colorado-Boulder, was the newest paper's lead author. Bart Geerts, a UW professor and department head of atmospheric science, was sixth author on the paper. Other contributors were from the University of Illinois at Urbana-Champaign, the National Center for Atmospheric Research (NCAR) and Idaho Power Co.

Throughout the western U.S. and other semiarid mountainous regions across the globe, water supplies are fed primarily through snowpack melt. Growing populations place higher demand on water, while warmer winters and earlier spring reduce water supplies. Water managers see cloud seeding as a potential way to increase winter snowfall.

"We tracked the seeding plumes from the time we put the silver iodide into the cloud until it generated snow that actually fell onto the ground," Friedrich says.

French credits modern technology, citing the use of ground-based radar, radar on UW's King Air research aircraft and multiple passes over a target mountain range near Boise, with making the detailed cloud-seeding observations happen. Despite numerous experiments spanning several decades, no direct, unambiguous observation of this process existed prior to SNOWIE, he says.

Over the years, research of cloud seeding "has been clouded," so to speak, Geerts adds. He says it was difficult to separate natural snowfall and what amount was actually produced through cloud seeding. However, this study was able to provide quantifiable snowfall.

"Natural snowfall was negligible. That really allowed us to isolate snow added through cloud seeding," Geerts says. "However, we are still in the dark where there is lots of natural snowfall."

Following a brief airborne seeding period Jan. 19, 2017, snow fell from the seeded clouds for about 67 minutes, dusting roughly 900 square miles of land in about one-tenth of a millimeter of snow, based on the team's calculations. In all, that cloud-seeding event and two more later that month produced a total of about 235 Olympic-sized swimming pools' worth of water.

Other observations where snow from cloud seeding was measured took place Jan. 20 and Jan. 31 of that year.

In all, the UW King Air made 24 research flights or intense observation periods (IOPs) lasting 4-6 hours each during SNOWIE. Of those IOPs, cloud seeding occurred during 21 of the flights. During the last three flights, Idaho Power had to suspend cloud seeding because there was so much snow in the mountains already.

While a good deal of research took place aboard the King Air, much of it also occurred on the ground. Numerical modeling of precipitation measurements was conducted using the supercomputer, nicknamed Cheyenne, at the NCAR-Wyoming Supercomputing Center. The numerical models simulated clouds and snow precipitation -- created in natural storms and with cloud seeding -- over the Payette Basin near Boise. The numerical models also allow researchers to study future storm events where measurements have not been obtained in the field.

While the 24 cloud-seeding flights by King Air was a good start, Geerts says, in an ideal world, even more flights are necessary to learn more about cloud seeding in other regions of the country.

Friedrich adds that the research is an important first step toward better understanding just how efficient cloud seeding can be at creating those winter wonderlands.

"Everyone you talk to will say, even if you can generate a little bit more snow, that helps us in the long run," she says.

French says the team has applied for a new National Science Foundation grant to continue analyzing cloud-seeding data collected from the remaining research flights during 2017.

"We will look at areas where natural snowfall occurs," French says. "We'll take what we learned and see if we can quantify how much snow was produced through silver iodide in areas already receiving snow.

"When we get done with the next three years, we'd like to go out and make similar-type measurements in Wyoming, Colorado or Utah, where clouds may have different characteristics," French adds. "We can broaden the types of clouds we can sample."

Credit: 
University of Wyoming

Quadrupling turbines, US can meet 2030 wind-energy goals

ITHACA, N.Y. - The United States could generate 20% of its electricity from wind within 10 years, without requiring any additional land, according to Cornell University research published in Nature Scientific Reports.

"The United States currently produces about 7% of its electricity from wind energy," said Sara Pryor, professor in the Department of Earth and Atmospheric Sciences. "This research shows that a quadrupling of the installed capacity of wind turbines from 2014 levels will allow us to attain the goal of 20% of electricity from the wind, without requiring additional land, or negative impacts on systemwide efficiency or local climates."

Pryor worked with Rebecca Barthelmie, professor in the Sibley School of Mechanical and Aerospace Engineering, and postdoctoral researcher Tristan Shepherd to develop scenarios for how wind energy can expand from current levels to one-fifth of the entire U.S. electricity supply by 2030, as outlined by the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) in 2008.

Called the "20% Wind Scenario," the NREL report noted that generating 20% of U.S. electricity from wind could eliminate approximately 825 million metric tons of carbon dioxide emissions in the electrical energy sector in 2030.

From 2016 to 2017, wind-generated electricity in the U.S. grew by 12% to 254 terawatt hours - then increased another 8.3% to 275 terawatt hours in 2018, the researchers said. In context, the U.S. currently uses approximately 310 to 320 terawatt hours of electricity each month - generated from coal, natural gas, nuclear and renewable energy power plants.

"Wind energy is already playing a key role in decarbonizing the global energy system," Pryor said. "Wind turbines repay the lifetime carbon emissions associated with their deployment and fabrication in three to seven months of operation and provide nearly 30 years of virtually carbon-free electricity generation."

But, the researchers asked, does quadrupling the number of wind turbines reduce the efficiency of turbine fleets that gather energy? And can that generation negatively affect the local climate?

In high-density arrays of large wind turbines, the researchers examined possible declines in systemwide efficiency associated with a phenomenon called "wind turbine wakes," where the wind speed is slowed by extraction of momentum by wind turbines. This wake is eroded by mixing with undisturbed air in the atmosphere, but can reduce the wind speed that impinges on downstream wind turbines.

"The 'theft' of wind by upstream wind turbines reduces the overall power produced by the total ensemble of wind turbines and the enhanced mixing (turbulence) can alter local climate conditions close to wind turbines," said Barthelmie.

The researchers offered scenarios - such as repowering turbines with improved technology - for expanding the installed capacity of wind turbines without using additional land. The researchers demonstrated that expansion of the installed capacity has a tiny influence on systemwide efficiency and very small impacts on local climate that are reduced by deploying large, state-of-the-art turbines.

Credit: 
Cornell University

Let it snow: Researchers put cloud seeding to the test

image: The Seeded and Natural Orographic Wintertime Clouds: The Idaho Experiment (SNOWIE) project radar dish parked on a mountaintop in western Idaho.

Image: 
Joshua Aikins

For the first time, researchers have used radar and other tools to accurately measure the volume of snow produced through cloud seeding.

Led by University of Colorado Boulder atmospheric scientist Katja Friedrich and her colleagues, the research began on a chilly day in January 2017. That's when the team watched as a flurry settled over a patch of land in western Idaho.

The gentle snow wasn't a natural occurrence. It had been triggered through cloud seeding, a technique in which tiny particles are mixed into the atmosphere to try to generate more precipitation than might normally fall.

The approach has become increasingly popular in states like Idaho and Colorado that are grappling with how to quench their growing demands for water. It's also notoriously difficult to measure.

But for three days early this year in Idaho's Payette Basin, that's just what Friedrich's team did, monitoring three attempts at cloud seeding from start to finish. Collaborators on the project included researchers from the National Center for Atmospheric Research in Boulder, University of Wyoming and University of Illinois at Urbana-Champaign.

"We tracked the seeding plume from the time we put it into the cloud until it generated snow that actually fell onto the ground," said Friedrich, an associate professor in the Department of Atmospheric and Oceanic Sciences.

In all, that cloud seeding event and two more later that month produced a total of about 282 Olympic-sized swimming pools worth of water. The group reported its findings today in the Proceedings of the National Academy of Sciences.

Friedrich added that the research is an important first step toward better understanding just how efficient cloud seeding can be at creating those winter wonderlands.

"Everyone you talk to will say even if you can generate a little bit more snow, that helps us in the long run," she said.

Idaho flurries

On Jan. 19, that little bit of additional snow kicked off with an airplane flight. Just before sunset, a plane owned by Idaho Power Company used a series of flares to inject particles of silver iodide into a natural cloud formation that was passing overhead.

The idea behind such cloud seeding is a simple one--to turn lightweight water vapor into heavy droplets.

"If everything goes according to plan, the water droplets will begin to freeze around the aerosols, forming snow," Friedrich said.

But, she added, it's also tricky to get a good sense of just how effective that transition really is, which is why most cloud seeding statistics lead to inconclusive results. Estimates range anywhere from zero to 50% additional snowfall, Friedrich said.

On that January day, however, she and her colleagues had a plan. The group used a nearby radar dish to peer into the clouds as the water inside thickened and eventually succumbed to gravity.

Based on the team's calculations, snow fell from those clouds for about 67 minutes, dusting roughly 900 square miles of land in about a tenth of a millimeter of snow.

It was barely enough snow to cling to the researchers' eyelashes. But it was water that, if not for cloud seeding, would have stayed in the air.

"If we hadn't seeded these clouds, they would not have produced any precipitation," Friedrich said.

Every little bit helps

And some in Colorado have high hopes for that process.

In 2019, the state entered into a partnership with six others that border the Colorado River to step up its efforts at cloud seeding--an attempt to increase the supply of water to that valuable waterway.

Friedrich added that, for now, she can't say how useful cloud seeding might be for such efforts moving forward--every winter storm is different and interacts with aerosols in different ways. But the group's findings could get scientists closer to being able to make those cost-benefit calculations.

"We can now finally put a number on how much water we can produce through cloud seeding," Friedrich said.

Credit: 
University of Colorado at Boulder

Let it snow: Researchers put cloud seeding to the test

image: A suite of meteorological instrumentation used in the study, including a micro-rain radar.

Image: 
Photo by Joshua Aikins

CHAMPAIGN, Ill. -- Cloud seeding has become an increasingly popular practice in the western United States, where states grapple with growing demands for water. Measuring how much precipitation cloud seeding produces has been a longstanding challenge. Researchers have developed a way to use radar and other tools to more accurately measure the volume of snow produced through cloud seeding.

The findings of the multi-institution study, co-written by University of Illinois at Urbana-Champaign professor Robert Rauber, are published in the Proceedings of the National Academy of Sciences.

In January 2019, a team led by University of Colorado, Boulder atmospheric scientist Katja Friedrich seeded clouds and tracked the associated snowfall over western Idaho. They fired flares composed of silver iodide from an airplane to inject microscopic particles into already-formed clouds to encourage water droplets to freeze into ice crystals.

"Ice particles grow rapidly and fall within clouds," said Rauber, a professor of atmospheric sciences and director of the School of Earth, Society and Environment at Illinois. "The goal was to generate more snow than might normally fall on the mountain slopes."

"We tracked the seeding plume from the time we put it into the cloud until it generated snow that reached the ground," said Friedrich, a professor of atmospheric and oceanic sciences at CU Boulder.

In all, the cloud seeding event and two more that month produced a total of about 282 Olympic-sized swimming pools worth of water in the form of snow.

"Even if you can generate a little bit more snow, that helps us in the long run," Friedrich said. "This is an important first step toward better understanding just how efficient cloud seeding can be at creating those winter wonderlands."

It is tricky to get a good sense of just how effective cloud seeding really is, the researchers said, which is why most statistics lead to inconclusive results. Estimates range anywhere from zero to 50% additional snowfall.

To overcome this obstacle, the group used two special mountaintop radars to peer into the clouds as the snow inside grew and succumbed to gravity. Based on the team's calculations, snow fell from those clouds for about 67 minutes, dusting roughly 900 square miles of land in about a tenth of a millimeter of snow.

"If we hadn't seeded these particular clouds, they would not have produced any precipitation," Friedrich said.

The researchers said it is too early to predict how useful cloud seeding might be in all mountain clouds moving forward - every winter storm is different and interacts with the seeded particles in different ways. However, the group's findings could get scientists closer to being able to make those cost-benefit calculations.

"We can now finally put a number on how much water we can produce through cloud seeding," Friedrich said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Modern technology reveals old secrets about the great, white Maya road

image: Built at the turn of the 7th century, the white plaster-coated road that began 100 kilometers to the east in Cobá ends at Yaxuná's ancient downtown, in the center of Mexico's Yucatan Peninsula.

Image: 
Photo courtesy Traci Ardren/University of Miami

Did a powerful queen of Cobá, one of the greatest cities of the ancient Maya world, build the longest Maya road to invade a smaller, isolated neighbor and gain a foothold against the emerging Chichén Itzá empire?

The question has long intrigued Traci Ardren, archaeologist and University of Miami professor of anthropology. Now, she and fellow scholars may be a step closer to an answer, after conducting the first lidar study of the 100-kilometer stone highway that connected the ancient cities of Cobá and Yaxuná on the Yucatan Peninsula 13 centuries ago.

Once used mainly by meteorologists to study clouds, lidar--short for "light detection and ranging"--technology is revolutionizing archaeology by enabling archaeologists to detect, measure, and map structures hidden beneath dense vegetation that, in some cases, has grown for centuries, engulfing entire cities. Often deployed from low-flying aircraft, lidar instruments fire rapid pulses of laser light at a surface, and then measure the amount of time it takes for each pulse to bounce back. The differences in the times and wavelengths of the bounce are then used to create digital 3D maps of hidden surface structures.

The lidar study, which Ardren and fellow researchers with the Proyecto de Interaccion del Centro de Yucatan (PIPCY) conducted in 2014 and 2017 of Sacbe 1--or White Road 1, as the white plaster-coated thoroughfare was called--may shed light on the intentions of Lady K'awiil Ajaw, the warrior queen who Ardren believes commissioned its construction at the turn of the 7th century.

In an analysis of the lidar study, recently published in the Journal of Archaeological Science, the researchers identified more than 8,000 tree-shrouded structures of varying sizes along the sacbe--with enough total volume to fill approximately 2,900 Olympic swimming pools. The study also confirmed that the road, which measures about 26 feet across, is not a straight line, as has been assumed since Carnegie Institute of Washington archaeologists mapped its entire length in the 1930s, with little more than a measuring tape and a compass.

Rather, the elevated road veered to incorporate preexisting towns and cities between Cobá, which known for its carved monuments depicting bellicose rulers standing over bound captives, controlled the eastern Yucatan, and Yaxuná--a smaller, older, city in the middle of the peninsula. Yet, the isolated Yaxuná (pronounced Ya-shoo-na) still managed to build a pyramid nearly three times bigger and centuries before Chichén Itzá's more famous Castillo, about 15 miles away.

"The lidar really allowed us to understand the road in much greater detail. It helped us identify many new towns and cities along the road--new to us, but preexisting the road," Ardren said. "We also now know the road is not straight, which suggests that it was built to incorporate these preexisting settlements, and that has interesting geo-political implications. This road was not just connecting Cobá and Yaxuná; it connected thousands of people who lived in the intermediary region."

It was partly Yaxuná's proximity to Chichén Itzá, Mexico's most famous Maya ruin which flourished after Yaxuná and Cobá waned, that led Ardren and other PIPCY researchers to theorize that K'awiil Ajaw built the road to invade Yaxuná and gain a foothold in the middle of the peninsula. Coba's ruler for several decades beginning in 640 A.D., she is depicted in stone carvings trampling over her bound captives.

"I personally think the rise of Chichén Itzá and its allies motivated the road," Ardren said. "It was built just before 700, at the end of the Classic Period, when Cobá is making a big push to expand. It's trying to hold on to its power, so with the rise of Chichén Itzá, it needed a stronghold in the center of the peninsula. The road is one of the last-gasp efforts of Cobá to maintain its power. And we believe it may have been one of the accomplishments of K'awiil Ajaw, who is documented as having conducted wars of territorial expansion."

To test their theory, Ardren, an expert on gender in ancient Maya society who edited the 2002 book "Ancient Maya Women," and fellow PIPCY scholars received funding from the National Science Foundation to excavate ancient household clusters along the great white road. Their goal is to determine the degree of similarities between the household goods in Cobá and Yaxuná before and after the road was built. The thinking, Ardren said, is that after the road linked the two cities, the goods found in Yaxuná would show increasing similarities to Cobá's.

So far, the researchers have excavated household clusters on the edge of both Cobá and Yaxuná, and they plan to begin a third dig this summer, at a spot informed by the lidar study. It sits between the two ancient Maya cities, on the great, white road that Ardren says would have glowed brightly even in the dark of night.

As she noted, the road was as much an engineering marvel as the monumental pyramids the Maya erected across southern Mexico, Guatemala, northern Belize and western Honduras. Although built over undulating terrain, the road was flat, with the uneven ground filled in with huge limestone boulders, and the surface coated with bright, white plaster. Essentially the same formula the Romans used for concrete in the third century B.C., the plaster was made by burning limestone and adding lime and water to the mixture.

"It would have been a beacon through the dense green of cornfields and fruit trees," Ardren said. "All the jungle we see today wasn't there in the past because the Maya cleared these areas. They needed wood to build their homes. And now that we know the area was densely occupied, we know they needed a lot of wood. Because they also needed it to burn limestone''--and build the longest road in the Maya world 13 centuries ago.

Credit: 
University of Miami

Listening to bursting bubbles

image: Sound signatures from violent fluid events, like bubbles bursting, can be used to measure forces at work during these events

Image: 
Bussonni&egrave;re et al. Physical Review Letters (2020)

Analyzing sounds from fluids in motion can help scientists gather data from biological and physical events that can be hard to quantify. For example, scientists use the famous Doppler effect to calculate how fast blood is flowing in the body. Now, scientists have measured the acoustics of a bursting soap bubble, a common example of violent event, to decipher the origin of the popping sound. Researchers Bussonnière et al. determined that the forces exerted by the liquid soap film on the air are those that create the "pop" as the bubble bursts. The results indicate how sound signatures from a violent event could be harnessed to measure forces during the event, according to the authors.

Credit: 
American Physical Society

The neutron's electric dipole moment

image: Techniques for studying the neutron's electrical charges have reached a new level of sensitivity

Image: 
C. Abel et al., Physical Review Letters (2020)

Although the neutron has no net electrical charge, it still could have an electric dipole moment (EDM)--a structural measure of the distance between its positive and negative electrical charges. According to researchers C. Abel et al., this moment would be "act like an electronic compass to electric fields just as an ordinary compass does with magnetic fields." These researchers are searching for the neutron's electric dipole moment to probe new physics which was at play just after the Big Bang. Now, they have refined the sensitivity of techniques used to search for the neutron's EDM. The new advancements with improved sensitivity will assist scientists who have been searching for the neutron's elusive electric dipole moment since the 1950s.

Credit: 
American Physical Society

Shining a new light on biomimetic materials

video: Schematic representation of optical self-trapping within SP-functionalized hydrogels with two remote beams; each beam is switched on and off to control the interaction.

Image: 
Aizenberg/Saravanamuttu Lab. Proceedings of the National Academy of Sciences Feb 2020, 201902872; DOI: 10.1073/pnas.1902872117

Advances in biomimicry - creating biological responses within non-biological substances - will enable synthetic materials to behave in ways that were typically only found in Nature. Light provides an especially effective tool for triggering life-like, dynamic responses within a range of materials. The problem, however, is that the applied light is typically dispersed throughout the sample and thus, it is difficult to localize the bio-inspired behavior to the desired, specific portions of the material.

A convergence of optical, chemical and materials sciences, however, has yielded a novel way to utilize light to control the local dynamic behavior within a material. In a general sense, the illuminated material mimics a vital biological behavior: the ability of the iris and pupil in the eye to dynamically respond to the in-coming light. Furthermore, once the light enters the sample, the material itself modifies the behavior of the light, trapping it within regions of the sample.

The latest research from the University of Pittsburgh's Swanson School of Engineering, Harvard University and McMaster University, reveals a hydrogel that can respond to optical stimuli and modify the stimuli in response. The group's findings of this opto-chemo-mechanical transduction were published this month in the Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.1902872117).

The Pitt authors include Anna C. Balazs, Distinguished Professor of Chemical and Petroleum Engineering and John A. Swanson Chair of Engineering; and Victor V. Yashin, Visiting Research Assistant Professor. Other members include Joanna Aizenberg, Amos Meeks (co-first author) and Anna V. Shneidman, Wyss Institute for Biologically Inspired Engineering and Harvard John A. Paulson School of Engineering and Applied Sciences; Ankita Shastri, Harvard Department of Chemistry and Chemical Biology; and Fariha Mahmood, Derek Morim (co-first author), Kalaichelvi Saravanamuttu and Andy Tran, McMaster University, Ontario, Canada.

"Until only a decade or so ago, the preferred state for materials was static. If you built something, the preference was that a material be predictable and unchanging," Dr. Balazs explained. "However, as technology evolves, we are thinking about materials in new ways and how we can exploit their dynamic properties to make them responsive to external stimuli.

"For example, rather than programming a computer to make a device perform a function, how can we combine chemistry, optics and materials to mimic biological processes without the need for hard-wired processors and complex algorithms?"

The findings continue Dr. Balazs' research with spiropyran (SP)-functionalized hydrogels and the material's photo-sensitive chromophores. Although the SP gel resembles gelatin, it is distinctive in its ability to contain beams of light and not disperse them, similar to the way fiber optics passively control light for communication. However, unlike a simple polymer, the water-filled hydrogel reacts to the light and can "trap" the photons within its molecular structure.

"The chromophore in the hydrogel plays an important role," she explains. "In the absence of light, the gel is swollen and relaxed. But when exposed to light from a laser beam about the width of a human hair, it changes it structure, shrinks and becomes hydrophobic. This increases the polymer density and changes the hydrogel's index of refraction and traps the light within regions that are denser than others. When the laser is removed from the source, the gel returns to its normal state. The ability of the light to affect the gel and the gel in turn to affect the propagating light creates a beautiful feedback loop that is unique in synthetic materials."

Most surprisingly, the group found that the introduction of a second, parallel beam of light creates a type of communication within the hydrogel. One of the self-trapped beams not only controls a second beam, but also the control can happen with a significant distance between the two, thanks to the response of the hydrogel medium. Dr. Yashin notes that this type of control is now possible because of the evolution of materials, not because of advances in laser technology.

"The first observation of self-trapping of light occurred in 1964, but with very large, powerful lasers in controlled conditions," he said. "We can now more easily achieve these behaviors in ambient environments with far less energy, and thus greatly expand the potential use for non-linear optics in applications."

The group believes that opto-chemo-mechanical responses present a potential sandbox for exploration into soft robotics, optical computing and adaptive optics.

"There are few materials designed with a built-in feedback loop," Dr. Balazs said. "The simplicity of the responses provides an exciting way to mimic biological processes such as movement and communication, and open new pathways toward creating devices that aren't reliant on human control."

Credit: 
University of Pittsburgh