Tech

First ICESat-2 global data released: Ice, forests and more

image: For the second straight year, NASA researchers endured low temperatures, biting winds, and high altitude to conduct another 88-South Traverse. The 470-mile expedition in one of the most barren landscapes on Earth provides the best means of assessment of the accuracy of data collected from space by the Ice Cloud and land Elevation Satellite-2 (ICESat-2).
Credits: NASA's Goddard Space Flight

Image: 
Center/Ryan Fitzgibbons

More than a trillion new measurements of Earth's height - blanketing everything from glaciers in Greenland, to mangrove forests in Florida, to sea ice surrounding Antarctica - are now available to the public. With millions more observations added each day, data from NASA's Ice, Cloud and land Elevation Satellite-2 is providing a precise global portrait of elevation and will allow scientists to track even the slightest changes in the planet's polar regions.

"The data from ICESat-2 are really blowing our minds, and I'm really excited to see what people with different perspectives will do with it," said Lori Magruder, a senior research scientist at the University of Texas, Austin, and the ICESat-2 science team lead.

The long-awaited ICESat-2 mission, launched in September 2018, continues the record of polar height data begun with the first ICESat satellite, which operated from 2003 to 2009. NASA's airborne Operation IceBridge project bridged the data gap between the two satellites. The new satellite provides far more measurements than its predecessor. ICESat took approximately 2 billion measurements in its lifetime, a figure ICESat-2 surpassed within its first week.

When ICESat orbited over a rift in Antarctica's Filchner-Ronne Ice Shelf in October 2008, for example, it recorded a handful of data points indicating a crevasse in the ice. When ICESat-2 passed over 10 years later, it collected hundreds of measurements tracing the sheer walls and jagged floor of the growing rift.

ICESat-2 is taking these measurements in a dense grid across the Arctic as well as Antarctica, recording each spot every season to track both seasonal and annual changes in ice.

ICESat-2's ability to measure heights beyond the poles is also impressing scientists - Magruder pointed to coastal areas, where in clear waters the satellite can detect the seafloor up to 100 feet (30 m) below the surface. Over forests, the satellite not only detects the top of the canopy, but the forest floor below - which will allow researchers to calculate the mass of vegetation in a given area.

All this is being done with six laser beams from a satellite 310 miles (500 kilometers) in space, noted Tom Neumann, ICESat-2 project scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland.

"Getting the exact latitude, longitude, and elevation of where a photon bounced off Earth is hard - lots of things have to happen and go really, really well," he said. To make sure everything is working, the science team conducts a series of checks using data from airborne surveys, ground-based campaigns, even the satellite itself.

That includes scientists travelling to Antarctica, where they drove modified snow-groomers along an arc of the 88-degree-south latitude line, taking highly accurate elevation measurements to compare with the data collected by ICESat-2 in space. Magruder compared measurements taken in White Sands, New Mexico, with what the satellite was tracking. In its most recent Antarctic and Arctic campaigns, NASA's airborne Operation IceBridge flew specific routes designed to take measurements over the same ice, at close to or exactly the same time the satellite flew overhead.

ICESat-2 is designed to precisely measure the height of ice and track how it changes over time. Earth's melting glaciers cause sea levels to rise globally, and shrinking sea ice can change weather and climate patterns far from the planet's poles.

Small changes across vast areas like the Greenland ice sheet can have large consequences. ICESat-2 will be able to measure the shift in annual elevation across the ice sheet to within a fraction of an inch. To do this, the satellite uses a laser altimeter - an instrument that times how long it takes light to travel to Earth's surface and back. With that time - along with the knowledge of where in space ICESat-2 is, and where on Earth the laser is pointing - computer programs create a height data point. The data is originally processed at NASA Goddard, then turned into advanced data products that researchers will be able to use to study elevations across the globe.

ICESat-2 data products are now available for free from the National Snow and Ice Data Center at https://nsidc.org/data/icesat-2.

For more information, visit http://www.nasa.gov/content/goddard/icesat-2 or https://icesat-2.gsfc.nasa.gov/. For more information on the data products, visit: https://earthdata.nasa.gov/icesat-2-data

Credit: 
NASA/Goddard Space Flight Center

New organic flow battery brings decomposing molecules back to life

image: This new flow battery brings molecules back from the dead. So-called zombie molecules cut the capacity fade rate of the battery at least a factor of 40 while keeping the battery low-cost

Image: 
Harvard SEAS

After years of making progress on an organic aqueous flow battery, Harvard University researchers ran into a problem: the organic anthraquinone molecules that powered their ground-breaking battery were slowly decomposing over time, reducing the long-term usefulness of the battery.

Now, the researchers -- led by Michael Aziz, the Gene and Tracy Sykes Professor of Materials and Energy Technologies at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Roy Gordon, the Thomas Dudley Cabot Professor of Chemistry and Professor of Materials Science -- have figured out not only how the molecules decompose, but also how to mitigate and even reverse the decomposition.

The death-defying molecule, named DHAQ in their paper but dubbed the "zombie quinone" in the lab, is among the cheapest to produce at large scale. The team's rejuvenation method cuts the capacity fade rate of the battery at least a factor of 40, while enabling the battery to be composed entirely of low-cost chemicals.

The research was published in the Journal of the American Chemical Society.

"Low mass-production cost is really important if organic flow batteries are going to gain wide market penetration," said Aziz. "So, if we can use these techniques to extend the DHAQ lifetime to decades, then we have a winning chemistry."

"This is a major step forward in enabling us to replace fossil fuels with intermittent renewable electricity," said Gordon.

Since 2014, Aziz, Gordon and their team have been pioneering the development of safe and cost-effective organic aqueous flow batteries for storing electricity from intermittent renewable sources like wind and solar and delivering it when the wind isn't blowing and the sun isn't shining. Their batteries use molecules known as anthraquinones, which are composed of naturally abundant elements such as carbon, hydrogen, and oxygen, to store and release energy.

At first, the researchers thought that the lifetime of the molecules depended on how many times the battery was charged and discharged, like in solid-electrode batteries such as lithium ion. However, in reconciling inconsistent results, the researchers discovered that these anthraquinones are decomposing slowly over the course of time, regardless of how many times the battery has been used. They found that the amount of decomposition was based on the calendar age of the molecules, not how often they've been charged and discharged.

That discovery led the researchers to study the mechanisms by which the molecules were decomposing.

"We found that these anthraquinone molecules, which have two oxygen atoms built into a carbon ring, have a slight tendency to lose one of their oxygen atoms when they're charged up, becoming a different molecule," said Gordon. "Once that happens, it starts of a chain reaction of events that leads to irreversible loss of energy storage material."

The researchers found two techniques to avoid that chain reaction. The first: expose the molecule to oxygen. The team found that if the molecule is exposed to air at just the right part of its charge-discharge cycle, it grabs the oxygen from the air and turns back into the original anthraquinone molecule -- as if returning from the dead. A single experiment recovered 70 percent of the lost capacity this way.

Second, the team found that overcharging the battery creates conditions that accelerate decomposition. Avoiding overcharging extends the lifetime by a factor of 40.

"In future work, we need to determine just how much the combination of these approaches can extend the lifetime of the battery if we engineer them right," said Aziz.

"The decomposition and rebirth mechanisms are likely to be relevant for all anthraquinones, and anthraquinones have been the best-recognized and most promising organic molecules for flow batteries," said Gordon.

"This important work represents a significant advance toward low-cost, long-life flow batteries," said Imre Gyuk, Director of the Department of Energy's Office of Electricity Storage program. "Such devices are needed to allow the electric grid to absorb increasing amounts of green but variable renewable generation."

Credit: 
Harvard John A. Paulson School of Engineering and Applied Sciences

Societal values and perceptions shape energy production and use as much as new technology

CORVALLIS, Ore. - Societal values and perceptions have shaped the energy landscape as much as the technologies that drive its production and consumption, a new paper from an Oregon State University researcher suggests.

The finding challenges the longstanding supposition that technology is the ruler of energy policy in the United States. Whether the issue is fossil fuels, nuclear power or renewable energy, discussions in both public forums and policy circles have centered on energy technologies as a primary source of opportunity and risk.

But other factors also are in play, said Hilary Boudet, an associate professor of public policy at OSU and one of a growing number of researchers challenging the long-held notion.

"Rather than operating as an independent variable, energy production and use is ultimately harnessed and directed by society," Boudet said. "New technologies have expanded the boundaries of our energy systems. But it's been people and politics that have determined how energy is used within those boundaries, often in ways that draw upon both personal experience and long-standing beliefs and practices."

Boudet describes current "people-centered" trends in energy policy in a new article published this week in the journal Nature Energy. The piece is a broad overview of recent publications in the burgeoning field.

Boudet focuses specifically on public perceptions and responses to new energy technologies, ranging from large-scale wind, solar and ocean-wave energy projects to cutting-edge consumer technologies such as electric vehicles, rooftop solar panels and smart meters.

In her review, Boudet identifies four dominant factors shaping public perceptions of new energy technologies - technology, people, place and process, as well as the interaction between them. These factors have considerable overlap, she noted.

"The old way of thinking was to get people to accept new technology by providing them with more information about it. But what we are finding now is that is not enough," Boudet said. "Studies have shown that more information does not necessarily change opinions or result in a consensus on how to move forward."

Support or disapproval for new energy sources is more likely to be based on personal values and experiences, as well as what we perceive to be the stances of others we trust. As a result, it's important to understand that people may harbor different views concerning new energy technologies based on their personal values and experiences, as well as the views held within their social networks, Boudet said.

"Fracking - the hydraulic fracturing process for extracting oil and gas - is a good example of this dynamic," she said. "When the risks and benefits remain abstract and distant, people lean largely on their own values and political leanings to form an opinion. In contrast, those who live near fracking examine the 'facts on the ground' to shape their opinions of the risks and benefits. Their perceptions and attitudes are primarily driven by the impact that the technology is having on their lives and their community."

At the same time, how decisions are made about technology deployment also shapes public perceptions and reactions. Particularly when considering large-scale energy projects like wind turbines and solar farms, if residents in nearby communities do not feel that the they are being adequately consulted or decision making processes are unfair, even the most technologically sound projects can fail.

Those trying to win support for a project often attempt to change aspects of the technology or aspects of the decision-making process to address public concerns, but that strategy may not work as well as it has in the past given the current divisive political climate, Boudet said.

A better approach may be a longer-term strategic planning effort around energy development that includes an assessment of deeply rooted social and cultural values and perceptions that play a fundamental role in shaping public attitudes and actions, she said.

"A greater understanding of the human dimension of energy technologies," Boudet contended, "would likely lead to more sustainable and effective energy policies, built on a broader perception that encompasses not just what happens in our universities, laboratories and research centers but also in our homes, neighborhoods and work places each and every day."

Credit: 
Oregon State University

Army project develops agile scouting robots

image: UC Berkeley robotics graduate student Justin Yim discusses his jumping robot, Salto, which is funded by the US Army.

Image: 
University of California Berkley

RESEARCH TRIANGLE PARK, N.C. (May 28, 2019) - In a research project for the U.S. Army, researchers at the University of California, Berkeley developed an agile robot, called Salto that looks like a Star Wars Imperial walker in miniature and may be able to aid in scouting and search-and-rescue operations.

Robots like this may one day be used to save lives of both warfighters and civilians, researchers said.

Topping out at less than a foot, Salto, which stands for saltatorial (leaping like a grasshopper) locomotion on terrain obstacles, now has a sophisticated control systems that allows it to master increasingly complex tasks, like bouncing in place, navigating an obstacle course or following a moving target, all controlled with a radio controller.

In 2016, the research team demonstrated how Salto could take a leap and then immediately spring higher by ricocheting off a wall, making it the world's most vertically agile robot - jumping for than three times its height.

With its new capabilities, the researchers hope Salto will propel the development of small, nimble robots that could leap through rubble to aid in search-and-rescue and other military missions.

"The physical environment the Army operates in is highly irregular, cluttered, and constantly changing," said Dr. Samuel Stanton, program manager at Army Research Office, an element of U.S. Army Combat Capability Development Command's Army Research Laboratory. "The science underlying the advancements is critical for achieving the desired mobility, speed of action, and situational awareness generation necessary for future Army operations."

The research team described the robot's new skills at the 2019 International Conference on Robotics and Automation in Montreal May 21.

"Small robots are really great for a lot of things, like running around in places where larger robots or humans can't fit. For example, in a disaster scenario, where people might be trapped under rubble, robots might be really useful at finding the people in a way that is not dangerous to rescuers and might even be faster than rescuers could have done unaided," said UC Berkeley robotics graduate student Justin Yim. "We wanted Salto to not only be small, but also able to jump really high and really quickly so that it could navigate these difficult places."

Yim works with Ronald Fearing, an electrical engineering and computer sciences professor at UC Berkeley, whose Biomimetic Millisystems Lab explores how the mechanics of animal movement can be applied to create more agile robots.

Fearing's lab is known for building insect-inspired robots that can safely crawl across tricky surfaces that are too smooth or too rough for a wheeled robot to navigate.

Salto's single, powerful leg is modeled after those of the galago, or Senegalese bush baby. The small, tree-dwelling primate's muscles and tendons store energy in a way that gives the spry creature the ability to string together multiple jumps in a matter of seconds. By linking a series of quick jumps, Salto also can navigate complex terrain -- like a pile of debris -- that might be impossible to cross without jumping or flying.

"Unlike a grasshopper or cricket that winds up and gives one jump, we're looking at a mechanism where it can jump, jump, jump, jump," Fearing said. "This allows our robot to jump from location to location, which then gives it the ability to temporarily land on surfaces that we might not be able to perch on."

Yim has also equipped Salto with new technology that allows it to feel its own body, telling it what angle it is pointing and the bend of its leg. Without these abilities, Salto has been confined to a room in one of Berkeley's engineering buildings, where motion capture cameras track its exact angle and position and transmit that data back to a computer, which rapidly crunches the numbers to tell Salto how to angle itself for its next leap.

Now that Salto has a sense of itself and its own motion, the robot can make these calculations for itself, allowing Yim to take the robot outside and use a joystick and radio controller to tell it where to go.

"By understanding the way that these dynamics work for Salto, with its mass and size, then we can extend the same type of understanding to other systems, and we could build other robots that are bigger or smaller or differently shaped or weighted," Yim said.

In the future, Fearing hopes to continue to explore the possibilities for hopping robots.

"This Army investment extends the current state of the art for small ground robot mobility beyond what is currently capable through traditional wheeled and tracked locomotion which are severely limited in complex three-dimensional terrain," said Dr. Brett Piekarski, Vehicle Technology Directorate, ARL. "These advances will inform and guide our Army Research Laboratory researchers as they continue to develop innovative solutions for robotic actuation and mobility and will enable agile robots that can go anywhere a Soldier can and beyond. This research brings us a step closer to providing our warfighters with effective unmanned systems that can be deployed in the field."

Credit: 
U.S. Army Research Laboratory

Energy researchers break the catalytic speed limit

image: A new discovery by University of Minnesota and University of Massachusetts Amherst researchers could increase the speed and lower the cost of thousands of chemical processes used in developing fertilizers, foods, fuels, plastics, and more.

Image: 
College of science and engineering

A team of researchers from the University of Minnesota and University of Massachusetts Amherst has discovered new technology that can speed up chemical reactions 10,000 times faster than the current reaction rate limit. These findings could increase the speed and lower the cost of thousands of chemical processes used in developing fertilizers, foods, fuels, plastics, and more.

The research is published online in ACS Catalysis, a leading journal of the American Chemical Society.

In chemical reactions, scientists use what are called catalysts to speed reactions. A reaction occurring on a catalyst surface, such as a metal, will speed up, but it can only go as fast as permitted by what is called the Sabatier's principle. Often called the "Goldilocks principle" of catalysis, the best possible catalyst aims to perfectly balance two parts of a chemical reaction. Reacting molecules should stick to a metal surface to react neither too strong nor too weakly, but "just right." Since this principle was established quantitatively in 1960, the Sabatier maximum has remained the catalytic speed limit.

Researchers of the Catalysis Center for Energy Innovation, funded by the U.S. Department of Energy, found that they could break the speed limit by applying waves to the catalyst to create an oscillating catalyst. The wave has a top and bottom, and when applied, it permits both parts of a chemical reaction to occur independently at different speeds. When the wave applied to the catalyst surface matched the natural frequency of a chemical reaction, the rate went up dramatically via a mechanism called "resonance."

"We realized early on that catalysts need to change with time, and it turns out that kilohertz to megahertz frequencies dramatically accelerate catalyst rates," said Paul Dauenhauer, a professor of chemical engineering and materials science at the University of Minnesota and one of the authors of the study.

The catalytic speed limit, or Sabatier maximum, is only accessible for a few metal catalysts. Other metals that have weaker or stronger binding exhibit slower reaction rate. For this reason, plots of catalyst reaction rate versus metal type have been called "volcano-shaped plots" with the best static catalyst existing right in the middle at the volcano peak.

"The best catalysts need to rapidly flip between strong and weak binding conditions on both sides of the volcano diagram," said Alex Ardagh, post-doctoral scholar in the Catalysis Center for Energy Innovation. "If we flip binding strength quickly enough, catalysts that jump between strong and weak binding actually perform above the catalytic speed limit."

The ability to accelerate chemical reactions directly affects thousands of chemical and materials technologies used to develop fertilizers, foods, fuels, plastics, and more. In the past century, these products have been optimized using static catalysts such as supported metals. Enhanced reaction rates could significantly reduce the amount of equipment required to manufacture these materials and lower the overall costs of many everyday materials.

Dramatic enhancement in catalyst performance also has the potential to scale down systems for distributed and rural chemical processes. Due to cost savings in large-scale conventional catalyst systems, most materials are only manufactured in enormous centralized locations such as refineries. Faster dynamic systems can be smaller processes, which can be located in rural locations such as farms, ethanol plants, or military installations.

"This has the potential to completely change the way we manufacture almost all of our most basic chemicals, materials, and fuels," said Professor Dionisios Vlachos, director of the Catalysis Center for Energy Innovation. "The transition from conventional to dynamic catalysts will be as big as the change from direct to alternating current electricity."

Credit: 
University of Minnesota

Palm oil: Down from the conservation barricades and out of the rhetorical trenches

image: The view of an oil palm plantation in Indonesia.

Image: 
Douglas Sheil

Oil palm is neither the devil's work, nor a godsend to humanity. Its effects on its surroundings largely depends on case-specific circumstances. Those who ask to boycott all palm oil due to its contribution to deforestation should also consider boycotting coffee, chocolate and coconut if they wish to be consistent.

Are you for or against palm oil?

Ask anyone who has kept half an eye on the news the last couple of years, and they will most likely say "Against, obviously. The plantations destroy orangutan habitats, right? We've all seen the videos."

The environmental impacts of the palm oil industry are widely recognised. Unsurprisingly, many people, including many conservation pundits, consider oil palm a major evil. What is less widely recognized is the extent to which this industry has benefited people. Oil palm development, if well-planned and managed, can provide improved incomes and employment and generate investments in services and infrastructure. These alternative viewpoints fuel a polarised debate in which oil palm is alternatively seen as a gift from god or a crime against humanity.

According to science, it is neither.

Flawed debate

Two leading scientists on forest conservation and management call for a more nuanced debate when it comes to palm oil and their plantations.

"Our key message is the following: The effects of oil palm, on the environment and on human society, are case-specific and largely dependent on circumstances. This must be recognised when conducting debate, and making management and consumer decisions," professor Douglas Sheil from the Norwegian University of Life Sciences (NMBU) says.

In a new scientific article, he and collaborator, professor Erik Meijaard from the University of Queensland, Australia and University of Kent, UK, explore questions related to the production and use of palm oil and other vegetable oils. Between them, they have more than 50 years of research experience on tropical forest conservation.

The only cause of deforestation?

Oil palm is widely reviled for causing large-scale deforestation in the species-rich tropics. With 18.7 million hectares of industrial-scale oil palm plantations in 2017, it is ranked 4th in terms of planted area for an oil crop, behind soy, rapeseed (or canola) and maize.

"Currently oil palm produces about 35% of global vegetable oils on less than 10% of the total land under oil crops," Sheil says.

"Overall, conversion to industrial scale oil palm development appears associated with less than 0.5% of global deforestation but surpasses 50% in specific regions such as Malaysian Borneo."

Locally, it can be environmentally devastating, but on a global scale, it is just one of many crops that should receive attention from environmentalists and government.

"Bananas, beef, cane sugar, chocolate, coconuts, coffee, pineapples, soybeans, tea and vanilla, to name a few, are all produced in previously forested tropical areas," he comments.

"But the attention these receive is hardly comparable to the scrutiny that is directed towards palm oil."

A wider perspective

"There is no doubt that the impacts from oil palm plantings on the environment and biodiversity at local scales can be summed up as highly negative," Sheil says.

"But in terms of global outcomes, the debate changes."

It is imperative to assess to what extent the negative impacts can be reduced or avoided.

"For example, by planting palm oil, or other crops for that matter, in areas that are deforested already. It is better to utilise already degraded areas, than cutting down new ones."

This is already being done; it is just not widely acknowledged but needs to be encouraged.

"Another element that is insufficiently recognised, is that the negative consequences of the expansion of palm oil plantations in one location are potentially offset by, for example, reduced expansion of other oil crops elsewhere."

Poverty alleviator

Who would deny a parent the opportunity of feeding their starving children? For some, palm oil is a way out of poverty when few other options exist. And economically, it is often a sensible option. Oil palms will grow in conditions that would defeat most other crops, and decades of successful breeding has increased yields dramatically.

"Regardless of measure, land, labour or inputs invested, oil palm is an exceptionally profitable crop," Meijaard says.

Scams and broken promises

However, palm oil is tainted with stories of corruption and disreputable practices. Its ability to produce considerable profits, even from areas where comparable options were absent, has fuelled a boom in speculation, opportunism and dubious practices. In locations with weak or corrupt institutions, this has parallels to the resource curse seen in some other high value commodities, such as mineral oil.

The immediate benefits of land clearance to develop oil palm can also be highly profitable encouraging some unscrupulous investors to access and clear large areas for the timber value on the promise of longer-term oil palm developments that never appear.

"Such scams have been common across Indonesia in recent decades, with both officials and communities duped into giving away their forest and timber for a broken promise," Meijaard says.

The authors emphasise that benefits from oil palm development likely depend much on the local context, such as variation between companies in how they engage with communities.

North or south?

The world demands vegetable oils, and if palm oil is not available, other crops will replace it.

"A call for reductions in palm oil production will require an increase in other, higher latitude, oil crops, like soy, maize, sunflower and rapeseed," Meijaard says.

The largest areas allocated for the production of vegetable oils are in the USA, China and Brazil, although the predominant crops there, maize and soy beans, also produce non-oil products. Nevertheless, among the world's 20 largest producers of oil crops, only the tropical countries of Indonesia, Nigeria, and Malaysia have more than 10% of their land areas allocated to oil palm. A global shift away from palm oil would require more production of other oils.

"This would most likely benefit economies in the global North, where deforestation for agriculture took place a lot earlier than in the tropics."

The danger of extremism

"Boycotts against palm oil by consumers or consuming countries are a legitimate expression of social and environmental concerns," Meijaard says.
H

e continues with warning that they punish innocent and guilty alike.

"Banning palm oil rather than seeking improved standards risks lowering rather than raising the practices."

If similar standards are not addressed to other crops and commodities, including those produced in consumer countries themselves, such boycotts can appear political, prejudiced and protectionist.

"We are already seeing palm oil producing countries protesting against what they see as Western double-standards."

"If we are not careful, there is a risk of driving them even further away."

Introducing nuance

"Stepping outside this rhetorical extremism is necessary if we seek resolution and pragmatic advances. An important question is how to plan, guide, and assess oil palm developments to foster the greatest benefits and least harm," Meijaard comments.

The authors recommend that a more complete accounting should consider not just the environmental aspects but the influence on poverty, hunger, and all the factors considered under the 17 UN-Sustainable Development Goals (SDGs).

"What is right and what is wrong depends on who you ask, and it is unlikely that there are clear universal answers as to how to best tackle contemporary global problems in a just and equitable manner, apart from providing informed choice."

"We have to bring nuance back into the debate," he concludes.

Credit: 
Norwegian University of Life Sciences

Inconsistent choice-making a normal part of how the brain evaluates options

image: Ryan Webb is an Assistant Professor at the Rotman School of Management. His research integrates the disciplines of Neuroscience, Psychology, and Economics to provide insight into consumer behaviour. His recent research includes developing tools to predict consumer choices from measurements of neural activity (fMRI and EEG), and modelling consumer behaviour from the principles of information coding in the human brain.

He has received a Ph.D. in Economics from Queens University and research fellowships from New York University and the California Institute of Technology.

Image: 
Rotman School of Management

Toronto - Economists have noticed that people can behave inconsistently when making choices. According to economic theory, people should choose the same things every time, under the same circumstances, because they are recognized as holding the same value as before. But people don't always do that. Sometimes consumers will switch their preferences, known in industry terms as "customer churn." While economists have previously called that an error in rationality, a new study says an important part of inconsistent choice-making is due to idiosyncratic activity in the brain areas that assess value.

"If the value of a Coke is higher to you than a Pepsi, then you should choose the Coke every single time," explains study co-author Ryan Webb, an assistant professor at the University of Toronto's Rotman School of Management. "But because of these 'noisy' fluctuations in neural activity, every so often, the Pepsi is better than the Coke."

Prof. Webb, Vered Kurtz-David, and Profs. Dotan Persitz, and Dino Levy from Tel Aviv University were able to observe the phenomenon by getting research volunteers to play a series of lotteries while lying inside a functional magnetic resonance imaging (fMRI) scanner. The fMRI monitors neural activity by detecting changes in blood flow to different parts of the brain.

The volunteers had to choose between different combinations of tokens directed towards two simultaneous lotteries, each with a 50% chance of being the winner. Each volunteer played the lotteries multiple times in quick succession while inside the fMRI.

The fMRI studies showed that the areas of the brain that were most active during the most inconsistent choices were the same areas responsible for evaluating value. In other words, the brain areas that usually make rational choices sometimes make irrational ones too. This contradicts previous theories that have suggested rational and irrational decision-making are influenced by activity in separate parts of the brain, or by different thinking processes, an idea popularized in Daniel Kahneman's book, Thinking Fast and Slow.

The results suggest that occasional inconsistent choices are fundamental to how a typical brain works, regardless of efforts to ensure people stick religiously to their usual preferences.

"Sometimes people are going to choose another product," says Prof. Webb, who brings together neuroscience, psychology and economics in his research. "Every so often, they're going to switch."

Credit: 
University of Toronto, Rotman School of Management

NASA-supported monitoring network assesses ozone layer threats

image: Banner Photo: Map of AGAGE network https://agage.mit.edu/global-network

Image: 
AGAGE

On the heels of the first definitive signs of the ozone layer recovery last year, an international team of scientists discovered that production and emission of a banned, potent ozone-depleting chemical is on the rise again. A new research finding, published in Nature on May 23, locates the source region for about half of those new emissions. Since 2013, they found that an increase of about 7000 tons per year of trichlorofluromethane, or CFC-11, added to the atmosphere originates from eastern China.

On the heels of the first definitive signs of the ozone layer recovery last year, an international team of scientists discovered that production and emission of a banned, potent ozone-depleting chemical is on the rise again. A new research finding, published in Nature on May 23, locates the source region for about half of those new emissions. Since 2013, they found that an increase of about 7000 tons per year of trichlorofluromethane, or CFC-11, added to the atmosphere originates from eastern China.

The Gosan GAW Regional Station

The Gosan GAW Regional Station (Global Atmosphere Watch, program of the World Meteorological Organization) is located on the south-western tip of Jeju Island (Republic of Korea), facing the East China Sea. The station rests at the top of a 72 m cliff, about 100 km south of the Korean peninsula, 500 km northeast of Shanghai, China, and 250 km west of Kyushu, Japan.
Credits: AGAGE

Locating and identifying this particular source of CFC-11 was possible in part because of a NASA-supported monitoring network for atmospheric gases that has been in place since 1978.

CFC-11 was one of the first generation of ozone-depleting substances banned by the Montreal Protocol, an international agreement signed by 196 countries to protect the stratospheric ozone layer. The ozone layer protects life on Earth from harmful ultraviolet radiation. CFC-11 was completely phased out of production in 2010. Its reappearance in the atmosphere will likely delay the recovery of the ozone layer.

The discovery of increased CFC-11 emissions was first made by researchers in 2018 with observational data from the Mauna Loa Observatory in Hawaii, part of an ongoing activity at the National Oceanic and Atmospheric Administration (NOAA)'s Global Monitoring Division. In the United States, NASA and NOAA are charged with monitoring threats to the ozone layer under the 1990 amendment to the Clean Air Act.

From 2012 onward, they found that CFC-11 concentration levels were not decreasing as fast as expected. Using computer models and other analysis techniques that simulated wind patterns and the movements of gases throughout the world, they determined that the increase in emissions was most likely from eastern Asia. The increase in emissions also suggested new production. The question was, where exactly did it come from? The monitoring station in Hawaii was too far from other land masses to be able to narrow down the origin beyond eastern Asia.

The ozone research community turned to the Advanced Global Atmospheric Gas Experiment (AGAGE), a network of 15 monitoring stations set up around the world to measure 40 atmospheric gases that contribute to greenhouse warming and ozone depletion. The network operations are led by the Center for Global Change Science at the Massachusetts Institute of Technology. NASA funds five of these stations and provides basic infrastructure that supports the entire network, partnering with multinational environmental agencies and with station host countries of Ireland, Norway, Switzerland, Australia, Japan, and Korea. The network has produced a 40-year continuous data record of frequent measurements of some of the most difficult-to-detect trace gases in the atmosphere.

"The primary way we monitor changes in ozone is through satellite observations," said Ken Jucks program manager for NASA's Upper Atmosphere Research Program at NASA Headquarters in Washington. "But it's much harder to measure ozone-depleting substances by satellite. A few of the CFCs can be measured from space, but not to the level of accuracy you need to understand their changes over time."

Scientists need to be able to detect concentrations and concentration changes of much less than one part per trillion of ozone-depleting gases. This can only be done effectively with instruments on the ground. To find out where these gases are coming from, the observations also need to be made relatively close to the source of the emissions. Two newer stations fit the bill for observing East Asian emissions: the South Korean Gosan AGAGE station, run by Kyungpook National University in South Korea, and the AGAGE-affiliated station on Hateruma island in Japan, run by Japan's National Institute of Environmental Studies.

"What you see in those data sets are basically pollution events," Jucks said. Pollution events coming from a localized source get diluted and harder to detect and trace back the farther they travel away from their origin. "As we're getting these observations in more populated areas, we basically get a data point an hour, which is a big advantage for these types of measurements compared to whole air gas samples," Jucks said, because that gives the modeling teams sufficient data to back-trace the gases to their source.

The data observations of CFC-11 combined with atmospheric modeling by researchers at the University of Bristol, the UK Met Office, the Swiss Federal Laboratories for Materials Science and Technology (Empa) and the Massachusetts Institute of Technology determined that the most likely source of the CFC-11 was new production from China.

When the AGAGE network was originally set up, the emphasis was on understanding the background levels of greenhouse and ozone-depleting gases and how they were transported through the atmosphere. With older stations set up farther from sources, researchers were able to get a better sense of the global concentrations and lifetimes of these gases - and how high or low those levels have gotten through time. This context is essential for monitoring current and future changes to CFC levels.

"Without the record that goes back all this time it would be really difficult to fully understand with the modeling experiments what the CFCs have been doing," Jucks said. "It gives us a tight constraint for the changes seen in the atmosphere over the last 40 years. Without that we wouldn't be able to tell whether concentrations are changing as we expect them to over time."

Credit: 
NASA/Goddard Space Flight Center

New evidence supports surgery for rare type of brain lymphoma

image: Left: resectable tumor Right: More diffuse, nonresectable tumor.

Image: 
Alicia Ortega, M.S.

Through a systematic review of published studies going back 50 years, Johns Hopkins Medicine researchers say they have identified a distinct subtype of primary central nervous system (PCNS) lymphoma that should be considered for surgical removal, suggesting a major shift in how this type of tumor is evaluated and managed.

Treatment for PCNS lymphoma -- a rare but aggressive form of cancer in the brain that involves infection-fighting immune system cells -- has traditionally been biopsy, radiotherapy, and high-dose chemotherapy with methotrexate, but surgical resection has not had a role because of the risk of damage to healthy brain tissue.

PCNS lymphoma is a form of non-Hodgkin lymphoma confined to the brain, eyes, spinal cord or tissues that cover the brain and spinal cord. It accounts for 1%-2% of central nervous system tumors, or approximately 1,400 new cases in the United States each year. It more often strikes the elderly and immunosuppressed. Ten-year survival is estimated at 10%-13%.

In a report on their study, published March 20, 2019, in World Neurosurgery, researchers from the Johns Hopkins University School of Medicine and Johns Hopkins Kimmel Cancer Center describe two subtypes of PCNS lymphomas easily distinguishable by MRI -- one that is superficial and localized that might be considered for surgical removal and another that is deep-seated and diffuse and likely not suited to surgery. They estimate that about 20% of patients have the localized type of tumor and hold the potential for cure with surgery followed by treatment with methotrexate.

To see if that was the case, the researchers performed what they described as a comprehensive and systematic review of studies of PCNS lymphoma published between January 1968 and May 2018 on patients who had biopsies of their tumor or surgeries to remove them. They tracked treatment, side effects, progression-free survival and overall survival, comparing the outcomes and complications of patients who had biopsies only with those who had surgical resections.

Overall, they identified 1,291 citations and 244 manuscripts, and selected information from 24 for a focused data review. The selected studies included information on 15,280 patients and met certain criteria, such as involving human subjects, including data on at least five patients, reporting primary data, and providing survival or complication data on stereotactic biopsy versus resection of PCNS lymphoma.

Of the 24, 15 older, smaller studies that included largely single-institution retrospective studies found no benefit from surgery to remove some or all of the tumor. The most prominent was a 1990 study that confirmed the benefits of methotrexate, but cited post-operative complications in 4 of 10 patients treated, and concluded that surgical treatment was not recommended.

Nine larger and more recent studies, including a 526-patient randomized German clinical trial published in 2012, found surgery beneficial in select patients, particularly when the tumor was well defined and located in more superficial regions of the brain, and in younger patients.

Since the 2012 study, five other studies began to reveal differences among PCNS lymphoma subtypes and a potential role for surgery, according to study leader Debraj Mukherjee, M.D., M.P.H., assistant professor of neurosurgery and director of neurosurgical oncology at Johns Hopkins Bayview Medical Center.

"What our study showed us is that we really should be thinking about PCNS lymphoma as two types of tumors with different methods of treatment for each," Mukherjee says. "Surgery to remove the superficial, localized types of tumors does not seem to put patients at greater risk and also improves outcomes for these patients, while the larger, deeper tumors are not suited for surgery because of their location near the ventricle system in the brain."

The researchers say the identification of two different subtypes of PCNS lymphoma explains the discrepancies among the older studies that found no overall benefit to surgery and the newer studies that did. "The older trials never delved into this question of size, type and location of tumor," explains Mukherjee. "It was thought that these tumors were all diffuse and multifocal, and too difficult to remove with surgery without increased risk to patients."

The researchers call for more prospective studies to better define the role of surgery in treating PCNS lymphoma.

In addition to Mukherjee, other researchers included Collin Labak, Matthias Holdhoff, Chetan Bettegowda, Gary Gallia, Michael Lim and Jon Weingart.

Credit: 
Johns Hopkins Medicine

Comet inspires chemistry for making breathable oxygen on Mars

image: Konstantinos P. Giapis with his reactor that converts carbon dioxide to molecular oxygen.

Image: 
Caltech

Science fiction stories are chock full of terraforming schemes and oxygen generators for a very good reason--we humans need molecular oxygen (O2) to breathe, and space is essentially devoid of it. Even on other planets with thick atmospheres, O2 is hard to come by.

So, when we explore space, we need to bring our own oxygen supply. That is not ideal because a lot of energy is needed to hoist things into space atop a rocket, and once the supply runs out, it is gone.

One place molecular oxygen does appear outside of Earth is in the wisps of gas streaming off comets. The source of that oxygen remained a mystery until two years ago when Konstantinos P. Giapis, a professor of chemical engineering at Caltech, and his postdoctoral fellow Yunxi Yao, proposed the existence of a new chemical process that could account for its production. Giapis, along with Tom Miller, professor of chemistry, have now demonstrated a new reaction for generating oxygen that Giapis says could help humans explore the universe and perhaps even fight climate change at home. More fundamentally though, he says the reaction represents a new kind of chemistry discovered by studying comets.

Most chemical reactions require energy, which is typically provided as heat. Giapis's research shows that some unusual reactions can occur by providing kinetic energy. When water molecules are shot like extremely tiny bullets onto surfaces containing oxygen, such as sand or rust, the water molecule can rip off that oxygen to produce molecular oxygen. This reaction occurs on comets when water molecules vaporize from the surface and are then accelerated by the solar wind until they crash back into the comet at high speed.

Comets, however, also emit carbon dioxide (CO2). Giapis and Yao wanted to test if CO2 could also produce molecular oxygen in collisions with the comet surface. When they found O2 in the stream of gases coming off the comet, they wanted to confirm that the reaction was similar to water's reaction. They designed an experiment to crash CO2 onto the inert surface of gold foil, which cannot be oxidized and should not produce molecular oxygen. Nonetheless, O2 continued to be emitted from the gold surface. This meant that both atoms of oxygen come from the same CO2 molecule, effectively splitting it in an extraordinary manner.

"At the time we thought it would be impossible to combine the two oxygen atoms of a CO2 molecule together because CO2 is a linear molecule, and you would have to bend the molecule severely for it to work," Giapis says. "You're doing something really drastic to the molecule."

To understand the mechanism of how CO2 breaks down to molecular oxygen, Giapis approached Miller and his postdoctoral fellow Philip Shushkov, who designed computer simulations of the entire process. Understanding the reaction posed a significant challenge because of the possible formation of excited molecules. These molecules have so much energy that their constituent atoms vibrate and rotate around to an enormous degree. All that motion makes simulating the reaction in a computer more difficult because the atoms within the molecules move in complex ways.

"In general, excited molecules can lead to unusual chemistry, so we started with that," Miller says. "But, to our surprise, the excited state did not create molecular oxygen. Instead, the molecule decomposed into other products. Ultimately, we found that a severely bent CO2 can also form without exciting the molecule, and that could produce O2."

The apparatus Giapis designed to perform the reaction works like a particle accelerator, turning the CO2 molecules into ions by giving them a charge and then accelerating them using an electric field, albeit at much lower energies than are found in a particle accelerator. However, he adds that such a device is not necessary for the reaction to occur.

"You could throw a stone with enough velocity at some CO2 and achieve the same thing," he says. "It would need to be traveling about as fast as a comet or asteroid travels through space."

That could explain the presence of small amounts of oxygen that have been observed high in the Martian atmosphere. There has been speculation that the oxygen is being generated by ultraviolet light from the sun striking CO2, but Giapis believes the oxygen is also generated by high-speed dust particles colliding with CO2 molecules.

He hopes that a variation of his reactor could be used to do the same thing at more useful scales--perhaps one day serving as a source of breathable air for astronauts on Mars or being used to combat climate change by pulling CO2, a greenhouse gas, out of Earth's atmosphere and turning it into oxygen. He acknowledges, however, that both of those applications are a long way off because the current version of the reactor has a low yield, creating only one to two oxygen molecules for every 100 CO2 molecules shot through the accelerator.

"Is it a final device? No. Is it a device that can solve the problem with Mars? No. But it is a device that can do something that is very hard," he says. "We are doing some crazy things with this reactor."

Credit: 
California Institute of Technology

Does being seen really make cyclists safer on the road?

image: UBC Okanagan research shows a directional arrow on a high visibility vest can make a difference when it comes to driver behaviour regarding cyclists on the road.

Image: 
UBCO

Researchers from UBC Okanagan have determined motorists tended to give cyclists wearing high-visibility vests more room on the road, compared to cyclists without high-visibility clothing.

The vests, with arrows directing traffic away from pedestrians and cyclists, have shown to reduce the number of traffic accidents involving these groups.

Gordon Lovegrove, a UBC Okanagan associate professor in the School of Engineering, suggests a bit of visual reinforcement, combined with driver education ingrained into safety apparel, may curb unnecessary accidents and fatalities.

Almost half of the world's traffic fatalities are pedestrians and cyclists according to the World Health Organization. And while improved vehicle designs and technologies can protect drivers, vulnerable road users (VRUs)--mostly cyclists and walkers--rely primarily on infrastructure systems such as separated sidewalks and cycle track networks to reduce their risk and navigate roads securely, he explains.

"Safer vehicle designs and their supporting infrastructure networks have been planned, designed, funded, built, operated, monitored and maintained for decades in a relatively comprehensive state," says Lovegrove. "However, the same cannot be said for vulnerable road users, which have been gaining in popularity as an alternative transportation mode in recent decades."

Lovegrove and his industry collaborator, Takuro Shoji, began their research project by reviewing previous projects focused on the role communication plays in the safety of vulnerable road users.

"We were curious to find out if communication aids like signage could possibly be more important than visibility aids like reflectors," says Lovegrove.

Using proprietary high-visibility cycling apparel that features an arrow symbol, the team of researchers investigated cyclists' perception of driver responses. Although the research was based on a relatively small sample size, results indicate that passing traffic gave cyclists more respect by slowing their speeds and providing wider berths when the riders were wearing reflective apparel with an arrow symbol.

Lovegrove's research involved road tests using cyclists with and without visibility vests, as well vests with differing graphics or communication tools. An online survey also determined participants showed a preference for the arrow vest design, including comments that it was felt to be the most effective and conveyed a safer 'keep left' message.

"It's funny that sometimes small visual cues for drivers can have a big impact," says Lovegrove. "Drivers have the narrowest margin of error in traffic environments due to the masses they control and the speeds at which they travel."

Lovegrove points out that 'be safe, be seen' is a statement often used when it comes to the safety of VRUs. For example, cyclists have been advised--or in some jurisdictions mandated--to use helmets, front and rear lights, reflectors and brightly-coloured clothing with retroreflective markings.

"This reflects a prevalent belief that visibility is the key to reducing vehicle-cyclist collisions," he says. "While overall detectability on the road is critical, evidence suggests that current conspicuity aids cannot provide sustainable safety in their current form, and a more optimal design is needed."

He adds that improvements to infrastructure for VRUs need more investments. However, many governments and road authorities lack capital or have not made it a priority to implement full VRU safety measures, with many gaps in infrastructure and networks.

"These gaps leave VRUs to take safety into their own hands, including use of conspicuity aids such as high-visibility wear, helmets, bells, and lights with differing levels of effectiveness," he adds. "Until improved infrastructure networks are fully funded and completed, we hypothesize that communication aids are equally, if not more important, than visibility aids for VRU safety."

Credit: 
University of British Columbia Okanagan campus

Scientists discover signalling circuit boards inside body's cells

image: The first ever images of the cell-wide web have been captured by scientists at the University of Edinburgh thanks to computing techniques similar to those used for the first picture of a black hole. The findings reveal cells in the body are wired like computer chips to direct signals that instruct how they function. Unlike a fixed circuit board, however, cells can rapidly rewire their communication networks to change their behavior.

Image: 
The University of Edinburgh

Cells in the body are wired like computer chips to direct signals that instruct how they function, research suggests.

Unlike a fixed circuit board, however, cells can rapidly rewire their communication networks to change their behaviour.

The discovery of this cell-wide web turns our understanding of how instructions spread around a cell on its head.

It was thought that the various organs and structures inside a cell float around in an open sea called the cytoplasm.

Signals that tell the cell what to do were thought to be transmitted in waves and the frequency of the waves was the crucial part of the message.

Researchers at the University of Edinburgh found information is carried across a web of guide wires that transmit signals across tiny, nanoscale distances.

It is the movement of charged molecules across these tiny distances that transmit information, just as in a computer microprocessor, the researchers say.

These localised signals are responsible for orchestrating the cell's activities, such as instructing muscle cells to relax or contract.

When these signals reach the genetic material at the heart of the cell, called the nucleus, they instruct minute changes in structure that release specific genes so that they can be expressed.

These changes in gene expression further alter the behaviour of the cell. When, for instance, the cell moves from a steady state into a growth phase, the web is completely reconfigured to transmit signals that switch on the genes needed for growth.

Researchers say understanding the code that controls this wiring system could help understand diseases such as pulmonary hypertension and cancer, and could one day open up new treatment opportunities.

The team made their discovery by studying the movement of charged calcium molecules inside cells, which are the key messages that carry instructions inside cells.

Using high-powered microscopes, they were able to observe the wiring network with the help of computing techniques similar to those that enabled the first ever image of a black hole to be obtained.

Scientists say their findings are an example of quantum biology - an emerging field that uses quantum mechanics and theoretical chemistry to solve biological problems.

The study, published in Nature Communications, was funded by the British Heart Foundation.

Professor Mark Evans, of the University of Edinburgh's Centre for Discovery Brain Sciences, said: "We found that cell function is coordinated by a network of nanotubes, similar to the carbon nanotubes you find in a computer microprocessor.

"The most striking thing is that this circuit is highly flexible, as this cell-wide web can rapidly reconfigure to deliver different outputs in a manner determined by the information received by and relayed from the nucleus. This is something no man-made microprocessors or circuit boards are yet capable of achieving."

Credit: 
University of Edinburgh

Mobile phone app designed to boost physical activity in women shows promise in trial

image: This mobile phone screenshot from the mPED trial app shows a home page including a daily message menu used by women in the study intervention, including the regular and plus groups. A pre-programmed interactive daily message or video clip automatically show ups at a predetermined time.

Image: 
University of California, San Francisco.

Activity trackers and mobile phone apps are all the rage, but do they really help users increase and maintain physical activity? A new study has found that one mobile phone app designed for inactive women did help when combined with an activity tracker and personal counseling.

Researchers said the findings offer important clues about how to make such app-based interventions successful--motivational messages and interactive feedback were notable features in this case. But they also highlight their limitations, as the app did not appear to be key in helping the women stay motivated past the first three months. Understanding what did, the researchers said, could eventually help the development of more effective technologies that can get people active and keep them active.

Funded by the National Heart, Lung, and Blood Institute (NHLBI), part of the National Institutes of Health, the study is one of the first to examine how an app-based program can help increase and maintain objectively measured daily physical activity. It was published online on May 24 in JAMA Network Open, a peer-reviewed online-only journal.

"We showed that if you design an activity app using an evidence-based approach, it will be more effective," said study leader Yoshimi Fukuoka, Ph.D., R.N., a professor in the Department of Physiological Nursing at the University of California, San Francisco. "Our findings could go a long way to get more people to move, particularly women."

Regular physical activity has long been shown to reduce the risk of obesity, heart disease, stroke, high blood pressure, diabetes and other chronic conditions. However, according to the 2018 Physical Activity Guidelines for Americans, nearly 80% of adults are not meeting the recommended activity level. Women across all age groups are less likely to be physically active than men. While apps and physical activity trackers have become extremely popular way to break some of those barriers, their long-term effectiveness remains unclear.

Previous activity app trials have been frequently short, and their sample sizes small, and most did not monitor activity objectively and continually. The current study, which lasted nine months, was called the mobile phone based physical activity education (mPED) trial. Fukuoka's research group designed their app specifically for physically inactive women, incorporating behavioral change strategies known to work well for this group, such as personalized goal setting, self-monitoring, social support, and feedback. It was critical, the researchers said, that the women were able to engage with the program at home.

The app, which was developed exclusively for the study and is not commercially available, had three main functions, including a pre-programed interactive daily message or video that reinforced what was learned during a beginning counseling session, and a daily activity diary to record progress. The app automatically increased the participants' activity goals by 20 percent each week to 10,000 steps daily. To improve adherence, participants received an automated message if the app had not been used for three consecutive days.

The trial involved 210 physically inactive women, ages 25 and 65. They were equally divided into three groups--a control that had no intervention but used a tracking device for the nine months of the trial; a "regular" group that got counseling and used the tracker and the app for three months, then used only the tracker for the remaining six months; and a "plus" group that got counseling and used the tracker and the app for the entire nine months. Unlike most other studies, the researchers measured women's activity every 60 seconds, every day for nine months, instead of relying on self-reported activity or intermittent activity measured by the tracker.

During the first three months, the tracker showed that, compared to the control group, the women in the regular and plus groups logged about 2,000 steps more per day, equivalent to approximately 1 mile or 20 minutes of walking. They also increased their moderate to vigorous physical activity by 18 minutes a day.

In the following six-month maintenance period, however, the regular and plus groups logged about 1,400 steps more than the control group and got in eight more minutes of moderate to vigorous physical activity. Researchers said these findings show that the women were able to sustain an impressive level of activity above their starting point.

However, continued use of the app by the plus group did not add any extra benefit to help maintain this increased activity, compared to the regular group, which had stopped using the app after the first three months.

"Sustaining any behavior change is difficult in general, and in particular, sustaining the increased physical activity that resulted after the intervention," Fukuoka said. "Still, it is encouraging to see that 97.6% of women in our trial completed a nine-month visit and kept up part of their increased activity."

The researchers' next goal is to refine maintenance strategies that can help maintain those increased levels of activity over a longer period.

According to the study, the intervention appeared to be equally effective, no matter the user's age, race and ethnicity, body mass index, education, and household income, but the researchers cautioned that the findings might not be generalizable to men.

The research is part of a larger NIH effort to explore better ways to improve cardiovascular health.

"Exercise is just one pillar in a heart-healthy lifestyle and should complement other heart-healthy changes, such as choosing a healthy diet, aiming for a healthy weight, managing stress, getting sufficient sleep, and quitting smoking," said Josephine Boyington, Ph.D., the NHLBI project officer for the study. "People should talk to their doctors about what changes are best for optimizing their individual heart-health plans."

Credit: 
NIH/National Heart, Lung and Blood Institute

AI and high-performance computing extend evolution to superconductors

image: This image depicts the algorithmic evolution of a defect structure in a superconducting material. Each iteration serves as the basis for a new defect structure. Redder colors indicate a higher current-carrying capacity.

Image: 
Argonne National Laboratory/Andreas Glatz

Materials by design: Argonne researchers use genetic algorithms for better superconductors.

Owners of thoroughbred stallions carefully breed prizewinning horses over generations to eke out fractions of a second in million-dollar races. Materials scientists have taken a page from that playbook, turning to the power of evolution and artificial selection to develop superconductors that can transmit electric current as efficiently as possible.

Perhaps counterintuitively, most applied superconductors can operate at high magnetic fields because they contain defects. The number, size, shape and position of the defects within a superconductor work together to enhance the electric current carrying capacity in the presence of a magnetic field. Too many defects, however, can lead to blocking the electric current pathway or a breakdown of the superconducting material, so scientists need to be selective in how they incorporate defects into a material.

“When people think of targeted evolution, they might think of people who breed dogs or horses. Ours is an example of materials by design, where the computer learns from prior generations the best possible arrangement of defects.” — Argonne materials scientist Andreas Glatz.

In a new study from the U.S. Department of Energy’s (DOEArgonne National Laboratory, researchers used the power of artificial intelligence and high-performance supercomputers to introduce and assess the impact of different configurations of defects on the performance of a superconductor.

The researchers developed a computer algorithm that treated each defect like a biological gene. Different combinations of defects yielded superconductors able to carry different amounts of current. Once the algorithm identified a particularly advantageous set of defects, it re-initialized with that set of defects as a “seed,” from which new combinations of defects would emerge.

“Each run of the simulation is equivalent to the formation of a new generation of defects that the algorithm seeks to optimize,” said Argonne distinguished fellow and senior materials scientist Wai-Kwong Kwok, an author of the study. “Over time, the defect structures become progressively refined, as we intentionally select for defect structures that will allow for materials with the highest critical current.”

The reason defects form such an essential part of a superconductor lies in their ability to trap and anchor magnetic vortices that form in the presence of a magnetic field. These vortices can move freely within a pure superconducting material when a current is applied. When they do so, they start to generate a resistance, negating the superconducting effect. Keeping vortices pinned, while still allowing current to travel through the material, represents a holy grail for scientists seeking to find ways to transmit electricity without loss in applied superconductors.

To find the right combination of defects to arrest the motion of the vortices, the researchers initialized their algorithm with defects of random shape and size. While the researchers knew this would be far from the optimal setup, it gave the model a set of neutral initial conditions from which to work. As the researchers ran through successive generations of the model, they saw the initial defects transform into a columnar shape and ultimately a periodic arrangement of planar defects.

“When people think of targeted evolution, they might think of people who breed dogs or horses,” said Argonne materials scientist Andreas Glatz, the corresponding author of the study. “Ours is an example of materials by design, where the computer learns from prior generations the best possible arrangement of defects.”

One potential drawback to the process of artificial defect selection lies in the fact that certain defect patterns can become entrenched in the model, leading to a kind of calcification of the genetic data. “In a certain sense, you can kind of think of it like inbreeding,” Kwok said. “Conserving most information in our defect ‘gene pool’ between generations has both benefits and limitations as it does not allow for drastic systemwide transformations. However, our digital ‘evolution’ can be repeated with different initial seeds to avoid these problems.”

In order to run their model, the researchers required high-performance computing facilities at Argonne and Oak Ridge National Laboratory. The Argonne Leadership Computing Facility and Oak Ridge Leadership Computing Facility are both DOE Office of Science User Facilities.

An article based on the study, “Targeted evolution of pinning landscapes for large superconducting critical currents,” appeared in the May 21 edition of the Proceedings of the National Academy of Sciences. In addition to Kwok and Glatz, Argonne’s Ivan Sadovskyy, Alexei Koshelev and Ulrich Welp also collaborated.

Funding for the research came from the DOE’s Office of Science.

Credit: 
DOE/Argonne National Laboratory

Origami-inspired materials could soften the blow for reusable spacecraft

video: Inspired by the paper folding art of origami, a University of Washington team created a paper model of a metamaterial that uses 'folding creases' to soften impact forces for potential applications in spacecraft, cars and beyond.

Image: 
Kiyomi Taguchi/University of Washington

Space vehicles like SpaceX's Falcon 9 are designed to be reusable. But this means that, like Olympic gymnasts hoping for a gold medal, they have to stick their landings.

Landing is stressful on a rocket's legs because they must handle the force from the impact with the landing pad. One way to combat this is to build legs out of materials that absorb some of the force and soften the blow.

University of Washington researchers have developed a novel solution to help reduce impact forces -- for potential applications in spacecraft, cars and beyond. Inspired by the paper folding art of origami, the team created a paper model of a metamaterial that uses "folding creases" to soften impact forces and instead promote forces that relax stresses in the chain. The team published its results May 24 in Science Advances.

"If you were wearing a football helmet made of this material and something hit the helmet, you'd never feel that hit on your head. By the time the energy reaches you, it's no longer pushing. It's pulling," said corresponding author Jinkyu Yang, a UW associate professor of aeronautics and astronautics.

Yang and his team designed this new metamaterial to have the properties they wanted.

"Metamaterials are like Legos. You can make all types of structures by repeating a single type of building block, or unit cell as we call it," he said. "Depending on how you design your unit cell, you can create a material with unique mechanical properties that are unprecedented in nature."

The researchers turned to the art of origami to create this particular unit cell.

"Origami is great for realizing the unit cell," said co-author Yasuhiro Miyazawa, a UW aeronautics and astronautics doctoral student. "By changing where we introduce creases into flat materials, we can design materials that exhibit different degrees of stiffness when they fold and unfold. Here we've created a unit cell that softens the force it feels when someone pushes on it, and it accentuates the tension that follows as the cell returns to its normal shape."

Just like origami, these unit cell prototypes are made out of paper. The researchers used a laser cutter to cut dotted lines into paper to designate where to fold. The team folded the paper along the lines to form a cylindrical structure, and then glued acrylic caps on either end to connect the cells into a long chain.

The researchers lined up 20 cells and connected one end to a device that pushed and set off a reaction throughout the chain. Using six GoPro cameras, the team tracked the initial compression wave and the following tension wave as the unit cells returned to normal.

The chain composed of the origami cells showed the counterintuitive wave motion: Even though the compressive pushing force from the device started the whole reaction, that force never made it to the other end of the chain. Instead, it was replaced by the tension force that started as the first unit cells returned to normal and propagated faster and faster down the chain. So the unit cells at the end of the chain only felt the tension force pulling them back.

"Impact is a problem we encounter on a daily basis, and our system provides a completely new approach to reducing its effects. For example, we'd like to use it to help both people and cars fare better in car accidents," Yang said. "Right now it's made out of paper, but we plan to make it out of a composite material. Ideally, we could optimize the material for each specific application."

Credit: 
University of Washington