Tech

Agricultural pickers in US to see unsafely hot workdays double by 2050

image: The first map shows the maximum daily heat index, in degrees Fahrenheit, that is the threshold for the top 5% of the hottest days in May through September. Other maps show the same value projected for 2 degrees Celsius warming, in 2050, and for 4 degrees Celsius warming, in 2100. The colors reflect federal health and safety recommendations for outdoor workers.

Image: 
Tigchelaar et al./Environmental Research Letters

The global pandemic has put a focus on essential workers, those we rely on for basic services. Workers who pick crops, from strawberries to apples to nuts, already face harsh conditions harvesting in fields during summer harvest months. Those conditions will worsen significantly over the coming decades.

A new study from the University of Washington and Stanford University, published online in Environmental Research Letters, looks at temperature increases in counties across the United States where crops are grown. It also looks at different strategies the industry could adopt to protect workers' health.

"Studies of climate change and agriculture have traditionally focused on crop yield projections, especially staple crops like corn and wheat," said lead author Michelle Tigchelaar, a postdoctoral researcher at Stanford University who did the work while at the UW. "This study asks what global warming means for the health of agricultural workers picking fruits and vegetables."

The average picker now experiences 21 days each year when the daily heat index -- a mix of air temperature and humidity -- would exceed workplace safety standards. Using projections from climate models, the study shows the number of unsafe days in crop-growing counties will jump to 39 days per season under 2 degrees Celsius warming, which is expected by 2050, and to 62 unsafe days under 4 degrees Celsius warming, which is expected by 2100.

"I was surprised by the scale of the change -- seeing a doubling of unsafe days by mid-century, then a tripling by 2100. And we think that's a low estimate," Tigchelaar said.

The study also shows that heat waves, prolonged stretches of three or more of the hottest days for each county, will occur five times as often, on average, under 2 degrees Celsius of warming.

Roughly 1 million people officially are employed in the U.S. picking agricultural crops. The authors used the U.S. Bureau of Labor Statistics job codes to determine their locations. The 20 counties that employ the most pickers all are in California, Washington, Oregon and Florida. The actual number of agricultural workers in the U.S. is estimated at more than 2 million.

This population already is vulnerable to health risks. Agricultural workers tend to have lower incomes and less health coverage, a majority say they are not fluent in English, and many do not have legal work status in the U.S., meaning they are less likely to seek medical care. Farmworkers already report more kidney ailments and other conditions related to heat stress.

Tigchelaar began the study after a 2017 death in Washington state, when a blueberry picker died during a hot and smoky period. That prompted Tigchelaar, then a postdoctoral researcher at the UW, to think about how agricultural workers are particularly at risk from climate change.

"The people who are the most vulnerable are asked to take the highest risk so that we, as consumers, can eat a healthy, nutritious diet," Tigchelaar said.

The authors also considered what steps might protect agricultural workers. The interdisciplinary team used an occupational health threshold value for heat stress that combines heat generated by physical activity with the external temperature and humidity. The four adaptation strategies they considered were working significantly less vigorously, taking longer breaks, wearing thinner and more breathable protective clothing, and taking breaks in a cooled shelter.

"This is the first study that I'm aware of that has attempted to quantify the effect of various adaptations, at the workplace level, to mitigate the risk of increased heat exposure with global warming for agricultural workers," said co-author Dr. June Spector, a UW associate professor of environmental and occupational health sciences.

Results show that the most effective way to reduce heat stress would be to develop lighter protective clothing that would still shield workers from pesticides or other hazards. And using any three of the four adaptation strategies in combination would be enough to offset the temperature increases.

Many workplaces are already protecting workers from heat, said Spector, who conducts research in the UW Pacific Northwest Agricultural Safety and Health Center. This new study helps employers and workers foresee future conditions and think about how to prepare.

The authors caution that the study is not an excuse to stop reducing greenhouse gas emissions. Lower emissions can't avoid the temperature increases projected by 2050, but the longer-term adaptation measures considered would have a big impact on farm productivity and profitability.

"The climate science community has long been pointing to the global south, the developing countries, as places that will be disproportionately affected by climate change," said co-author David Battisti, a UW professor of atmospheric sciences. "This shows that you don't have to go to the global south to find people who will get hurt with even modest amounts of global warming -- you just have to look in our own backyard."

Credit: 
University of Washington

How the heart affects our perception

The first mechanism establishes a relationship between the phase of the heartbeat and conscious experience. In a regular rhythm, the heart contracts in the so-called systolic phase and pumps blood into the body. In a second phase, the diastolic phase, the blood flows back and the heart fills up again. In a previous publication from the MPI CBS, it was reported that perception of external stimuli changes with the heartbeat. In systole, we are less likely to detect a weak electric stimulus in the finger compared to diastole.

Now, in a new study, Esra Al and colleagues have found the reason for this change in perception: Brain activity is changing over the heart cycle. In systole a specific component of brain activity, which is associated with consciousness, the so called P300-component is suppressed. In other words, it seems that - in systole - the brain makes sure that certain information is kept out of conscious experience. The brain seems to take into account the pulse which floods the body in systole and predicts that pulse-associated bodily changes are "not real" but rather due to the pulse. Normally, this helps us to not be constantly disturbed by our pulse. However, when it comes to weak stimuli which coincide with systole we might miss them, although they are real.

During their investigations on heart-brain interactions, Al and colleagues also revealed a second effect of heartbeat on perception: If a person's brain shows a higher response to the heartbeat, the processing of the stimulus in the brain is attenuated - the person detects the stimulus less. "This seems to be a result of directing our attention between external environmental signals and internal bodily signals.", explains study author Al. In other words, a large heartbeat-evoked potential seems to reflect a "state of mind", in which we are more focused on the functioning of our inner organs such as the blood circulation, however less aware of stimuli from the outside world.

The results not only have implications for our understanding of heart-brain interactions in healthy persons, but also in patients. The senior author, Arno Villringer explains, "The new results might help to explain why patients after stroke often suffer from cardiac problems and why patients with cardiac disease often have impaired cognitive function."

The researchers investigated these relationships by sending weak electrical stimuli to electrodes clamped onto the study participants fingers. In parallel, they recorded each participants' brain processes using an EEG and their cardiac activity using an EKG.

Credit: 
Max Planck Institute for Human Cognitive and Brain Sciences

Making transitions from nursing home to hospital safer during COVID-19 outbreak

image: In addition to being a research scientist in the Indiana Center for Aging Research at Regenstrief Institute, Kathleen Unroe, M.D., MHA, is an associate professor of medicine at Indiana University School of Medicine and a practicing geriatrician.

Image: 
Regenstrief Institute

INDIANAPOLIS - The COVID-19 pandemic has severely impacted nursing home residents, and many have been hospitalized. To help nursing homes and emergency departments (EDs) handle these difficult patient transfers, Regenstrief Institute Research Scientist and Indiana University School of Medicine Associate Professor Kathleen T. Unroe, M.D., MHA, and colleagues have developed the top 10 points for safe care transitions between nursing home and emergency departments during the pandemic.

Acknowledging that issues, including scarce resources and the vulnerability of the nursing home population are universal, the authors write that evolving solutions are necessarily local. They produced this manuscript, they write, to guide conversation and planning between nursing homes and hospitals and to raise general awareness of the need to make transfers from nursing homes to hospitals safer.

The paper also emphasizes the need for nursing homes to work with residents and their families on advance care planning to determine whether or not hospitalization is desired.

"Especially during this pandemic, clarity on resident and family goals for care is critical," said Dr. Unroe. "Nursing home residents are at significant risk of having a poor outcome if they become sick from COVID-19. Clear communication around preferences for treatment and setting expectations has never been more important.

"For us to provide the best care for our seniors," Dr. Unroe said, "we must communicate across facilities and disciplines. Nursing home and ED providers need to directly and clearly hand-off care of patients transitioning across settings."

Credit: 
Regenstrief Institute

Scientists explore the power of radio waves to help control fusion reactions

image: Physicist Eduardo Rodriguez with images from paper.

Image: 
Photo and composite by Elle Starkman.

A key challenge to capturing and controlling fusion energy on Earth is maintaining the stability of plasma -- the electrically charged gas that fuels fusion reactions -- and keeping it millions of degrees hot to launch and maintain fusion reactions. This challenge requires controlling magnetic islands, bubble-like structures that form in the plasma in doughnut-shaped tokamak fusion facilities. These islands can grow, cool the plasma and trigger disruptions -- the sudden release of energy stored in the plasma -- that can halt fusion reactions and seriously damage the fusion facilities that house them.

Improved island control

Research by scientists at Princeton University and at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) points toward improved control of the troublesome magnetic islands in ITER, the international tokamak under construction in France, and other future fusion facilities that cannot allow large disruptions. "This research could open the door to improved control schemes previously deemed unobtainable," said Eduardo Rodriguez, a graduate student in the Princeton Program in Plasma Physics and first author of a paper (link is external) in Physics of Plasmas that reports the findings.

The research follows up on previous work by Allan Reiman and Nat Fisch, which identified a new effect called "RF [radio frequency] current condensation" that can greatly facilitate the stabilization of magnetic islands. The new Physics of Plasmas paper shows how to make optimal use of the effect. Reiman is a Distinguished Research Fellow at PPPL and Fisch is a Princeton University professor and Director of the Princeton Program in Plasma Physics and Associate Director of Academic Affairs at PPPL.

Fusion reactions combine light elements in the form of plasma -- the state of matter composed of free electrons and atomic nuclei -- to generate massive amounts of energy in the sun and stars. Scientists throughout the world are seeking to reproduce the process on Earth for a virtually inexhaustible supply of safe and clean power to generate electricity for all humanity.

The new paper, based on a simplified analytical model, focuses on use of RF waves to heat the islands and drive electric current that causes them to shrink and disappear. When the temperature gets sufficiently high, complicated interactions can occur that lead to the RF current condensation effect, which concentrates the current in the center of the island and can greatly enhance the stabilization. But as the temperature increases, and the gradient of the temperature between the colder edge and the hot interior of the island grows larger, the gradient can drive instabilities that make it more difficult to increase the temperature further.

Point-counterpoint

This point-counterpoint is an important indicator of whether the RF waves will accomplish their stabilizing goal. "We analyze the interaction between the current condensation and the increased turbulence from the gradient the heating creates to determine whether the system is stabilized or not," Rodriguez says. "We want the islands not to grow." The new paper shows how to control the power and aiming of the waves to make optimal use of the RF current condensation effect, taking account of the instabilities. Focusing on this can lead to improved stabilization of fusion reactors," Rodriguez said.

The researchers now plan to introduce new aspects into the model to develop a more detailed investigation. Such steps include work being done towards including the condensation effect in computer codes to model the behavior of launched RF waves and their true effect. The technique would ultimately be used in designing optimal island stabilization schemes.

Credit: 
DOE/Princeton Plasma Physics Laboratory

Scientists double understanding of genetic risk of melanoma

A global collaboration of scientists has more than doubled the known number of regions on the human genome that influence the risk of developing melanoma.

The research, involving the University of Leeds in the UK, QIMR Berghofer in Australia, and the National Cancer Institute in the US, has been published today in the journal Nature Genetics.

Melanoma is a sometimes-deadly skin cancer, with an estimated 350,000 cases worldwide in 2015, resulting in nearly 60,000 deaths.

The joint study leader and QIMR Berghofer statistical geneticist, Associate Professor Matthew Law, said the researchers identified 33 new regions of the genome and confirmed another 21 previously reported regions that are linked to a person's risk of developing melanoma of the skin.

"By finding new regions we can now narrow in on the specific underlying genes and better understand the pathways that lead to melanoma," Associate Professor Law said.

"Two of the new regions we've discovered that are linked to melanoma have previously been linked to autoimmune disorders. This provides further evidence that the immune system plays an important role in a person developing melanoma.

"We also found an association between melanoma and common genetic variants in the gene TP53, which is a gene critical in controlling DNA repair when cells divide, and in suppressing cancer."

The UK based co-lead author, Dr Mark Iles from the University of Leeds' Institute for Data Analytics, said the researchers examined DNA from 37,000 people who had been diagnosed with melanoma and compared their genetic information to that of nearly 400,000 people with no history of the disease.

"The large population sample made it possible to recognise which regions of the genome were active in people with melanoma," Dr Iles said.

"The population sample we used is three times larger than any previous genetic study on melanoma risk and gives us strong confidence that the new regions we've discovered all play a role in the disease.

"It's a product of power in numbers. The only way to discover these things is by having such a large study population that spans across the globe, and we'd urge more people to sign up for these large melanoma research projects."

Melanoma begins in melanocytes, cells in the skin responsible for making the pigment melanin that gives colour to the skin.

Melanin can block some of the harmful effects of UV radiation, which is why people with pale skin are at a higher risk of skin cancer, but the protection is not complete.

Moles also develop from melanocytes and having a high number of moles is a risk factor for melanoma.

Dr Maria Teresa Landi, co-lead author on the study and senior investigator at the US National Cancer Institute, part of the National Institutes of Health, said the research also uncovered other important clues to the genetic causes of melanoma.

"We used the relationship between moles, pigmentation, and melanoma to identify 31 additional gene regions that potentially influence melanoma risk. For example, one of the regions we identified is involved in melanocyte growth," Dr Landi said.

"Moreover, we also included people from Mediterranean populations involved in the MelaNostrum Consortium. Most studies of melanoma use people with northern or western European ancestry (e.g. British) and by expanding our analysis to include Mediterranean populations, we will gain a greater understanding of the genetics of melanoma in this highly sun-exposed group.

"Larger datasets are needed to find further genes that influence melanoma risk, and in parallel we need functional studies to work out the exact mechanisms that lead to these genes causing melanoma."

The researchers said the study reaffirmed the importance of protecting skin from the sun by avoiding UV exposure between the hours of 10 AM and 4 PM, seeking shade, covering up exposed skin, and using sunscreen.

Credit: 
University of Leeds

Cancer care model could help us cope with COVID-19, says nanomedicine expert

As the UK government looks for an exit strategy to Britain's COVID-19 lockdown a nanomedicine expert from The University of Manchester believes a care model usually applied to cancer patients could provide a constructive way forward.

Kostas Kostarelos, is Professor of Nanomedicine at The University of Manchester and is leading the Nanomedicine Lab, which is part of the National Graphene Institute and the Manchester Cancer Research Centre.

The Manchester-based expert believes more scientific research should be employed as we transform how we view the COVID-19 pandemic, or any future virus outbreak, and deal with it more like a chronic disease - an ever present issue for humanity that needs systematic management if we are ever to return to our 'normal' lives.

Professor Kostarelos makes this claim in an academic thesis entitled 'Nanoscale nights of COVID-19' that offers a nanoscience response to the COVID-19 crisis and will be published on Monday, April 27, by the journal Nature Nanotechnology.

"As for any other chronic medical condition, COVID-19 stricken societies have families, jobs, businesses and other commitments. Therefore, our aim is to cure COVID-19 if possible," says Professor Kostarelos.

"However, if no immediate cure is available, such as effective vaccination," Professor Kostarelos suggests: "We need to manage the symptoms to improve the quality of patients' lives by making sure our society can function as near as normal and simultaneously guarantee targeted protection of the ill and most vulnerable."

Professor Kostarelos says his experience in cancer research and nanotechnology suggests a model that could also be applied to a viral pandemic like COVID-19.

"There are three key principles in managing an individual cancer patient: early detection, monitoring and targeting," explains Professor Kostarelos. "These principles, if exercised simultaneously, could provide us with a way forward in the management of COVID-19 and the future pandemics.

"Early detection has improved the prognosis of many cancer patients. Similarly, early detection of individuals and groups, who are infected with COVID-19, could substantially accelerate the ability to manage and treat patients.

"All chronic conditions, such as cancer, are further managed by regular monitoring. Therefore, monitoring should be undertaken not only for patients already infected with COVID-19, to track progression and responses, but also for healthy essential workers to ensure that they remain healthy and to reduce the risk of further spreading.

Finally, says Professor Kostarelos, nanomaterials - as well as other biologicals, such as monoclonal antibodies - are often used for targeting therapeutic agents that will be most effective only against cancer cells.

The same principle of 'targeting' should be applied for the management of COVID-19 patients to be able to safely isolate them and ensure they receive prompt treatment.

Also, a safeguarding strategy should be provided to the most vulnerable segments of the population by, for example, extending social distancing protocols in elderly care homes - but with the provision of emotional and practical support to ensure the wellbeing of this group is fully supported.

"Protection of the most vulnerable and essential workers, must be guaranteed, with protective gear and monitoring continuously provided," he added. "Only if all three principles are applied can the rest of society begin to return to normal function and better support the activities in managing this and all future pandemics."

Credit: 
University of Manchester

Soil in wounds can help stem deadly bleeding

New UBC research shows for the first time that soil silicates--the most abundant material on the Earth's crust--play a key role in blood clotting.

"Soil is not simply our matrix for growing food and for building materials. Here we discovered that soil can actually help control bleeding after injury by triggering clotting," says the study's senior author Christian Kastrup, associate professor in the faculty of medicine's department of biochemistry and molecular biology and a scientist in UBC's Michael Smith Laboratories and Centre for Blood Research.

The study, published today in Blood Advances, found that the presence of soil in wounds helps activate a blood protein, known as coagulation Factor XII. Once activated, the protein kicks off a rapid chain reaction that helps leads to the formation of a plug, sealing the wound and limiting blood loss.

While the researchers caution that there is a high risk of infection from unsterilized dirt, they say their findings may have implications for the future development of novel strategies using sterilized dirt to help manage bleeding and potentially understand infection after trauma.

"Excessive bleeding is responsible for up to 40 per cent of mortality in trauma patients. In extreme cases and in remote areas without access to healthcare and wound sealing products, like sponges and sealants, sterilized soil could potentially be used to stem deadly bleeding following injuries," says Kastrup.

The study also uncovered that the mechanism by which soil silicates activate Factor XII and promote faster clotting is unique to terrestrial mammals, or those that live predominantly or entirely on land.

"This finding demonstrates how terrestrial mammals, ranging from mice to humans, evolved to naturally use silicates as a specific signal to Factor XII to trigger blood clotting," says Lih Jiin Juang, the study's first author and UBC PhD student in the department of biochemistry and molecular biology. "These results will have a profound impact on the way we view our relationship with our environment."

The scientists' next plan includes testing if the response of blood to silicates helps prevent infection from microbes in soil. They will also look to test if silicates from the moon's surface are able to active Factor XII and stop bleeding.

"If moon silicates activate Factor XII, this discovery could prove useful in preventing death among people visiting or colonizing the moon, and it would provide further insight to identifying materials that may halt bleeding in very remote environments with limited resources and medical supplies," says Kastrup.

Credit: 
University of British Columbia

The North Atlantic right whale population is in poor condition

image: Healthy southern right whales from three populations (left three photographs) next to a much leaner North Atlantic right whale (right) in visibly poorer body condition.

Image: 
Fredrik Christiansen (left & center-left), Stephen M. Dawson (center-right), John W. Durban and Holly Fearnbach (right).

New research by an international team of scientists reveals that endangered North Atlantic right whales are in much poorer body condition than their counterparts in the southern hemisphere. The alarming results from this research, led by Dr Fredrik Christiansen from Aarhus University in Denmark, were published last week in the journal Marine Ecology Progress Series.

Since large-scale commercial whaling stopped in the last century, most populations of southern right whales have recovered well, and now there are about 10,000 - 15,000 right whales in the southern hemisphere. Unfortunately, the same cannot be said for the North Atlantic right whales, found today off the east coast of North America. There are only around 410 individuals left, and the species is heading to extinction.

A number of known challenges cause the dissimilar and unfortunate development of the two different whale populations: Lethal vessel strikes and entanglement in fishing gear continue to kill and harm the North Atlantic right whales. Individual North Atlantic right whales also have to deal with frequent entanglements in fishing gear, in particular lobster and crab pots, which drains their energy. These burdens, along with a change in the abundance and distribution of their main source of food, copepods and krill, have left the whales thin and unhealthy, which makes them less likely to have a calf.

All this contributes to the current overall decline of the species. But so far, it has not been fully understood, how the body condition of the whales was affected by the different conditions in the north Atlantic.

Cross-continental study of whales' body condition

To quantify the 'thin and unhealthy' state of the North Atlantic right whales, Dr Christiansen and an international team of scientists have investigated the body condition of individual North Atlantic right whales, and compared their condition with individuals from three increasing populations of Southern right whales: off Argentina, Australia and New Zealand.

"Good body condition and abundant fat reserves are crucial for the reproduction of large whales, including right whales, as the animals rely on these energy stores during the breeding season when they are mostly fasting," said Dr Fredrik Christiansen from Aarhus Institute of Advanced Studies and Department of Biology, Aarhus University, Denmark, and lead author of the study. "Stored fat reserves are particularly important for mothers, who need the extra energy to support the growth of their newborn calf while they are nursing."

How fat are right whales?

The study is the largest assessment of the body condition of baleen whales in the world, and involved researchers from 12 institutes in five countries. The international research team used drones and a method called 'aerial photogrammetry' to measure the body length and width of individual right whales in these four regions around the world. From aerial photographs, the researchers were able to estimate the body volume of individual whales, which they then used to derive an index of body condition or relative fatness.

The analyses revealed that individual North Atlantic right whales, juveniles, adults and mothers, were all in poorer body condition than individual whales from the three other populations of Southern right whales. This difference is alarming, since poor body condition for North Atlantic right whales explains why too many of them are dying, and why they are not giving birth to enough calves to boost the population's recovery. It may also be affecting their growth, and delaying juveniles in reaching sexual maturity. The combined impacts on individuals help explain why the species is in decline.

"For North Atlantic right whales as individuals, and as a species, things are going terribly wrong. This comparison with their southern hemisphere relatives shows that most individual North Atlantic right whales are in much worse condition than they should be," said Dr Michael Moore from the Woods Hole Oceanographic Institution. "Sub-lethal entanglement trauma, along with changing food supplies is making them too skinny to reproduce well, and lethal entanglement and vessel trauma are killing them. To reverse these changes, we must: redirect vessels way from, and reduce their speed in, right whale habitat; retrieve crab and lobster traps without rope in the water column using available technologies; and minimize ocean noise from its many sources."

"Right whales are a sentinel species of ocean health. They are warning us, and their message is strong - the seas that used to be a safe haven for whales are now a threat. We must act now to protect their home, everybody´s home," said Dr Mariano Sironi from Instituto de Conservación de Ballenas and right whale researcher in Argentina.

Credit: 
Aarhus University

Electronics for high-altitude use can get smaller and sturdier with new nanomaterials

image: After being bombarded with ionizing radiation, this sample with copper-platinum nano-ink on its surface still conducts electricity.

Image: 
Sandia National Laboratories

WASHINGTON, April 27, 2020 — As demand for higher-efficiency and smaller electronics grows, so does demand for a new generation of materials that can be printed at ever smaller dimensions. Such materials are critical to national security applications and space exploration. But materials that work well on Earth don’t always hold up well at high altitudes and in space. Scientists are now creating new metal-based nanomaterials for circuit boards that could be resistant to the high-altitude radiation encountered by electronics in aerospace equipment, fighter jets and weapon systems.

The researchers are presenting their results through the American Chemical Society (ACS) SciMeetings online platform.

“This research grew out of the huge need at U.S. national laboratories for electronic materials that are stable when subjected to ionizing radiation,” says Timothy J. Boyle, Ph.D., the project’s principal investigator. “We started looking at nanometal materials because they can be printed at the required smaller dimension, but we had to overcome their tendency to be damaged by ionizing radiation.”

Ionizing radiation comes from a variety of natural and manmade sources, and it can change the structures of molecules. On Earth, most of this radiation comes naturally from the environment, or even from the sun or outside of the solar system, and is not normally harmful. But at very high altitudes — such as those experienced by airplanes or spacecraft — the levels of radiation are much higher and can affect humans, as well as materials, that are exposed to it for long periods.

Boyle says that copper was an obvious choice to miniaturize electronics for high-altitude applications, because it has already been used in nano-inks in circuits, and it is a lower-cost alternative to more expensive materials, such as silver and gold. But like most metals, copper is susceptible to damage from elevated levels of radiation. To create new nano-inks that might hold up well in high altitudes, Boyle, Fernando Guerrero and others added, or doped, more than 10 different metals in copper to see which one would work best. They placed the metals inside the face-centered-cubic (fcc) crystal structure of copper — a cube with an atom of copper at each corner and an empty space in the center.

The group, located at Sandia National Laboratories, had to surmount numerous hurdles in synthesis, characterization, formulation and testing of the nano-inks. “We had to find a system that worked for each potential metal,” Guerrero says. “It also was challenging to insert the metal into the copper fcc and synthesize the doped metal alloy in sufficient quantity to make enough ink to print.” When Guerrero used platinum as the dopant, he finally obtained enough nano-ink to work with.

To test the copper-platinum nano-ink, Guerrero printed and sintered it in a furnace at about 700 F for an hour. Sintering compacts particles and creates a solid mass of material by heat (as in this case) or pressure. He then exposed this material to high-energy radiation in a unique instrument developed at Sandia by Khalid M. Hattar, Ph.D., called the in-situ irradiation transmission electron microscope (I3TEM). This instrument bombards the sample with ionizing radiation — similar to the type encountered by electronics operating in space or at high altitudes — while displaying it under a microscope.

“The preliminary results are promising,” Guerrero says. “The sample after exposure to the I3TEM still conducted electricity, suggesting it was resistant to the radiation.”

The next step is to test the materials to confirm the results using Sandia’s gamma irradiation facility (GIF), which provides a high-fidelity simulation of nuclear radiation environments for materials and component testing. GIF can produce a wide range of gamma radiation environments and can irradiate objects as small as electronic components and as large as an M1 Abrams tank.

“Our ultimate goal is to develop enhanced radiation-stable, copper-based electronic nanomaterials to meet munition, aerospace and nuclear monitoring needs,” Boyle says. “Our future research will focus on mastering and making smaller, more effective nanoparticles and inks and testing the printed circuits with the GIF.”

Credit: 
American Chemical Society

Erosion process studies in the Volga Region assist in land use planning

image: Location of the study region in the East European (Russian) Plain within European Russia.

Image: 
Kazan Federal University

Dr. Gusarov (Paleoclimatology, Paleoecology and Paleomagnetism Lab) has been working on erosion processes for two decades as a part of various teams. In this research, he tackled the Middle Volga Region, the one where the city of Kazan - and Kazan Federal University - are situated.

A few decades ago, until the 1990s, the Middle Volga Region was considered to be a region very intense soil-rill-gully erosion, owing to a favorable combination of natural (especially the geological structure, topography, climate, etc.) and anthropogenic (high agricultural development of soils in the region due to their relatively high fertility) factors. It is widely known that the last three decades, following the collapse of the Soviet Union, have been characterized by changes in climate and land use/cover. These changes have affected the intensity of soil-gully erosion processes and the volume of their products - sediments, which are redeposited in different parts of river basins or migrate out of river basins with water flow.

In the paper, a generalized representation of the development trends of these processes over the past 50 years was made, based mostly on the analysis of the sediment that moves with river waters. The advantage of using long-term series of river sediment yield for research is that they are an integral indicator of the change in time of both soil and gully erosion processes. Hence, it is better for assessing trends on a regional scale. Additionally, the main trends in seasonal river water flow and their relationship with river sediment yield dynamics were identified. Information on river water flow and sediment yield is provided the Federal Service for Hydrometeorology and Environmental Monitoring of the Russian Federation. Long-term data from hydrological stations of 14 small and medium-sized rivers (the basin areas are from 237 to 12,000 km2) of the Middle Volga Region (from the Republic of Tatarstan and Chuvash Republic in the north to the Samara Oblast and Orenburg Oblast in the south) were analyzed. In one of the river basins (the Myosha River basin, Tatarstan) in 2016, field studies were also carried out in one of the small and ploughed catchments, in order to use it as an example to assess the erosion dynamics in the upper links of the river network. Their results were partially included in the study.

The processes of soil and gully erosion are largely responsible for the general degradation of the soil cover globally and regionally. This is a matter of environmental and food security of the region and the country as a whole. In turn, the mobilization of significant financial, technical, intellectual and other resources to combat soil erosion is required. This and other related studies have shown a significant decrease in erosion activity and volumes of its products, as well as in unevenness in intra-annual river water flow in the region over the past twenty years. Hence, from a practical point of view, this should lead to the redistribution of resources, the adjustment of a number of programs of nature conservation, the rational use of natural (including water) resources, and other steps.

A quantitative assessment of the contribution (share) of changes in climate and land use to the mentioned trends in soil-rill-gully erosion is planned for the near future. It is also important to study the delivery of erosion products from the upper links to the lower in the river network of the region under different scenarios of climate change and land use change. Another important area of research here is to study the migration and accumulation of sediment-associated pollutants in different parts of the river network of the region.

The article was made available online in November 2019 and is due to appear in print in June 2020.

Credit: 
Kazan Federal University

NASA catches formation and final fate of Eastern Pacific's Tropical Depression 1E

image: NASA's Aqua satellite passed over Tropical Depression 1E on Sunday, April 26, 2020 at 5 a.m. EDT (0900 UTC) and the MODIS instrument aboard analyzed 1E in infrared light to determine temperature of cloud tops in storms. MODIS found cloud top temperatures as cold as minus 50 degrees Fahrenheit in the depression, not cold enough to suggest heavy rainfall.

Image: 
NASA/NRL

The Eastern Pacific Ocean's hurricane season may not officially start until mid-May, but the first tropical cyclone of the season formed over the weekend of April 25 and 26. NASA's Aqua satellite provided an infrared look at the small depression when it was at its peak and before it became post-tropical.

Tropical Depression 1E formed on Saturday, April 25 and by the next day, it was not in a favorable environment for further development, according to the National Hurricane Center (NHC). NASA's Aqua satellite provided forecasters with a look at 1E's cloud top temperatures to assess the strength of the storm.

NASA's Aqua satellite passed over Tropical Depression 1E on Sunday, April 26, 2020 at 5 a.m. EDT (0900 UTC) and the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard analyzed 1E in infrared light. Infrared light is used to determine temperature of cloud tops in storms, and the colder the cloud top, the higher it is in the troposphere (lowest layer of the atmosphere), and the stronger the storm tends to be. MODIS found cloud top temperatures as cold as minus 50 degrees Fahrenheit in the depression, not cold enough to suggest heavy rainfall.

On April 26, the National Hurricane Center (NHC) reported, "Deep convection has been waning quickly in the southeastern quadrant since the previous advisory, and the system barely met criteria for identifying it as a tropical cyclone at 1200 UTC (8 a.m. EDT)."

NHC said at 11 a.m. EDT (8 a.m. PDT) the center of Tropical Depression 1E was located near latitude 15.7 north, longitude 118.8 west. At that time, 1E reached its peak with maximum sustained winds near 35 mph (55 kph). The system weakened throughout the day and overnight as it encountered, "Deep-layer dry air and cooler sea-surface temperatures, combined with strong westerly shear of 25 to 30 knots," NHC noted.

On Monday, April 27, 2020, the NHC said that post-tropical remnant low 1E was far from land areas as it continued to degenerate. It was centered near latitude 17 degrees north and longitude 121 degrees west. The estimated minimum central pressure was 1010 millibars. Maximum sustained winds were 25 knots (29 mph/46 kph).

A post-tropical storm is a generic term for a former tropical cyclone that no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. Former tropical cyclones can become fully extratropical, subtropical, or remnant lows, which are three classes of post-tropical cyclones. In any case, they no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. However, post-tropical cyclones can continue carrying heavy rains and high winds.

This system was only retaining shallow convection. It continued to weaken as it moved toward the west-northwest. NHC expects the system to dissipate by Tuesday, April 28.

Hurricane season officially begins on May 15 in the Eastern Pacific Ocean, so this depression was early for season. For more information about NHC, visit: http://www.nhc.noaa.gov.

Typhoons/ hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

The MODIS instrument is one of six instruments flying on board NASA's Aqua satellite which launched on May 4, 2002.

Credit: 
NASA/Goddard Space Flight Center

'Elegant' solution reveals how the universe got its structure

image: The Magellan telescopes at Carnegie's Las Campanas Observatory in Chile, which were crucial to the ability to conduct this survey.

Image: 
Photograph by Yuri Beletsky, courtesy of the Carnegie Institution for Science.

Pasadena, CA-- The universe is full of billions of galaxies--but their distribution across space is far from uniform. Why do we see so much structure in the universe today and how did it all form and grow?

A 10-year survey of tens of thousands of galaxies made using the Magellan Baade Telescope at Carnegie's Las Campanas Observatory in Chile provided a new approach to answering this fundamental mystery. The results, led by Carnegie's Daniel Kelson, are published in Monthly Notices of the Royal Astronomical Society.

"How do you describe the indescribable?" asks Kelson. "By taking an entirely new approach to the problem."

"Our tactic provides new--and intuitive--insights into how gravity drove the growth of structure from the universe's earliest times," said co-author Andrew Benson. "This is a direct, observation-based test of one of the pillars of cosmology."

The Carnegie-Spitzer-IMACS Redshift Survey was designed to study the relationship between galaxy growth and the surrounding environment over the last 9 billion years, when modern galaxies' appearances were defined.

The first galaxies were formed a few hundred million years after the Big Bang, which started the universe as a hot, murky soup of extremely energetic particles. As this material expanded outward from the initial explosion, it cooled, and the particles coalesced into neutral hydrogen gas. Some patches were denser than others and, eventually, their gravity overcame the universe's outward trajectory and the material collapsed inward, forming the first clumps of structure in the cosmos.

The density differences that allowed for structures both large and small to form in some places and not in others have been a longstanding topic of fascination. But until now, astronomers' abilities to model how structure grew in the universe over the last 13 billion years faced mathematical limitations.

"The gravitational interactions occurring between all the particles in the universe are too complex to explain with simple mathematics," Benson said.

So, astronomers either used mathematical approximations--which compromised the accuracy of their models--or large computer simulations that numerically model all the interactions between galaxies, but not all the interactions occurring between all of the particles, which was considered too complicated.

"A key goal of our survey was to count up the mass present in stars found in an enormous selection of distant galaxies and then use this information to formulate a new approach to understanding how structure formed in the universe," Kelson explained.

The research team--which also included Carnegie's Louis Abramson, Shannon Patel, Stephen Shectman, Alan Dressler, Patrick McCarthy, and John S. Mulchaey, as well as Rik Williams , now of Uber Technologies--demonstrated for the first time that the growth of individual proto-structures can be calculated and then averaged over all of space.

Doing this revealed that denser clumps grew faster, and less-dense clumps grew more slowly.

They were then able to work backward and determine the original distributions and growth rates of the fluctuations in density, which would eventually become the large-scale structures that determined the distributions of galaxies we see today.

In essence, their work provided a simple, yet accurate, description of why and how density fluctuations grow the way they do in the real universe, as well as in the computational-based work that underpins our understanding of the universe's infancy.

"And it's just so simple, with a real elegance to it," added Kelson.

The findings would not have been possible without the allocation of an extraordinary number of observing nights at Las Campanas.

"Many institutions wouldn't have had the capacity to take on a project of this scope on their own," said Observatories Director John Mulchaey. "But thanks to our Magellan Telescopes, we were able to execute this survey and create this novel approach to answering a classic question."

"While there's no doubt that this project required the resources of an institution like Carnegie, our work also could not have happened without the tremendous number of additional infrared images that we were able to obtain at Kit Peak and Cerro Tololo, which are both part of the NSF's National Optical-Infrared Astronomy Research Laboratory," Kelson added.

Credit: 
Carnegie Institution for Science

Superconductivity: It's hydrogen's fault

image: When hydrogen is incorporated into the nickelate structure, it is not a superconductor.

Image: 
TU Wien

Last summer, a new age for high-temperature superconductivity was proclaimed - the nickel age. It was discovered that there are promising superconductors in a special class of materials, the so-called nickelates, which can conduct electric current without any resistance even at high temperatures.

However, it soon became apparent that these initially spectacular results from Stanford could not be reproduced by other research groups. TU Wien (Vienna) has now found the reason for this: In some nickelates additional hydrogen atoms are incorporated into the material structure. This completely changes the electrical behaviour of the material. In the production of the new superconductors, this effect must now be taken into account.

The search for High-Temperature Superconductors

Some materials are only superconducting near absolute temperature zero - such superconductors are not suitable for technical applications. Therefore, for decades, people have been looking for materials that remain superconducting even at higher temperatures. In the 1980s, "high-temperature superconductors" were discovered. What is referred to as "high temperatures" in this context, however, is still very cold: even high-temperature superconductors must be cooled strongly in order to obtain their superconducting properties. Therefore, the search for new superconductors at even higher temperatures continues.

"For a long time, special attention was paid to so-called cuprates, i.e. compounds containing copper. This is why we also speak of the copper age", explains Prof. Karsten Held from the Institute of Solid State Physics at TU Wien. "With these cuprates, some important progress was made, even though there are still many open questions in the theory of high-temperature superconductivity today".

But for some time now, other possibilities have also been under consideration. There was already a so-called "iron age" based on iron-containing superconductors. In summer 2019, the research group of Harold Y. Hwang's research group from Stanford then succeeded in demonstrating high-temperature superconductivity in nickelates. "Based on our calculations, we already proposed nickelates as superconductors 10 years ago, but they were somewhat different from the ones that have now been discovered. They are related to cuprates, but contain nickel instead of copper atoms," says Karsten Held.

The Trouble with Hydrogen

After some initial enthusiasm, however, it has become apparent in recent months that nickelate superconductors are more difficult to produce than initially thought. Other research groups reported that their nickelates do not have superconducting properties. This apparent contradiction has now been clarified at TU Wien.

"We analysed the nickelates with the help of supercomputers and found that they are extremely receptive to hydrogen into the material," reports Liang Si (TU Vienna). In the synthesis of certain nickelates, hydrogen atoms can be incorporated, which completely changes the electronic properties of the material. "However, this does not happen with all nickelates," says Liang Si, "Our calculations show that for most of them, it is energetically more favourable to incorporate hydrogen, but not for the nickelates from Stanford. Even small changes in the synthesis conditions can make a difference." Last Friday (on 24.04.2020) the group around Ariando Ariando from the NUS Singapore could report that they also succeeded in producing superconducting nickelates. They let the hydrogen that is released in the production process escape immediately.

Calculating the Critical Temperature with Supercomputers

At TU Wien new computer calculation methods are being developed and used to understand and predict the properties of nickelates. "Since a large number of quantum-physical particles always play a role here at the same time, the calculations are extremely complex," says Liang Si, "But by combining different methods, we are now even able to estimate the critical temperature up to which the various materials are superconducting. Such reliable calculations have not been possible before."
In particular, the team at TU Wien was able to calculate the allowed range of strontium concentration for which the nickelates are superconducting - and this prediction has now been confirmed in experiment.

"High-temperature superconductivity is an extremely complex and difficult field of research," says Karsten Held. "The new nickelate superconductors, together with our theoretical understanding and the predictive power of computer calculations, open up a whole new perspective on the great dream of solid state physics: a superconductor at ambient temperature that hence works without any cooling."

Credit: 
Vienna University of Technology

Researchers make key advance toward production of important biofuel

image: Graphic depicting biobutanol separation method.

Image: 
Oregon State University

CORVALLIS, Ore. - An international research collaboration has taken an important step toward the commercially viable manufacture of biobutanol, an alcohol whose strong potential as a fuel for gasoline-powered engines could pave the path away from fossil fuels.

The key breakthrough is the development of a new metal organic framework, or MOF, that can efficiently separate biobutanol from the broth of fermented biomass needed for the fuel's production. Findings were published today in the Journal of the American Chemical Society.

The researchers are now looking to partner with industry to try to scale up the separation method using the new metal organic framework, says the study's corresponding, Kyriakos Stylianou of Oregon State University.

If it scales well, it could be an important milestone on the road toward non-reliance on fossil fuels.

"Biofuels are a sustainable and renewable fuel alternative, and biobutanol has recently emerged as an attractive option compared to bioethanol and biodiesel," said Stylianou, a chemistry researcher in OSU's College of Science. "But separating it from the fermentation broth has been a significant obstacle on the way to economically competitive manufacturing."

Butanol, also known as butyl alcohol, is more closely related to gasoline than ethanol and can be synthesized from petroleum or made from biomass. Bioethanol - ethyl alcohol - is a common biofuel additive but contains significantly less energy per gallon than gasoline and can also be harmful to engine components.

The process of creating biobutanol is known as ABE fermentation - acetone-butanol-ethanol. It yields a watery broth that maxes out at about 2% butanol by weight. Hence the need for a separation tool that can work well in an aqueous environment and also in the presence of organic solvents, in this case acetone, which is a key ingredient in products like nail polish remover and paint thinner.

Stylianou and colleagues at universities in Switzerland, China, the United Kingdom and Spain synthesized a novel metal organic framework, based on copper ions and carborane-carboxylate ligands, known as mCB-MOF-1. The MOF can pull butanol from the fermentation broth, via adsorption, with greater efficiency than distillation or any other existing method.

The MOF is stable in organic solvents, in hot water, and in both acidic and basic aqueous solutions.

"Biofuels can augment energy security and supply and also can be a big part of an energy plan that actually captures and stores carbon, which would be huge for meeting targets for combating climate change," Stylianou said. "Biobutanol is better than bioethanol for a variety of reasons, including that it's almost as energy-dense as gasoline and mixes well with gasoline. And biobutanol can also potentially replace synthetic butanol as an essential precursor for a range of industrial chemicals."

Credit: 
Oregon State University

Investigating the causes of the ozone levels in the Valderejo Nature Reserve

image: Weather station at the Valderejo Nature Reserve (Basque Country, Spain), where the study was conducted

Image: 
UPV/EHU

Atmospheric contamination is one of society's main concerns, with nitrogen dioxide (NO2) and tropospheric ozone (O3) being among the contaminants that are giving rise to the most concern. In principle, the measures to reduce NO2 contamination effectively are fairly easy to identify; however, reducing O3 contamination is much more complex because it is a secondary contaminant that is not emitted directly but is generated on the basis of chemical reactions in the atmosphere. So a number of contaminants, such as nitrogen oxides (NOx) and a large number of volatile organic compounds (VOC), undergo various chemical reactions and physical transformations brought about by the action of solar radiation and which lead to the formation of ozone. "Ozone is formed in the atmosphere mainly as a result of the reaction of NOx and VOCs in the presence of high solar radiation," said María Carmen Gómez, researcher in the UPV/EHU's Atmospheric Research Group (GIA). So "the way to control the ozone involves controlling these contaminants, their precursors and getting to know their formation mechanisms better", she added.

The Valderejo Nature Reserve is one of the stations in the Basque Country where over the last few years the legal limits stipulated for ozone have been intermittently exceeded. Against this background and for the purpose of studying the ozone dynamics in this area, the UPV/EHU's Atmospheric Research Group has developed a database of more than 60 volatile organic compounds measured continuously over the last ten years in the Valderejo Nature Reserve. VOCs originate both naturally (biogenic VOCs, BVOCs) and anthropogenically (due to the evaporation of organic solvents, the burning of fuels, transport, etc.). So "we characterised the precursors in ozone formation and calculated the formation potential of each compound individually", explained María Carmen Gómez.

"The main results of this study centre on the episodes in which the target and/or information threshold values stipulated by legislation for ozone were exceeded, which tends to occur in the Valderejo Nature Reserve between June and September. The contribution of biogenic volatile organic compounds (BVOCs) towards the ozone forming potential over the summer months is reckoned to account for up to 68% of the total VOCs measured," said the UPV/EHU researcher. "BVOCs include isoprene and monoterpines emitted by vegetation. Isoprene is highly volatile and its emission increases with the rise in temperature and radiation. Monoterpines, emitted mostly by conifers, are stored by part of the vegetation; radiation has little effect on their emission, while temperature has a major effect," explained Gomez.

What is more, "we saw that between 13% and 51% of the ozone recorded in the reserve is due to local VOCs; the rest are transported to the measuring station, in particular when contaminated air masses arrive", said the researcher of the group.

María Carmen Gómez stresses the need for the study to be pursued further. "It is a way of expanding knowledge about ozone transportation and formation processes in the Basque Autonomous Community (region). As they are complex processes, the more information we have, the greater the possibilities will be of interpreting them properly so that they can be used to manage air quality and inform control strategies".

Credit: 
University of the Basque Country