Earth

Heart rhythm disorders are best managed when patients are listened to

Sophia Antipolis, France - 29 Aug 2020: Atrial fibrillation is the most common heart rhythm disorder and increases the risk of stroke by fivefold. Patients with irregular heartbeats should choose the treatment plan with their health professionals, according to European Society of Cardiology (ESC) Guidelines published online today in European Heart Journal,1 and on the ESC website.2 The document was developed in collaboration with the European Association of Cardio-Thoracic Surgery (EACTS).

"Patients want to be involved in decisions about their care and their preferences should be respected," said Professor Gerhard Hindricks, Chairperson of the guidelines Task Force and medical director of the Rhythmology Department, Heart Centre Leipzig, Germany.

It is estimated that one in three Europeans will develop atrial fibrillation. It is associated with a twofold increased risk of death in women and a 1.5-fold increase in men. People with atrial fibrillation are twice as likely to be admitted to hospital as their peers without the condition.

Symptoms include palpitations, shortness of breath, fatigue, and difficulty sleeping. Up to one in five patients are depressed. More than 60% of patients report significantly impaired quality of life, while cognitive decline and dementia are around 50% more likely than in the general population.

The guidelines advocate the Atrial fibrillation Better Care (ABC) pathway. 'A' (Anticoagulation/Avoid stroke) involves anticoagulation medication to prevent stroke except in patients at low risk. 'B' (Better symptom management) refers to controlling heart rate and heart rhythm with medications and procedures. 'C' (Cardiovascular and Comorbidity optimisation) is management of other conditions such as high blood pressure and lifestyle - for example smoking cessation, improved nutrition to lose weight, avoiding excess alcohol, and moderate intensity exercise.

An individualised care plan should be agreed after patients and their family discuss the advantages and limitations of each treatment option with an interdisciplinary team including cardiologists, nurses, and psychologists. Success of treatment from the patient's perspective should be assessed by routinely collecting information on quality of life, symptoms, cognitive function, and ability to work and be physically active. Prevention of stroke is a vital part of treatment.

Atrial fibrillation is one of the most frequent heart rhythm disorders during pregnancy - especially in older women and those born with heart defects - and is associated with an increased risk of death. Vaginal delivery is contraindicated in women taking warfarin because of bleeding risks for the baby. Use of non-vitamin K antagonist oral anticoagulants (NOACs) is prohibited during pregnancy.

Athletes are around five times more likely to develop atrial fibrillation during their lifetime compared to sedentary individuals. Endurance sports such as running, cycling, and cross-country skiing carry the highest risk. Professional athletes should be advised that long-lasting intense sports participation may promote atrial fibrillation. Contact sports should be avoided in patients on oral anticoagulants due to the risks of bleeding.

Screening could identify people with previously undiagnosed atrial fibrillation who could then receive treatment to prevent stroke. More than 100,000 apps for smartphones, wrist bands, and watches and at least 400 wearable activity monitors are available - but the guidelines state that caution is needed as many are not clinically validated to detect atrial fibrillation.

Opportunistic screening is advised for people aged 65 and over and for people with high blood pressure, who should have their pulse taken or undergo an electrocardiogram (ECG). Individuals should be informed about the treatment implications of detecting atrial fibrillation. Those who test positive should be referred to a physician to confirm the diagnosis.

"People with unhealthy lifestyles are more likely to develop atrial fibrillation," said Professor Tatjana Potpara, Chairperson of the guidelines Task Force and head of the Department for Intensive Arrhythmia Care, Clinical Centre of Serbia, Belgrade. "Risk can be reduced by lifestyle modification - for example, weight control, and moderate physical activity."

Credit: 
European Society of Cardiology

New analysis reveals where marine heatwaves will intensify fastest

image: Picture (above): Small boat on still, sunset ocean.

Image: 
Johannes Plenio (Pexels)

The world's strongest ocean currents, which play key roles in fisheries and ocean ecosystems, will experience more intense marine heatwaves than the global average over coming decades, according to a paper published today in Nature Communications by researchers from the ARC Centre of Excellence for Climate Extremes at the University of Tasmania and CSIRO.

Sections of Australia's Leeuwin current and East Australian Current; the United States Gulf Stream; Japan's Kuroshio current; and the most powerful ocean current of all, the Antarctic Circumpolar Current, will all see the intensity of heatwave events ratchet up over the next 30 years.

However, while the intensity of individual marine heatwave events in these areas is likely to increase faster than the global average, the number of marine heatwave days appear to increase at a lower than average rate. And what happens around these currents is even more interesting.

"We know marine heatwaves are on the rise globally, but policymakers, fisheries experts, aquaculture industries and ecologists need to know how this will play out at regional levels, especially in terms of where they will occur and how much hotter they will be," said lead author from the ARC Centre of Excellence for Climate Extremes Dr Hakase Hayashida.

"Our detailed modelling is the first step in peeling back these layers, revealing the temperature variation that occurs across these currents and around them, indicating where the sharpest rises in marine heatwaves are likely to occur.

"For instance, we found intense marine heatwaves were more likely to form well off the coast of Tasmania, while along the Gulf Stream more intense marine heatwaves start to appear more frequently close to the shore along the stretch of coastline from the state of Virginia to New Brunswick. This will almost certainly change ecosystems in these regions."

The key to this research was the use of two near-global high-resolution (1/10o) simulations over current and future periods developed by CSIRO Ocean Downscaling Strategic Project, which could reproduce eddies 100km across and generate realistic boundary currents and fronts.. This detailed approach revealed the, sometimes, stark regional variability in ocean temperature extremes much more variable than coarser global climate models.

The researchers confirmed the accuracy of their model by comparing the detailed model outputs with observations from 1982-2018. They then used the same high-resolution model to project how marine heatwaves would alter with climate change out to 2050.

In every western boundary current they examined, more intense marine heatwaves appeared. In general marine heatwaves also occurred more frequently.

But on the edge of these currents, it was a different story. Eddies that spun off from the main current created areas where the increases in numbers of heatwave days were lower than average and even some regions where heatwave intensity declined.

"Like so many aspects of the climate system, the warming of the oceans isn't the same everywhere, which means the ecology will respond differently to global warming, depending on location," said Assoc Prof Peter Strutton.

"Detailed modelling like this is the first step in understanding which ecosystems will thrive or decline, how the productivity of the ocean will change, and those parts of the food chain most likely to be affected. This is exactly the kind of knowledge we need to adapt to the inevitable consequences of global warming."

Credit: 
University of New South Wales

Structural colors from cellulose-based polymers

A surface displays structural colors when light is reflected by tiny, regular structural elements in a transparent material. Researchers have now developed a method to make structural colors from cellulose-based polymers by using coated droplets that exist in a surrounding fluid--so-called liquid marbles. The system readily responds to environmental changes, which makes it interesting for applications in bio-based sensors and soft photonic elements, according to the study published in the journal Angewandte Chemie.

Structural colors are a way to colorize a material without using a dye. Instead, the transparent material generates color through the regular arrangement of its molecules or other elements, as seen, for instance, in the ripples in the scales of colorful fish and butterflies, or in nanocrystals arranged at certain distances, as in the color-changing skin of chameleons.

Manos Anyfantakis and colleagues at the University of Luxembourg have identified a means to control the pitch, the distance of a full helical turn in a polymer, as a structural element on which reflection might occur and structural colors appear. Scientists can prepare liquid crystalline phases of biopolymers with pitches generating structural colors--called cholesteric phases--but these preparations depend on many parameters and need a long time to reach equilibrium.

Now, Anyfantakis and colleagues have discovered a faster and better controllable method, using liquid marbles as a platform for the controlled self-assembly of biopolymer-based structural colors. Liquid marbles are millimeter-sized droplets of liquid crystalline solutions, which are coated with nanoparticles. The coating protects the liquid from mixing with the outside fluid, but still allows for some interaction, dependent on the nature of both liquids.

In this case, the scientists prepared liquid marbles from an aqueous solution of hydroxypropyl cellulose--a modified cellulose polymer that orients itself in cholesteric phases--coated by silica nanoparticles. These cellulose-based liquid marbles were colorless at first, but allowing them to stay for some time in a defined volume of an organic solvent slowly brought about bright colors of red, green, and blue.

The colors were the result of a concentration change in the droplets, the authors found out. The organic solvent slowly extracted water from the liquid marbles, which caused the biopolymer to adopt a crystal form suitable for structural colors. Slowness and controllability were pivotal, the authors pointed out, because "this gives enough time for the polymer molecules to adjust to the concentration change, organizing with a new equilibrium pitch," they explained.

The method is elegant and simple, and the colors only depend on the volume of the organic solvent. The scientists also introduced external stimuli such as heat, pressure, or exposure to the chemicals and observed characteristic color shifts, corresponding to a varying pitch size. These shifts were reversible: when reapplying the normal conditions, the liquid marbles returned to their original colors, the authors observed.

The authors believe that the biopolymer-based liquid marbles could offer a route to synthesize cost-effective, environmentally friendly, and sustainable sensors.

Credit: 
Wiley

A novel salvinia-like slippery surface

image: Design of the Salvinia-like slippery surface. (a) Salvinia molesta floating leaf on which water drop displays stable Cassie state. Though the rational design of elastic eggbeater-shaped microstructure with surface energy gradient in vertical direction could stabilize the contact line to prevent impalement, such structure has strong adhesion because of the hydrophilic patches. (b) A pitcher plant-inspired slippery surface with molecularly smooth lubricant fixed on the top of the microstructure which enable fast drop or liquid transportation. (c) The combination constitutes the SSS on which water drop shows slippery stable Cassie state. The black, purple, and red solid arrows represent the directions of pressure (P), gravity (g) and the velocity of drop transport (V), respectively.

Image: 
©Science China Press

Superhydrophobic surfaces are widely used in many industrial settings, which mainly consist of rough solid protrusions that entrap air to minimize the liquid/solid area. The stability of the superhydrophobic state favors a relatively small spacing between protrusions. However, this in turn increases the lateral adhesion force that retards the mobility of drops. Thus, the simultaneous optimization of both stability of the Cassie state and minimization of the lateral adhesion force remains a great challenge for SHPOS with high performance.

In nature, Salvinia leaves show long-lasting Cassie state under water, owing to the hydrophobic eggbeater-like trichomes with hydrophilic pins on the top. The hydrophilic-to-hydrophobic boundary pins the water/air contact line in the vertical direction. However, the pinning effect also diminishes the mobility of the contact line at the horizontal direction. On another research line, slippery liquid-infused porous surfaces (SLIPS) inspired by the Nepenthes pitcher plants, have been demonstrated as promising substrates where low lateral adhesion force for drops of any liquid is required. A drop on a liquid-infused slippery surface, however, shows both smaller contact angle and shedding velocity compared to the SHPOS. Thus, to get a structure with both stability of the Cassie state and minimization of the lateral adhesion force, we need to combine SHPOS and SLIPS. However, the introduction of a stable air cushion between protrusions with slippery surface is challenging because of the low surface tension of the lubricant.

In response to this challenge, recently, inspired by the Salvinia leaf with stable water/air contact line and Nepenthes pitcher plants with mobile water/air contact line, the materials surface science research team led by Professor Xu Deng from the University of Electronic Science and Technology of China (UESTC) cooperated with professor Periklis Papadopoulos (University of Ioannina) proposed a Salvinia-like slippery surface (SSS). The SSS consists of lubricant-infused cross-linked polydimethyl siloxane (PDMS) layer on the top of pillars with hydrophobic side walls. The lubricant creates an additional energy barrier, against quasi-static and dynamic impalement. Furthermore, the oil layer on the top of the structure also works as a lubricant which reduces the adhesion and improved the drop mobility significantly. Therefore, drops on the SSS show stable slippery Cassie state, avoiding the strong pinning effect on the hydrophilic pitches of the Salvinia plant (Figure 1). Compared with a control surface with the same structure without lubricant, the SSS exhibits increased stability against pressure and impact, the enhanced lateral mobility of water drops as well as the reduced hydrodynamic drag. Owing to its easy fabrication and enhanced performances, the SSS will be useful in transport of viscous fluids, pipelines and microfluidic devices.

Credit: 
Science China Press

How Neanderthals adjusted to climate change

Climate change occurring shortly before their disappearance triggered a complex change in the behaviour of late Neanderthals in Europe: they developed more complex tools. This is the conclusion reached by a group of researchers from Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and Università degli Studi die Ferrara (UNIFE) on the basis of finds in the Sesselfelsgrotte cave in Lower Bavaria.

Neanderthals lived approximately 400,000 to 40,000 years ago in large areas of Europe and the Middle East, even as far as the outer edges of Siberia. They produced tools using wood and glass-like rock material, which they also sometimes combined, for example to make a spear with a sharp and hard point made of stone.

From approximately 100,000 years ago, their universal cutting and scraping tool was a knife made of stone, the handle consisting of a blunt edge on the tool itself. These Keilmesser (backed, asymmetrical bifacially-shaped knives) were available in various shapes, leading researchers to wonder why the Neanderthals created such a variety of knives? Did they use different knives for different tasks or did the knives come from different sub-groups of Neanderthals? This was what the international research project hoped to find out.

Keilmesser are the answer

'Keilmesser are a reaction to the highly mobile lifestyle during the first half of the last ice age. As they could be sharpened again as and when necessary, they were able to be used for a long time - almost like a Swiss army knife today,' says Prof. Dr. Thorsten Uthmeier from the Institute of Prehistory and Early History at FAU. 'However, people often forget that bi-facially worked knives were not the only tools Neanderthals had. Backed knives from the Neanderthal period are surprisingly varied,' adds his Italian colleague Dr. Davide Delpiano from Sezione di Scienze Preistoriche e Antropologiche at UNIFE. 'Our research uses the possibilities offered by digital analysis of 3D models to discover similarities and differences between the various types of knives using statistical methods.'

The two researchers investigated artefacts from one of the most important Neanderthal sites in Central Europe, the Sesselfelsgrotte cave in Lower Bavaria. During excavations in the cave conducted by the Institute of Prehistory and Early History at FAU, more than 100,000 artefacts and innumerable hunting remains left behind by the Neanderthals have been found, even including evidence of a Neanderthal burial. The researchers have now analysed the most significant knife-like tools using 3D scans produced in collaboration with Prof. Dr. Marc Stamminger and Dr. Frank Bauer from the Chair of Visual Computing at the Department of Computer Science at FAU. They allow the form and properties of the tool to be recorded extremely precisely.

'The technical repertoire used to create Keilmesser is not only direct proof of the advanced planning skills of our extinct relatives, but also a strategical reaction to the restrictions imposed upon them by adverse natural conditions,' says Uthmeier, FAU professor for Early Prehistory and Archaeology of Prehistoric Hunters and Gatherers.

Other climate, other tools

What Uthmeier refers to as 'adverse natural conditions' are climate changes after the end of the last interglacial more than 100,000 years ago. Particularly severe cold phases during the following Weichsel glacial period began more than 60,000 years ago and led to a shortage of natural resources. In order to survive, the Neanderthals had to become more mobile than before, and adjust their tools accordingly.

The Neanderthals probably copied the functionality of unifacial backed knives, which are only shaped on one side, and used these as the starting point to develop bi-facially formed Keilmesser shaped on both sides. 'This is indicated in particular by similarities in the cutting edge, which consists in both instances of a flat bottom and a convex top, which was predominantly suited for cutting lengthwise, meaning that it is quite right to refer to the tool as a knife,' says Davide Delpiano from UNIFE.

Both types of knife - the simpler older version and the newer, significantly more complex version - obviously have the same function. The most important difference between the two tools investigated in this instance is the longer lifespan of bi-facial tools. Keilmesser therefore represent a high-tech concept for a long-life, multi-functional tool, which could be used without any additional accessories such as a wooden handle.

'Studies from other research groups seem to support our interpretation,' says Uthmeier. 'Unlike some people have claimed, the disappearance of the Neanderthals cannot have been a result of a lack of innovation or methodical thinking.'

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Plant scientists study the interaction of heat stress responses in corn

image: The Enviratron robotic collects data on plants.

Image: 
ISU News Service

AMES, Iowa - Environmental extremes driven by climate change create stresses in crops, and plant breeders are attempting to untangle the genetic factors that endow plants with tolerance to stress. A new study from Iowa State University scientists shows how two seemingly unrelated responses in corn plants interact to help the crop survive heat stress.

The study, published on Tuesday in the academic journal The Plant Cell, shows how a response called the unfolded protein response helps to activate the heat shock response when corn plants are exposed to hot weather conditions. The two responses operate in different parts of plant cells, and scientists previously assumed the responses were independent. But data gathered using the Enviratron, a highly controlled and automated facility at Iowa State equipped with a robotic rover and growth chambers, allowed the research team to show how one response influences another.

"These two systems have been thought to operate independently," said Stephen Howell, Distinguished Professor of Genetics, Development and Cell Biology and senior author of the study. "We've been able to show these systems sometimes work together to mitigate damage caused by heat and to protect the plant from stress."

Heat stress causes proteins to denature and misfold in the endoplasmic reticulum, an organelle inside cells. Misfolded proteins can be toxic, and their buildup sets off an alarm that activates the expression of genes that protects plants from heat stress. A similar response plays out in different locations of the cell, including the cytoplasm, where excessive heat activates the expression of a different set of genes encoding heat-shock proteins.

The new study shows that, although the two responses take place in different parts of the cell, they actually work in concert during heat stress: a powerful transcription factor produced in the unfolded protein response activates the expression of a key factor helping to trigger the heat shock response.

The scientists found that knocking out the unfolded protein response made corn plants more susceptible to heat stress and hindered the heat shock response. That raises the question if overexpressing the misfolded protein response could strengthen the ability of corn plants to withstand high heat, but Howell said doing so may have other undesirable consequences.

"There's a seesaw balance, if you will, between defense and growth," he said. "The more you contribute to defense, the more you sacrifice growth. It may be that you could provide somewhat greater defense to crops but you might do so at the expense of growth."

In their study, the researchers drew on data gathered in the Enviratron, a state-of-the-art facility at the ISU Ag Engineering/Agronomy Research Farm that utilizes a robotic rover that travels through a series of specialized growth chambers that carefully control the environments in which the plants are raised. Development of the Enviratron was funded through a grant from the National Science Foundation. Zhaoxia Li, first author of the paper and a postdoctoral scientist in Howell's lab, said the facility allows researchers to control variables such as temperature, moisture, light and carbon dioxide concentrations to study their effect on plant development.

Howell said previous scientific papers have described the design and construction of the Enviratron, but this is the first publication in a journal based on data generated in the facility.

"We hope that studies like this will emphasize the value of conducting such research under controlled environmental conditions offered by the Enviratron," he said.

Credit: 
Iowa State University

Host tissue T cells may have an unexpected role in graft-versus-host disease

Allogenic hematopoietic stem cell transplantation (HSCT) is a procedure that infuses a donor's healthy blood-forming stem cells into a recipient as part of a potentially curative therapy for cancer. While this therapy can be life-saving, a major complication is the development of graft-versus-host disease (GVHD), which causes significant morbidity and can be fatal. Before allogenic HSCT, a patient receives a conditioning regimen, chemotherapy designed to deplete his/her normal white blood cells, including T cells. But a new study by investigators from Brigham and Women's Hospital, University of Oslo (Norway) and University of Newcastle (UK) has found that skin and intestinal T cells in the recipient survive conditioning regimens and continue to perform their normal functions. But, under certain conditions, these T cells can become activated by donor white blood cells and play a previously unappreciated role in acute GVHD. The investigators' results are published in The Journal of Clinical Investigation.

"In all the years that GVHD has been studied, it has been an article of faith that graft T cells mediate the disease and attack the body," said Thomas Kupper, MD, chair of the Department of Dermatology at the Brigham. "We discovered in skin and gastrointestinal tract that the T cells that cause GVHD are also from the host -- that is, the patient's own T cells. These durable T cells from the host become activated by cells from the graft, thus causing tissue injury. This completely novel finding was unexpected and opens the door to new approaches to treatment and prevention."

Conditioning regimens are meant to deplete the host of normal white blood cells, including T cells, in order to make room for the new immune system that will develop from the graft. When recipient blood has been examined after conditioning, T cells are difficult to detect. But Kupper and colleagues found that while the recipients blood T cells were depleted, their tissue T cells in the skin and gut were not. The researchers used high throughput DNA sequencing of T cells and "short tandem repeat"/STR analysis, which together determines the proportion of blood (or tissue) cells that derive from the donor (graft) or the recipient (host), respectively. They further studied male-female host/donor mismatch transplants, using XY sex chromosomes to determine the origin of the cells. The team also used mouse models, grafting human skin onto immunocompromised mice to avoid rejection and test the ability of host skin T cells to mediate GVHD without donor T cells.

Based on the high throughput sequencing and STR analysis, the team saw that there were still abundant host T cells present in the skin and small intestine during GVHD, even when blood cells were 100% of donor origin. The mouse models demonstrated that skin-resident host T cells could be activated by donor non-T cell white blood cells to generate GVHD-like skin inflammation. The results indicate that unexpectedly, skin- and intestinal-resident T cells not only survive conditioning regimens, but are present in tissues during acute GVHD and very likely play an important role in the pathphysiology of this disease.

"Our new understanding of GVHD allows us to think about tissue resident memory T cells in the host/recipient as a new target for therapy, which is potentially game changing," said Kupper. "Hypothetically, we could use this information to intervene earlier and perhaps even prevent the emergence of GVHD. This study is an example of how we must never assume we know everything about disease mechanism and must always be willing to challenge prevailing paradigms if that's where the data leads."

Credit: 
Brigham and Women's Hospital

NASA Terra Satellite examines Tropical Storm Hernan's relocated center

image: On Aug. 28, 2020 at 1 a.m. EDT (0500 UTC) the MODIS instrument that flies aboard NASA's Terra satellite revealed the most powerful thunderstorms (yellow) minus 80 degrees Fahrenheit (minus 62.2 degrees Celsius) near Hernan's center and over the Gulf of California. Surrounding that area were cloud top temperatures were as cold as (red) minus 70 degrees Fahrenheit (minus 56.6. degrees Celsius). All of those areas were generating heavy rain.

Image: 
NASA/NRL

NASA infrared imagery revealed a burst of strength in Tropical Storm Hernan, located over the Gulf of California. At 12:30 a.m. EDT, NOAA's National Hurricane Center or NHC noted that recent satellite-based wind data indicated Hernan was located northeast of previous estimates.

The body of water located between the Baja California Peninsula and the Mexican mainland is known as the Gulf of California. It is a marginal sea of the Pacific Ocean.

NHC noted late on Aug. 27, that Hernan appeared poorly organized, and despite a burst of strength, the storm weakened to a depression.

Infrared Data Provides a Temperature Check

Infrared data provides temperature information, and the strongest thunderstorms that reach highest into the atmosphere have the coldest cloud top temperatures.

On Aug. 28 at 1 a.m. EDT (0500 UTC), the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard NASA's Terra satellite captured an infrared image of cloud top temperatures in Hernan that showed what appears to be its final burst of strength. MODIS found the powerful thunderstorms that developed were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 degrees Celsius) near Hernan's center and over the Gulf of California. Surrounding that area were cloud top temperatures were as cold as minus 70 degrees Fahrenheit (minus 56.6. degrees Celsius). All of those areas were generating heavy rain, but within a couple of hours, they diminished.

Hernan Weakened to a Depression

The National Hurricane Center (NHC) noted at 5 a.m. EDT that Hernan had weakened to a depression and strong thunderstorms had weakened. NHC said, "Shortly after the release of the previous advisory, microwave imagery from a WindSat overpass showed no indication of a well-defined center near Hernan's estimated location. However, there was a hint of a small vortex well to the northeast. Confidence is therefore fairly high that Hernan has persisted as a tropical cyclone, at least through 12 a.m. EDT (0400 UTC) this morning." WindSat is the primary instrument aboard the Coriolis mission satellite, which is jointly sponsored by the U.S. Dept. of Defense Space Test Program and the U.S. Navy.

Hernan's Status on Aug. 28, 2020

At 11 a.m. EDT (1500 UTC), the center of Tropical Depression Hernan was located near latitude 23.4 north, longitude 109.1 west, about 60 miles (100 km) northeast of the southern tip of Baja California, Mexico. The depression is moving toward the west-northwest near 21 mph (33 kph) and this motion is expected to continue through tonight. Maximum sustained winds have decreased to near 35 mph (55 kph) with higher gusts. Additional weakening is forecast, and Hernan is expected to degenerate to a remnant low-pressure area tonight. The remnants are expected to dissipate on Saturday. The estimated minimum central pressure is 1006 millibars.

Forecast from NHC

Based on decreasing satellite intensity estimates, Hernan was downgraded to a tropical depression. Additional weakening is forecast, and Hernan is expected to degenerate to a remnant low pressure area as it moves over the Baja California peninsula later today and tonight. The system is then expected to weaken to a trough (elongated area of low pressure) on Saturday.

Credit: 
NASA/Goddard Space Flight Center

Natural disasters must be unusual or deadly to prompt local climate policy change

image: A map of the U.S. depicting the extreme weather events included in the study.

Image: 
Oregon State University

CORVALLIS, Ore. -- Natural disasters alone are not enough to motivate local communities to engage in climate change mitigation or adaptation, a new study from Oregon State University found.

Rather, policy change in response to extreme weather events appears to depend on a combination of factors, including fatalities, sustained media coverage, the unusualness of the event and the political makeup of the community.

Climate scientists predict that the frequency and severity of extreme weather events will only continue to increase in coming decades. OSU researchers wanted to understand how local communities are reacting.

"There's obviously national and state-level climate change policy, but we're really interested in what goes on at the local level to adapt to these changes," said lead author Leanne Giordono, a post-doctoral researcher in OSU's College of Public Health and Human Sciences. "Local communities are typically the first to respond to extreme events and disasters. How are they making themselves more resilient -- for example, how are they adapting to more frequent flooding or intense heat?"

For the study, which was funded by the National Science Foundation, Giordono and co-authors Hilary Boudet of OSU's College of Liberal Arts and Alexander Gard-Murray at Harvard University examined 15 extreme weather events that occurred around the U.S. between March 2012 and June 2017, and any subsequent local climate policy change.

These events included flooding, winter weather, extreme heat, tornadoes, wildfires and a landslide.

The study, published recently in the journal Policy Sciences, found there were two "recipes" for local policy change after an extreme weather event.

"For both recipes, experiencing a high-impact event -- one with many deaths or a presidential disaster declaration -- is a necessary condition for future-oriented policy adoption," Giordono said.

In addition to a high death toll, the first recipe consisted of Democrat-leaning communities where there was focused media coverage of the weather event. These communities moved forward with adopting policies aimed at adapting in response to future climate change, such as building emergency preparedness and risk management capacity.

The second recipe consisted of Republican-leaning communities with past experiences of other uncommon weather events. In these locales, residents often didn't engage directly in conversation about climate change but still worked on policies meant to prepare their communities for future disasters.

In both recipes, policy changes were fairly modest and reactive, such as building fire breaks, levees or community tornado shelters. Giordono referred to these as "instrumental" policy changes.

"As opposed to being driven by ideology or a shift in thought process, it's more a means to an end," she said. "'We don't want anyone else to die from tornadoes, so we build a shelter.' It's not typically a systemic response to global climate change."

In their sample, the researchers didn't find any evidence of mitigation-focused policy response, such as communities passing laws to limit carbon emissions or require a shift to solar power. And some communities did not make any policy changes at all in the wake of extreme weather.

The researchers suggest that in communities that are ideologically resistant to talking about climate change, it may be more effective to frame these policy conversations in other ways, such as people's commitment to their community or the community's long-term viability.

Without specifically examining communities that have not experienced extreme weather events, the researchers cannot speak to the status of their policy change, but Giordono said it is a question for future study.

"In some ways, it's not surprising that you see communities that have these really devastating events responding to them," Giordono said. "What about the vast majority of communities that don't experience a high-impact event -- is there a way to also spark interest in those communities?"

"We don't want people to have to experience these types of disasters to make changes."

Credit: 
Oregon State University

Fossil evidence of 'hibernation-like' state in 250-million-year-old Antarctic animal

image: Life restoration of Lystrosaurus in a state of torpor.

Image: 
Crystal Shin

Hibernation is a familiar feature on Earth today. Many animals -- especially those that live close to or within polar regions -- hibernate to get through the tough winter months when food is scarce, temperatures drop and days are dark.

According to new research, this type of adaptation has a long history. In a paper published Aug. 27 in the journal Communications Biology, scientists at the University of Washington and its Burke Museum of Natural History and Culture report evidence of a hibernation-like state in an animal that lived in Antarctica during the Early Triassic, some 250 million years ago.

The creature, a member of the genus Lystrosaurus, was a distant relative of mammals. Antarctica during Lystrosaurus' time lay largely within the Antarctic Circle, like today, and experienced extended periods without sunlight each winter.

The fossils are the oldest evidence of a hibernation-like state in a vertebrate animal, and indicates that torpor -- a general term for hibernation and similar states in which animals temporarily lower their metabolic rate to get through a tough season -- arose in vertebrates even before mammals and dinosaurs evolved.

"Animals that live at or near the poles have always had to cope with the more extreme environments present there," said lead author Megan Whitney, a postdoctoral researcher at Harvard University who conducted this study as a UW doctoral student in biology. "These preliminary findings indicate that entering into a hibernation-like state is not a relatively new type of adaptation. It is an ancient one."

Lystrosaurus lived during a dynamic period of our planet's history, arising just before Earth's largest mass extinction at the end of the Permian Period -- which wiped out about 70% of vertebrate species on land -- and somehow surviving it. The stout, four-legged foragers lived another 5 million years into the subsequent Triassic Period and spread across swathes of Earth's then-single continent, Pangea, which included what is now Antarctica.

"The fact that Lystrosaurus survived the end-Permian mass extinction and had such a wide range in the early Triassic has made them a very well-studied group of animals for understanding survival and adaptation," said co-author Christian Sidor, a UW professor of biology and curator of vertebrate paleontology at the Burke Museum.

Paleontologists today find Lystrosaurus fossils in India, China, Russia, parts of Africa and Antarctica. These squat, stubby, creatures -- most were roughly pig-sized, but some grew 6 to 8 feet long -- had no teeth but bore a pair of tusks in the upper jaw, which they likely employed to forage among ground vegetation and dig for roots and tubers, according to Whitney.

Those tusks made Whitney and Sidor's study possible. Like elephants, Lystrosaurus tusks grew continuously throughout their lives. The cross-sections of fossilized tusks can harbor life-history information about metabolism, growth and stress or strain. Whitney and Sidor compared cross-sections of tusks from six Antarctic Lystrosaurus to cross-sections of four Lystrosaurus from South Africa.

Back in the Triassic, the collection sites in Antarctica were at about 72 degrees south latitude -- well within the Antarctic Circle, at 66.3 degrees south. The collection sites in South Africa were more than 550 miles north during the Triassic at 58-61 degrees south latitude, far outside the Antarctic Circle.

The tusks from the two regions showed similar growth patterns, with layers of dentine deposited in concentric circles like tree rings. But the Antarctic fossils harbored an additional feature that was rare or absent in tusks farther north: closely-spaced, thick rings, which likely indicate periods of less deposition due to prolonged stress, according to the researchers.

"The closest analog we can find to the 'stress marks' that we observed in Antarctic Lystrosaurus tusks are stress marks in teeth associated with hibernation in certain modern animals," said Whitney.

The researchers cannot definitively conclude that Lystrosaurus underwent true hibernation --which is a specific, weeks-long reduction in metabolism, body temperature and activity. The stress could have been caused by another hibernation-like form of torpor, such as a more short-term reduction in metabolism, according to Sidor.

Lystrosaurus in Antarctica likely needed some form of hibernation-like adaptation to cope with life near the South Pole, said Whitney. Though Earth was much warmer during the Triassic than today -- and parts of Antarctica may have been forested -- plants and animals below the Antarctic Circle would still experience extreme annual variations in the amount of daylight, with the sun absent for long periods in winter.

Many other ancient vertebrates at high latitudes may also have used torpor, including hibernation, to cope with the strains of winter, Whitney said. But many famous extinct animals, including the dinosaurs that evolved and spread after Lystrosaurus died out, don't have teeth that grow continuously.

"To see the specific signs of stress and strain brought on by hibernation, you need to look at something that can fossilize and was growing continuously during the animal's life," said Sidor. "Many animals don't have that, but luckily Lystrosaurus did."

If analysis of additional Antarctic and South African Lystrosaurus fossils confirms this discovery, it may also settle another debate about these ancient, hearty animals.

"Cold-blooded animals often shut down their metabolism entirely during a tough season, but many endothermic or 'warm-blooded' animals that hibernate frequently reactivate their metabolism during the hibernation period," said Whitney. "What we observed in the Antarctic Lystrosaurus tusks fits a pattern of small metabolic 'reactivation events' during a period of stress, which is most similar to what we see in warm-blooded hibernators today."

If so, this distant cousin of mammals isn't just an example of a hearty creature. It is also a reminder that many features of life today may have been around for hundreds of millions of years before humans evolved to observe them.

Credit: 
University of Washington

Rejuvenating old organs could increase donor pool

Boston, MA -- Despite the limited supply of organs available for patients on waitlists for transplantation, organs from older, deceased donors are frequently discarded or not utilized. Available older organs have the potential to close the gap between demand and supply that is responsible for the very long wait-times that lead to many patients not surviving the time it takes for an organ to become available. Older organs can also often provoke a stronger immune response and may put patients at greater risk of adverse outcomes and transplant rejection. But, as the world population ages, organs from older, deceased donors represent an untapped and growing resource for patients in need. Investigators from Brigham and Women's Hospital are leading efforts to breathe new life into older organs by leveraging a new class of drugs known as senolytics, which target and eliminate old cells. Using clinical and experimental studies, the team presents evidence that senolytic drugs may help rejuvenate older organs, which could lead to better outcomes and a wider pool of organs eligible for donation. Results are published in Nature Communications.

"Older organs are available and have the potential to contribute to mitigating the current demand for organ transplantation," said corresponding author Stefan G. Tullius, MD, PhD, chief of the Division of Transplant Surgery at the Brigham. "If we can utilize older organs in a safe way with outcomes that are comparable, we will take a substantial step forward for helping patients."

As organs age, senescent cells accumulate. These cells, which no longer divide, escape the body's usual means of destroying older, unneeded cells. Senescent cells release cell-free mitochondrial DNA (mt-DNA), which also accumulates in older organs. Recent studies have suggested that this rise in mt-DNA is tied to organ rejection.

In their Nature Communications paper, Tullius and colleagues identified senescent cells as the key source of mt-DNA and present evidence that the accumulation of mt-DNA provokes an immune response leading to organ failure and rejection. Senolytic drugs force senescent cells back into the cell cycle, allowing the body to clear them. The researchers therefore examined whether senolytic drugs could be used to improve outcomes. In a mouse model, they treated organ donors with a combination of the senolytic drugs dasatinib and quercetin. The drugs reduced the number of senescent cells, reduced mt-DNA levels and decreased inflammation. Most relevantly, the survival of old organs treated with senolytics was as comparable to that of organs originating from young donors.

Since the authors carried out their therapeutic experiments in a mouse model, further mechanistic studies are needed to evaluate whether senolytic drugs may have the same effects on human organs from older donors and the same degree of success in clearing senescent cells, as well as whether organs can be treated effectively with senolytic drugs after they are harvested. The authors have already started with first steps in humans and determined that augmented levels of mt-DNA circulate in older organ donors.

"We have not yet tested the effects clinically, but we are well prepared to take the next step toward clinical application by using a perfusion device to flow senolytic drugs over organs and measure whether or not there are improvements in levels of senescent cells," said Tullius. "Our data provide a rationale for considering clinical trials treating donors, organs, and/or recipients with senolytic drugs to optimize the use of organs from older donors. The goal is to help to close the gap between organ availability and the needs of the many patients currently on transplant waiting lists."

Credit: 
Brigham and Women's Hospital

First in situ radiation measurements 21 km up into the air over Tibetan Plateau

image: The release of the stratospheric balloon-based radiation measurement system.

Image: 
Jinqiang Zhang

Radiation variations over the Tibetan Plateau (TP) are crucial for global climate and regional ecological environment. Previous radiation studies over the TP were widely based on ground and satellite measurements of the radiation budget at the surface and at the top of the atmosphere.

In situ vertical radiation measurements from the surface up to the upper troposphere and lower stratosphere (UTLS), about 10~22 km in altitude, are rare over the TP or even over a large territory of China.

Dr. Jinqiang Zhang from the Institute of Atmospheric Physics (IAP) of the Chinese Academy of Sciences (CAS), in collaboration with scientists from the Aerospace Information Research Institute of CAS, developed a balloon-based measurement system to measure stratospheric radiation.

This original system, for the first time, provides in situ measurements of multiwavelength radiation profiles from the surface up to the UTLS over the TP. Using this system, scientists can study how and why radiation profiles vary over the TP during the Asian summer monsoon period.

The observation campaigns were conducted three times in the summer of 2018 and 2019, of which the longest flight observation lasted more than 30 hours and achieved a breakthrough of diurnal radiation variation in the UTLS.

According to the team, the stratospheric balloon-based radiation profiles, combined with simultaneous operational radiosondes, ground measurements, satellite retrievals and radiative transfer model simulations, are valuable because the data can be used to study radiation variations and the radiative forcings of clouds and aerosols over the TP during the Asian summer monsoon period. The radiation retrievals from the radiative transfer model simulations and satellite observations are also validated.

"The results of these campaigns can improve our understanding of radiation properties in the UTLS and help us better comprehend the thermal conditions associated with clouds and aerosols over the TP during the Asian summer monsoon period," said Zhang.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Growing demand for zero-deforestation cacao might not help Colombian forests

image: Harvested cacao pods, near Gauchené, in the north of Colombia's Cauca Department.

Image: 
Neil Palmer / International Center for Tropical Agriculture

When Brazil refused soy grown on deforested land in the Amazon, the movement spread worldwide. Brazil's Soy Moratorium in 2006 became the first zero-deforestation agreement. And from cocoa in Ghana to palm oil in Indonesia, now companies would have to explain: where was their product from? Did it contribute to deforestation?

But more than a decade later, there is little concrete evidence that zero-deforestation pledges have cut deforestation or carbon emissions. While zero-deforestation support has grown, companies still have no guidelines through which they may measure progress. The result is a vague nod in the direction of improvement, with little concrete evidence that it works.

"There are no one-size-fits-all, silver bullet solutions. Zero-deforestation supply chains is not one either," said Augusto Castro-Nunez, a scientist at the Alliance of Bioversity International and CIAT. "It just doesn't work as a blanket approach for all countries and all supply chains. The proliferation of zero-deforestation pledges might creates uncertainty for small and large producers alike, without a clear roadmap to implement those pledges."

A smarter approach might be to include a raft of sustainability measures along every part of the supply chain, embedded as value addition. This is because ending agriculture-driven deforestation requires global-level commitments - like zero-deforestation pledges - to be tailored to local contexts.

Colombia is a case in point. Here, cocoa does not appear to replace virgin forest. Rather, it is used to replace illicit coca plant used for cocaine production. The research led by Castro-Nunez also shows that in Colombia, cocoa production, even if it is only for local markets, offers a pathway out of conflict and poverty.

The research, which maps cacao production in Colombia and overlays it with deforestation hotspots, was published in Applied Geography and has been used by the Colombian Cacao, Forest and Peace Initiative. The study was funded by Germany's International Climate Initiative, or IKI, as part of its Sustainable Land Use Systems (SLUS) project led by the Alliance.

Yet even though the science tells us that cocoa is not a driver of deforestation in Colombia like palm oil is in Indonesia, small producers must still adhere to zero-deforestation practices. Indeed the top prices and niche markets it attracts, continue to sweet-talk cacao stakeholders across the country.

And yet, although 90 percent of those small farmers live in poor and post-conflict areas where cocoa is produced, they risk being cut out of supply chains if they do not comply with zero-deforestation requirements. The impact: to switch to other crops, like coca - leading to an exacerbation of the conflict from which these small producers are trying to escape.

"The expectation is for producers is that because their cacao does not drive deforestation, it could reach new international markets and command higher prices," said Castro-Nunez. "This has not happened at a wide-scale but there is potential."

Castro-Nunez and colleagues say that instead of dropping producers from supply chains for not meeting zero-deforestation requirements, different value-addition strategies can be adopted along the chain. Relationships can be made and nurtured; producers can receive support, finance and information to build their businesses sustainably. Suppliers must work together to add value and help producers out of conflict and poverty.

Much of the burden for investing in zero-deforestation pledges would be carried by producers. Without access to finance, knowledge and the right networks, producers are often not in a position to invest in meeting these goals, reducing further any chances of creating sustainable markets and helping themselves out of the situation.

So investment in the value-chain approach to build peace and support zero-deforestation and other context-specific situations is critical. Since in Colombia, cocoa is promoted as an alternative to illicit coca, support must be given to help legal businesses develop and thrive, towards peace and transparency.

Reducing deforestation in agricultural production is undoubtedly a must. But first, the extent to which a product is contributing to deforestation in any specific location must be determined. We can't manage what we can't measure: we need more data about what is happening on the ground to define commitments.

In Colombia, demand for zero-deforestation cocoa might even drive up competition to produce cocoa for these high-end markets touting certification as a solution.

"Eventually, the laws of supply and demand tell us that this would drive up deforestation in the future, undermining the very goals the zero-deforestation movement was set up to meet," said Castro-Nunez.

So, while we know that we must reduce deforestation caused as a result of agricultural intensification, we still don't know how to do it. Pledges with no roadmap for implementation cannot help us meet a raft of not only zero-deforestation outcomes but sustainability outcomes in general.

"I do believe that bringing about zero-deforestation and a sustainable future is possible. But it requires more than sweet-talk," said Castro-Nunez. "We need granular data, context-specific and peace-driven motivations. And a roadmap for implementation: one which makes a real difference for every person along a product supply chain, from smallholder farmer to chocolate lover."

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

Round nanoparticles improve quality factors of surface lattice resonances: Study

image: An energy flux propagates along a surface and bypasses the nanoparticle at SLR. The hemisphere shape introduces weaker perturbations than the rod shape, resulting in much lower loss and a much higher quality factor.

Image: 
SIAT

Plasmonic surface lattice resonances (SLRs) supported by metal nanoparticle arrays have many merits such as strong field enhancements extended over large volumes, as well as long lifetimes, narrow linewidths, angle-dependent dispersion, and a wide range of wavelength tunability.

In order to improve the performance of SLR-based nanophotonic devices such as nanolasers, nonlinear optical devices, and optical sensors, much effort has been put into improving SLR quality factors.

A research group led by Dr. LI Guangyuan from the Shenzhen Institutes of Advanced Technology (SIAT) of the Chinese Academy of Sciences has found that nanohemisphere arrays can significantly improve the quality factors of SLRs.

The group's study, entitled "Exceptionally narrow plasmonic surface lattice resonances in gold nanohemisphere array," was published in the Journal of Physics D: Applied Physics on August 24.

In previous studies, SLRs were supported mainly by periodic metal nanorods. According to a recent review, the quality factors of such SLRs are ~150 for visible light, ~300 for telecom wavelengths, and ~500 for the midinfrared regime, respectively.

Although the lattice shape is vital for quality factors, studies involving various geometries did not lead to an anticipated remarkable narrowing of localized surface plasmon resonances (LSPRs) associated with these particles.

In this study, the researchers investigated SLRs supported by a 2D periodic nanohemisphere array embedded in a symmetric dielectric environment. Their simulation results showed that out-of-plane SLRs can have an ultra-narrow resonant linewidth (~0.9 nm) at visible wavelengths around 715 nm.

This result corresponded to an exceptionally high quality factor of 794, which was an order of magnitude larger than that of widely adopted nanorods.

In addition, the team also showed how to achieve high quality factors based on detuning between the Rayleigh anomaly and the LSPR of an isolated nanoparticle.

"The energy flux propagates along the surface and bypasses the nanoparticle, which mimics a stream bypassing a stone," said Dr. LI Guangyuan. "We all know that a round stone introduces weaker perturbations. This inspired us to replace nanorods with nanohemispheres."

The researchers are now continuing to fabricate 2D nanohemisphere array patterns with controlled feature size and shape, which is challenging but feasible.

They believe that SLRs supported by a 2D nanohemisphere array, featuring much higher quality factors than nanorods, will be attractive in diverse applications, including nanolasers, nonlinear optics, and ultrasensitive sensing.

Credit: 
Chinese Academy of Sciences Headquarters

QUT algorithm could quash Twitter abuse of women

Online abuse targeting women, including threats of harm or sexual violence, has proliferated across all social media platforms but QUT researchers have developed a statistical model to help drum it out of the Twittersphere.

Associate Professor Richi Nayak, Professor Nicolas Suzor and research fellow Dr Md Abul Bashar from QUT have developed a sophisticated and accurate algorithm to detect these posts on Twitter, cutting through the raucous rabble of millions of tweets to identify misogynistic content.

The team, a collaboration between QUT's faculties of Science and Engineering and Law and the Digital Media Research Centre, mined a dataset of 1M tweets then refined these by searching for those containing one of three abusive keywords - whore, slut, and rape.

Their paper - Regularising LSTM classifier by transfer learning for detecting misogynistic tweets with small training set - has been published by Springer Nature.

"At the moment, the onus is on the user to report abuse they receive. We hope our machine-learning solution can be adopted by social media platforms to automatically identify and report this content to protect women and other user groups online," said Professor Nayak.

"The key challenge in misogynistic tweet detection is understanding the context of a tweet. The complex and noisy nature of tweets makes it difficult.

"On top of that, teaching a machine to understand natural language is one of the more complicated ends of data science: language changes and evolves constantly, and much of meaning depends on context and tone.

"So, we developed a text mining system where the algorithm learns the language as it goes, first by developing a base-level understanding then augmenting that knowledge with both tweet-specific and abusive language.

"We implemented a deep learning algorithm called Long Short-Term Memory with Transfer Learning, which means that the machine could look back at its previous understanding of terminology and change the model as it goes, learning and developing its contextual and semantic understanding over time."

While the system started with a base dictionary and built its vocabulary from there, context and intent had to be carefully monitored by the research team to ensure that the algorithm could differentiate between abuse, sarcasm and friendly use of aggressive terminology.

"Take the phrase 'get back to the kitchen' as an example--devoid of context of structural inequality, a machine's literal interpretation could miss the misogynistic meaning," said Professor Nayak.

"But seen with the understanding of what constitutes abusive or misogynistic language, it can be identified as a misogynistic tweet.

"Or take a tweet like 'STFU BITCH! DON'T YOU DARE INSULT KEEMSTAR OR I'LL KILL YOU'. Distinguishing this, without context, from a misogynistic and abusive threat is incredibly difficult for a machine to do.

"Teaching a machine to differentiate context, without the help of tone and through text alone, was key to this project's success, and we were very happy when our algorithm identified 'go back to the kitchen' as misogynistic--it demonstrated that the context learning works."

The research team's model identifies misogynistic content with 75% accuracy, outperforming other methods that investigate similar aspects of social media language.

"Other methods based on word distribution or occurrence patterns identify abusive or misogynistic terminology, but the presence of a word by itself doesn't necessarily correlate with intent," said Professor Nayak.

"Once we had refined the 1M tweets to 5000, those tweets were then categorised as misogynistic or not based on context and intent, and were input to the machine learning classifier, which used these labelled samples to begin to build its classification model.

"Sadly, there's no shortage of misogynistic data out there to work with, but labelling the data was quite labour-intensive."

Professor Nayak and the team hoped the research could translate into platform-level policy that would see Twitter, for example, remove any tweets identified by the algorithm as misogynistic.

"This modelling could also be expanded upon and used in other contexts in the future, such as identifying racism, homophobia, or abuse toward people with disabilities," she said.

"Our end goal is to take the model to social media platforms and trial it in place. If we can make identifying and removing this content easier, that can help create a safer online space for all users."

Credit: 
Queensland University of Technology