Tech

Driver's-ed-inspired system could make automated parallel parking more accessible

One of the most challenging tasks for drivers is parallel parking, which is why automatic parking systems are becoming a popular feature on some vehicles. However, the cost of designing and implementing such computing-intensive systems can significantly increase a vehicle's price, creating a barrier to adding the feature in many models.

Now researchers have developed a more efficient automated parking guidance control strategy that mimics the approach to parallel parking commonly used by human drivers. This new, simpler automatic parking method has the potential to reduce the computing and storage resources required in the vehicle, which could lead to lower system costs and higher adoption rates by vehicle manufacturers.

The results of the research is published in IEEE/CAA Journal of Automatica Sinica, a joint publication of the Institute of Electrical and Electronics Engineers (IEEE) and the Chinese Association of Automation (CAA).

"We observed the way students typically learn how to parallel park in driving schools and determined that they use a relatively simple three-step process," said Li Li with the Department of Automation at Tsinghua University in Beijing, China. "Unlike conventional approaches to automatic parking, our new method focuses on simplifying control rules and strategies, rather than adding complicated feedback controllers and technical assistance systems."

The three-step guidance control strategy is based on the parallel parking method taught in many driver education classes. First, drivers align their vehicle next to the car in front of the open parking space. Next, the drivers back their vehicle up while making a hard-right turn until reaching a critical angle position. Finally, the drivers turn the steering wheel to a hard-left position and continue backing up until arriving in the parked position.

"By reducing the parking process to three simple steps, we limit the number of variables to five, of which the maximum allowable steering angle and velocity can be determined in advance," said co-author Lingxi Li, associate professor at Indiana University-Purdue University in Indianapolis. "Therefore, we can focus on controlling for just three variables?the starting point, the size of the open parking space and the critical angle position. This greatly simplifies designing and implementing the programming and computational resources for the onboard parking system."

The researchers plan to explore other methods of integrating human driving experiences in hybrid-augmented intelligence systems for future intelligent vehicle applications.

Credit: 
Chinese Association of Automation

Looking outside the fiber: Researchers demonstrate new concept of optical fiber sensors

image: Researchers at Bar-Ilan University in Israel have demonstrated a new concept of optical fiber sensors that addresses a decades-long challenge: the distributed mapping of refractive index outside the cladding of standard fiber, where light does not reach. The sensor can be used for leak detection in critical infrastructure, and process monitoring in the petrochemical industry, desalination plants, food and beverage production and more.

In this Distributed mapping of media outside the cladding along two meters of standard optical fiber. Two short segments immersed in water and ethanol are clearly identified by the local spectra of coupling to a cladding mode of the fiber.

Image: 
Prof. Avinoam Zadok

Optical fibers enable our era of the internet, as they carry vast amounts of data all around the world. Fibers are also an excellent sensor platform. They can reach over hundreds of kilometers, simply embedded within structures, and can be installed in hazardous environment where the use of electricity is prohibited. However, optical fiber sensors also face an inherent, fundamental challenge.

"Everything the light touches is our kingdom," says doctoral student Hagai Diamandi from the Faculty of Engineering at Bar-Ilan University in Israel. "In that, we mean to say that any optical measurement mandates that light should touch the medium under test." Standard optical fibers, however, are designed to do the exact opposite. "Standard fibers are made of a glass cladding, with a much thinner, inner core," continues Diamandi. "Light is guided at the inner core, and every effort is made to keep light from leaking outside. A substance under test, in most cases, lies outside the much larger cladding. Unfortunately, guided light does not touch upon much of the outside world."

A possible solution is available based on other forms of propagation in the same fiber. Doctoral student Yosef London explains: "In addition to the core mode, light can propagate in the fiber by filling out the entire cladding. In that case, it may 'feel' what's outside." But how do you get light to switch from the 'normal' core mode to those cladding modes? London continues: "Here there's a catch. Coupling to the cladding modes requires the inscription of permanent, periodic perturbations in the fiber medium, called 'gratings'. Gratings are written at specific, discrete locations. You cannot erase them or move them about." For that reason, cladding mode sensors are limited to point-measurements only.

The main strength of optical fiber sensors is spatially-distributed analysis, in which every fiber segment serves as an independent measurement node. Cladding modes could not support distributed measurements, until now. The breakthrough idea came from a third doctoral student in the group, Gil Bashan: "There is an alternative to the use of gratings. We can launch two strong optical waves into the fiber instead. When their frequencies are chosen correctly, the two waves can drive acoustic oscillations within the core of the fiber, at very high hypersonic frequencies. Those acoustic waves become our gratings." The principle is known as Brillouin dynamic gratings. Unlike permanent inscription, Brillouin dynamic gratings can be switched on and off at will. They can also be confined to short segments of arbitrary locations, and scanned along the fiber. "The principle has been used between core modes of fibers for over a decade," says Bashan. "We carry it over to the cladding modes."

In a paper published recently in Optica journal, the group reports a distributed cladding mode fiber sensor, a first of its kind. In doing so, they had to overcome considerable obstacles. Advisor Prof. Avi Zadok explains: "There is large disparity in size between core and cladding modes. Core modes are confined to a very tight region. Cladding modes spread over an area 200 times larger. For that reason, we were concerned that the coupling between the two modes would be weak and inefficient." Nevertheless, the team could show the precise measurement of refractive index outside the cladding boundary of standard, unmodified optical fiber. The spatial resolution of the measurements was eight centimeters. The analysis correctly identified short fiber sections immersed in water and ethanol, and clearly distinguished between the two. The uncertainty in index measurements was in the fourth decimal point.

Prof. Zadok concludes: "We have demonstrated a new concept of optical fiber sensors. It addresses a decades-long challenge: the distributed mapping of refractive index outside the cladding of standard fiber, where light does not reach." The sensor can be used for leak detection in critical infrastructure, and process monitoring in the petrochemical industry, desalination plants, food and beverage production and more.

Credit: 
Bar-Ilan University

Gratitude interventions don't help with depression, anxiety

COLUMBUS, Ohio - Go ahead and be grateful for the good things in your life. Just don't think that a gratitude intervention will help you feel less depressed or anxious.

In a new study, researchers at The Ohio State University analyzed results from 27 separate studies that examined the effectiveness of gratitude interventions on reducing symptoms of anxiety and depression.

The results showed that such interventions had limited benefits at best.

"For years now, we have heard in the media and elsewhere about how finding ways to increase gratitude can help make us happier and healthier in so many ways," said David Cregg, lead author of the study and a doctoral student in psychology at Ohio State.

"But when it comes to one supposed benefit of these interventions - helping with symptoms of anxiety and depression - they really seem to have limited value."

Cregg conducted the study with Jennifer Cheavens, associate professor of psychology at Ohio State. Their results were published online recently in the Journal of Happiness Studies.

There are two commonly recommended gratitude interventions, Cheavens said. One is the "Three Good Things" exercise: At the end of the day, a person thinks of three things that went well for them that day, then writes them down and reflects on them.

Another is a "gratitude visit," when a person writes a letter thanking someone who has made a difference in their life and then reads the letter to that person.

The 27 studies involved in this analysis often had participants do one of these exercises or something similar. The studies included 3,675 participants.

In many studies, participants who did the gratitude interventions were compared with people who performed a similar activity that was unrelated to gratitude. For example, instead of writing about what they were grateful about, a college student sample might write about their class schedule.

The gratitude intervention was not much better at relieving anxiety and depression than the seemingly unrelated activity.

"There was a difference, but it was a small difference," Cheavens said. "It would not be something you would recommend as a treatment."

As an alternative, Cheavens and Cregg recommend people pursue treatments that have been shown to be effective with anxiety and depression, such as cognitive behavioral therapy.

The results suggest that it isn't helpful to tell people with symptoms of depression or anxiety to simply be more grateful for the good things they have, Cheavens said.

"Based on our results, telling people who are feeling depressed and anxious to be more grateful likely won't result in the kind of reductions in depression and anxiety we would want to see," she said.

"It might be that these sort of interventions, on their own, aren't powerful enough or that people have difficulty enacting them fully when they are feeling depressed and anxious."

The results don't mean that there are no benefits to being grateful or to using gratitude interventions, the researchers said. In fact, some studies show that such interventions are effective at improving relationships.

"It is good to be more grateful - it has intrinsic virtue and there's evidence that people who have gratitude as a general trait have a lower incidence of mental health problems and better relationships," Cregg said.

"The problem is when we try to turn gratefulness into a self-help tool. Gratitude can't fix everything."

Credit: 
Ohio State University

Smart scaffolding to monitor tumor growth in real time and controlled environments

image: Photo taken beside the SERS of the CIC biomaGUNE research group led by the Ikerbasque Professor Luis Liz-Marzán

Image: 
CIC biomaGUNE

It is a project geared towards the study of cancer, melanoma and breast cancer, in particular, and seeks to better understand the growth and dynamics of tumours, which also avoids the need for animal experiments. 4DbioSERS is a five-year project funded by 2.4 million euros from the European Research Council (ERC) as part of the prestigious ERC Advanced Grants call awarded to high-risk and high-gain projects.

Liz-Marzán sums up the key aspects of the project thus: "We are working on the building of a kind of micrometric scaffolding using various methods (including 3D printing) and which has gold nanoparticles that will act as sensors built into it. A mixture of tumour cells, other types of cells and other components are cultured inside the scaffolding to reproduce a real tumour as faithfully as possible so that the aforementioned nanosensors will allow us to detect biomarkers relating to the evolution of the tumour in a range of conditions, which include changing the temperature or pH, adding drugs or creating other conditions that could affect it and which will help to design more effective treatments afterwards." They are also considering "marking some of the cells to see how they are displaced inside the tumour, or to see whether certain types of cells are segregated in a specific place so that the heterogeneity of the tumour can be studied", he added.

The tool used to detect the biomarkers and monitor the displacement of the cells is SERS (surface-enhanced Raman spectroscopy), which is capable of analysing a broad variety of substances using very low spatial resolution, even at extremely low concentrations. SERS uses the gold nanoparticles as sensors and also as labels, as well as a laser that enables the molecules close to these nanoparticles to be seen.

Progress in parallel towards a single aim

In just over one year "we have achieved results that tell us we are heading in the right direction", confirmed the Ikerbasque professor. Firstly, they have shown that by using nanoparticles encoded for SERS "we can produce a three-dimensional reconstruction of systems formed by different types of cells organised into multilayers with a resolution that allows us to differentiate between each layer of cells over relatively long periods of time". The advantage of the encoded particles used in this system is that they do not degrade over time, unlike fluorescent molecules routinely used for detections of this type. The research group has managed to produce "a kind of three-dimensional map of the positioning of the cells inside these complex systems; in other words, they have succeeded in controlling the fabricated cell system to be able to demonstrate the three-dimensional detection of each cell that carries a code provided by the said encoded particles. This is a first step with a view to studying the dynamic evolution of these systems, in other words, to producing a 4D-study (in three dimensions plus time)", he explained.

They have also demonstrated that it is possible to culture tumour cells and measure the evolution of different biomarkers in real time: cancer metabolites or substances that are generated as a result of the presence of cancer cells. "By using specially designed substrates, we have sufficient capacity to detect concentrations that are small enough to be significant in these tumour cultures. That way, we can see how the tumour cells that are developing in the system itself evolve over time and distinguish their behaviour under various conditions," explained Liz-Marzán. Specifically, they have managed to observe the evolution of two metabolites simultaneously; "we have seen that one increases its concentration while the other declines, which confirms that we are seeing in real time the metabolic process caused by enzymes that are expressed in these tumour cells," he added.

Likewise, the detection of a certain molecule indicates that "there are cells of a particular type that are dying in the system under those conditions". Liz-Marzán stresses the importance of this evidence because "it enables us to avail ourselves of practically remote detection, owing to the fact that our detector is not in direct contact with the cells, but simply studies the medium surrounding them. This is a significant step forward with respect to the ultimate objective". Liz-Marzán revealed finally that they are working on the building of scaffolding for cell cultures using a 3D printer, which also allows detections to be made, but he concluded by saying that "there is still a long way to go".

Credit: 
Elhuyar Fundazioa

Spending on experiences versus possessions advances more immediate happiness

image: Consumers with piggy banks thinking about purchases.

Image: 
The University of Texas at Austin

Certain purchases are better than others at sparking people's in-the-moment happiness, according to new research from the McCombs School of Business at The University of Texas at Austin.

Lead author Amit Kumar, assistant professor of marketing, and his research team found that consumers are happier when they spend on experiential purchases versus material ones. The paper, "Spending on Doing Promotes More Moment-to-Moment Happiness than Spending on Having," is published in the May 2020 issue of the Journal of Experimental Social Psychology.

"One issue that hasn't really been examined much is what happens in the here and now -- are we happier spending our money on an experience or on a material item?" Kumar said. "The basic finding from a lot of experiments is that people derive more happiness from their experiences than from their possessions."

Kumar and his co-authors, Matthew Killingsworth from the University of Pennsylvania, and Thomas Gilovich from Cornell University, recruited 2,635 adults who were randomly assigned to a material or experiential group. The participants were sent random texts during the day to monitor their emotions and their purchasing behavior. Material purchasers bought things such as jewelry, clothing or furniture, while experiential shoppers attended sporting events, dined at restaurants, or engaged in other experiences. The results: Happiness was higher for participants who consumed experiential purchases versus material ones in every category, regardless of the cost of the item.

"It would be unfair to compare a shirt to a trip, but when we account for price, we still see this result where experiences are associated with more happiness," Kumar said.

To address possible differences in types of consumers, the researchers conducted a second study in which they asked more than 5,000 participants to first rate their happiness and then report whether they had used, enjoyed, or consumed either a material or experiential purchase within the past hour. If they responded "yes," participants were asked a series of questions and details about their purchase.

"We still observed the same effect," Kumar said. "When the very same person was consuming an experience, that was associated with more happiness."

The researchers concluded that people are happier with experiential purchases over material ones irrespective of when you measure happiness: before, during or after consumption. Experiences also provoke more satisfaction even though people typically spend more time using their material possessions. The researchers said a possible explanation is the endurance of experiences in people's memories, while the perceived value of material goods weakens over time.

"If you want to be happier, it might be wise to shift some of your consumption away from material goods and a bit more toward experiences," Kumar said. "That would likely lead to greater well-being."

Credit: 
University of Texas at Austin

New guideline provides pathway to end homelessness, with housing as the foundation

A collaborative approach is required to build health care pathways that will end homelessness in Canada. Clinicians can play a role by tailoring their interventions using a comprehensive new clinical guideline on homelessness published in CMAJ (Canadian Medical Association Journal).

The guideline aims to inform clinicians and encourage collaboration with community organizations and policy-makers around priority steps and evidence-based interventions to treat homeless and precariously housed people at risk of homelessness.

"Homelessness has become a health emergency, not just a social issue. And we now know how to end it," says Dr. Kevin Pottie, Bruyère Research Institute and the University of Ottawa, Ottawa, Ontario. "It is critical to bring more clinicians into the conversation about homelessness and vulnerably housed people."

A network of clinicians, academics, governmental and nongovernmental stakeholders, called the Homeless Health Research Network, as well as five people with lived experience of homelessness, created the guideline. A steering committee with representatives from across Canada helped coordinate the process.

"Housing is medicine," says Amanda DiFalco, a fellow at the Institute of Global Homelessness and someone who has experienced homelessness herself. "We need to integrate this guidance into health policy and how we teach the next generation of clinicians."

Clinicians can learn to adapt their clinical approach to meet a patient's needs -- both social and medical.

The guideline recommends the following interventions to help patients who are homeless or vulnerably housed:

1. Permanent supportive housing: connect homeless or vulnerably housed people with a local housing coordinator or case manager to provide links to housing options

2. Income assistance: help people with income insecurity to find and access income-support resources

3. Case management: ensure people with mental health and substance use disorders access local mental health programs and psychiatric services

4. Opioid agonist therapy: provide access to opioid agonist therapy in primary care or a referral to an addiction specialist

5. Harm-reduction: identify appropriate management for people with substance use issues, or refer them to local addiction and harm reduction services

The homeless population in Canada has changed considerably over the last 25 years, from mostly middle-aged men to increasing numbers of women, youth, Indigenous people, older adults and even families. The estimated homeless population in 2014 was 235,000, of whom 27.3% were women, 18.7% were youth, 6% were recent immigrants or migrants, and a growing number were veterans or seniors.

"The successful implementation of the guideline starts with permanent supportive housing, commonly known as Housing First," says coauthor Tim Aubry, University of Ottawa. "Once successfully housed, individuals and families are in a much better position to receive the health care that they need."

Inner City Health Associates, the Canadian Medical Association and the Public Health Agency of Canada funded the guideline. The Homeless Health Research Network will update the guideline every five years.

Credit: 
Canadian Medical Association Journal

Babies born prematurely can catch up their immune systems, study finds

Researchers from King's College London & Homerton University Hospital have found babies born before 32 weeks' gestation can rapidly acquire some adult immune functions after birth, equivalent to that achieved by infants born at term.

In research published today in Nature Communications, the team followed babies born before 32 weeks gestation to identify different immune cell populations, the state of these populations, their ability to produce mediators, and how these features changed post-natally. They also took stool samples and analysed to see which bacteria were present.

They found that all the infants' immune profiles progressed in a similar direction as they aged, regardless of the number of weeks of gestation at birth. Babies born at the earliest gestations - before 28 weeks - made a greater degree of movement over a similar time period to those born at later gestation. This suggests that preterm and term infants converge in a similar time frame, and immune development in all babies follows a set path after birth.

Dr Deena Gibbons, a lecturer in Immunology in the School of Immunology & Microbial Sciences, said: "These data highlight that the majority of immune development takes place after birth and, as such, even those babies born very prematurely have the ability to develop a normal immune system."

Infection and infection-related complications are significant causes of death following preterm birth. Despite this, there is limited understanding of the development of the immune system in babies born prematurely, and how this development can be influenced by the environment post birth.

Some preterm babies who went on to develop infection showed reduced CXCL8-producing T cells at birth. This suggests that infants at risk of infection and complications in the first few months of their life could be identified shortly after birth, which may lead to improved outcomes.

There were limited differences driven by sex which suggests that the few identified may play a role in the observations that preterm male infants often experience poorer outcomes.

The findings build on previous findings studying the infant immune system.

Dr Deena Gibbons: "We are continuing to study the role of the CXCL8-producing T cell and how it can be activated to help babies fight infection. We also want to take a closer look at other immune functions that change during infection to help improve outcomes for this vulnerable group."

Credit: 
King's College London

Experts discover toolkit to repair DNA breaks linked to aging, cancer and MND

Experts discover a new 'toolkit' of proteins which can repair breaks in DNA.?

An accumulation of DNA breaks can cause ageing, cancer and Motor Neurone Disease (MND).?

The finding could also help repair DNA breaks caused deliberately during chemotherapy treatment to kill cancerous cells.

A new 'toolkit' to repair damaged DNA that can lead to ageing, cancer and Motor Neurone Disease (MND) has been discovered by scientists at the Universities of Sheffield and Oxford.

Published in Nature Communications, the research shows that a protein called TEX264, together with other enzymes, is able to recognise and 'eat' toxic proteins that can stick to DNA and cause it to become damaged. An accumulation of broken, damaged DNA can cause cellular ageing, cancer and neurological diseases such as MND.

Until now, ways of repairing this sort of DNA damage have been poorly understood, but scientists hope to exploit this novel repair toolkit of proteins to protect us from ageing, cancer and neurological disease.

The findings could also have implications for chemotherapy, which deliberately causes breaks in DNA when trying to kill cancerous cells. Scientists believe targeting the TEX264 protein may offer a new way to treat cancer.

Professor Sherif El-Khamisy, Co-Founder and Deputy Director of the Healthy Lifespan Institute at the University of Sheffield and a professor from the Department of Molecular Biology and Biotechnology and the Neuroscience Institute at the University of Sheffield, who co-led the research said: "Failure to fix DNA breaks in our genome can impact our ability to enjoy a healthy life at an old age, as well as leave us vulnerable to neurological diseases like Motor Neurone Disease (MND).

"We hope that by understanding how our cells fix DNA breaks, we can help meet some of these challenges, as well as explore new ways of treating cancer in the future."

Professor Kristijan Ramadan from the University of Oxford, who co-led the research, said: "Our finding of TEX264, a protein that forms the specialised machinery to digest toxic proteins from our DNA, significantly changes the current understanding of how cells repair the genome and so protect us from accelerated ageing, cancer and neurodegeneration. I believe this discovery has a great potential for cancer therapy in the future and we are already pursuing our research in this direction."

Professor Ramadan added: "I am very proud of my research team who initially discovered the involvement of TEX264 in DNA repair."

Oxford's research was supported by funding bodies, including the Medical Research Council. Backing was also received from the Oxford Institute for Radiation Oncology and Department of Oncology.

Professor El-Khamisy's lab is funded by the Wellcome Trust and the Lister Institute of Preventative Medicine.

The work forms part of the research taking place at the University of Sheffield's Healthy Lifespan Institute and the Neuroscience Institute.

The Healthy Lifespan Institute brings together 120 world-class researchers from a wide range of disciplines with the aim of slowing down the ageing process and tackling the global epidemic of multi-morbidity - the presence of two or more chronic conditions - in a bid to help everyone live healthier, independent lives for longer and reduce the cost of care.

The Neuroscience Institute aims to translate scientific discoveries from the lab into pioneering treatments that will benefit patients living with neurodegenerative disorders.

The next step of the research will be to test if the behaviour and properties of protein TEX264 is altered in ageing and in neurological disorders such as MND.

Credit: 
University of Sheffield

Rain, more than wind, led to massive toppling of trees in Hurricane Maria, says study

image: Researchers survey damage to a forest plot following hurricanes Irma and Maria. Uprooting of trees in many cases may have had more to do more with rain than wind.

Image: 
Kevin Krajick/Earth Institute

A new study says that hurricanes Irma and Maria combined in 2017 to knock down a quarter of the biomass contained in Puerto Rico's trees -- and that massive rainfall, more than wind, was a previously unsuspected key factor. The surprising finding suggests that future hurricanes stoked by warming climate may be even more destructive to forests than scientists have already projected. The study appears this week in the journal Scientific Reports.

"Up to now, the focus on damage to forests has been on catastrophic wind speeds. Here, the data show that rain tends to be the greatest risk factor," said Jazlynn Hall, a Columbia University PhD. student who led the study. Her team identified several ways in which extreme rain might topple trees, but they do not completely understand the phenomenon yet, she said. She said that adding in climate-driven extreme rainfall to the various dangers threatening tropical and subtropical forests suggests that they may store less carbon in the future than previously thought.

When Irma arrived off Puerto Rico on Sept. 6, 2017, it was then the most powerful Atlantic hurricane ever recorded. (Dorian, two years later, surpassed it.) But the main storm passed well off the coast; it dumped a foot of rain, but spared the island the heaviest winds. Forests suffered little damage. Then, two weeks later, on Sept, 20, Maria hit directly, with sustained winds of up to 130 miles per hour, and an astonishing 5 feet of rain over 48 hours in some areas.

Extrapolating from a combination of satellite imagery and on-the-ground surveys made a year before the hurricanes, and repeated shortly after, the researchers say that in the wake of Maria, some 10.4 billion tons of Puerto Rico's tree biomass went down, with trunks snapped off, uprooted or stripped of leaves and branches -- 23 percent of the island's pre-hurricane forest. But the damage was not uniform, and the researchers sorted through various risk factors that might account for differences.

Conventional wisdom has it that big trees high up on slopes directly exposed to high winds should suffer the most in storms. Indeed, the researchers did find that canopy height was an overarching factor; they confirmed earlier research showing that the island's biggest trees were prime victims. After that, conventional wisdom dissolved. Drilling down past tree height, the scientists found that the next most important factors were the amount of rain a specific locality got, plus the maximum local sustained wind speeds. Underlying those: the amount of antecedent rain from Irma, plus the amount of water that could be stored in the first five feet or so of soil from both storms. Adding it all up, the researchers concluded that rain, and its resulting storage in soil, dominated in determining which locales suffered the worst damage. Slope, elevation, topographic protection from wind and orientation toward the wind turned out to be the weakest factors.

"It's surprising, in the sense that when you think about hurricane damage to forests, you think about wind," said Hall's advisor and paper coauthor Maria Uriarte, a professor at Columbia's Earth Institute. "We're very aware of what flooding does to human infrastructure, but not so much to natural ecosystems." Uriarte led a series of prior studies on the storms, including one last year suggesting that forests in the paths of increasingly powerful and frequent hurricanes may eventually go into permanent decline.

The researchers say extreme rain potentially could affect trees in multiple ways. For one, in relatively flat areas where soils are porous and have a high capacity to store water for extended periods, Irma probably pre-loaded the dirt with liquid. When Maria came along, the ground around tree root zones became waterlogged. This theoretically would weaken the soil and make it easier for wind to uproot trees.

In addition to uprooting, the researchers also found that many trees in high-damage areas instead suffered snapped trunks. This, Hall speculated, could happen because rain simultaneously increases the weight of the soil and a tree's canopy, exerting increased strain on the trunk in the face of high winds. A heavier canopy could also contribute to uprooting by simply making it easier for the tree to tip over in saturated soil, she said. Counterintuitively, trees growing on slopes might in many cases resist damage better, because soils there might drain more quickly than those in low-lying areas that are protected from wind, but which collect more rainfall.

"The protective role of topography may be lessened in storms of Hurricane Maria's magnitude, which may foreshadow similar effects in future intense storms," says the paper. "Our study supports the idea that compounded disturbances can interact in ways that cannot be predicted."

Hurricanes derive their strength from heated air, and previous studies have projected that, due to warming climate, wind speeds of North Atlantic hurricanes may increase by 6 to 15 percent by 2100. Perhaps more salient in light of the new study: Warmer air also pulls in more moisture, and current models project that rainfall will increase even more drastically -- 20-plus percent. Added to that, hurricanes may stall over land for increased times, meaning that rainfall will not be more intense, but last longer. This was what caused 2017's Hurricane Harvey to devastate southeast Texas with the wettest recorded tropical cyclone ever to hit the United States.

A study last year by other researchers says that things may be heading this way already. It estimates that trends in sea-surface temperatures over the last 60 years have made the probability of Hurricane Maria-scale precipitation five times more likely. In addition, intervals between high-rain storms like Irma and Maria have already decreased by 50 percent, hiking up the possibility of the sequence that took place in 2017.

Tropical forests are now absorbing a third less carbon from the air than they did in the 1990s, according to a study out last week. The main reasons right now are burning and logging of trees, higher temperatures and droughts. But if the new study holds up, in some places it may not be the fire next time, but water.

Credit: 
Columbia Climate School

Why organisms shrink

Everyone is talking about global warming. A team of palaeontologists at GeoZentrum Nordbayern at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) has recently investigated how prehistoric organisms reacted to climate change, basing their research on belemnites. These shrunk significantly when the water temperature rose as a result of volcanic activity approximately 183 million years ago, during the period known as the Toarcian. The FAU research team published their results in the online publication Royal Society Open Science.

'Belemnites are particularly interesting, as they were very widespread for a long time and are closely related to the squid of today,' explains palaeontologist Dr. Patricia Rita. 'Their fossilised remains, for example the rostrum, can be used to make reliable observations.' Within the context of the DFG-funded research project 'Temperature-related stresses as a unifying principle in ancient extinctions,' the hypothesis was confirmed that climate has a significant influence on the morphology of adult aquatic organisms. The body size of dominant species fell by an average of up to 40 percent. The team of researchers believe that this Lilliput effect was a precursor to the later extinction of the animals. It is still unclear whether rises in temperature influenced the organisms' metabolism directly or indirectly, for example due to a shortage of food sources.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Scientists propose new method for large-scale production of thermally stable single-atom catalysts

A research group led by Prof. QIAO Botao from the Dalian Institute of Chemical Physics (DICP) of the Chinese Academy of Sciences proposed a new method for large-scale production of thermally stable single-atom catalysts (SACs) with high metal loading. Their findings were published in Nature Communications on Mar. 9.

SACs can maximize precious metal utilization and generate well-defined and uniform active sites. However, large-scale production of thermally stable SACs, especially in a simple way, remains a challenge.  

The researchers mixed RuO2 powder with high surface Fe-containing spinel support. After high-temperature calcination (900 oC), they found that the submicron RuO2 powder directly dispersed into Ru single atoms.

Detailed studies revealed that different from the traditional gas atom trapping approach, the dispersion of RuO2 was promoted, and the atom was trapped and stabilized by a strong covalent metal-support interaction with FeOx in the spinel support.

In addition, the obtained Ru SAC showed excellent thermal stability and improved activity for N2O decomposition.

This environmentally friendly and low-cost preparation method could achieve kilogram-level production of commercial Fe2O3 supported Ru SAC. It paves a way for the large-scale production of thermally stable SACs for industrial applications.

Credit: 
Chinese Academy of Sciences Headquarters

A new method to improve tropical cyclone intensity forecasts

In numerical weather forecasting research, how to improve short-term forecasts of tropical cyclone intensity is a challenging problem that has long plagued meteorologists and operational forecasters, despite meteorologists having greatly increased the accuracy of the initial field through increasing observations.in either quantity or quality. So, what else are we missing?

According to a recently published study in Advances in Atmospheric Sciences, the missing piece of the jigsaw is reducing the model error of numerical weather forecasting models. There are many reasons for model errors, such as our incomplete understanding of tropical cyclone physical processes, the uncertainty in many parameters of parameterization schemes, the insufficiently fine resolution, truncated errors, and oversimplification of parameterization schemes to save on calculation costs, and so on. "It would cost too much [computationally] if we try to overcome all the model errors of different sources one by one," says Wansuo Duan, a Senior Scientist with the Institute of Atmospheric Physics, Chinese Academy of Sciences.

Duan's team propose a new idea to solve the above problems, based on the Nonlinear Forcing Singular Vector (NFSV) method, which Duan's team developed. "We consider the model errors, caused by different sources, as a whole to explore their impact on the forecast accuracy," he adds.

By utilizing this method, the team employed the WRF model to identify the uncertainty of which variable and which area is more likely to cause large errors in short-term tropical cyclone intensity forecasts. Taking into account the characteristics of observations made by the Fengyun-4 satellite, they used the above method and identified that the uncertainty of the rate of change in atmospheric temperature in the middle and lower layers of the typhoon core area has the greatest influence on the uncertainty of the typhoon's simulation.

"Satellites can adaptively collect corresponding observations in a given region, which can then be further used to reduce the model errors and improve short-term forecasts of tropical cyclone intensity," concludes Duan.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

New findings of chemical differences between PM1 and PM2.5 might reshape air pollution studies

image: A fog event in Gucheng, Hebei province in winter 2018. Small particles below 1 μm can grow rapidly to large particles, e.g., 2.5 μm during high relative humidity periods, yet the primary particles from coal combustion and traffic emissions would not change much.

Image: 
Xiaoxuan Zhao

Current air pollution studies largely rely upon aerosol mass spectrometers, most of which can only measure submicron aerosol (PM1) species--particulate matter with aerodynamic diameter less than 1 μm. In many studies, PM1 aerosol species are therefore used to validate those of PM2.5 (particulate matter with aerodynamic diameter less than 2.5 μm) in chemical transport models, and estimate particle acidity (pH) and aerosol water content which are key parameters in studying heterogeneous reactions. However, are there chemical differences between PM1 and PM2.5? Will the differences bring uncertainties into air pollution studies, especially in highly polluted environment, e.g., China and India?

Professor Yele Sun and his team with the Institute of Atmospheric Physics, Chinese Academy of Sciences tried to answer these questions by characterizing the chemical differences between PM1 and PM2.5 in a highly polluted environment in north China in winter using a newly developed PM2.5 Time-of-Flight Aerosol Chemical Speciation Monitor. They found that the changes in PM1/PM2.5 ratios as a function of relative humidity (RH) were largely different for primary and secondary aerosol species.

"If organics is the dominant component (> 50%) of particulate matter and RH is below 80%, the chemical species in PM1 would be highly correlated with those in PM2.5. PM1 can be representative of PM2.5" says Sun, the first and corresponding author of this study, "however, if sulfate, nitrate, and secondary organic aerosol that are formed from secondary formation are dominant components, there would be large chemical differences between PM1 and PM2.5 at RH > 60%. The major reason is that these secondary species have higher hygroscopicity and can uptake more water during higher RH periods".

SUN also evaluated the impacts of chemical differences between PM1 and PM2.5 on the predictions of pH and aerosol water content with thermodynamic modeling. "The chemical differences of between PM1 and PM2.5 have negligible impacts on pH prediction, but has a large impact on prediction of aerosol water content by up to 50-70%." says Sun.

"Our findings are important because current air pollution studies in highly polluted environment, particularly during severe haze events with high RH must consider the chemical differences between PM1 and PM2.5." says Sun, "Validation of model simulations in chemical transport models also need to consider such differences."

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences

Machine learning illuminates material's hidden order

Extreme temperature can do strange things to metals. In severe heat, iron ceases to be magnetic. In devastating cold, lead becomes a superconductor.

For the last 30 years, physicists have been stumped by what exactly happens to uranium ruthenium silicide (URu2Si2) at 17.5 kelvin (minus 256 degrees Celsius). By measuring heat capacity and other characteristics, they can tell it undergoes some type of phase transition, but that's as much as anyone can say with certainty. Plenty of theories abound.

A Cornell collaboration led by physicist Brad Ramshaw, the Dick & Dale Reis Johnson Assistant Professor in the College of Arts and Sciences, used a combination of ultrasound and machine learning to narrow the possible explanations for what happens to this quantum material when it enters this so-called "hidden order."

Their paper, "One-Component Order Parameter in URu2Si2 Uncovered by Resonant Ultrasound Spectroscopy and Machine Learning" published March 6 in Science Advances.

"In uranium ruthenium silicide, we have no idea what the electrons are doing in the hidden order state," said Ramshaw, the paper's senior author. "We know that they don't become magnetic, we know that they don't become superconducting, but what are they doing? There are a lot of possibilities - orbital order, charge density waves, valence transitions - but it's hard to tell these different states of matter apart. So the electrons are 'hiding,' in that sense."

Ramshaw and his doctoral student Sayak Ghosh used high-resolution ultrasound spectroscopy to examine the symmetry properties of a single crystal of URu2Si2 and how these properties change during the hidden order phase transition. Most phase transitions are accompanied by a change in symmetry properties. For example, solids have all their atoms lined up in an organized way, while liquids do not. These changes in symmetry aren't always obvious, and can be difficult to detect experimentally.

"By looking at symmetry, we don't have to know all the details about what the uranium is doing, or what the ruthenium is doing. We can just analyze how the symmetry of the system looks before the phase transition, and how it looks after," Ramshaw said. "And that lets us take that table of possibilities that theorists have come up with and say, 'Well, these are not consistent with the symmetry before and after the phase transition, but these are.' That's nice, because it's rare that you can make such definitive yes and no statements."

However, the researchers encountered a problem. To analyze the ultrasound data, they normally would model it with wave mechanics. But to study the purest form of URu2Si2, they had to use a smaller, cleaner sample. This "oddly-shaped little hexagon chip," Ramshaw said, was too tiny and had too much uncertainty for a straightforward wave-mechanics solution.

So Ramshaw and Ghosh turned to Eun-Ah Kim, professor of physics and a co-author of the paper, and her doctoral student Michael Matty, to produce a machine-learning algorithm that could analyze the data and uncover underlying patterns.

"Machine learning is not only for an image-like data or big data," Kim said. "It can dramatically change the analysis of any data with complexity that evades manual modeling."

"It's hard, because the data is just a list of numbers. Without any sort of method, it has no structure, and it's impossible to learn anything from it," said Matty, the paper's co-lead author with Ghosh. "Machine learning is really good at learning functions. But you have to do the training correctly. The idea was, there is some function that maps this list of numbers to a class of theories. Given a set of numerically approximated data, we could do what is effectively regression to learn a function that interprets the data for us."

The results from the machine-learning algorithm eliminated roughly half of the more than 20 likely explanations for the hidden order. It may not yet solve the URu2Si2 riddle, but it has created a new approach for tackling data analysis problems in experimental physics.

The team's algorithm can be applied to other quantum materials and techniques, most notably nuclear magnetic resonance (NMR) spectroscopy, the fundamental process behind magnetic resonance imaging (MRI). Ramshaw also plans to use the new technique to tackle the unruly geometries of uranium telluride, a potential topological superconductor that could be a platform for quantum computing.

Credit: 
Cornell University

Don't blame the messenger -- unless it's all stats and no story

BUFFALO, N.Y. - It's curious how an issue like climate change remains unsettled in segments of the population despite the overwhelming scientific consensus that human activity is responsible for the Earth's current warming trend.

Pick another science-based debate: Intelligent design and evolution? Crop circles and the possibility of extraterrestrial visits? How about information concerning the public health threat posed by the novel coronavirus?

What drives the lingering public doubt despite the conclusions of credible experts? Part of the answer might be a story, or more accurately, failing to tell one.

"Narrative affects an audience's perception of the person who is delivering the message," says Melanie Green, a professor of communication in the University at Buffalo College of Arts and

Sciences.

Green is co-author of a new study in the journal PLOS ONE that turns the rich literature of person-perception on its head to look at how the nature of the message affects our perception of the person delivering it, rather than how the person affects our perception of the message.

In some cases of ineffective messaging, it might be appropriate, despite the aphorism to the contrary, to blame the messenger.

"Our findings suggest that telling stories when communicating can make the speaker appear more warm and trustworthy, as opposed to speaking some other way, such as providing only statistics and figures," says Green, a social psychologist and an expert on narrative persuasion and the power of storytelling.

Green's current study with Jenna L. Clark, a senior behavioral researcher at Duke University and Joseph J.P. Simons of the Agency for Science, Technology and Research in Singapore, was inspired by research in science communication.

"We wanted to explore why people are sometimes distrusting of what amounts to the best possible evidence we have on many issues," she says.

People rely on two qualities in particular when forming impressions of someone: warmth and competence, according to Green. Warmth is defined as being friendly, helpful and trustworthy, while competence relates to ability, intelligence and skill.

Previous research indicates that people perceive scientists as smart but distant, and high in competence but low in warmth -- a deficiency that implies a lack of trustworthiness.

"That perception might be a communication barrier that's responsible for people believing that regardless of someone's ability, they still might not have the best interests of others in mind," says Green. "We worked from the idea of science communication, but the results can be applied whenever there's someone perceived as high in competence, but cold and distant."

"Telling a story might be a way to improve that perception of warmth because stories create empathy, and we begin to appreciate what characters in the narrative are going through."

The researchers conducted three studies with between 235 and 255 participants. In the first two studies, people read a scenario that required them to give advice on a bank or vacation destination, using either storytelling or statistical information, such as describing how a family member was able to secure home financing because of the efforts of a loan officer or running through the bank's interest rates and level of customer satisfaction.

In the later study, people again told stories or provided statistics, with the listener then deciding with whom they wanted to work on a specific task.

For each study, there was clear support for message features, like the types of evidence inherent in stories or statistics, influencing perception of the source.

Green says she understands the hesitancy about scientists telling stories. She's one of those scientists, and realizes the risks of people drawing conclusions beyond the findings of established research.

"As scientists we're trained to be careful about the limits of our data and to be precise. One story is not going to explain everything," she says. "But there are many types of stories, and we can discuss things like how data were collected; why the research team came together; what interests us most about this field of study.

"These kinds of stories keep things precise, but help create warmth and trustworthiness without treading on scientific ideals."

And that trustworthiness does not necessarily come at the expense of competence, she notes.

"We do have evidence for a general positive effect," says Green. "Both qualities - warmth and competence -- can increase together."

Credit: 
University at Buffalo