Earth

Yellowstone National Park is hotter than ever

image: A woman sits on a rocky slope with her back to the camera, facing a mountainous landscape.

Image: 
Grant L. Harley

WASHINGTON--Yellowstone National Park is famous for harsh winters but a new study shows summers are also getting harsher, with August 2016 ranking as one of the hottest summers in the last 1,250 years.

The new study drew upon samples of living and dead Engelmann spruce trees collected at high elevations in and around Yellowstone National Park to extend the record of maximum summer temperatures back centuries beyond instrumental records. The findings were published in Geophysical Research Letters, AGU's journal for high-impact, short-format reports with immediate implications spanning all Earth and space sciences.

The team, led by Karen Heeter, a dendrochronologist at the University of Idaho in Moscow, found that the 20th and 21st centuries, and especially the past 20 years, are the hottest in the new 1,250-year record. Previously, temperature records for the Yellowstone region were only available going back to 1905.

The climate data gleaned from the tree ring samples fits closely with the instrumental record over the past 100 years. The team was also able to identify several known periods of warming in the tree ring record, including the Medieval Climate Anomaly that occurred between 950 and 1250, as well as several multidecadal periods of cooling that occurred prior to 1500.

"If we can find historical analogs to the warming conditions we're seeing now, that's really valuable," Heeter said. "The records show that the 1080s were extremely warm and in the 16th century, there was a period of prolonged warmth for about 130 years."

The warm periods of the past were characterized by substantial multidecadal temperature variability, markedly different from the prolonged, intense warming trends seen over the past 20 years. Today's unprecedented warming may spell trouble for the Greater Yellowstone Ecosystem, the pride of the US National Park system, by exacerbating droughts, wildfires, and other types of ecosystem stress.

The new record provides crucial data for scientists seeking to better understand the relationships between increasing temperatures and environmental factors like fire regimes, seasonal snowpack, and vegetation changes, Heeter said. "The warming trend we see beginning around 2000 is the most intense in the record. The rate of warmth over a relatively short period of time is alarming and has important implications for ecosystem health and function," she said.

In addition to providing one of the few millennial-length temperature records for North America, the study identified summer surface temperature trends using a new tree ring technique called Blue Intensity, Heeter said.

"Unlike traditional tree ring methods where we just measure annual or sub-annual growth rings, Blue Intensity gives us a representation of ring density," Heeter said. Density of the outermost part of annual growth rings, called the latewood, has been shown to correlate closely with maximum summer temperatures, she said.

Developed in Europe in the early 2000s, Blue Intensity has been shown to be a more cost effective method of assessing tree ring density than other methods, says Robert Wilson, a dendrochronologist at the University of St. Andrews in Scotland, who was not involved in the new study.

Engelmann spruce trees, found throughout North America from Canada to Mexico, are the "perfect species for BI methods due to their uniformly light-colored wood", Wilson said, helping to assuage the main drawback of the Blue Intensity method, which can be biased by color variations in wood samples. Engelmann spruce also live between 600 and 800 years and rot relatively slowly. The pristine setting of Yellowstone National Park provided an opportunity to source samples from living and downed trees dating back 1,250 years.

Heeter and colleagues are also working on applying Blue Intensity methods to more locations across North America, particularly in southern states, where obtaining a strong temperature signal from traditional tree ring data can be difficult. The team has already made the new Greater Yellowstone dataset available to other researchers by adding it to the International Tree-Ring Data Bank, which is publicly available from NOAA.

"I have all these things I'd like to do with [the Yellowstone dataset], such as looking at periods of drought through time or temperature and fire trends," Heeter said. "But I hope that it might also be useful to other researchers who are studying other aspects of the ecosystem. Honestly, I think the [research] possibilities are endless."

Credit: 
American Geophysical Union

Does correcting online falsehoods make matters worse?

So, you thought the problem of false information on social media could not be any worse? Allow us to respectfully offer evidence to the contrary.

Not only is misinformation increasing online, but attempting to correct it politely on Twitter can have negative consequences, leading to even less-accurate tweets and more toxicity from the people being corrected, according to a new study co-authored by a group of MIT scholars.

The study was centered around a Twitter field experiment in which a research team offered polite corrections, complete with links to solid evidence, in replies to flagrantly false tweets about politics.

"What we found was not encouraging," says Mohsen Mosleh, a research affiliate at the MIT Sloan School of Management, lecturer at University of Exeter Business School, and a co-author of a new paper detailing the study's results. "After a user was corrected ... they retweeted news that was significantly lower in quality and higher in partisan slant, and their retweets contained more toxic language."

The paper, "Perverse Downstream Consequences of Debunking: Being Corrected by Another User for Posting False Political News Increases Subsequent Sharing of Low Quality, Partisan, and Toxic Content in a Twitter Field Experiment," has been published online in CHI '21: Proceedings of the 2021 Conference on Human Factors in Computing Systems.

The paper's authors are Mosleh; Cameron Martel, a PhD candidate at MIT Sloan; Dean Eckles, the Mitsubishi Career Development Associate Professor at MIT Sloan; and David G. Rand, the Erwin H. Schell Professor at MIT Sloan.

From attention to embarrassment?

To conduct the experiment, the researchers first identified 2,000 Twitter users, with a mix of political persuasions, who had tweeted out any one of 11 frequently repeated false news articles. All of those articles had been debunked by the website Snopes.com. Examples of these pieces of misinformation include the incorrect assertion that Ukraine donated more money than any other nation to the Clinton Foundation, and the false claim that Donald Trump, as a landlord, once evicted a disabled combat veteran for owning a therapy dog.

The research team then created a series of Twitter bot accounts, all of which existed for at least three months and gained at least 1,000 followers, and appeared to be genuine human accounts. Upon finding any of the 11 false claims being tweeted out, the bots would then send a reply message along the lines of, "I'm uncertain about this article -- it might not be true. I found a link on Snopes that says this headline is false." That reply would also link to the correct information.

Among other findings, the researchers observed that the accuracy of news sources the Twitter users retweeted promptly declined by roughly 1 percent in the next 24 hours after being corrected. Similarly, evaluating over 7,000 retweets with links to political content made by the Twitter accounts in the same 24 hours, the scholars found an upturn by over 1 percent in the partisan lean of content, and an increase of about 3 percent in the "toxicity" of the retweets, based on an analysis of the language being used.

In all these areas -- accuracy, partisan lean, and the language being used -- there was a distinction between retweets and the primary tweets written by the Twitter users. Retweets, specifically, degraded in quality, while tweets original to the accounts being studied did not.

"Our observation that the effect only happens to retweets suggests that the effect is operating through the channel of attention," says Rand, noting that on Twitter, people seem to spend a relatively long time crafting primary tweets, and little time making decisions about retweets.

He adds: "We might have expected that being corrected would shift one's attention to accuracy. But instead, it seems that getting publicly corrected by another user shifted people's attention away from accuracy -- perhaps to other social factors such as embarrassment." The effects were slightly larger when people were being corrected by an account identified with the same political party as them, suggesting that the negative response was not driven by partisan animosity.

Ready for prime time

As Rand observes, the current result seemingly does not follow some of the previous findings that he and other colleagues have made, such as a study published in Nature in March showing that neutral, nonconfrontational reminders about the concept of accuracy can increase the quality of the news people share on social media.

"The difference between these results and our prior work on subtle accuracy nudges highlights how complicated the relevant psychology is," Rand says.

As the current paper notes, there is a big difference between privately reading online reminders and having the accuracy of one's own tweet publicly questioned. And as Rand notes, when it comes to issuing corrections, "it is possible for users to post about the importance of accuracy in general without debunking or attacking specific posts, and this should help to prime accuracy and increase the quality of news shared by others."

At least, it is possible that highly argumentative corrections could produce even worse results. Rand suggests the style of corrections and the nature of the source material used in corrections could both be the subject of additional research.

"Future work should explore how to word corrections in order to maximize their impact, and how the source of the correction affects its impact," he says.

Credit: 
Massachusetts Institute of Technology

Immunotherapy combination shows benefit for patients with advanced melanoma

Fixed-dose combination of nivolumab and relatlimab holds the cancer in check significantly longer than nivolumab alone

This is the first regimen to demonstrate a statistical benefit over anti-PD-1 monotherapy in metastatic melanoma

BOSTON - A combination of two drugs that target different proteins on immune system T cells kept advanced melanoma in check significantly longer than one of the drugs alone in a phase 3 clinical trial involving 714 patients. Dana-Farber Cancer Institute investigators co-led the study. Findings will be presented at the American Society of Clinical Oncology (ASCO) Annual Meeting, being held virtually June 4-8, 2021, and are included in the ASCO press program.

The trial, known as the RELATIVITY-047 study, compared the effectiveness of the drug nivolumab, an immune checkpoint inhibitor, by itself against a combination of the LAG-3 blocking antibody relatlimab and nivolumab given as a fixed-dose. Trial participants who received the combination therapy as their initial treatment had a median progression-free survival - the time in which the disease did not worsen - of 10.1 months, compared to 4.6 months for those treated with nivolumab alone, investigators found. Twelve months after treatment, 47.7% of patients treated with the two-drug regimen had no advance of their disease, compared to 36% of those who received only nivolumab. The side effects of the combination were generally manageable.

Both nivolumab and relatlimab are antibody drugs. They target separate proteins on T cells to revive and reinvigorate the cells' natural attack on tumor cells. Nivolumab targets PD-1, a checkpoint protein that prompts T cells to call off their attack when it binds to a corresponding protein on tumor cells. Relatlimab targets the protein LAG-3, an immune checkpoint receptor protein that functions to control T-cell response, activation and growth.

"Immune checkpoint inhibitors such as nivolumab have revolutionized the treatment of patients with advanced melanoma," said F. Stephen Hodi, MD, the director of the Melanoma Center and the Center for Immuno-Oncology at Dana-Farber and the co-senior author of the study.

"However, novel combinations of checkpoint inhibitors with other immune agents are needed to improve results. The RELATIVITY trial is the first study of a combination treatment to demonstrate a clinically important benefit by simultaneously inhibiting the LAG-3 and PD-1 pathways."

Credit: 
Dana-Farber Cancer Institute

Groundwater monitoring with seismic instruments

image: The catchment area of Bhote Koshi River lies in the bordering region between Nepal and China.

Image: 
Luc Illien/GFZ

Water in the high-mountain regions has many faces. Frozen in the ground, it is like a cement foundation that keeps slopes stable. Glacial ice and snow supply the rivers and thus the foothills with water for drinking and agriculture during the melt season. Intense downpours with flash floods and landslides, on the other hand, pose a life-threatening risk to people in the valleys. The subsoil with its ability to store water therefore plays an existential role in mountainous regions.

But how can we determine how empty or full the soil reservoir is in areas that are difficult to access? Researchers at the German Research Centre for Geosciences (GFZ), together with colleagues from Nepal, have now demonstrated an elegant method to track groundwater dynamics in high mountains: They use seismic waves, such as those generated by ground vibrations, which they record with highly sensitive instruments. Similar to medical ultrasound, they exploit the fact that the waves propagate differently in different subsurface conditions. The researchers led by Luc Illien, Christoph Sens-Schönfelder and Christoff Andermann from GFZ report on this in the journal AGU Advances.

Seismic waves well-known from earthquakes. After a rupture in the subsurface, they propagate rapidly and unleash destructive forces. However, there are also much smaller waves caused, for example, by trucks, streetcars or - in the mountains - by falling rock. The ground is actually vibrating all the time. In geoscience, this is referred to as "seismic noise." What has to be laboriously extracted from the measured data of seismometers in earthquake detection turns out to be a valuable source of information when looking into the subsurface. This is because seismic waves propagate differently in the water-saturated zone than in the unsaturated zone, also called vadose zone.

Luc Illien, a PhD student at GFZ, and his colleagues used two Nepalese seismic stations at 1,200 and 2,300 meters above sea level. Luc Illien says: "The Nepalese Himalayas provide vital water resources to a large part of the population of South Asia. Most of this water drains through mountain groundwater reservoirs that we can poorly delineate." The study area comprised the catchment area of a small tributary to the Bothe Koshi, a border river between China and Nepal. Using several weather stations and level gauges, the team collected data, sometimes every minute, over three monsoon seasons. From this, they established a groundwater model that they could compare with the seismic records. The result: runoff to the Bothe Koshi is fed mainly from the deep aquifer. In the dry season, little water flows down the valley. In the monsoon, levels rise, but two distinct phases can be identified. First, it rains without increasing the discharge, but later a clear correlation between rainfall and river level becomes apparent. Christoff Andermann, co-author of the study, explains, "The first rainfall initially replenishes reservoirs in the soil near the surface. Once the soil is saturated with water, the deep groundwater reservoir, which is directly linked to the rivers, fills up. An increase in groundwater is then immediately reflected in rising river water levels."

The comparison with the data from seismometers showed that the saturation of the vadose zone can be well deduced from the seismic noise. "Only by merging the hydrological observations with the seismic measurements we could analyze the function of the vadose zone as a link between precipitation and groundwater reservoir," says Christoph Sens-Schönfelder. First author Luc Illien: "Understanding how the reservoir fills and drains is crucial for assessing its sustainability. From this, we can not only make predictions for runoff, but also warn of increased risk of landslides and flash floods." For example, if the soil is already saturated with water, rainfall will run off more superficially and can carry away slopes. Climate change is exacerbating the situation by contributing to changes in large-scale weather patterns and destabilizing the mountain environment. GFZ Scientific Director Niels Hovius, who contributed to the study, says: "Our work in Nepal and its results show how important it is to monitor numerous influencing factors. These include groundwater storage, changes in land use, land cover and precipitation regimes. Capturing and anticipating such changes will help us better predict the future of freshwater resources and mountain landscapes, especially as glaciers continue to melt."

Credit: 
GFZ GeoForschungsZentrum Potsdam, Helmholtz Centre

A rapid antigen test for SARS-CoV-2 in saliva

image: The Lumipulse G600II instrument (left) and the Lumipulse SARS-CoV-2 Ag kit (right), both manufactured by Fujirebio, which were used in this study for the quantification of SARS-CoV-2 in saliva samples (Photo: Shinichi Fujisawa).

Image: 
Shinichi Fujisawa

Scientists from Hokkaido University have shown that an antigen-based test for quantifying SARS-CoV-2 in saliva samples is simple, rapid, and more conducive for mass-screening.

More than a year into the COVID-19 pandemic, the RT-PCR test remains the gold standard for detection of the SARS-CoV-2 virus. This method requires trained personnel at every step, from collection of nasopharyngeal swab (NPS) samples to interpretation of the results; in addition, the entire process ranges from 24-48 hours on average. As the virus can be transmitted by an infected person before symptoms develop, and is even transmitted by individuals who are asymptomatic, the ability to screen a large number of people quickly is vital to controlling and preventing the spread of the pandemic. Faster methods to detect the SARS-CoV-2 antigens have been developed, but they are not as sensitive as the RT-PCR test. In June 2020, a novel antigen-based kit, Lumipulse® SARS-CoV-2 Ag kit (Lumipulse), was developed by Fujirebio to quantitatively measure the viral antigen in biological samples within 35 minutes.

A team of scientists from Hokkaido University have used the antigen kit to detect SARS-CoV-2 in saliva samples, and have assessed the efficiency and accuracy of the test compared to RT-PCR. Their findings show that the antigen detection kit, which is used to perform chemiluminescent enzyme immunoassay (CLEIA), can rapidly detect SARS-CoV-2 with good accuracy in these samples. The study was published in the journal The Lancet Microbe.

The scientists tested 2056 individuals from three cohorts: patients with clinically confirmed COVID-19, individuals who had contacted patients with COVID-19, and individuals tested on arrival at Tokyo and Kansai International Airports. Saliva samples were collected from all individuals and used for RT-PCR tests as well as CLEIA using Lumipulse. The results of both were compared to determine the usefulness of CLEIA.

The scientists found that CLEIA is a reliable test, as it correlates well with RT-PCR. CLEIA alone can be used to detect SARS-CoV-2 within an hour; however, using CLEIA for screening and RT-PCR for confirmation increases the accuracy of diagnosis.

The benefit of using saliva samples is the ease of collection: it is quick and can be collected by the individuals being tested, reducing the risk that healthcare workers are exposed to the virus. Furthermore, self-collection of saliva allows multiple samples to be collected simultaneously for expeditious screening of visitors at large gatherings.

Combined CLEIA and RT-PCR testing on saliva samples has already been implemented at Japanese airport quarantines, and the authors recommend adopting it at a wider scale to rapidly screen for SARS-CoV-2.

Credit: 
Hokkaido University

New study shows flies mutant for schizophrenia-associated genes respond well to anti-psychotics

image: 1. Images of a groups of fruit flies (small black dots) placed in a triangular arena. Normal flies (left triangle) like to be a certain distance apart showing their preference for social space, flies mutants in the schizophrenia-associated Rim gene (right) prefer increased in social distance.

Image: 
University of Bristol

Scientists have successfully treated flies displaying behavioural problems linked to newly discovered schizophrenia-associated genes in humans, using common anti-psychotics.

Schizophrenia is a severe long-term mental health condition that is historically poorly understood and treated. It is relatively common, affecting one to two per cent of the population, and is known to be up to 80 per cent genetic in origin.

Recent advances in sequencing genomes of people with schizophrenia have identified a list of novel genes and mutations associated with the disease. Many are expressed in the brain and are involved in how neurons communicate with each other by electrical and chemical signals released at synapses.

The research was performed by the first student, Dr Sergio Hidalgo, on the dual PhD program from the Universities of Bristol (UK) and the Pontificia Universidad Catolica de Chile. He studied the role of two schizophrenia associated genes on behaviours associated with the disease, using the genetics of the fruit fly, Drosophila.

"We studied two of these schizophrenia-associated genes - one called Rim, which is involved in neurotransmitter release at synapses, and another called CACNA1A and CACNA1B in humans and cacophony in flies, voltage-sensitive calcium channels involved in electrical and chemical signalling in and between neurons. We found that fly Rim mutants showed several behavioural changes seen in people with schizophrenia who may have Rim mutations. These included preferring larger social distances between individuals when in a group and changes in smell or olfaction. We also found the circadian (24-hour body clock) deficits reported in schizophrenia were also present in Rim mutant flies," said Dr James Hodge, who supervised the research at School of Physiology, Pharmacology and Neuroscience at the University of Bristol.

Strikingly, treatment with the commonly used antipsychotic, haloperidol, rescued some of the Rim mutant's behavioural problems.

The second study looked at voltage-gated calcium channels, several of which are major contributors to the risk of developing schizophrenia. The team focussed on the negative symptoms of schizophrenia which include behavioural defects such as impaired memory, sleep and circadian rhythms.

"These symptoms are particularly poorly understood and treated. We found fly cacophony (cac) mutants showed several behavioural features including decreased night-time sleep and hyperactivity similar to those reported in human patients. We also found that loss of cac function in the clock of the fly's brain decreased their circadian rhythms, while loss of cac function in their memory centre reduced the fly's memory via a reduction in calcium signalling," said Dr Sergio Hidalgo.

Two new research papers have come out of the study, published in Translational Psychiatry and Neurobiology of Disease. The research represents important advances in understanding schizophrenia by demonstrating how loss of rim or cac Cav2 channel function causes a number of disease relevant cognitive and behavioural deficits and underlying reduction in synaptic growth and neuronal calcium transients.

"It is apparent from this study that these behaviours are caused by changes in calcium signalling, shape of synapses and their release of neurotransmitter. Along with the ability to return these behaviours to normal with a commonly used schizophrenia drug, these studies establish Drosophila as a high-throughput in vivo genetic model to study the Cav channel and neurotransmitter release pathophysiology related to schizophrenia.

"The next step is to understand how rim and different calcium channels act together at synapses to regulate behaviours affected by schizophrenia. By testing drugs or treatments directed at these targets we will develop a deeper understanding of therapies for schizophrenia, and how they work."

"It is important we secure funding for this novel research which has potential to inform truly significant advances in the way we treat this common and debilitating condition," added Dr James Hodge.

"Understanding the complexity of schizophrenia etiology could help us to develop new and more effective treatments. By showing that some molecular mechanisms are conserved between species, we can propose the use of flies as a new platform for drug testing," added Dr Sergio Hidalgo.

Dr Jorge Campusano, joint supervisor of the research from the Department of Cellular and Molecular Biology at Pontificia Universidad Catolica de Chile, said "Sergio Hidalgo is the first student in a new dual doctoral programme signed by University of Bristol and Pontificia Universidad Catolica de Chile. His research demonstrates how collaboration between our two universities can expand the frontiers of knowledge and the reach of our findings. This is particularly poignant on such an important area of biomedicine these days, which is mental health."

Credit: 
University of Bristol

New research could help manufacturers avoid 3D-printing pitfall

video: A 3D printer of the laser powder-bed fusion type, in action. Laser powder-bed fusion adds successive layers of metal powder and then uses a laser to melt each layer into place on the part being created.

Image: 
NIST

A research team has found that a method commonly used to skirt one of metal 3D printing's biggest problems may be far from a silver bullet.

For manufacturers, 3D printing, or additive manufacturing, provides a means of building complex-shaped parts that are more durable, lighter and more environmentally friendly than those made through traditional methods. The industry is burgeoning, with some predicting it to double in size every three years, but growth often goes hand in hand with growing pains.

Residual stress, a byproduct of the repeated heating and cooling inherent to metal printing processes, can introduce defects into parts and, in some cases, damage printers. To better understand how residual stress forms, and how it might be curbed, researchers at the National Institute of Standards and Technology (NIST), Lawrence Livermore National Laboratory, Los Alamos National Laboratory and other institutions closely examined the effects of different printing patterns in titanium alloy parts made with a common laser-based method.

Their results, published in Additive Manufacturing, show that a printing pattern often used in industry to decrease residual stress, known as island scanning, had the worst showing among the approaches studied, defying the team's expectations. The data they produced could help manufacturers test and improve predictive models for 3D printing, which, if accurate, could steer them away from destructive levels of residual stress.

"This was very surprising and underscores the complexity of the problem," said NIST materials research engineer Thien Phan, a co-author of the study. "It shows that, although island scanning may work in many cases, it did not work in ours, which really highlights the fact that we need to have accurate modeling."

The team's research centered on a prevalent additive manufacturing method called laser powder bed fusion (LPBF), in which a laser scans over a layer of metal powder in a predetermined pattern, melting and fusing particles at the surface together. As the molten metal cools into a solid, a stage supporting the material lowers and the printer adds a new coat of powder on top, allowing the laser to continue building the part layer by layer.

Once the second layer of a build begins, residual stress starts to rear its unpleasant head. The metals used in LPBF cool off quickly, meaning that by the time a printer's laser begins heating up a new layer, the metal from the previous layer is already solid. The melted layers contract inward as they cool, pulling on the solid metal below and creating stress. And the greater the difference in temperature, the more the melted layer pulls. This process repeats for every layer until the part is complete, locking the stresses into solid metal.

"You end up with an incredible amount of residual stresses inside your piece," said Phan. "So it's sitting there, tearing itself apart. The residual stress could crack the part and lift it up during the build, which could actually crash the machine."

The most straightforward printing pattern in LPBF is a continuous scan, where the laser scans back and forth from one end of the part to the other. But an alternative option called island scanning has emerged as a way to mitigate stress. The idea behind this approach is that melting small sections, or islands, of metal one at a time rather than an entire layer would result in less metal contracting at the same time, reducing the overall stress.

Island scanning has gained traction with manufacturers, but past studies on the technique have been inconsistent. And more broadly, the relationship between scanning strategies and residual stress largely remains a mystery. To begin filling in these gaps, the multi-institution team set out to analyze the effects of island scanning on stress in great detail.

The authors of the new study printed four titanium alloy bridges just over 2 centimeters (0.8 inches) in length. The samples were built via either continuous or island scanning, with lasers running along their length and width or at a 45-degree angle.

At a glance, the bridges looked similar coming out of the printer, but rather than take them at face value, the researchers scrutinized them in close detail.

They beamed high-energy X-rays, generated by a powerful tool called a synchrotron, deep into the samples. By measuring the wavelengths of X-rays that reflected off of the metal, the team extracted the distances between the metal atoms with high accuracy. From there the researchers calculated stress. The greater the distances, the more stressed the metal was. With that critical information in hand, they generated maps showing the location and degree of stress throughout the samples.

All samples contained stresses close to the titanium alloy's yield strength -- the point at which a material undergoes permanent deformation. But the maps revealed something else that caught the researchers by surprise.

"The island scan samples have these really large stresses on their sides and tops, which are missing or much less pronounced in the continuous scan samples," said NIST physicist and co-author Lyle Levine. "If island scanning is a way that industry is trying to mitigate these stresses, I would say, for this particular case, it is far from successful."

In another test, they detached a leg of each bridge from the metal base plates it was stuck to. The study's authors measured the distance the legs sprung upward, obtaining another indicator for how much residual stress was stored inside of the arch of each bridge. Again, the island scan samples performed poorly, their legs deflecting by more than twice as much as the other samples.

The authors propose that island scanning could be a double-edged sword. Although the small size of the islands may reduce contraction, the islands might also cool much faster than the larger melt pools, creating greater temperature differences and thus greater stress.

Although island scanning was not well suited to the particular part, material and equipment used in the study, it could still be a good choice under different circumstances, Phan said. The results do indicate it is not a cure-all for residual stress, however. To keep stress at bay, manufacturers may need to tailor the scanning strategy and other parameters to their specific build -- an effort that would be greatly aided by computer models.

Rather than optimize a print through trial and error, manufacturers could use models to quickly and cheaply identify the best parameters, if their predictions are accurate. Modelers could boost confidence in their tools by testing them against rigorously produced benchmark measurements, not unlike the data obtained in the new study, Levine said.

This work provides a new perspective on a popular printing strategy, adding a key piece to the puzzle of residual stress formation and ultimately bringing 3D printing a step closer to its full potential.

Credit: 
National Institute of Standards and Technology (NIST)

Scientists reconstruct past history of largest ice shelf on Antarctic Peninsula

For the first time, geological records have been used to reconstruct the history of Larsen C Ice Shelf in Antarctica. The ice shelf is the largest remaining remnant of a much more extensive area of ice on the Antarctic Peninsula that began to break up during the 1990s (Larsen A), and saw a huge collapse in 2002 (Larsen B). This new reconstruction enables scientists to better understand if and when the remaining ice shelf could collapse in the future.

Publishing this month in the journal Geology an international team describes how the largest remaining ice shelf on the Antarctic Peninsula, has been stable for the past ~10,000 years.

The vast Larsen Ice Shelf, twice the size of Wales, attracted global media attention, after a 5,800-square-kilometre iceberg weighing more than a trillion tonnes calved in 2017. Last month (April) it broke up completely, following a three year journey drifting from the Antarctic Peninsula to the sub-Antarctic island of South Georgia.

Over the past 25 years, several of the region's ice shelves have collapsed, including the rapid disintegration of the Larsen B Ice Shelf in 2002. The sequential breakup of ice shelves along the eastern Antarctic Peninsula is linked to warmer atmospheric temperatures which have gradually moved southward over the past 50 years. At the same time, warm ocean currents have also increased, weakening the region's ice shelves from below.

Using hot water drilling technology to penetrate through the 300 m-thick ice shelf, the team collected seabed sediment cores from beneath the Larsen C Ice Shelf in 2011. Data from these were combined with data from sediment cores recovered offshore a decade earlier, enabling the science team to reconstruct the first detailed history of the ice shelf. The authors conclude that despite modest retreat and advances of the ice shelf front there was no significant collapse during the past 10,000 years.

Lead author, marine geologist Dr James Smith from British Antarctic Survey, says:

"There is a huge international scientific effort underway to get a better understanding of what's happening to Antarctica's ice shelves. If we can understand what happened in the past we will have a sense of what might happen in the future. We can perhaps differentiate natural events that affect the ice shelves from environmental change related to human activity. This new study provides the final piece of the puzzle to the history of this last remaining ice shelf on the eastern Peninsula."

The team suggest that persistence of Larsen C, as well as Larsen B, implies that these ice shelves were more resilient to past climate warming because they were thicker, or that the heat from the atmosphere and ocean did not penetrate this far south.

In this context, the collapse of Larsen B in 2002 provided the first clue that the extent of contemporary ice shelf break-ups was starting to push further south than at any time during the past 10,000 years. Larsen C is also showing signs that it might be the next ice shelf in line to collapse.

"We now have a much clearer picture of the pattern and extent of ice shelf break-ups, both past and present. It starts in the north and progresses southward as the atmosphere and ocean warms. Should collapse of Larsen C happen, it would confirm that the magnitudes of ice loss along the eastern Antarctic Peninsula and underlying climate change are unprecedented during the past 10,000 years" says Smith.

Credit: 
British Antarctic Survey

Tree species diversity is no protection against bark beetle infestation

image: Aerial view of the IDENT tree diversity experiment near Freiburg before (left) and after (right) the 2018 drought and bark beetle infestation

Image: 
aerial photos by K. R. Kovach, Sixtoothed spruce bark beetle photo by U. Schmidt

In recent years, foresters have been able to observe it up close: First, prolonged drought weakens the trees, then bark beetles and other pests attack. While healthy trees keep the invaders away with resin, stressed ones are virtually defenseless. Freiburg scientist Sylvie Berthelot and her team of researchers from the Faculty of Environment and Natural Resources and the Faculty of Biology are studying the importance of tree diversity on bark beetle infestation. They are investigating whether the composition of tree species affects bark beetle feeding behavior. The team recently published their findings in the Journal of Ecology.

In a 1.1 hectare experimental set-up in Freiburg, six native deciduous and coniferous tree species from Europe and six deciduous and coniferous tree species from North America were each planted in different mono- and mixed plots. After the severe drought in the summer of 2018, the Sixtoothed spruce bark beetle mainly attacked the native species: the European spruce and the European larch. "We were surprised that the beetles exhibited only a slight interest in the exotic conifer species, such as the American spruce," Berthelot says.

While measuring the infestation, the researchers found that the position within the experimental site was also crucial. The trees at the edge were attacked the most. Therefore, Berthelot suspects that the bark beetle entered the testing plot from outside. "In addition, environmental influences weaken the unprotected outer trees more, so they are more susceptible."

At the same time, the likelihood of which trees the bark beetles will attack changes the more tree species there are. Until now, the researchers assumed that tree diversity reduces the infestation of insect pests such as the bark beetle. But their experiment shows that "increasing tree diversity can reduce the risk of bark beetle infestation for species that are susceptible to high infestation rates, such as larch and spruce. But the risk for less preferred species such as pine or exotic trees may increase with tree diversity, as beetles, once attracted, also attack these trees," Berthelot says. Although the study indicates that non-native tree species are less attacked because the bark beetles are unfamiliar with these species. "However, this effect may weaken over the years," she said. As a result, the risk of infestation in mixed forests is redistributed among tree species rather than reduced for all.

The team is conducting research as part of the International Diversity Experiment Network with Trees, or IDENT. The international network is dedicated to research on tree species diversity and its influence on ecosystem functions. The same experimental setup was established in Freiburg as in Canada, the US and Italy.

Credit: 
University of Freiburg

White shark population is small but healthy off the coast of Central California

image: Researchers use a camera on a pole to document the unique dorsal fin markings of a white shark off the California coast.

Image: 
Scot Anderson

NEWPORT, Ore. - The population of white sharks that call the Central California coast their primary home is holding steady at about 300 animals and shows some signs of growth, a new long-term study of the species has shown.

Between 2011 and 2018, researchers were able to identify hundreds of individual adult and subadult white sharks, which are not fully mature but are old enough to prey on marine mammals. They used that information to develop estimates of the sharks' abundance.

"The finding, a result of eight years of photographing and identifying individual sharks in the group, is an important indicator of the overall health of the marine environment in which the sharks live," said Taylor Chapple of the Coastal Oregon Marine Experiment Station at Oregon State University's Hatfield Marine Science Center and a co-author of the study.

White sharks, sometimes referred to as "great" white sharks, are apex predators, meaning they are the top animal of the food chain, preying on large marine mammals such as elephant seals, harbor seals and sea lions. As apex predators, they play an important role in the health of the marine ecosystem, said Chapple, who is an assistant professor in Department of Fisheries, Wildlife, and Conservation Sciences in OSU's College of Agricultural Sciences.

"Robust populations of large predators are critical to the health of our coastal marine ecosystem," said Chapple, a marine ecologist who specializes in the study of marine predators. "So our findings are not only good news for white sharks, but also for the rich waters just off our shores here."

The findings were just published in the journal Biological Conservation. The study's lead author is Paul Kanive of Montana State University. Additional co-authors are Jay Rotella of Montana State University; Scot Anderson of the Monterey Bay Aquarium; Timothy White and Barbara Block of Stanford University; and Salvador Jorgensen of University of California, Santa Cruz and the Monterey Bay Aquarium.

White sharks live in all of the world's oceans. They can grow to 20 feet in length, weigh more than 2,000 pounds and live up to 70 years. They are listed as a vulnerable species due to threats such as fishing, because they can be caught up in commercial fishing gear, and poaching due to trade interest for their fins and teeth.

White sharks' main aggregation in the California Current, the span of waters off the West Coast of North America, is off the coast of Central California in an area that stretches from Bodega Bay, north of San Francisco, south to Monterey Bay.

Through more than 20 years of study, researchers have learned this group of white sharks spends about half of the year in offshore waters of the northeast Pacific about halfway between Hawaii and Baja, Mexico, and about half the year along the Pacific Coast. They may travel as far north as Washington and as far south as Mexico, but tend to aggregate around islands and shores off the central California coast and Guadalupe Island in Mexico.

Monitoring animal populations and determining trends is important to understanding the health of the population and making decisions about population management and protections. In 2011, Chapple, working with Jorgensen and Block, published the first estimate of population size for the California sharks. The new study provides a longer term of observation of the population's size and growth trends.

From 2011 through 2018, researchers collected photographs from above the water and underwater video recordings of white sharks during peak periods of their residency in the fall and early winter in the waters off the California coast. The data was collected from more than 2,500 hours of observation at three sites: Southeast Farallon Island; Año Nuevo Island; and Tomales Point.

They lured white sharks to their research boat with a seal decoy and captured more than 1,500 photographs that they used to identify individual sharks, focusing on adults and subadults.

"Every white shark has a unique dorsal fin. It's like a fingerprint or a bar code. It's very distinct," Chapple said. "We were able to identify every individual over that eight-year period. With that information, we were able to estimate the population as a whole and establish a trend over time."

Researchers often can also use the underwater video to identify whether a shark is male or female. Following the same sharks year over year allows them to gain insight into the sharks' age and survival differences between males and females. About half of the sharks they saw each year were sharks they had seen previously, and about half were new sightings.

Overall, the researchers found that the population of white sharks numbers about 300, and evidence suggests that the adult population showed a modest uptick in numbers, while the subadult population held steady over the course of the study. The findings applied to both male and female sharks, though the estimate of the adult female population showed only about 60 sharks in this region.

"That underscores the need for continued monitoring of white sharks, as there are relatively few reproductively active females supplying the population with additional sharks," said Kanive, the study's lead author.

"Losing just a few animals can be really critical to the larger population," he said. "It's important that we continue to protect them and their surroundings."

The Marine Mammal Protection Act, which protects many of the prey that are critical to white sharks' survival, and restrictions on the use of gillnets in the California coastal region are likely factors helping the white shark population, Chapple said. But once white sharks leave U.S. waters, they continue to face threats.

"We can provide as much protection as possible while they are in coastal waters, but these sharks are highly migratory animals," he said. "It will take international cooperation, agreement and enforcement to protect them."

Credit: 
Oregon State University

Unexpected 'Black Swan' defect discovered in soft matter for first time

In new research, Texas A&M University scientists have for the first time revealed a single microscopic defect called a "twin" in a soft-block copolymer using an advanced electron microscopy technique. This defect may be exploited in the future to create materials with novel acoustic and photonic properties.

"This defect is like a black swan -- something special going on that isn't typical," said Dr. Edwin Thomas, professor in the Department of Materials Science and Engineering. "Although we chose a certain polymer for our study, I think the twin defect will be fairly universal across a bunch of similar soft matter systems, like oils, surfactants, biological materials and natural polymers. Therefore, our findings will be valuable to diverse research across the soft matter field."

The results of the study are detailed in the Proceedings of the National Academy of Sciences (PNAS).

Materials can be broadly classified as hard or soft matter. Hard materials, like metal alloys and ceramics, generally have a very regular and symmetric arrangement of atoms. Further, in hard matter, ordered groups of atoms arrange themselves into nanoscopic building blocks, called unit cells. Typically, these unit cells are comprised of only a few atoms and stack together to form the periodic crystal. Soft matter can also form crystals consisting of unit cells, but now the periodic pattern is not at the atomic level; it occurs at a much larger scale from assemblies of large molecules.

In particular, for an A-B diblock copolymer, a type of soft matter, the periodic molecular motif comprises of two linked chains: one chain of A units and one chain of B units. Each chain, called a block, has thousands of units linked together and a soft crystal forms by selective aggregation of the A units into domains and B units into domains that form huge unit cells compared to hard matter.

Another notable difference between soft and hard crystals is that structural defects have been much more extensively studied in hard matter. These imperfections can occur at a single atomic location within material, called a point defect. For example, point defects in the periodic arrangement of carbon atoms in a diamond due to nitrogen impurities create the exquisite "canary" yellow diamond. In addition, imperfections in crystals can be elongated as a line defect or spread across an area as a surface defect.

By and large, defects within hard materials have been extensively investigated using advanced electron imaging techniques. But in order to be able to locate and identify defects in their block copolymer soft crystals, Thomas and his colleagues used a new technique called slice-and-view scanning electron microscopy. This method allowed the researchers to use a fine ion beam to trim off a very thin slice of the soft material, then they used an electron beam to image the surface below the slice, then slice again, image again, over and over. These slices were then digitally stacked together to get a 3D view.

For their analysis, they investigated a diblock copolymer made of a polystyrene block and a polydimethylsiloxane block. At the microscopic level, a unit cell of this material exhibits a spatial pattern of the so-called "double gyroid" shape, a complex, periodic structure consisting of two intertwined molecular networks of which one has a left-handed rotation and the other, a right-handed rotation.

While the researchers were not actively looking for any particular defect in the material, the advanced imaging technique uncovered a surface defect, called a twin boundary. At either side of the twin juncture, the molecular networks abruptly transformed their handedness.

"I like to call this defect a topological mirror, and it's a really neat effect," said Thomas. "When you have a twin boundary, it's like looking at a reflection into a mirror, as each network crosses the boundary, the networks switch handedness, right becomes left and vice versa."

The researcher added that the consequences of having a twin boundary in a periodic structure that does not by itself have any inherent mirror symmetry could induce novel optical and acoustic properties that open new doors in materials engineering and technology.

"In biology, we know that even a single defect in DNA, a mutation, can cause a disease or some other observable change in an organism. In our study, we show a single twin defect in a double gyroid material," said Thomas. "Future research will explore to see whether there's something special about the presence of an isolated mirror plane in a structure, which otherwise has no mirror symmetry."

Credit: 
Texas A&M University

The 'Great Dying'

image: Outcrop photos are taken T.D. Frank and are from Frazer Beach, New South Wales, Australia. The end Permian extinction and disappearance of Glossopteris flora occurs at the top of the coal (black layer).

Image: 
T.D. Frank

Boulder, Colo., USA: The Paleozoic era culminated 251.9 million years ago in the most severe mass extinction recorded in the geologic record. Known as the "great dying," this event saw the loss of up to 96% of all marine species and around 70% of terrestrial species, including plants and insects.

The consensus view of scientists is that volcanic activity at the end of the Permian period, associated with the Siberian Traps Large Igneous Province, emitted massive quantities of greenhouse gases into the atmosphere over a short time interval. This caused a spike in global temperatures and a cascade of other deleterious environmental effects.

An international team of researchers from the United States, Sweden, and Australia studied sedimentary deposits in eastern Australia, which span the extinction event and provide a record of changing conditions along a coastal margin that was located in the high latitudes of the southern hemisphere. Here, the extinction event is evident as the abrupt disappearance of Glossopteris forest-mire ecosystems that had flourished in the region for millions of years. Data collected from eight sites in New South Wales and Queensland, Australia were combined with the results of climate models to assess the nature and pace of climate change before, during, and after the extinction event.

Results show that Glossopteris forest-mire ecosystems thrived through the final stages of the Permian period, a time when the climate in the region was gradually warming and becoming increasingly seasonal. The collapse of these lush environments was abrupt, coinciding with a rapid spike in temperatures recorded throughout the region. The post-extinction climate was 10-14°C warmer, and landscapes were no longer persistently wet, but results point to overall higher but more seasonal precipitation consistent with an intensification of a monsoonal climate regime in the high southern latitudes.

Because many areas of the globe experienced abrupt aridification in the wake of the "great dying," results suggest that high-southern latitudes may have served as important refugia for moisture-loving terrestrial groups.

The rate of present-day global warming rivals that experienced during the "great dying," but its signature varies regionally, with some areas of the planet experiencing rapid change while other areas remain relatively unaffected. The future effects of climate change on ecosystems will likely be severe. Thus, understanding global patterns of environmental change at the end of the Paleozoic can provide important insights as we navigate rapid climate change today.

Credit: 
Geological Society of America

Researchers shed light on the evolution of extremist groups

image: Early online support for the Boogaloos, one of the groups implicated in the January 2021 attack on the United States Capitol, followed the same mathematical pattern as ISIS, despite the stark ideological, geographical and cultural differences between their forms of extremism.

Image: 
Neil Johnson/GW

WASHINGTON (May 19, 2021)--Early online support for the Boogaloos, one of the groups implicated in the January 2021 attack on the United States Capitol, followed the same mathematical pattern as ISIS, despite the stark ideological, geographical and cultural differences between their forms of extremism. That's the conclusion of a new study published today by researchers at the George Washington University.

"This study helps provide a better understanding of the emergence of extremist movements in the U.S. and worldwide," Neil Johnson, a professor of physics at GW, said. "By identifying hidden common patterns in what seem to be completely unrelated movements, topped with a rigorous mathematical description of how they develop, our findings could help social media platforms disrupt the growth of such extremist groups," Johnson, who is also a researcher at the GW Institute for Data, Democracy & Politics, added.

The study, published in the journal Scientific Reports, compares the growth of the Boogaloos, a U.S.-based extremist group, to online support for ISIS, a militant, terrorist organization based in the Middle East. The Boogaloos are a loosely organized, pro-gun-rights movement preparing for civil war in the U.S. By contrast, ISIS adheres to a specific ideology, a radicalized form of Islam, and is responsible for terrorist attacks across the globe.

Johnson and his team collected data by observing public online communities on social media platforms for both the Boogaloos and ISIS. They found that the evolution of both movements follows a single shockwave mathematical equation.

The findings suggest the need for specific policies aimed at limiting the growth of such extremist movements. The researchers point out that online extremism can lead to real world violence, such as the attack on the U.S. Capitol, an attack that included members of the Boogaloo movement and other U.S. extremist groups.

Social media platforms have been struggling to control the growth of online extremism, according to Johnson. They often use a combination of content moderation and active promotion of users who are providing counter messaging. The researchers point out the limitations in both approaches and suggest that new strategies are needed to combat this growing threat.

"One key aspect we identified is how these extremist groups assemble and combine into communities, a quality we call their 'collective chemistry'," Yonatan Lupu, an associate professor of political science at GW and co-author on the paper, said. "Despite the sociological and ideological differences in these groups, they share a similar collective chemistry in terms of how communities grow. This knowledge is key to identifying how to slow them down or even prevent them from forming in the first place."

Credit: 
George Washington University

Children's sleep and adenotonsillectomy

While a pint-sized snorer may seem adorable tucked up in bed, studies shows that children with sleep disordered breathing are likely to show aggressive and hyperactive behaviours during the day.

The recommended treatment is an adenotonsillectomy - the removal of adenoid and tonsils - not only to fix the snore, but also the behaviour.

Yet according to new research from the University of South Australia, while the surgery can cure a child's snoring it doesn't change their behaviour, despite common misconceptions by parents and doctors alike.

Conducted in partnership with the University of Adelaide and the Women's and Children's Hospital, researchers examined children's behaviour at six months, two and four years after an adenotonsillectomy for clinically diagnosed sleep disordered breathing disorder (SDB).

Comparing them to a control group of non-snoring children, the study showed improvements to children's sleep and quality of life, but not behaviour.

Lead researcher, UniSA's Professor Kurt Lushington says the findings provide realistic expectations for parents and practitioners, particularly if the child already has a diagnosis of a behavioural disorder such as ADHD.

"As most parents would attest, when a child has a bad night's sleep, their behaviour reflects this the next day," Prof Lushington says.

"But when their sleep quality is affected by snoring, parents often hope that by fixing this problem, they'll also fix any associated behavioural issues. While I'd love to advise the opposite, this is not necessarily the case.

"Our research shows that a child's quality of life improves following an adenotonsillectomy, which is clearly linked to a more solid, less interrupted sleep. But when it comes to behavioural difficulties, we did not see any significant changes."

Recommended sleep for school age children (age 5-12) is between 9-11 hours a night. Up to 15 per cent of children snore regularly, with 1-4 per cent formally diagnosed with obstructive sleep apnea syndrome (which leads to breathing repeatedly stopping and starting).

But it's not all bad news.

"In clinical practice at a child's post-operative review, many parents report major improvements in behaviour and attentiveness," Prof Lushington says.

"No doubt this is reassuring, but it's probable that other factors are at play - most likely more sleep for the whole family and less worry from the parents, that together translate as a calmer, more attentive and emotionally responsive environment during the day.

"Beyond this, there is evidence to suggest that intervention much earlier in life may help. We may be leaving surgery too late."

"Previous work conducted by UniSA found that an adenotonsillectomy at a younger age of 3-5 years - may be important. Our previous work has suggested this too, so there is scope for further research.

"At this point, ensuring parents are fully aware of what an adenotonsillectomy can and can't achieve for their child, is vital."

Credit: 
University of South Australia

Getting "wind" of the future: Making wind turbines low-maintenance and more resilient

image: Researchers propose a computationally simple design for the simultaneous detection and real-time resolution of multiple faults in a wind turbine system.

Image: 
IEEE/CAA Journal of Automatica Sinica

A key driver of energy research is the ever-growing demand for energy. Traditional fossil-fuel-based energy sources currently meet these demands and do it well, but they're non-renewable and cause major environmental pollution. In a world with looming climate and resource crises threats, researchers have turned to renewable sources of energy as sustainable alternatives. Among renewables, wind energy, in particular, has gained considerable attention due to its low cost. As Dr. Afef Fekih, Computer Engineer at the University of Louisiana, USA, with a specialization in wind turbine design, notes, "Wind energy has been described as 'the world's fastest-growing renewable energy source', seeing a 30% annual growth on average over the last two decades."

But with the number of wind farms increasing worldwide, there has been a commensurate growth in the need for their improved reliability. This partly has to do with the fact that ideal habitats for wind farms can sometimes be in remote offshore areas that are characterized by rough weather and uncertain winds. Under such conditions, wind turbines become prone to faults, which lead to greater downtimes and lower amounts of generated power. Furthermore, the consequent increased maintenance needs only serve to make the generated energy costlier.

In a new study published in the IEEE/CAA Journal of Automatica Sinica, Dr. Fekih and her colleagues from Curtin University, Australia, and University of Ferrara, Italy, address these issues using an approach called fault-tolerant control (FTC)--a set of techniques that prevent simple faults from becoming serious failures. "FTCs can maintain satisfactory performance under faulty conditions using either passive approaches or active ones, such as online fault detection and isolation (FDI), which capture fault information and use it to optimize maintenance via remote diagnosis," explains Dr. Fekih.

The team applied FDI. They realized that previous studies exploring FDI focus on faults either in the pitch actuator (an independent safety brake for the turbine) or in the speed sensor, but not both. Yet, the reality remains that simultaneous faults can and are likely to occur in both. To enhance a turbine's performance and efficiency, a method to compensate for such simultaneous faults could be key. Thus, they set out to develop a computationally simple scheme for the simultaneous detection of sensor and actuator faults, while doing away with any redundant hardware components.

To achieve this, they adopted an approach called "the principle of separation", where they designed and implemented on a computer a "state observer" model--a system that is essentially a duplicate of the original system. They then provided similar inputs to the observer and the turbine, and by noting discrepancies between the model and the observer, were able to detect system faults in real time. Based on the detection, one can implement a signal correction scheme to ensure that the system keeps operating with the desired efficiency.

In this approach, all wind variations are considered as unknown disturbances, which eliminates the need for their accurate measurement or estimation, thereby reducing computational complexity, and in turn, cost.

When the team evaluated their scheme numerically on a 4.8-megawatt benchmark wind turbine model and backed it up with Monte Carlo analysis it proved reliable and robust. The desired operation could be maintained despite simultaneous faults and unknown wind speeds.

Inspired by their findings, Dr. Fekih envisions a future governed by wind energy. "By helping realize an economically feasible wind turbine power plant as a renewable energy source with zero carbon footprint, we will be able to respond to the increasing power demand in a sustainable and environment-friendly way," she says.

It is indeed exciting to get "wind" of such a future.

Credit: 
Chinese Association of Automation