Tech

Reimagining everyday technologies in light of COVID-19

In his paper, Life Less Normal, published in the July-August 2020 special edition of the Association of Computing Machinery's journal, Interactions, City, University of London's Reader in Human Computer Interaction, Dr Alex Taylor invites an examination of life as normal, and asks whether it is really something we want to return to, or indeed should return to.

The most recent edition of Interactions, co-edited by Dr Taylor, is dedicated to exploring the ways in which critical research and emerging practices in Human Computer Interaction and Design (HCID) are being applied to address societal challenges presented by the current COVID-19 pandemic.

With his academic concern for the ways technology is inextricably bound up with social life, he asks how technology might have played into discriminatory and exploitative structures in society.

Dr Taylor poses the question:

What are we to think of the seemingly benign call to work remotely (and the technologies designed to enable this) when we consider the often forgotten workforce that has no choice but to be at work, to touch and be touched by others? In the coronavirus pandemic and its aftermath, the urge has been to think about a time before lockdown and before what has felt like a never-ending series of curtailments to ordinary life. The urge has been to wish for a return to a life as normal. However, something the uncertainties of the pandemic have forced us to consider is what is this normal? What have we taken for granted and what, exactly, is entailed in the versions of life we want back?

Dr Taylor calls on the industries and practitioners who build technologies, and the scholars who study it, to imagine different futures, ones that are responsive to and responsible for the full diversity of lives lived, and that ensure many more actors are given a place at the table where what matters and who counts are decided:

"Finding ways to mitigate the spread of COVID-19 by supporting, for example, contact tracing, symptom tracking, and immunity certification are undoubtedly important goals. The longer-term challenge for those of us invested in design and technology's proliferation must be, however, to look beyond these immediate fixes. We need to be asking what multi-scalar modes and practices might be reimagined to be responsive to and responsible for the seemingly separate technoscientific realms of managing human pandemics and caring for our sociotechnical and multispecies relations. We need to be imagining worlds that resist singular or monolithic ways of valuing life, that question the logics of extraction and transaction, and that make possible a multiplicity of ways of living together."

Credit: 
City St George’s, University of London

COVID-19 infected workers return to work faster using time and symptom-based protocols

BOSTON - One of the of the most important questions in managing a hospital's response to the COVID-19 pandemic is determining when healthcare workers infected with COVID-19 can return to the job. Recently, investigators from Mass General Brigham (MGB) assessed the experience of using a test-based protocol in over 1000 infected health care workers.

Their research was published in the latest edition of Infection Control & Hospital Epidemiology.

The "test-based" approach involves repeat testing after resolution of symptoms until two consecutive negative tests are obtained 24 hours apart. In the alternative time-plus-symptoms approach, healthcare workers return to work after a set period of time since their symptom onset (or in the case of asymptomatic infection, the date of their positive test) has elapsed and symptoms, if present, have improved or resolved.

"We've learned a lot throughout the pandemic," explains Erica S. Shenoy, MD, PhD, associate chief of the Infection Control Unit at Massachusetts General Hospital (MGH) and the study's lead author. "For example, we now know from multiple published studies that individuals can have repeat positive tests for weeks and those positive test do not reflect infectivity after their initial illness has resolved." These findings have led to modifications in how public health and healthcare facilities determine how long individuals need to be isolated to prevent transmission to others, Shenoy says.

The Centers for Disease Control and Prevention (CDC)'s April guidance advised either repeat "test-based" strategy to determine when workers could return to their healthcare jobs or "time-plus-symptoms" approach. Under the test-based strategy, healthcare workers had to have two negative back-to-back PCR tests to return to work.

For this study, conducted between March 7 and April 22, 2020, employees from across Mass General Brigham (MGB) health system who showed symptoms of COVID-19 were referred to its Occupational Health Services department for evaluation and a nasopharyngeal (NP) sample test using viral RNA nucleic acid amplification methods.

Return to work criteria at that time required: resolution of fever without fever-reducing medications, improvement in respiratory symptoms, and at least two consecutive negative nasopharyngeal tests collected longer than or equal to 24 hours apart. There was no minimum interval of time from resolution of symptoms to first test of clearance specified.

The researchers then analyzed the data to evaluate results of the two strategies and found that using resolution of symptoms and passage of time would have averted more than 4,000 days of lost worktime, or a mean of 7.2 additional days of work lost per employee compared to using a time-plus-symptom approach. Both approaches are options per public health recommendations, though more recently, the time-plus-symptom approach is now the preferred strategy per the CDC the Massachusetts Department of Public Health, and has replaced the prior test-based approach at MGB. One additional potential benefit of moving away from test-based approaches, though not assessed in this study, was the psychological impact on employees of repeat testing. "We've had employees who tested positive repeatedly but had recovered for weeks and they were frustrated we couldn't bring them back to work," Shenoy says. About 70 percent of participants had at least one negative test result during the study, and of those, about 62 percent had a two negative test results in a row, she adds.

A substantial number of healthcare workers diagnosed and treated for COVID-19 had repeatedly positive PCR tests. Such long duration of PCR positivity has been seen in other studies as well.

Determining when workers can return to work is a process that can affect many aspects of hospital operations, Shenoy says. "Patient and worker safety, flow of resources, speed and access to care, are some of the things impacted."

Based on the studies findings, and evolving public health guidance to prefer time-plus-symptom over test-based strategies, MGB moved to the latter over the summer. "Moving to a time+symptom approach was a vast improvement over past reliance on a less predictable test-based approach. Employees are now able to anticipate when they will be allowed to return to work, and it has reduced the strain on our testing capacity. This revision in testing strategy is consistent with our evolving medical understanding of test results and in keeping with our high commitment to workplace safety," said Dean Hashimoto, MD, chief medical officer of MGB Occupational Health Services.

Credit: 
Massachusetts General Hospital

Secondary variant of Photorhabdus luminescens interacts with plant roots

image: The "dual life" of the insect-pathogenic bacterium Photorhabdus luminescens: The primary variant (I) of the bacterium lives in symbiosis with its hosts, nematodes, which attack and kill insect larvae. Because this variant is bioluminescent, the larvae it kills glow with a blue light (left). The genetically identical secondary variant (II) cannot interact with nematodes, remains in the soil after an infection cycle, and instead interacts with plant roots (right).

Image: 
ill./©: Nazzareno Dominelli, Ralf Heermann

One of the basic approaches in organic farming is to use organisms beneficial to the system to combat pests. The bacterium Photorhabdus luminescens is one such beneficial organism. In the case of insect larvae infestation, the bacterium produces a variety of different toxins which quickly kill the larvae. Yet, it seems this is not the only ability of Photorhabdus that can be exploited for organic plant cultivation. A research team led by Professor Ralf Heermann at Johannes Gutenberg University Mainz (JGU) has discovered additional properties that could significantly extend its range of uses. "We have identified a new form of the bacterium that was previously unknown," Heermann pointed out. This has a direct relationship with the roots of plants. The researchers think that here it promotes plant growth primarily by releasing substances which combat plant-damaging fungi.

Bioluminescent symbiotic bacteria cause their insect victims to glow

Bacteria of the Photorhabdus luminescens family are close relatives of the plague pathogen Yersinia pestis. However, they do not pose a danger to humans, but rather stand out for a different characteristic: The insect larvae they kill become luminescent. As well as harmful toxins, for reasons that are as yet unclear, Photorhabdus also produces the enzyme luciferase, which causes the body of the victim to glow as it decays. This form of Photorhabdus lives in close symbiosis with small nematodes that penetrate into insect larvae and release the bacteria inside them. But it appears that the newly identified variant does not need a host. "We were surprised to find that a large proportion of the population was developing differently and looked into why this was," explained Heermann.

His team first used molecular biological techniques to analyze the transcriptome, i.e., the total number of gene transcripts in a cell, and found that there were actually two variants. According to the results, the new variant differs on a number of levels as it is more mobile and sensitive, reacts to plant exudates, and is attracted towards them. "All this points to the fact that this bacterial variant interacts more intensely with plants," said Heermann. In the next stage, the research team looked at the interaction more closely. They discovered that the new variant bacterium changes its metabolism to increase the utilization of sugar instead of protein and produces substances that inhibit the growth of fungi that are pathogenic to plants. "A completely different set of natural substances is produced when this bacterium comes into contact with plants," Heermann added.

Second bacterial variant offers new prospects for organic farming

The researchers do not know yet why there is this second variant of Photorhabdus luminescens, which, despite being genetically identical to its primary form, behaves differently and also does not produce luciferase. The two first authors of the paper published in Applied and Environmental Microbiology, Dr. Alice Regaiolo and Nazzareno Dominelli, postulate that the purpose of this is to ensure there is a variant of Photorhabdus luminescens that can survive and fend for itself even if the other variant is unable to prosper because the host nematodes do not find any insects. In any case, there may be completely new prospects for sustainable crop protection in agriculture: The bacteria could be used to combat pests and also promote plant growth. The question now arises of whether other pathogenic bacteria also lead such a "dual life". So far, this has been the first time that this phenomenon was observed.

Credit: 
Johannes Gutenberg Universitaet Mainz

Building a better stroke diagnosis

image: Grant O'Connell

Image: 
none

CLEVELAND--An interdisciplinary group of researchers at the Frances Payne Bolton School of Nursing at Case Western Reserve University have uncovered a new suite of human blood biomarkers which could someday help emergency clinicians quickly recognize whether someone is experiencing a stroke with a simple blood test.

While a viable test is probably still years away, the researchers have identified new biomarkers whose presence in the blood indicates damage to brain tissue, said Grant O'Connell, an assistant professor and director of the Biomarker and Basic Science Laboratory at the nursing school.

O'Connell and colleagues from the School of Nursing recently published their findings in the Proceedings of the National Academy of Sciences. Others on the research team, all students taught by O'Connell in the nursing PhD program at the School of Nursing, were Megan L. Alder, Christine G. Smothers and Julia H. C. Chang.

Major strokes, minor strokes

The symptoms of a major stroke are readily apparent, often repeated in public service announcements as FAST--the acronym for Face drooping, Arm weakness, Speech slurred and Time to call 911.

However, O'Connell said, most strokes cannot be definitively diagnosed until revealed by advanced radiological tests at a hospital, such as an MRI or CT scan.

"You would think that a stroke would be really obvious, and that's true with severe strokes, but most strokes are actually minor (in terms of the initial symptoms)," O'Connell said. "Many people might just think that they're having a bad migraine, so they don't go to the hospital."

More importantly, it can be difficult for health care workers such as paramedics, nurses and physicians to recognize that a stroke is happening in this group of patients who have less obvious symptoms. Because stroke treatment is time-sensitive, this can lead to life-threatening delays in care.

"(Clinicians) don't have CT scanners or MRI in the back of an ambulance, or even in the emergency rooms of some of the smaller hospitals," O'Connell said. "Because of this, up to one-third of strokes are missed at the initial contact with a clinician, which delays treatment that could prevent death or disability."

The discovery of blood biomarkers associated with stroke could be an avenue to avoid such delays, he said.

"If we had a blood test to tell us right away if someone is having a stroke, that could make a huge difference in patient care," O'Connell said.

Finding new biomarkers

The idea of finding biomarkers for brain damage, such as the damage caused by stroke, in the blood is not new. In fact, the problem with advancing the technique was more that the data were old, O'Connell said.

Neurodiagnostic researchers have known for years that if proteins can be identified that are only expressed within the brain, their detection in the blood could indicate that there is damage to the brain tissue.

"But what we've started to realize is that the proteins we study as candidate biomarkers had been identified some 20 to 40 years ago," O'Connell said. "And it turns out that a lot of these proteins aren't as specific to the brain as we thought because we're now seeing them expressed in other organs, so it could look like you've had a brain injury and you didn't."

The Frances Payne Bolton team used a custom developed algorithm to assess gene expression patterns in thousands of tissue samples from the brain and other organs to identify proteins which could serve as more specific biomarkers of neurological damage. The analysis revealed up to 50 new possible markers, several of which were subsequently measured and successfully detected in the blood of a cohort of patients with stroke, O'Connell said.

"This could open up the door to a whole new wave of biomarker research," he said, "and that could lead to clinically useful tests (if we can) validate the findings."

Credit: 
Case Western Reserve University

Talking alone: Researchers use artificial intelligence tools to predict loneliness

image: Ellen Lee, MD, assistant professor of psychiatry at UC San Diego School of Medicine.

Image: 
UC San Diego Health Sciences

For the past couple of decades, there has been a loneliness pandemic, marked by rising rates of suicides and opioid use, lost productivity, increased health care costs and rising mortality. The Covid-19 pandemic, with its associated social distancing and lockdowns, have only made things worse, say experts.

Accurately assessing the breadth and depth of societal loneliness is daunting, limited by available tools, such as self-reports. In a new proof-of-concept paper, published online September 24, 2020 in the American Journal of Geriatric Psychiatry, a team led by researchers at University of California San Diego School of Medicine used artificial intelligence technologies to analyze natural language patterns (NLP) to discern degrees of loneliness in older adults.

"Most studies use either a direct question of ' how often do you feel lonely,' which can lead to biased responses due to stigma associated with loneliness or the UCLA Loneliness Scale which does not explicitly use the word 'lonely,'" said senior author Ellen Lee, MD, assistant professor of psychiatry at UC San Diego School of Medicine. "For this project, we used natural language processing or NLP, an unbiased quantitative assessment of expressed emotion and sentiment, in concert with the usual loneliness measurement tools."

In recent years, numerous studies have documented rising rates of loneliness in various populations of people, particularly those most vulnerable, such as older adults. For example, a UC San Diego study published earlier this year found that 85 percent of residents living in an independent senior housing community reported moderate to severe levels of loneliness.

The new study also focused on independent senior living residents: 80 participants aged 66 to 94, with a mean age of 83 years. But, rather than simply asking and documenting answers to questions from the UCLA Loneliness Scale, participants were also interviewed by trained study staff in more unstructured conversations that were analyzed using NLP-understanding software developed by IBM, plus other machine-learning tools.

"NLP and machine learning allow us to systematically examine long interviews from many individuals and explore how subtle speech features like emotions may indicate loneliness. Similar emotion analyses by humans would be open to bias, lack consistency, and require extensive training to standardize," said first author Varsha Badal, PhD, a postdoctoral research fellow.

Among the findings:

Lonely individuals had longer responses in qualitative interview, and more greatly expressed sadness to direct questions about loneliness.
Women were more likely than men to acknowledge feeling lonely during interviews.
Men used more fearful and joyful words in their responses compared to women.

Authors said the study highlights the discrepancies between research assessments for loneliness and an individual's subjective experience of loneliness, which NLP-based tools could help to reconcile. The early findings reflect how there may be "lonely speech" that could be used to detect loneliness in older adults, improving how clinicians and families assess and treat loneliness in older adults, especially during times of physical distancing and social isolation.

The study, said the authors, demonstrates the feasibility of using natural language pattern analyses of transcribed speech to better parse and understand complex emotions like loneliness. They said the machine-learning models predicted qualitative loneliness with 94 percent accuracy.

"Our IBM-UC San Diego Center is now exploring NLP signatures of loneliness and wisdom, which are inversely linked in older adults. Speech data can be combined with our other assessments of cognition, mobility, sleep, physical activity and mental health to improve our understanding of aging and to help promote successful aging" said study co-author Dilip Jeste, MD, senior associate dean for healthy aging and senior care and co-director of the IBM-UC San Diego Center for Artificial Intelligence for Healthy Living.

Credit: 
University of California - San Diego

Leading water scientists warn of risks in shift to monoculture crops, tree plantations

image: Irena Creed is a University of Saskatchewan hydrologist who co-led the think tank paper.

Image: 
University of Saskatchewan

Conversion of large swaths of land to uniform tree plantations and single-crop species may lead to unintended consequences for the water cycle, putting ecosystems at greater risk for fires, floods, droughts and even hurricanes, warns a think-tank group of almost 30 water scientists from 11 countries.

Worldwide, policies are increasingly aimed at planting more trees and crops both to combat climate change and increase food and fuel production. Already about 40 per cent of the world's ice-free land surface has been converted to forestry and agriculture--often with only a few choice tree species and crops where biodiversity once thrived. This trend is poised to continue or even accelerate.

But in an article published in Nature Geoscience, the scientists argue that mixed-species diversity is crucial to the water cycle pathways that enable soil-plant-water systems to recover quickly from environmental stresses. Forestry and agricultural monocultures (growing a single species repeatedly on the same land) can constrain these pathways, adversely affecting conditions such as soil moisture and erosion, streamflow, evaporation, and groundwater quality--and ultimately reducing ecological resilience.

The authors urge policy makers and land managers to take into account critical water-vegetation interactions to guide decisions about what to plant and where.

"When we modify landscapes to help combat climate change or meet human demands for food and energy, we need to be smart about it," said Irena Creed, a University of Saskatchewan hydrologist who co-led the think tank paper with University of Delaware researcher Delphis Levia.

"We need to emulate what was natural by not relying on just a few choice crops or trees but instead embracing biodiversity. When you narrow biodiversity to a few select crops, it makes the whole ecosystem vulnerable."

Creed explains that the rate, timing and magnitude of water released to the atmosphere varies with each plant species.

"By having a diverse range in the rate of water movement, you are building a more diverse water system that can withstand water stresses such as droughts and fires," she said.

For example, in a forest with a variety of tree species, some species send roots down shallow, some at an intermediate level, and some deep.

"That means there's a lot more soil moisture available to some tree species than others," said Levia. "But if you're in a monoculture situation, as with many staple crops, the rooting depths are more uniform. They don't penetrate the soil to varying degrees like natural vegetation in forests. And so, they can be more susceptible to drought."

The paper notes that increased production of tree plantations to meet demand for wood can reduce, or even eliminate streamflow, and sometimes lead to the salinization and acidification of soils, as well as to increased susceptibility to fire.

In highly managed landscapes that have replaced wetlands, plant uniformity has been linked to increases in the frequency and severity of both floods and droughts, as well as water quality deterioration.

More research is needed to pinpoint the water movement pathways that are most susceptible to being constrained in the conversion of natural vegetation to planted monocultures, the authors state.

"We need governments to prioritize research into how much diversity is enough to ensure that a given type of landscape can be resilient and withstand environmental stresses," Creed said.

Having data on precisely how the change in the water cycle is occurring would enable proper management practices to be put into place, said Oregon State University professor John Selker, a co-author on the paper. Such evidence gathering will be possible using new sensor technologies that are becoming available.

"By recognizing, preserving or enhancing the diverse array of hydrological responses among plant species, we can provide better stewardship of the Earth's finite water resources," the authors conclude.

Credit: 
University of Saskatchewan

COVID-19 shapes political approval ratings

Approval ratings of political leaders surged in the early days of the COVID-19 pandemic, according to a new study published in the Proceedings of the National Academy of Sciences.

In the days and weeks with high numbers of new COVID-19 cases there were also large boosts to leader approval. These results support a "rally 'round the flag" phenomenon in which citizens rally around their leaders during times of crisis and may have voting implications.

Data analyzed by the University of North Carolina at Chapel Hill and the National University of Singapore reveal world leaders, on average, experienced a 14-point boost in approval.

Citizens tend to support their leaders in times of national crisis, such as war or terrorist attack, but the new study is the first to identify a rally effect during a health crisis - one that's been deadly and destructive across the globe.

The idea for the paper was developed by Kai Chi (Sam) Yam, associate professor and dean's chair in the Department of Management and Organization in Singapore. Yam collaborated with Joshua Conrad Jackson, a doctoral student in psychology at UNC-Chapel Hill who conducted the analyses and co-wrote the paper.

Drawing from political science and psychological theories, the study authors and their colleagues examined the effect of COVID-19 cases on approval ratings through the first 120 days of 2020.

More than 2 million daily approval ratings were collected for 11 heads of government from geographically and culturally diverse countries and all 50 United States governors.

U.S. President Donald J. Trump had a scant 4-point gain out of a possible 100 in approval during the time period compared to the substantial 24- to 61-point boosts in approval for leaders in the United Kingdom, Canada, Germany, and Australia. U.S. governors experienced 15- to 20-point gains.

"COVID-19 might serve as a catalyst to help some incumbent governments win election," said Yam

For example, the Korean ruling party won the most seats in the house by any party since 1960 in an election held during the pandemic in April 2020.

"We collected our data during the early stages of the pandemic, so we aren't equipped to answer questions about the effect's endurance. Clearly the effect doesn't last forever, but its timeline may depend on several factors, including how effectively a leader is perceived to respond to the pandemic," Jackson said.

Credit: 
University of North Carolina at Chapel Hill

Stroke alarm clock may streamline and accelerate time-sensitive acute stroke care

DALLAS, September 24, 2020 -- A digital clock that sounds alarms signaling each step of acute stroke care at the hospital is a low cost tool that helped doctors in Germany streamline and accelerate the time-sensitive process, according to new research published today in Stroke, a journal of the American Stroke Association, a division of the American Heart Association.

The success of emergency stroke treatment depends on how fast treatment is delivered. The American Heart Association/American Stroke Association's Target: Stroke quality improvement initiative recommends 60 minutes or less from the time a stroke patient arrives to the hospital to the time of clot-busting treatment.

"Time is brain. Minutes are easily lost in acute stroke management, despite standard protocols," said study author Klaus Fassbender, M.D., professor of neurology at Saarland University Medical Center in Homburg, Germany. "The stroke alarm clock is a low-cost intervention and an efficient way to quickly deliver life-saving treatment to acute stroke patients."

In the study, a large-display alarm clock was installed in the hospital's computed tomography (CT) room, which is where stroke patients are admitted, neurological examinations are performed and clot-busting medication (such as intravenous alteplase) is administered. The clock is set at the time of admission, and alarms sound when various treatment procedures should have been completed: 1) 15 minutes for the neurological exam; 2) 25 minutes for imaging and laboratory tests; and 3) 30 minutes for the start of intravenous thrombolysis treatment.

This study was conducted from February 2016 to November 2017 in the department of neurology of the Saarland University Medical Center in Homburg, Germany, which is a comprehensive stroke center. The two patient groups who were selected for either the alarm clock or regular care were similar regarding demographics and medical characteristics, including final diagnoses, stroke impairment and degree of disability or dependence at hospital arrival. Of the 107 acute stroke patients selected, 51 were treated utilizing the clock to time care, compared to 56 patients who were treated without using the clock as the timer.

Researchers found:

The time from arrival to neurological examination completion was 7.28 minutes in the stroke clock group, versus 10 minutes in the comparison group.

Time from arrival through diagnostic workup, including imaging, was 16.73 minutes in the stroke clock group, versus 26 minutes in the comparison group.

Arrival to start of intravenous thrombolysis treatment times were median 18.83 minutes in the stroke clock group, versus 47 minutes in the comparison group.

Use of the clock did not remarkably improve arrival to mechanical clot-busting times.

Stroke patients' functional abilities were not notably different 90 days after treatment, regardless of whether they received care with or without the clock. "However, this study was not designed to measure results of treatment in the months or years following acute stroke care," noted Fassbender.

"A limitation of this study was its size. We need more patients to determine whether accelerated acute stroke management with the clock translates to less death and disability long-term," he said.

Credit: 
American Heart Association

5G wireless may lead to inaccurate weather forecasts

image: This image shows leakage (unintended radiation from a transmitter into an adjacent frequency band or channel) from a 5G cellular network affecting sensors on weather satellites.

Image: 
Mohammad Yousefvand

Upcoming 5G wireless networks that will provide faster cell phone service may lead to inaccurate weather forecasts, according to a Rutgers study on a controversial issue that has created anxiety among meteorologists.

"Our study - the first of its kind that quantifies the effect of 5G on weather prediction error - suggests that there is an impact on the accuracy of weather forecasts," said senior author Narayan B. Mandayam, a Distinguished Professor at the Wireless Information Network Laboratory (WINLAB), who also chairs the Department of Electrical and Computer Engineering in the School of Engineering at Rutgers University-New Brunswick.

The peer-reviewed study was published this month at the 2020 IEEE 5G World Forum, sponsored by the Institute of Electrical and Electronics Engineers. Fifth-generation cellular wireless technology (5G) stems from new, smarter ways to use the higher (mmWave) frequencies for mobile communications. This technology will revolutionize internet communication and telecommunication. It has faster connection times, increases the number of devices that can connect to a network and will be more widely available over the next two to three years, according to IEEE.

The Rutgers study used computer modeling to examine the impact of 5G "leakage" - unintended radiation from a transmitter into an adjacent frequency band or channel - on forecasting the deadly 2008 Super Tuesday Tornado Outbreak in the South and Midwest.

The signals from the 5G frequency bands potentially could leak into the band used by weather sensors on satellites that measure the amount of water vapor in the atmosphere and affect weather forecasting and predictions. Meteorologists rely on satellites for the data needed to forecast weather.

Based on modeling, 5G leakage power of -15 to -20 decibel Watts (a decibel Watt is a unit of power that describes the strength of radio waves) affected the accuracy of forecasting of precipitation (by up to 0.9 millimeters) during the tornado outbreak and temperatures near ground level (by up to 2.34 degrees Fahrenheit).

"It can be argued that the magnitude of error found in our study is insignificant or significant, depending on whether you represent the 5G community or the meteorological community, respectively," Mandayam said. "One of our takeaways is that if we want leakage to be at levels preferred by the 5G community, we need to work on more detailed models as well as antenna technology, dynamic reallocation of spectrum resources and improved weather forecasting algorithms that can take into account 5G leakage."

The lead author is Mohammad Yousefvand, a Rutgers electrical engineering doctoral student. Co-authors include Professor Chung-Tse Michael Wu in the Department of Electrical and Computer Engineering, Professor Ruo-Qian (Roger) Wang in the Department of Civil and Environmental Engineering and Joseph Brodie, director of atmospheric research in the Rutgers Center for Ocean Observing Leadership.

Credit: 
Rutgers University

Provide shady spots to protect butterflies from climate change, say scientists

image: Latin name: Gonepteryx rhamni.

Image: 
Andrew Bladon

Researchers have discovered significant variations in the ability of different UK butterfly species to maintain a suitable body temperature. Species that rely most on finding a suitably shady location to keep cool are at the greatest risk of population decline. The results predict how climate change might impact butterfly communities, and will inform conservation strategies to protect them.

The results, published today in the Journal of Animal Ecology, show that larger and paler butterflies including the Large White (Pieris brassicae) and Brimstone (Gonepteryx rhamni) are best able to buffer themselves against environmental temperature swings. They angle their large, reflective wings in relation to the sun, and use them to direct the sun's heat either away from, or onto their bodies. These species have either stable or growing populations.

More colourful larger species such as the Peacock (Aglais io) and Red Admiral (Vanessa atalanta) have greater difficulty controlling their body temperature, but even they are better than their smaller relatives like the Small Heath (Coenonympha pamphilus).

The study found that some butterfly species rely on finding a spot at a specific temperature within a landscape - termed a 'microclimate' - to control their body temperature. Air temperatures vary on a fine scale: a shaded patch of ground is cooler than one in full sun, for example. These 'thermal specialists', including Brown Argus (Aricia agestis) and Small Copper (Lycaena phlaeas), have suffered larger population declines over the last 40 years.

All butterflies are ectotherms: they can't generate their own body heat. The populations of two thirds of UK butterfly species are in decline: habitat loss and fragmentation, and more monotonous landscapes have removed many of the microclimates butterflies need to survive. Climate change is compounding the problem by causing more extreme weather events and greater fluctuations in temperature.

Insects, including butterflies, pollinate around 85% of our food crops - providing a vital service worth billions of pounds globally. Protecting a diverse range of species will provide long-term resilience: if numbers of one species fall there are others to fill the gaps. Insects are also an important food source for many other species, including birds.

"Butterfly species that aren't very good at controlling their temperature with small behavioural changes, but rely on choosing a micro-habitat at the right temperature, are likely to suffer the most from climate change and habitat loss," said Dr Andrew Bladon, a Postdoctoral Research Associate in the University of Cambridge's Department of Zoology, and first author of the report.

He added: "We need to make landscapes more diverse to help conserve many of our butterfly species. Even within a garden lawn, patches of grass can be left to grow longer - these areas will provide cooler, shady places for many species of butterfly. In nature reserves, some areas could be grazed or cut and others left standing. We also need to protect features that break up the monotony of farm landscapes, like hedgerows, ditches, and patches of woodland."

Landscapes with a diversity of heights and features have a greater range of temperatures than flat, monotonous ones. This applies on scales from kilometres to centimetres: from hillsides to flower patches. Such structural diversity creates different microclimates that many butterflies use to regulate their temperature.

The research involved catching nearly 4,000 wild butterflies in hand-held nets, and taking the temperature of each using a fine probe. The surrounding air temperature was measured, and for butterflies found perching on a plant, the air temperature at the perch was also taken. This indicated the degree to which butterflies were seeking out specific locations to control their body temperature. In total, 29 different butterfly species were recorded.

The study reveals that butterflies are either thermal generalists or thermal specialists, and this does not always correspond with their current categorisations as either habitat generalists or specialists.

"As we plan conservation measures to address the effects of climate change, it will be important to understand not only the habitat requirements of different butterfly species, but also their temperature requirements," said Dr Ed Turner in the University Museum of Zoology, Cambridge, who led the work.

He added: "With this new understanding of butterflies, we should be able to better manage habitats and landscapes to protect them, and in doing so we're probably also protecting other insects too."

Over the past thirty years, many species of butterfly have expanded their range northwards, as more northerly places have become warmer due to climate change. The ranges of species adapted to cooler environments are shrinking. These trends have been tracked for butterfly populations as a whole, but no previous study has investigated how the individual butterflies that make up these populations are able to respond to small scale temperature changes.

Bladon said: "I like to think of butterflies as the gateway drug. If we can get people involved in butterfly conservation, that's the first step to getting them to care about insects more broadly."

Credit: 
University of Cambridge

Pale melanomas masked by albino gene

image: Pale melanomas masked by albino gene

Image: 
iStock

People with pale coloured melanomas are more likely to have a gene mutation associated with albinism, University of Queensland research has found.

Study lead author Dr Jenna Rayner said albinism, a rare genetic disorder affecting one in 10,000 people, prevented brown pigment from being synthesised in the body and led to fair hair and extremely pale skin that was easily sunburned and prone to skin cancers.

"Albinism develops when there are two mutated genes, so people with one mutation usually don't know they have it," Dr Rayner said.

"These people may be more prone to developing pale coloured melanomas, called amelanotic, because tumours accumulate new mutations, and they already have a mutated albinism gene."

The researchers studied DNA samples from more than 380 volunteers using whole exome sequencing, while looking for rare genetic mutations that cause albinism.

Queensland has the highest rate of melanoma in the world and more than 14,000 cases are diagnosed in Australia each year.

UQ Dermatology Research Centre Associate Professor Rick Sturm said up to eight per cent of melanomas could be amelanotic, making them difficult to diagnose and easily mistaken for non-cancerous conditions like warts or scars.

"Amelanotic melanomas are normally diagnosed in advanced stage, compared with darker melanomas, causing patients to often miss out on early treatment and their best chance of a cure," he said.

When funding becomes available, researchers plan to collect amelanotic melanoma samples to compare their genotype with that of the patient.

Dr Rayner said it could lead to personalised medicine - where doctors would be alerted to monitor potential amelanotic melanomas in people with one albinism gene mutation.

"This could optimise early intervention and consequently improve patient outcomes," she said.

Credit: 
University of Queensland

Driven by climate, more frequent, severe wildfires in Cascade Range reshape forests

image: Subalpine forest severely and repeatedly burned on Mt. Jefferson, Oregon. A lack of live trees and exposed growing conditions created by the fires is limiting natural forest regeneration.

Image: 
Courtesy of Sebastian Busby | Portland State University

In recent years -- and 2020 is no exception -- parts of the Pacific Northwest that are typically too wet to burn are experiencing more frequent, severe and larger wildfires due to changes in climate. New research from Portland State University found that while the increased wildfire activity is causing widespread changes in the structure and composition of these mid-to-high elevation forests, the new landscapes are also likely more resilient to projected upward trends in future fire activity and climate conditions.

The study, led by PSU graduate student Sebastian Busby, examined temperate forests that burned expansively, severely and repeatedly between 2003 and 2015 in the central Cascade Range of Oregon and Washington. On Mt. Adams, these wildfires included the 2008 Cold Springs, 2012 Cascade Creek and 2015 Cougar Creek fires. On Mt. Jefferson, the wildfires included the 2003 Booth and Bear Butte Complex, 2007 Warm Springs Area Lightning Complex and 2014 Bear Butte 2 fires. Some areas Busby studied have burned again this summer as part of the Lionshead fire in the Mt. Jefferson area.

Busby said that historically, wet and cool climate limited fire events in these humid forest environments to an interval of 50 to 200-plus years. But climate change has led to warmer winters, reduced mountain snowpack and longer, drier summers and fire seasons. The time between repeated wildfire events in this study was less than 12 years.

"These forests are drying out earlier in the year, making them more vulnerable to frequent, severe and larger wildfires," Busby said. "Because these forests have not historically burned very often, they're composed of high densities of tree species that are not well-adapted to frequent and very large severe fires."

True firs were the dominant conifer tree species across the study areas, but post-fire tree regeneration was generally very poor due to a lack of live mature trees remaining after the fires to reseed the forest.

The burned areas, however, did support the establishment of pines at a low density, which are functionally better adapted to fire. The findings suggest that in the near term, these forests may transition from a dense fir-dominated conifer forest into a patchy, low-density, pine-dominated forest that will likely lack the fuel connectivity conducive to crown fires. Busby said that while widespread forest composition change and forest cover loss may be alarming, the results indicate that the altered structure and composition are likely to be more resilient in the face of future fire and climate conditions, such as drought and heatwave events.

"From an ecological point of view, these reburned forests are going to have a greater abundance of tree species that are better adapted to fire and potentially have less flammable forest structure overall," he said. "Now, in these post-reburned forests that are growing in a warmer and drier world, it will be up to us to decide whether we let future fires burn or not."

If forest managers and other stakeholders choose to suppress them, they risk returning these forests to their historical dense structures, which thrived in cool and wet climates. However, under ongoing warming conditions, this alternative might increase the likelihood of severe and expansive fires in the future, negatively impacting human life, property, and natural resources.

"Wildfires are a natural ecological process on these landscapes and have been for thousands of years," Busby said. "Wildfire can be a great catalyst for change, but that change doesn't have to be entirely negative. We must learn to co-exist with wildfires, use them effectively, and embrace the positive elements they bring to our regional forests and ecosystems."

Credit: 
Portland State University

Island-building in Southeast Asia created Earth's northern ice sheets

image: Mt. Sumbing, an arc volcano in Central Java, in 2016. The uplift of volcanic rock in the entire Southeast Asian island arc, starting 15 million years ago, triggered global cooling and eventually ice sheets that covered much of North America and Northern Europe 18,000 years ago, according to UC Berkeley scientists and their colleagues.

Image: 
UC Berkeley photo by Yuem Park

The Greenland ice sheet owes its existence to the growth of an arc of islands in Southeast Asia -- stretching from Sumatra to New Guinea -- over the last 15 million years, a new study claims.

According to an analysis by researchers at the University of California, Berkeley, UC Santa Barbara and a research institute in Toulouse, France, as the Australian continent pushed these volcanic islands out of the ocean, the rocks were exposed to rain mixed with carbon dioxide, which is acidic. Minerals within the rocks dissolved and washed with the carbon into the ocean, consuming enough carbon dioxide to cool the planet and allow for large ice sheets to form over North America and Northern Europe.

"You have the continental crust of Australia bulldozing into these volcanic islands, giving you really high mountains just south of the equator," said Nicholas Swanson-Hysell, associate professor of earth and planetary science at UC Berkeley and senior author of the study. "So, you have this big increase of land area that is quite steep, in a region where it's warm and wet and a lot of rock types that have the ability to naturally sequester carbon."

Starting about 15 million years ago, this tropical mountain-building drew down carbon dioxide in the atmosphere, decreasing the strength of the greenhouse effect and cooling the planet. By about 3 million years ago, Earth's temperature was cool enough to allow snow and ice to remain through the summer and grow into huge ice sheets over the Northern Hemisphere, like that covering Greenland today.

Once Northern Hemisphere ice sheets grew, other climate dynamics led to a cycle of glacial maxima and minima every 40,000 to 100,000 years. At the most recent glacial maximum, about 15,000 years ago, massive ice sheets covered most of Canada, the northern portions of the U.S., as well as Scandinavia and much of the British Isles.

"If it wasn't for the carbon sequestration that's happening in the Southeast Asian islands, we wouldn't have ended up with the climate that includes a Greenland ice sheet and these glacial and interglacial cycles," Swanson-Hysell said. "We wouldn't have crossed this atmospheric CO2 threshold to initiate Northern Hemisphere ice sheets."

The periodic growth and decline of the northern ice sheets -- the cycle of glacial maxima and minima -- is likely postponed, due to human emissions that have increased carbon dioxide concentrations in the atmosphere.

"A process that took millions of years we have reversed in 100 years," Swanson-Hysell said. "Over the next tens to hundreds of thousands of years, geological processes in places like Southeast Asia will once again decrease CO2 levels in the atmosphere -- a pace that is frustratingly slow when humanity is facing the impact of current global warming."

UC Berkeley doctoral student Yuem Park, Swanson-Hysell and their colleagues, including Francis Macdonald of UC Santa Barbara and Yves Goddéris of Géosciences Environnement Toulouse, will publish their findings this week in the journal Proceedings of the National Academy of Sciences.

Weathering of rock sequesters carbon

Geologists have long speculated about the processes that periodically warm and cool the planet, occasionally covering the entire globe with ice and turning it into a so-called snowball Earth.

Once scientists realized that, over the course of millions of years, tectonic processes move land masses around the planet like massive jigsaw puzzle pieces, they sought a connection between continental movements -- and collisions -- and ice ages. Cycles of Earth's orbit are responsible for the 40,000- or 100,000-year fluctuations in temperature that overlay the long-term warming and cooling.

The rise of the Himalayas in Asia in the mid-latitudes over the past 50 million years has been a prime candidate for cooling and the start of a glacial climate after an extended geologic interval without ice sheets. A few years ago, however, Swanson-Hysell and Macdonald saw a correlation between mountain-building in tropical areas and the onset of time intervals with ice ages over the past 500 million years.

In 2017, they proposed that a major ice age 445 million years ago was triggered by mountain- building in the tropics, and they followed that in 2019 with a more complete correlation of the last four time intervals of glacial climate and collisions between continents and tropical island arcs. They argue that the combination of increased exposure of rock with minerals that can sequester carbon and a plenitude of warm tropical rain is particularly effective in pulling carbon dioxide from the atmosphere.

The process involves chemical dissolution of the rocks that consume carbon dioxide, which is then locked in carbonate minerals that form limestone rock in the ocean. The calcium within seashells that you find on the beach may have come out of a tropical mountain on the other side of the world, Swanson-Hysell said.

"We built up a new database of these types of mountain-building events and then reconstructed the latitude at which they happened," Swanson-Hysell said. "Then we saw, hey, there is a lot of cooling when there is a lot of this type of mountain being built in the tropics, which is the Southeast Asian setting. The Southeast Asian islands are the best analog for processes that we also see further in the past."

For the current paper, Park, Swanson-Hysell and Macdonald teamed up with Goddéris to model more precisely what carbon dioxide levels would be with changes in the size of the Southeast Asian islands.

The researchers first recreated the sizes of the islands as they grew over the last 15 million years, focusing primarily on the largest: Java, Sumatra, the Philippines, Sulawesi and New Guinea. They calculated that the area of the islands increased from 0.3 million square kilometers 15 million years ago to 2 million square kilometers today. UC Santa Barbara graduate student Eliel Anttila, who was an undergraduate student in earth and planetary science at UC Berkeley and is a co-author of the paper, contributed to this aspect of the research.

They then used Godderis' GEOCLIM computer model to estimate how the growth of these islands altered carbon levels in the atmosphere. Together with UC Berkeley postdoctoral scholar Pierre Maffre, who recently obtained his Ph.D. in Godderis' lab, they updated the model to account for the variable effect of different rock types. The model is linked with a climate model in order to relate CO2 levels to global temperatures and precipitation.

They found that the increase of land area along the southeast edge of the Pacific corresponded with global cooling, as reconstructed from oxygen isotope compositions in ocean sediments. The carbon dioxide levels inferred from the model also match some measurement-based estimates, though Swanson-Hysell admits that estimating CO2 levels more than a million years ago is difficult and uncertain.

Based on their model, chemical weathering in the Southeast Asian islands alone diminished CO2 levels from more than 500 parts per million (ppm) 15 million years ago to approximately 400 ppm 5 million years ago and, finally, to pre-industrial levels of 280 ppm. Fossil fuel-burning has now raised the level of carbon dioxide in the atmosphere to 411 ppm -- levels that haven't been seen on Earth for millions of years.

While the threshold for Arctic glaciation is estimated to be about 280 ppm of carbon dioxide, the threshold for ice sheet formation at the South Pole is much higher: about 750 ppm. That's why the Antarctic ice sheets began forming much earlier, about 34 million years ago, than those in the Arctic.

While the researchers' model doesn't allow them to isolate the climatic effects of the rise of the Himalayas, their Southeast Asian island scenario alone can account for the appearance of Northern Hemisphere ice sheets. They did explore the effect of volcanic events occurring around the same time, including massive lava flows, or flood basalts, such as those in Ethiopia and North America (Columbian traps). Though the weathering of such rocks has been proposed as an ice age trigger, the model shows that this activity played a minor role, compared to the rise of the Southeast Asian islands.

"These results highlight that the Earth's climate state is particularly sensitive to changes in tropical geography," the authors conclude.

Swanson-Hysell credits the campus's France-Berkeley Fund for providing resources for an initial collaboration with Goddéris that led to a large collaborative grant from the National Science Foundation's (NSF's) Frontier Research in Earth Science program to further pursue the research resulting in this paper.

The French-American team plans to model other past ice ages, including the one in the Ordovician period 445 million years ago that, in 2017, Swanson-Hysell and Macdonald proposed was triggered by a collision similar to that occurring today in the Southeast Asian islands. That collision took place during the first phase of Appalachian mountain-building, when the present-day eastern U.S. was located in the tropics.

Credit: 
University of California - Berkeley

'Save me Seymour!'

New international research led by Curtin University has found approximately a quarter of carnivorous plant species across the world may be at risk of extinction due to global climate change, illegal poaching, and the clearing of land for agriculture, mining and development.

Carnivorous plants are predatory plants which obtain some or most of their nutrients through specialised adaptations that allow them to attract, capture and kill their prey - mainly flies and other small insects but occasionally even birds and small mammals. Well-known species of carnivorous plants include the Venus fly trap and pitcher plants.

Lead researcher, restoration ecologist Dr Adam Cross from the School of Molecular and Life Sciences at Curtin University, said the loss of carnivorous plants would not only be devastating due to their captivating qualities, but could potentially have detrimental effects across ecosystems.

"Carnivorous plants are an iconic group of plants, and they are often involved in complex biological relationships with animals - sometimes providing habitats for animals, or even relying upon animals to digest the prey they catch for them," Dr Cross said.

"Our research has found around 25 per cent of the world's carnivorous plants are at increasing risk of extinction. Australia is currently sixth in the world for harbouring the most Critically Endangered carnivorous plant species, behind Brazil, Indonesia, Philippines, Cuba and Thailand."

Carnivorous plants usually occur in extremely sensitive habitats, and are often in areas experiencing direct conflict with human activities.

During the team's research, each of the over 850 known carnivorous plant species was assessed for its exposure to threats such as residential and commercial development; agriculture and aquaculture activities; energy production and mining; transport development, such as land clearing for roads or trains; human exploitation, such as illegal collection; pollution; geological events; climate change; and severe weather.

"Globally speaking, the biggest threats to carnivorous plants are the result of agricultural practices and natural systems modifications, as well as continental scale environmental shifts caused by climate change," Dr Cross said.

"In Western Australia, which harbours more carnivorous plant species than any other place on Earth, the biggest threat remains the clearing of habitat to meet human needs, resulting hydrological changes, and of course the warming, drying climate trend that affects much of Australia."

Research co-author Dr Andreas Fleischmann, from Botanische Staatssammlung Munich and Ludwig-Maximilians-University Munich, Germany, also noted that illegal poaching of carnivorous plants was a large problem.

"Noting their unique and fascinating features, some species of carnivorous plants are illegally collected from their natural habitats and sold to collectors. Poached plants of some species sell for hundreds of dollars," Dr Fleischmann said.

Looking to the future, research co-author Dr Alastair Robinson from Royal Botanic Gardens Victoria, stressed the need for immediate action in order to save carnivorous plants species from extinction.

"Conservation initiatives must be established immediately to prevent these species being lost in the coming years and decades," Dr Robinson said.

"Urgent global action is required to reduce rates of habitat loss and land use change, particularly in already highly-cleared regions that are home to many threatened carnivorous plant species, including habitats in Western Australia, Brazil, southeast Asia and the United States of America."

Credit: 
Curtin University

One of the world's driest deserts is the focus of a new study on our changing climate

image: University students Ruusa Gottlieb (L) and Priscilla Mundilo (R) measure carbon dioxide release from the soil at one of the high rainfall sites in the Namib Desert.

Image: 
Throop/ASU

Carbon, one of the main building blocks for all life on Earth, cycles among living organisms and the environment. This cycle, and how it works in one of the driest places on Earth, is the subject of a new study recently published in the journal Plant and Soil with lead author and Arizona State University (ASU) scientist, Heather Throop.

While the natural carbon cycle should be balanced each year, with about as much carbon taken out of the atmosphere as is released back by natural processes, humans are upsetting this balance through carbon dioxide additions to the atmosphere, both through changing land use that releases carbon stored in soils and from burning fossil fuels.

In an effort to understand what controls the release of carbon dioxide from soils in deserts, Throop and a team of university students from Namibia conducted field work in the Namib Desert, one of the world's driest regions that stretches more than 1,200 miles along the Atlantic coasts of Angola, Namibia, and South Africa.

What Throop and her team ultimately determined from their research is that subtle differences in surface topography and erosion have big influences on microorganisms in the soil and these differences ultimately affect carbon cycling. Even in the driest places, they found signs of life influencing carbon cycling.

"The amount of carbon dioxide in the atmosphere affects our climate, so understanding what affects the release of carbon from soils is important predicting how climate will change in the future." says Throop, who is an associate professor in the School of Earth and Space Exploration and the School of Life Sciences.

To conduct their analyses, the research team chose six locations that differed in yearly rainfall. At each site they carried out 48-hour sampling campaigns, working continuously day and night to collect data. At each location, the team analyzed the landscape structure and plants and selected representative locations to sample. Then, they simulated rainfall and used gas analyzers to measure carbon dioxide release from soils, to determine how carbon cycling responded as soils dried after the simulated rain.

"It's really an incredible amount of data to collect manually," says Throop. "And having a crew of dedicated and enthusiastic students made this work possible. Often for remote field work like this we just get a snapshot of what is happening at one or two sites or at a few points in time. It was exciting to be able to collect the data continuously for a few days and at six different sites."

The students participating in this research came from the University of Namibia and the Namibia University of Science and Technology. They were each participating in the Summer Drylands Program, an intense research experience where students plan, execute, and report on an experiment within a short timeframe.

"The ability of technology to record soil carbon was outstanding," says co-author and student researcher Vimbai Marufu, who is now in graduate school at the Namibia University of Science and Technology. "What I treasure the most from the experience is what it means to work on an interdisciplinary team and the unexplainable satisfaction of being close to nature."

And there are plans to continue additional fieldwork in the Namib Desert with a recent grant from the National Science Foundation to ASU. This grant will provide support for U.S. students to conduct research in the Namib Desert in collaboration with Namibian researchers. "We hope to use this work to help us in understanding how deserts respond to a changing climate," says Throop. "How biological processes function in the extreme dry of the Namib Desert will gives us clues about how relatively wet deserts will behave under drier conditions."

Credit: 
Arizona State University