Brain

Weaving Indigenous knowledge with scientific research: a balanced approach

image: The Hooker River in Aoraki Mount Cook National Park, Aotearoa New Zealand

Image: 
University of Canterbury

MUNICH -- Indigenous knowledge, including oral histories, mythologies, place names, and classification schemes, can span many generations, preserving information that has helped native communities adapt to natural hazards as well as gradually changing conditions. Although Western scientists have historically deemed such information unreliable, during the past decade there has been increasing recognition of the advantages of bicultural approaches to scientific research, including demonstration of reliability.

Now a review published in the European Geosciences Union's journal Earth Surface Dynamics offers a roadmap for weaving together Indigenous knowledge with modern research, with a focus on the geosciences. "One goal is to encourage scientists to consider how their project might be of interest or relevance to Indigenous communities and to make conducting research with such groups more accessible," says Clare Wilkinson, a Ph.D. student at Te Whare Wananga o Waitaha | University of Canterbury and the paper's lead author.

The review, which is co-authored by native and non-native researchers from both Te Whare Wananga o Waitaha | University of Canterbury and Te Whare Wananga o Tamaki Makaurau | The University of Auckland, showcases a variety of tools for weaving Indigenous knowledge with Western science that maintain the integrity and validity of both methodologies, Wilkinson says. "There are clear links between Indigenous knowledge and values with respect to geomorphology," states Wilkinson, "but there is not much research that weaves these two cultural knowledge bases together."

Bicultural research: benefits for all

Bicultural research undertaken within respectful, reciprocal relationships can yield benefits for everyone involved, according to Wilkinson. Oral histories, for example, may provide insight into events that have been erased from the geologic record. Filling such gaps is crucial for projects such as the Aotearoa New Zealand Palaeotsunami Database, a catalogue of tsunamis that occurred prior to the start of historical written recordkeeping that is being used to better understand the distribution and magnitudes of these potentially destructive mega-waves.

Weaving of Indigenous knowledge with Western scientific research also has the potential to support native communities to make informed decisions regarding potential hazards on their ancestral lands. An example cited in the review describes native Maori purakau (stories) about a ngarara: a mythological, lizard-like creature, who lives in the Waitepuru River in Aotearoa New Zealand (a name that reflects the country's bicultural foundation). According to the authors, many Maori purakau are codified knowledge expressed through metaphors. These particular stories document the river's past geomorphic activity, expressed through the analogy of the ngarara flicking its tail back and forth.

"These stories, which my co-author Dan Hikuroa first published in 2017, document flood events," says Wilkinson, who notes that these have implications for understanding both the area's geomorphic history and the potential risks of living there. "The stories of the danger posed by the ngarara were taken into consideration when Maori built their homes, leaving them unharmed by past river-related hazards that have affected other nearby settlements," Wilkinson says.

Braided rivers tool: merging knowledge streams

In the review the authors describe tools that may help other researchers find respectful ways to initiate bicultural research projects. These include several potential frameworks--methodologies used during the theoretical design of the research--as well as step-by-step methods for acquiring data that incorporates Indigenous values.

The most transferable framework, suggest the authors, is the He Awa Whiria | Braided Rivers, which is based on the iconic Aotearoa New Zealand river systems characterised by networks of frequently shifting, sediment-choked river channels. This framework consists of two streams, one symbolising Maori knowledge and a second representing Western science. "The two knowledge streams operate collaboratively as well as independently, but both have the same objective of providing a balanced research outcome," Wilkinson says.

An element of reciprocity

When working with Indigenous communities, it's essential to understand--or at least respect--Indigenous priorities, interests and worldview, according to Wilkinson. "You need to anticipate that rock formations and rivers can be ancestors; that when communities talk about fish, they are speaking about brothers and sisters; and when communities talk about the soil, they are describing their Earth mother."

Shifting language is also a challenge, explains Wilkinson; words must be chosen very carefully to maintain mutual respect and safety for all involved, and researchers shouldn't expect Indigenous input on a project that doesn't interest them or provide them with any benefit. "Purely extractive research is not acceptable; there must be an element of reciprocity," says Wilkinson. The authors strongly recommend that scientists wishing to participate in bicultural research find cultural advisors who know the preferred procedures for engaging with Indigenous people.

Ultimately, suggest the review authors, drawing from multiple knowledge systems will help researchers and native communities realise novel understandings that could not be reached in isolation. "It is an exciting time to be a researcher and to play a part in increasingly important engagements with Indigenous culture and knowledge," concludes Wilkinson.

Credit: 
European Geosciences Union

Heat stress in gestating dairy cows impairs performance of future generations

image: Number of dairy cows (dry and milking) per state (USDA-ERS, 2019) and number of heat stress days per state (NOAA, 2019). Taller bars represent more cows within each cow number range. A heat stress was declared when average daily temperature-humidity index was equal to or greater than 68. The number of heat stress days per state in each year from 2007 to 2013 was calculated and averaged across the years.

Image: 
Journal of Dairy Science

Philadelphia, July 16, 2020 - It is estimated that in the United States, environmental heat stress in cows costs the dairy industry more than $1.5 billion annually due to decreased milk production, impaired reproductive performance, increased rates of illness, and shortened lifespans. But what effects do heat stress in pregnant cows have on the productivity and health of their female offspring in the future, and how much might this affect dairy producers' costs? In a recent article appearing in the Journal of Dairy Science, scientists from the University of Florida and the University of California, Davis investigated the performance and profitability of two future generations of cows born to mothers exposed to heat stress during pregnancy.

According to senior author Jimena Laporta, PhD, of the Department of Animal Sciences at the University of Florida, Gainesville, FL, USA, previous research has found that heifers born to cows that are heat-stressed during late gestation grow to be smaller and produce on average five kilograms per day less milk in their first lactation, compared with heifers born to dams that were cooled during the hottest days of the year.

"This suggests a permanent effect of fetal environment on genetic expression in adulthood," said Laporta. "We hypothesized that exposure of pregnant cows to heat stress during late gestation will impair daughters' and granddaughters' lifetime performances."

The authors' first objective was to measure the carryover effects of maternal exposure to heat stress during late gestation on milk yield, reproductive performance, and survival rates of daughters and granddaughters. Their second was to estimate the economic losses related to those outcomes across the United States. Laporta and colleagues pooled and analyzed data collected over a 10-year period on performance of Holstein cows in Florida, the state with the greatest number of heat stress days per year. This gave them information on the lifespans, productivity, and reproductive performance of two successive generations of cows born to dams exposed to heat stress during pregnancy and those born to dams that were provided active cooling during heat stress periods.

The authors found that, as expected, daughters and granddaughters of heat-stressed cows showed negative effects in rates of survival from birth to first calving, length of productive lifespan, and milk performance, including milk yields and nutrient contents through the first three lactations. The team calculated that these impairments translate to considerable annual costs to dairy producers in the United States, with nationwide losses, based on the US average milk price from 2010 to 2015, of up to $595 million annually.

Laporta notes that lactating cows tend to be the focus of heat reduction strategies, possibly because the effects of overheating are more immediately evident among them than among nonlactating pregnant cows, for which the damage may become apparent only later, when they resume milking. But considering the hidden costs that carry over to future generations of cows and the likelihood of increased heat stress due to ongoing climate change, Laporta and colleagues consider cooling techniques for dry cows--such as the use of fans and sprinkler systems--crucial to the continued success of the US dairy industry.

Credit: 
Elsevier

Patients with substance use disorder discriminated against by post-acute care facilities

Boston - A new study shows that 29 percent of private post-acute care facilities in Massachusetts explicitly discriminated against hospitalized individuals with opioid use disorder, rejecting their referral for admission. Led by researchers at Boston Medical Center's (BMC's) Grayken Center for Addiction, the study showed that 15 percent of the rejections among patients with substance use disorders were denied due to a substance use disorder diagnosis or because they were being treated for opioid use disorder with buprenorphine or methadone. These denials included documented and explicit discrimination. Published in the Journal of Addiction Medicine, the results demonstrate the urgent need to eliminate discrimination in order to ensure patients taking a medication to treat opioid use disorder get access to the ongoing medical care they need.

In 2016, the Massachusetts Department of Public Health issued guidance to post-acute care facilities in the state, advising that individuals taking buprenorphine or methadone to treat opioid use disorder who were otherwise eligible for admission could not be denied admission because of their opioid use disorder medication status. In 2018 and 2019, the United States Attorney's office for the District of Massachusetts settled with two different private post-acute care organizations for their discriminatory practices of denying admittance of patients taking an opioid agonist therapy, putting them in violation of the Americans with Disabilities Act.

"As clinicians who treat individuals with opioid use disorder, we frequently encounter issues getting our patients accepted to post-acute care facilities," said Simeon Kimmel, MD, MA, an addiction medicine and infectious diseases specialist at the Grayken Center who serves as the study's corresponding author. "Given the landscape in Massachusetts, we wanted to take a look at the data to determine the reasons why this continues to be an issue."

Researchers analyzed electronic health record data (EHR) from BMC patients over the age of 18 diagnosed with opioid use disorder to track their referrals to private Massachusetts post-acute medical care facilities in 2018. They categorized the stated reasons for the rejections, singling out those who noted substance use or receiving an opioid agonist (methadone or buprenorphine) as discriminatory. In 2018, there were 219 hospitalizations at BMC associated with opioid use disorder that resulted in 1,648 referrals to 285 private post-acute care facilities in Massachusetts. Of those referrals, 81.8 percent (1,348) were rejected. Among those rejections, 15.1 percent were deemed discriminatory based on the reasons cited: 105 were rejected because the patient was treated with buprenorphine or methadone, and 98 were rejected because the patient had a substance use disorder diagnosis.

Of the facilities included in the study, 29.1 percent (83) had at least one discriminatory rejection based on information they included in the patient's EHR. Interestingly, the study's data showed no differences in the proportion of discriminatory rejections made by post-acute care facilities before and after the 2018 settlement between the U. S. Attorney's Office, District of Massachusetts, and a post-acute care facility organization.

"This data demonstrates a troubling pattern of explicit discrimination aimed at patients with substance use disorders who need post-acute medical care, despite the guidance from the Massachusetts Department of Public Health and being in violation of federal law," said Kimmel, also an assistant professor of medicine at Boston University School of Medicine.

The study results also show that only two of three patients with SUD referred to post-acute medical care are ultimately discharged to one for further medical treatment, including wound care, physical and/or occupational therapy, as well as medical care for other underlying conditions. This is another example of the health care system further stigmatizing individuals with SUD, creating barriers to accessing care.

"These facilities are telling us openly that they are discriminating and violating the law, and many more are likely engaging in the same practices but not documenting it as openly for us to see," added Kimmel. "We need to have more coordinated efforts to enforce the state and federal policies that prohibit these practices from occurring so that people are no longer being denied medical care."

The study authors note that future research should focus on the specific regulatory, behavioral and technical barriers that exist in order to better understand and determine the best approaches to remove those barriers and reduce discriminatory practices. The Massachusetts Department of Public Health funded an ongoing project in 2019, which provides training and technical support to increase access to medications for opioid use disorder in long term care and skilled nursing facilities.

Credit: 
Boston Medical Center

Regular physical activity seems to enhance cognition in children who need it most

image: Relationship between baseline performance and pre-post changes in cognitive performance, illustrating the group × baseline performance interaction.

Image: 
University of Tsukuba

Tsukuba, Japan -- A common school-age stereotype is that smart kids are unathletic. However, as a recent study lead by Associate Professor Keita Kamijo at the University of Tsukuba and Assistant Professor Toru Ishihara at Kobe University shows, physical activity is linked to better cognitive ability, which is in turn related to academic performance in school. Understanding the effects of physical activity on cognition has been difficult for several reasons. "Previous studies looked at the issue too broadly," explains Professor Kamijo, "When we broke down the data, we were able to see that physical activity helps children the most if they start out with poor executive function."

Executive functions refer to three types of cognitive skills. The first is the ability to suppress impulses and inhibit reflex-like behaviors or habits. To assess this ability, children were asked to indicate the color in which words like "red" and "blue" were displayed on a computer screen. This is easy when the words and colors match ("red" displayed in red font), but often requires inhibition of a reflex response when they don't ("red" displayed in blue font). The second skill is the ability to hold information in working memory and process it. This was evaluated by testing how well children could remember strings of letters that vary in length. The third cognitive skill is mental flexibility. This was measured by asking children to frequently switch the rules for categorizing colored circles and squares from shape-based to color-based.

Professor Kamijo and Professor Ishihara, and their colleagues re-analyzed the data from previous experiments in which executive function was assessed in children before and after several months of daily intervention with physical activity, such as aerobic activities, ball games, and playing tag. They looked at a factor that was missed in the initial analyses. That is, they considered whether the effectiveness of the intervention depended on the initial baseline scores.

The researchers found that cognitive skills, which have been shown to closely associate with academic performance, improved most in children whose skills were initially poor. The team also found that increased time spent doing regular physical activity did not negatively affect cognitive function in children who started out with better cognitive functions.

The finding that daily physical activity can improve executive function in children who might need it the most has some practical implications. "Because the cognitive functions evaluated in our study are related to academic performance," says Professor Kamijo, "we can say that daily physical activity is critical for school-aged children. Our findings can help educational institutions design appropriate systems for maximizing the effects of physical activity and exercise."

Credit: 
University of Tsukuba

The smallest micro-gripper, grown on optical fibers, is operated remotely with light

image: The optical pliers next to the mandibles of a Formica polyctena ant for comparison (composite scanning electron microscope (SEM) image with added colors). The two jaws (red) close when light is sent through the optical fibers (light blue) that have the diameter of 125 microns, comparable to the diameter of a human hair. (Source: UW Physics)

Image: 
Source: UW Physics

Researchers at the Faculty of Physics, University of Warsaw, used the liquid crystal elastomer technology to demonstrate a series of micro-tools grown on optical fibers. The 200-micrometer gripers are controlled remotely, without electric wiring or pneumatic tubing, with green light delivered through the fibers - absorbed light energy is directly converted into the gripper jaws' action.

Gripping objects is a fundamental skill for living organisms, from the microscopic rotifers, through the amazing dexterity of the human hand, to the jaws of predatory whales and soft tentacles of giant squids, and is also vital for many ever-shrinking technologies. Mechanical grippers, powered by electric, pneumatic, hydraulic or piezoelectric servos, are used at scales down to millimeters, but their complexity and need for force transmission prevent miniaturization.

Researchers at the Faculty of Physics at the University of Warsaw with colleagues from the AGH University of Science and Technology in Cracow, Poland have now used liquid crystal elastomer microstructures that can change shape in response to light to build a light-powered micro-tool - optical pliers. The device was built by growing two bending jaws on the tips of hair-sized optical fibers.

Liquid Crystalline Elastomers (LCEs) are smart materials that can reversibly change shape under illumination with visible light. In their prototype, scientists combined the light-powered LCEs with a novel method of fabricating micrometer-scale structures: when UV light is sent through the optical fiber, a cone-shaped structure grows at the fiber tip. The light-induced mechanical response of thus grown micro-structure depends on the orientation of molecules inside the elastomer element and can be controlled to get bending or contracting micro-actuators. The new elastomer growth technique readily offers a variety of micrometer-scale, remotely controlled functional structures - building blocks for the micro-toolbox.

The research on light-powered elastomer microstructures is funded by the National Science Center (Poland) within the project "Micro-scale actuators based on photo-responsive polymers".

Physics and Astronomy first appeared at the University of Warsaw in 1816, under the then Faculty of Philosophy. In 1825 the Astronomical Observatory was established. Currently, the Faculty of Physics' Institutes include Experimental Physics, Theoretical Physics, Geophysics, Department of Mathematical Methods and an Astronomical Observatory. Research covers almost all areas of modern physics, on scales from the quantum to the cosmological. The Faculty's research and teaching staff includes ca. 200 university teachers, of which 87 are employees with the title of professor. The Faculty of Physics, University of Warsaw, is attended by ca. 1000 students and more than 170 doctoral students.

Credit: 
University of Warsaw, Faculty of Physics

Avian speciation: Uniform vs. particolored plumage

Although carrion crows and hooded crows are almost indistinguishable genetically, they avoid mating with each other. Researchers from Ludwig-Maximlian-Universitaet (LMU) in Munich have now identified a mutation that appears to contribute to this instance of reproductive isolation. 

The carrion crow and the hooded crow are genetically closely related, but they are distinguishable on the basis of the color of their plumage. The carrion crow's feathers are soot-black, while the hooded crow's plumage presents a particolored combination of black and light gray. Although crosses between the two forms can produce fertile offspring, the region of overlap between their geographical distributions in Europe is strikingly narrow. For this reason, the two forms have become a popular model for the elucidation of the processes that lead to species divergence. LMU evolutionary biologist Jochen Wolf and his team are studying the factors that contribute to the divergence of the two populations at the molecular level. Genetic analyses have already suggested that differences in the color of the plumage play an important role in limiting the frequency of hybridization between carrion and hooded crows. The scientists have now identified a crucial mutation that affects this character. Their findings appear in the online journal Nature Communications, and imply that all corvid species were originally uniformly black in color.

The ancestral population of crows in Europe began to diverge during the Late Pleistocene, at a time when the onset of glaciation in Central Europe forced the birds to retreat to refuge zones in Iberia and the Balkans. When the climate improved at the end of the last glacial maximum, they were able to recolonize their original habitats. However, during the period of their isolation, the populations in Southwestern and Southeastern Europe had diverged from each other to such an extent that they no longer interbred at the same rate, i.e. became reproductively isolated. In evolutionary terms, the two populations thereafter went their separate ways. The Western European population became the carrion crow, while their counterparts in Eastern Europe gave rise to the hooded crow. The zone in which the two now come into contact (the 'hybrid zone') is only 20 to 50 km wide, and in Germany it essentially follows the course of the Elbe River. "Within this narrow zone, there is a low incidence of interbreeding. The progeny of such crosses have plumage of an intermediate color," Wolf explains. "The fact that this zone is so clearly defined implies that hybrid progeny are subjected to negative selection."

Wolf wants to understand the genetic basis of this instance of reproductive isolation. In previous work, he and his group had demonstrated that the two populations differ genetically from each other only in segments of their genomes that determine plumage color. Moreover, population genetic studies have strongly suggested that mate selection is indeed based on this very character - the two forms preferentially choose mating partners that closely resemble themselves. These earlier studies were based on the investigation of single-base variation, i.e. differences between individuals at single sites (base-pairs) within the genomic DNA. "However, we have never been able to directly determine the functional effects of such single-base variations on plumage color," says Matthias Weissensteiner, the lead author of the study. "Even when we find an association between a single-base variant and plumage color, the mutation actually responsible for the color change might be located thousands of base-pairs away."

To tackle this problem, the researchers have used a technically demanding method to search for interspecific differences that affect longer stretches of DNA. These 'structural' variations include deletions, insertions or inversions of sequence blocks.  "Up until recently, high-throughput sequencing technologies could only sequence segments of DNA on the order of 100 bp in length, which is not long enough to capture large-scale structural mutations," says Wolf. "Thanks to the new methods, we can now examine very long stretches of DNA comprising up to 150,000 base pairs."

The team applied this technology to DNA obtained from about two dozen birds, and searched for structural variations that differentiate carrion crows from hooded crows. The data not only confirmed the results of the single-base analyses, they also uncovered an insertion mutation in a gene which is known to determine plumage color by interacting with a second gene elsewhere in the genome. In addition, phylogenetic analysis of DNA from related species revealed that their common ancestor carried the black variant of the first of these genes. The variant found in the hooded crow represents a new mutation, which first appeared about half a million years ago.  "The new color variant seems to be quite attractive, because it was able to establish itself very quickly, and therefore must have been positively selected," says Wolf. How the variant accomplished this feat is not yet clear. The evidence suggests that it first appeared in the region which now encompasses Iran and Iraq, and there are some indications that the lighter plumage confers a selective advantage in hot regions, because it effectively reflects sunlight. This supports the idea that the mutation might have initially been favored by natural selection. "Once it had reached a certain frequency within the local population, it would have been able to spread because parental imprinting, which enables nestlings to recognize their parents, also causes mature birds to choose mates that resemble their parents in appearance," Wolf explains. However, other possible scenarios, such as random genetic drift in small populations or the involvement of selfish genes (which promote their own propagation), are also conceivable and have yet to be ruled out.

Credit: 
Ludwig-Maximilians-Universität München

Wireless aquatic robot could clean water and transport cells

video: Researchers at Eindhoven University of Technology developed a tiny plastic robot, made of responsive polymers, which moves under the influence of light and magnetism. In this video, the artificial polyp of 1 by 1 cm attracts and grabs the floating oil droplet. The stem moves under the influence of magnetism, it therefore creates a current in the water which attracts the oil droplet to its tentacles. The tentacles then close around the oil droplet, under the influence of UV light. Blue light can open the tentacles again to release the droplet.

Image: 
Marina Pilz Da Cunha

Inspired by a coral polyp, this plastic mini robot moves by magnetism and light.

Researchers at Eindhoven University of Technology developed a tiny plastic robot, made of responsive polymers, which moves under the influence of light and magnetism. In the future this 'wireless aquatic polyp' should be able to attract and capture contaminant particles from the surrounding liquid or pick up and transport cells for analysis in diagnostic devices. The researchers published their results in the journal PNAS.

The mini robot is inspired by a coral polyp; a small soft creature with tentacles, which makes up the corals in the ocean. Doctoral candidate Marina Pilz Da Cunha: "I was inspired by the motion of these coral polyps, especially their ability to interact with the environment through self-made currents." The stem of the living polyps makes a specific movement that creates a current which attracts food particles. Subsequently, the tentacles grab the food particles floating by.

The developed wireless artificial polyp is 1 by 1 cm, has a stem that reacts to magnetism, and light steered tentacles. "Combining two different stimuli is rare since it requires delicate material preparation and assembly, but it is interesting for creating untethered robots because it allows for complex shape changes and tasks to be performed," explains Pilz Da Cunha. The tentacles move by shining light on them. Different wavelengths lead to different results. For example, the tentacles 'grab' under the influence of UV light, while they 'release' with blue light.

FROM LAND TO WATER

The device now presented can grab and release objects underwater, which is a new feature of the light-guided package delivery mini robot the researchers presented earlier this year. This land-based robot couldn't work underwater, because the polymers making up that robot act through photothermal effects. The heat generated by the light fueled the robot, instead of the light itself. Pilz Da Cunha: "Heat dissipates in water, which makes it impossible to steer the robot under water." She therefore developed a photomechanical polymer material that moves under the influence of light only. Not heat.

And that is not its only advantage. Next to operating underwater, this new material can hold its deformation after being activated by light. While the photothermal material immediately returns to its original shape after the stimuli has been removed, the molecules in the photomechanical material actually take on a new state. This allows different stable shapes, to be maintained for a longer period of time. "That helps to control the gripper arm; once something has been captured, the robot can keep holding it until it is addressed by light once again to release it," says Pilz Da Cunha.

FLOW ATTRACTS PARTICLES

By placing a rotating magnet underneath the robot, the stem circles around its axis (see video). Pilz Da Cunha: "It was therefore possible to actually move floating objects in the water towards the polyp, in our case oil droplets."

The position of the tentacles (open, closed or something in between), turned out to have an influence on the fluid flow. "Computer simulations, with different tentacle positions, eventually helped us to understand and get the movement of the stem exactly right. And to 'attract' the oil droplets towards the tentacles," explains Pilz Da Cunha.

OPERATION INDEPENDENT OF THE WATER COMPOSITION

An added advantage is that the robot operates independently from the composition of the surrounding liquid. This is unique, because the dominant stimuli-responsive material used for underwater applications nowadays, hydrogels, are sensitive for their environment. Hydrogels therefore behave differently in contaminated water. Pilz Da Cunha: "Our robot also works in the same way in salt water, or water with contaminants. In fact, in the future the polyp may be able to filter contaminants out of the water by catching them with its tentacles."

NEXT STEP: SWIMMING ROBOT

PhD student Pilz Da Cunha is now working on the next step: an array of polyps that can work together. She hopes to realize transport of particles, in which one polyp passes on a package to the other. A swimming robot is also on her wish list. Here, she thinks of biomedical applications such as capturing specific cells.

To achieve this, the researchers still have to work on the wavelengths to which the material responds. "UV light affects cells and the depth of penetration in the human body is limited. In addition, UV light might damage the robot itself, making it less durable. Therefore we want to create a robot that doesn't need UV light as a stimuli," concludes Pilz Da Cunha.

This research was published in PNAS magazine on July 13th. Carried out at the department of Chemical Engineering and Chemistry and the Institute for Complex Molecular Systems of Eindhoven University and Technology.Title: An Artificial Aquatic Polyp that Wirelessly Attracts, Grasps and Releases Objects. DOI: https://doi.org/10.1073/pnas.2004748117

Credit: 
Eindhoven University of Technology

Experts' high-flying study reveals secrets of soaring birds

image: The Andean condor in flight - recording devices revealed it actually flaps its wings for just one per cent of its flight time.

Image: 
Picture: Facundo Vital

New research has revealed when it comes to flying the largest of birds don't rely on flapping to move around. Instead they make use of air currents to keep them airborne for hours at a time.

The Andean condor - the world's heaviest soaring bird which can weigh in at up to 15kg - actually flaps its wings for one per cent of its flight time.

The study is part of a collaboration between Swansea University's Professor Emily Shepard and Dr Sergio Lambertucci in Argentina, that uses high-tech flight-recorders on Andean condors. These log each and every wingbeat and twist and turn in flight as condors search for food.

The team wanted to find out more about how birds' flight efforts vary depending on environmental conditions. Their findings will help to improve understanding about large birds' capacity for soaring and the specific circumstances that make flight costly.

During the study, the researchers discovered that more than 75 per cent of the condors' flapping was associated with take-off.

However, once in the sky condors can sustain soaring for long periods in a wide range of wind and thermal conditions - one bird managed to clock up five hours without flapping, covering around 172 km or more than 100 miles.

The findings are revealed in a new paper Physical limits of flight performance in the heaviest soaring bird, which has just been published by Proceedings of the National Academy of Sciences.

Dr Hannah Williams, now at the Max Planck Institute for Animal Behaviour, said: "Watching birds from kites to eagles fly, you might wonder if they ever flap.

"This question is important, because by the time birds are as big as condors, theory tells us they are dependent on soaring to get around.

"Our results revealed the amount the birds flapped didn't change substantially with the weather.

"This suggests that decisions about when and where to land are crucial, as not only do condors need to be able to take off again, but unnecessary landings will add significantly to their overall flight costs."

Professor Shepard, who is part of Swansea Lab for Animal Movement, said as all the birds they studied were immature, it demonstrated that low investment in flight is possible even in the early years of a condor's life.

Closer examination showed the challenges the birds faced as they moved between weak thermals. The condors were seen to flap more as they reached the end of the glides between thermals when they were likely to be closer to the ground.

Dr Lambertucci explained: "This is a critical time as birds need to find rising air to avoid an unplanned landing. These risks are higher when moving between thermal updrafts.

"Thermals can behave like lava lamps, with bubbles of air rising intermittently from the ground when the air is warm enough. Birds may therefore arrive in the right place for a thermal, but at the wrong time."

"This is a nice example of where the behaviour of the birds can provide insight into the behaviour of the air."

Credit: 
Swansea University

Evolution after Chicxulub asteroid impact: Rapid response of life to end-cretaceous mass

image: Lead author Francisco J. Rodríguez-Tovar in Bremen, Germany, working with the K-Pg core from IODP Expedition 364.

Image: 
Geology and Francisco J. Rodríguez-Tovar

Boulder, Colo., USA: The impact event that formed the Chicxulub crater (Yucatán Peninsula, México) caused the extinction of 75% of species on Earth 66 million years ago, including non-avian dinosaurs. One place that did not experience much extinction was the deep, as organisms living in the abyss made it through the mass extinction event with just some changes to community structure.

New evidence from International Ocean Discovery Program (IODP) Expedition 364 of trace fossils of burrowing organisms that lived in the seafloor of the Chicxulub Crater beginning a few years after the impact shows just how quick the recovery of the seafloor ecosystem was, with the establishment of a well-developed tiered community within ?700,000 years after the event.

In April and May 2016, a team of international scientists drilled into the Chicxulub impact crater. This joint expedition, organized by the International Ocean Discovery Program (IODP) and International Continental Scientific Drilling Program (ICDP) recovered an extended syn- and post-impact set of rock cores, allowing study of the effects of the impact on life and its recovery after the mass extinction event. The end Cretaceous (K-Pg) event has been profusely studied and its effect on biota are relatively well-known. However, the effect of these changes on the macrobenthic community, the community of organisms living on and in the seafloor that do not leave body fossils, is poorly known.

The investigators concluded that the diversity and abundance of trace fossils responded primarily to variations in the flux of organic matter (i.e., food) sinking to the seafloor during the early Paleocene. Local and regional-scale effects of the K-Pg impact included earthquakes of magnitude 10-11, causing continental and marine landslides, tsunamis hundreds of meters in height that swept more than 300 km onshore, shock waves and air blasts, and the ignition of wildfires. Global phenomena included acid rain, injection of aerosols, dust, and soot into the atmosphere, brief intense cooling followed by slight warming, and destruction of the stratospheric ozone layer, followed by a longer-term greenhouse effect.

Mass extinction events have punctuated the past 500 million years of Earth's history, and studying them helps geoscientists understand how organisms respond to stress in their environment and how ecosystems recover from the loss of biodiversity. Although the K-Pg mass extinction was caused by an asteroid impact, previous ones were caused by slower processes, like massive volcanism, which caused ocean acidification and deoxygenation and had environmental effects that lasted millions of years.

By comparing the K-Pg record to earlier events like the end Permian mass extinction (the so-called "Great Dying" when 90% of life on Earth went extinct), geoscientists can determine how different environmental changes affect life. There are similar overall patterns of recovery after both events with distinct phases of stabilization and diversification, but with very different time frames. The initial recovery after the K-Pg, even at ground zero of the impact, lasted just a few years; this same phase lasted tens of thousands of years after the end Permian mass extinction. The overall recovery of seafloor burrowing organisms after the K-Pg took ~700,000 years, but it took several million years after the end Permian.

Credit: 
Geological Society of America

Mismatched caregiver-infant interactions during feeding could boost babies' risk of later obesity

Infancy is a sensitive developmental period that presents opportunities and challenges for caregivers to feed their infants in ways that support healthy growth and development. A new integrative review examined evidence related to infants' self-regulation of behavior and emotion, and how that relates to interactions when they are fed by their caregivers, including how those interactions may derail infants' ability to regulate their intake of food. The review found that infants who are fed in the absence of hunger or beyond fullness may develop skewed perceptions of hunger and fullness, which could increase their risk of obesity and related health problems later in life.

The review, conducted by researchers at the University of North Carolina at Chapel Hill and the University of North Carolina at Wilmington, appears in Child Development Perspectives, a journal of the Society for Research in Child Development.

"We know that beginning in infancy, interactions with caregivers shape behavioral and physiological foundations of self-regulation, but we don't know as much about how these interactions influence self-regulation of feeding, eating, and energy intake," explains Eric A. Hodges, associate professor of nursing at the University of North Carolina at Chapel Hill's School of Nursing, the article's lead author. "In our work, we examined how the relationship between caregivers and their infants during feeding may affect infants' development, which has implications for the probability of subsequent preventable disease."

The first two years of life are a critical time during which eventual independent eating behavior and self-regulation of energy intake are shaped. Healthy babies appear to have the ability to adjust their energy intake--that is, how much food they ingest--with their body's physiologic need for growth and development.

In this review of about 50 studies on nutrition, physiology, and psychology, researchers sought to determine how caregivers' feeding of infants affects the role of infants' nervous system in energy intake self-regulation, specifically, the role of the vagus nerve, which runs from the brain through the face and thorax and to the abdomen. The researchers also sought to understand how caregiver-infant interactions may disrupt infants' development of self-regulation of energy intake.

Based on their review, the researchers modified an existing model, linking feeding responsiveness to obesity. According to their model, in addition to caregivers' responsiveness to cues, infants are responsible for the clarity of cues (i.e., cues intended to communicate hunger and fullness). Factors that may affect infants' cues and caregivers' perceptions of them include infants' temperament and inherited traits related to appetite, such as perceived enjoyment of eating, responsiveness to fullness, and pace of eating.

The researchers concluded that feeding in the absence of hunger or feeding beyond fullness may undermine infants' self-regulation of energy intake not only through the learning about eating that happens in feeding interactions, but also in the function of the vagus nerve in communicating hunger and fullness to the brain. This, in turn, could boost infants' risk for subsequent obesity as they develop from complete dependence to increasing independence in eating.

"Our review adds a deeper understanding of the interplay of behavior and underlying autonomic physiology that supports communication of hunger and fullness to the brain," according to Cathi B. Propper, advanced research scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill, who coauthored the article. "While much of the research on responsive feeding has focused on the caregiver's effects on the infant, our model, which encompasses both caregiver and infant, suggests that the infant is evoking behavioral and physiologic responses in the caregiver and is responding to the caregiver's behavior and physiology, too."

The researchers acknowledge that their conclusions have limitations, including the sparse empirical evidence supporting the links between the influence of the quality of the caregiver-child interaction on the vagus nerve. Moreover, they note that other factors could influence infants' digestive behavior, including changes caused by modifications of gene expression.

Credit: 
Society for Research in Child Development

High-fat diet with antibiotic use linked to gut inflammation

image: Image of sections of the colon's inner lining, based on combinations of low-fat diet (LFD) vs. high-fat diet (HFD), and no treatment (mock) vs. with streptomycin (Strep) treatment.

Image: 
UC Davis Health

UC Davis researchers have found that combining a Western-style high-fat diet with antibiotic use significantly increases the risk of developing pre-inflammatory bowel disease (pre-IBD). The study, published July 14 in Cell Host and Microbe, suggests that this combination shuts down the energy factories (mitochondria) in cells of the large intestinal lining, leading to gut inflammation.

Irritable bowel syndrome (IBS) affects approximately 11% of people worldwide. It is characterized by recurring episodes of abdominal pain, bloating and changes in bowel habits. IBS patients with mucosal inflammation and changes in the gut’s microbial composition are considered pre-IBD.

Antibiotic usage with high-fat diet is a risk factor

The study included 43 healthy adults and 49 adult patients diagnosed with IBS. The researchers measured fecal calprotectin, a biomarker for intestinal inflammation, of participants. Elevated levels of fecal calprotectin indicated a pre-IBD condition. The study identified 19 patients with IBS as pre-IBD.

The researchers found that all participants who consumed high-fat diet and used antibiotics were at 8.6 times higher risk for having pre-IBD than those on low-fat diet and no recent history of antibiotic use. Participants with the highest fat consumption were about 2.8 times more likely to have pre-IBD than those with the lowest fat intake. A history of recent antibiotic usage alone was associated with 3.9 times higher likelihood of having pre-IBD.

 “Our study found that a history of antibiotics in individuals consuming a high-fat diet was associated with the greatest risk for pre-IBD,” said Andreas Bäumler, professor of medical microbiology and immunology and lead author on the study. “Until now, we didn’t appreciate how different environmental risk factors can synergize to drive the disease.”

Shutting the cell’s powerhouse promotes gut microbial growth

Using mouse models, the study also tested the effect of high-fat diet and antibiotics use on the cells in the intestinal lining. It found that high-fat diet and antibiotics cooperate to disrupt the work of the cell’s mitochondria, shutting its ability to burn oxygen. This disruption causes reduction in cell’s oxygen consumption and leads to oxygen leakage into the gut.

The body’s beneficial bacteria thrive in environments lacking oxygen such as the large intestine. Higher oxygen levels in the gut promote bacterial imbalances and inflammation. With the disruption in the gut environment, a vicious cycle of replacing the good bacteria with potentially harmful proinflammatory microbes that are more oxygen tolerant begins. This in turn leads to mucosal inflammation linked to pre-IBD conditions.

The study also identified 5-aminosalicylate (mesalazine), a drug that restarts the energy factories in the intestinal lining, as a potential treatment for pre-IBD.

“The best approach to a healthy gut is to get rid of the preferred sustenance of harmful microbes,” Lee said. “Our study emphasized the importance of avoiding high fat food and abuse of antibiotics to avoid gut inflammation.”

Credit: 
University of California - Davis Health

New bioink for cell bioprinting in 3D

image: PhD student Sajjad Naeimipour preparing the 3D printer.

Image: 
Magnus Johansson

A research group led by Daniel Aili, associate professor at Linköping University, has developed a bioink to print tissue-mimicking material in 3D printers. The scientists have developed a method and a material that allow cells to survive and thrive.

"Bioprinting is a new and exciting technology to manufacture three-dimensional tissue-mimicking cell cultures. It has been a major problem to develop the bioink required, i.e. a material that can encapsulate the cells and be used in printers. Our bioink has several exciting properties that open new opportunities to approach our vision - creating tissue and organs in the laboratory", says Daniel Aili, associate professor in the Division of Biophysics and Bioengineering at Linköping University.

The properties of the ink can be modified as required and they have achieved excellent results in tests when using the material with different cell types: liver cells, heart cells, nerve cells and fibroblasts (a type of cell found in connective tissue). The research group has also solved one of the major challenges when attempting to print organic material: they have found a method that allows the cells to survive and thrive, despite the harsh treatment they receive. The results have just been published in the journal Biofabrication.

The ink the group has developed contains hyaluronan and synthetic molecules similar to proteins, known as peptides. These are bound together in a water-rich network, a hydrogel, that functions as a scaffolding for the cells.

"We can use some advanced chemical techniques to control how rapidly the hydrogel forms, in other words the transition from liquid to a gel that gently encapsulates the cells", says Daniel Aili.

The scientists have developed a modular system, like Lego bricks, in which different components can be combined to create different types of hydrogel. The hydrogels provide mechanical support to the cells and encapsulate them without damaging them. They can also control cell growth and behaviour. A system of various peptides makes it possible to modify the properties to control the cells and incorporate various functionalities. One example from the wide array possible is to attach an enzyme that stimulates the growth of bone.

"We are one of the first research groups that can change the material properties both before and after it is printed. We can, for example, increase the degree of cross-linking during the process to provide more stability to the material, and we can change the biochemical properties. We can also adapt the material to different types of cells. This is a further step on the way to mimicking the support structures that surrounds most human cells, the extracellular matrix", says Daniel Aili.

Since the material is dynamic and can be given tailored properties when used as bioink in 3D printing, the result of the research is referred to as a 4D printed biomaterial - yet another step closer to mimicking the body's own functions.

"Our work is quite basic research, but we are aware that there is a huge medical need for tissue, and for better and biologically relevant models for drug development, not least as a replacement for animal experiments. Progress is rapid in this field at the moment", Daniel Aili concludes.

Credit: 
Linköping University

New PET radiotracer proven safe and effective in imaging malignant brain tumors

image: Representative maximum-intensity projection PET images of a healthy human volunteer injected with 64Cu-NOTA-EB-RGD at 1, 8, and 24 hours after injection. Axial MRI and PET slices of glioblastoma patient injected with 64Cu-NOTA-EB-RGD at different time points after injection.

Image: 
Jingjing Zhang et al., Peking Union Medical College Hospital, Beijing, China/ Xiaoyuan Chen et al., Laboratory of Molecular Imaging and Nanomedicine, NIBIB/NIH, Bethesda, USA

A first-in-human study presented at the Society of Nuclear Medicine and Molecular Imaging 2020 Annual Meeting has demonstrated the safety, favorable pharmacokinetic and dosimetry profile of 64Cu-EBRGD, a new, relatively long-lived PET tracer, in patients with glioblastomas. The radiotracer proved to be a superior, high-contrast imaging diagnostic in patients, visualizing tumors that express low or moderate levels of αvβ3 integrin with high sensitivity.

Glioblastoma is the most common and most aggressive primary malignant brain tumor in adults, with 17,000 diagnoses annually. It is a highly diffuse and invasive disease that is personally devastating and virtually incurable. Once diagnosed, most patients survive less than 15 months, and fewer than five percent survive five years.

The 64Cu-EBRGD radiotracer presented in this study has several unique qualities. The peptide sequence Arg-Gly-Asp (RGD) specifically targets the cell surface receptor αvβ3 integrin, which is overexpressed in glioblastomas. To slow clearance, Evans Blue (EB) dye, which reversibly binds to circulating albumin, is bound to RGD, significantly enhancing target accumulation and retention. The addition of the 64Cu label to EBRGD provides persistent, high-contrast diagnostic images in glioblastoma patients.

This first-in-human, first-in-class study included three healthy volunteers who underwent whole-body 64Cu-EBRGD PET/CT. Safety data--including vital signs, physical examination, electrocardiography, laboratory parameters and adverse events--were collected after one day and after one week. Regions of interest were drawn, time-activity curves were obtained and dosimetry was calculated. Two patients with recurrent glioblastoma also underwent 64Cu-EBRGD PET/CT. Seven sets of brain PET and PET/CT scans were obtained over two consecutive days. Tumor-to-background ratios were calculated for the target tumor lesion and normal brain tissue. One week after radiotracer administration, the patient underwent surgical treatment, and immunohistochemical staining of tumor samples was performed.

64Cu-EBRGD was well-tolerated in patients with no adverse symptoms immediately or up to one week after administration. The mean effective dose of 64Cu-EBRGD was very similar to the effective dose of an 18F-FDG scan. Injection of 64Cu-EBRGD to the patients with recurrent glioblastoma showed high accumulation at the tumor with continuously increased tumor-to-background contrast over time. Post-operative pathology revealed World Health Organization grade IV glioblastoma, and immunohistochemical staining showed moderate expression of the αvß3 integrin.

"In this study, we have demonstrated a potential radiotheranostic agent that is safe, sensitive and highly selective in humans, which infers a future diagnostic tool and breakthrough targeted radiotherapy for glioblastoma patients," said Jingjing Zhang, MD, PhD, of Peking Union Medical College Hospital, Beijing, China. "We believe this innovative use of 64Cu-EBRGD will significantly improve therapeutic efficacy and patient outcomes."

"64Cu-labeled EBRGD represents a viable model compound for therapeutic applications since 177Lu, 90Y or 225Ac can be substituted for 64Cu," said Deling Li, MD, of Beijing Tiantan Hospital, Capital Medical University, Beijing, China. "We are currently studying the 177Lu homolog to treat glioblastoma and other αvβ3 integrin expressing cancers, including non-small cell lung, melanoma, renal and bone, and hope to build on the current wave of radiotherapies like 177Lu-DOTATATE."

Credit: 
Society of Nuclear Medicine and Molecular Imaging

When calling loudly, echolocation is costly for small bats

image: Nathusius's pipistrelle bat

Image: 
Photo: René Janssen

Calling in the ultrasonic range enables small bats to orient themselves in the dark and track down tiny insects. Louder calls travel farther, improving a bat's ability to detect their prey. It was long assumed that echolocation does not contribute much to energy expenditure in flight because individuals simply couple their calls with the beat of their wings. Scientists at the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) in Berlin have now shown that high intensity echolocation calls are by no means free and substantially contribute to energy expenditure. Bats must therefore find a balance between energy expenditure and effective echolocation and use the latter economically.

For many animals, vocalisations are essential for survival. With their calls, roars, croaks, chirps or songs, animals attract potential mating partners, repulse competitors or locate prey. These sounds can be deafening. A bison, for instance, roars at up to 127 decibels (dB), some birds reach 132 dB and sea lions even manage 137 dB! For comparison: a sound pressure level of 110dB is equivalent to the sound of jet engine 100m away. Despite their small size, bats can reach a sound pressure level of 137 dB, amongst the loudest animals in the world. But because of their high frequency, these sounds are inaudible to the human ear.

In principle, generating a higher sound pressure level is associated with higher energy costs. If a bat in search of prey wants to increase the distance over which its echolocation travel, it has to call louder, which should cost more energy. Until now, the prevailing opinion amongst scientists was that, at least in flight, bats can boost the sound pressure level of their calls without additional energy expenditure. This is because they synchronize the contraction of the abdominal wall, necessary for sound production, with the contractions of the large, active flight muscles to generate echolocation calls. According to conventional wisdom, the pressure generated by the wing beat is sufficient to support the production of very loud echolocation. The energy expenditure of flying bats should therefore remain more or less the same, regardless of whether they call softly or loudly.

A team of scientists from the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) in Germany and Tel Aviv University in Israel has now shown that this is not the case. In their experiments, they allowed Nathusius' pipistrelle bats (Pipistrellus nathusii) to fly freely in a wind tunnel under controlled conditions. Using loudspeakers, the researchers generated a loud background noise inside the tunnel. This encouraged the bats to drown out the noise with more intense echolocation calls. Before flying in the wind tunnel, the animals had received an isotonic solution of 13C-labelled sodium bicarbonate, which is exhaled as carbon dioxide during breathing, a proxy for metabolic rate. From the isotopic composition of the breathing air before and after the flight, the scientists determined the animals' energy expenditure when flying in the tunnel.

"When bats were flying with only the noise from the wind tunnel, echolocation intensity was 113 dB on average" says Leibniz-IZW scientist Shannon Currie, joint first author of the study. "But when flying in background noise of 109dB, the bats increased their echolocation intensity to an average of 128 dB." Since the sound pressure level follows a logarithmic scale, the bat calls were actually 30 times (!) louder when there was a high background noise in the wind tunnel.

This had significant effects on energy expenditure. Metabolic power rose by 0.12 watts when bats were calling 15 dB louder. If a bat were to maintain this high sound pressure level throughout a typical night-time foraging flight, it would have to catch about 0.5 grams of additional insect prey to compensate for the added energy expenditure, or one fourteenth of their own body mass. This is an enormous amount for an animal that itself weighs only seven grams.

"Our study illustrates that in bats the coupling of the abdominal wall movement with the flight muscle contractions alone is not sufficient to produce very loud calls," explains Christian Voigt, head of the Department of Evolutionary Ecology at the Leibniz-IZW. "We therefore assume that with more intense echolocation, additional muscles must become active to support the production of sound. This clearly costs a great deal of energy - especially above 130 dB. A bat in search of prey cannot increase at will the intensity and thus the distance over which its calls range. Instead, it must use loud calls economically and find a good compromise between the associated energy expenditure and the efficiency of echolocation."

Credit: 
Forschungsverbund Berlin

Burrowing crabs reshaping salt marshes, with climate change to blame

image: A new study reveals how climate change has enabled a voracious crab species to dramatically alter salt marsh ecosystems across the southeastern U.S.

The study, published in Proceedings of the National Academy of Sciences, shows that soils beneath salt marshes from South Carolina to Florida have been softened by higher sea levels and increased tidal inundation. That softening has allowed the burrowing crab species Sesarma reticulatum to thrive, feeding on the cordgrass that holds the marshes together.

The clearing of grass by crabs has dramatically altered the flow of creeks that run through the marshes, the study found, and is altering the dynamics between predator and prey species in the marshes. In fact, the researchers say that Sesarma, which had previously been a minor player in southeastern salt marshes, can now be considered a keystone species, meaning it plays a dominant role in shaping the ecosystem.

Image: 
Christine Angelini

PROVIDENCE, R.I. [Brown University] -- A new study reveals how climate change has enabled a voracious crab species to dramatically alter salt marsh ecosystems across the southeastern U.S.

The study, published in Proceedings of the National Academy of Sciences, shows that soils beneath salt marshes from South Carolina to Florida have been softened by higher sea levels and increased tidal inundation. That softening has allowed the burrowing crab species Sesarma reticulatum to thrive, feeding on the cordgrass that holds the marshes together.

The clearing of grass by crabs has dramatically altered the flow of creeks that run through the marshes, the study found, and is altering the dynamics between predator and prey species in the marshes. In fact, the researchers say that Sesarma, which had previously been a minor player in southeastern salt marshes, can now be considered a keystone species, meaning it plays a dominant role in shaping the ecosystem.

"What we've found is an example of how sea level rise can activate a keystone species that's now dramatically remodeling these salt marshes," said Mark Bertness, a professor emeritus of ecology and evolutionary biology at Brown University and a coauthor of the research. "That's a big deal because sea level rise is a pervasive global phenomenon, and this is a largely unexpected consequence. We need to start thinking about how global climate change could activate new keystone species in other ecosystems."

Research on Sesarma crabs and their impact on salt marshes has a long history in Bertness's lab at Brown. In 2011, Bertness and his students discovered that Sesarma, voracious grazers of cordgrass roots and leaves, were behind sudden die-offs of marshes on Cape Cod. In that case, overfishing had suddenly pulled predator species like striped bass out of the water, giving the crabs free reign to decimate the marshes. One of the undergraduate co-authors on that earlier research was Christini Angelini, now an associate professor at the University of Florida and a senior author on this new paper.

Sesarma were known to inhabit southern marshes in Florida and the Carolinas, but their populations hadn't boomed like those further north. One potential reason for that was differing soil substrates. While working several years ago as an undergraduate researcher in Bertness' lab, Sinead Crotty, now project director at Yale's Carbon Containment Lab, showed that ground hardness played a big role in where Sesarma are able to establish themselves. Her findings indicated that Sesarma had a much easier time building burrows and feeding on grass roots in the peaty New England soil compared to harder soil substrates often found in southern marshes.

But as sea levels continue to rise due to climate change, Crotty, Angelini and Bertness wondered if softening soils might be giving Sesarma more of a foothold in the South. Looking at aerial photos from nine locations across South Carolina and Florida, they found that the number of marsh creeks with evidence of Sesarma grazing increased by up to 240% from the late 1990s to the late 2010s. Meanwhile, surveys of sea level rise show that the ground in these areas is tidally submerged up to an hour longer per day now compared to the late 1990s.

"You've got the sea level rising, which softens the substrate that these crabs usually can't burrow in," Bertness said. "Now that it's softer you've got an ideal habitat to support these huge communal Sesarma burrows."

This new Sesarma activity is reshaping marshes, the researchers found. Elimination of grasses has increased the rate at which creeks form in the marshes, and increases the drainage density of marsh creeks by up to 35%.

Sesarma activity is also influencing interactions between predators and prey in the creeks. Clearing of grasses provides predators increased access to shellfish and other prey species. The research found that populations of mussels were dramatically lower in Sesarma-grazed creeks compared to creeks that weren't grazed.

"As they drown, southeastern U.S. marshes are fracturing from grasslands to patches of marsh, with depleted populations of mussels, snails and other invertebrates," Angelini said. "These dynamics reveal how quickly marshes may disappear with accelerating sea level rise and how long they will remain foraging grounds for commercially, recreationally and ecologically important species."

The fact that Sesarma is now altering the geomorphology of the marshes, as well as the ecological interactions between other species, is evidence that it now qualifies as a keystone species in southern marshes. This is the first example, the researchers say, of activation of new keystone species as the result of anthropogenic climate change.

"This is going to be something for the textbooks," Bertness said. "This is an underappreciated way in which climate change alters ecosystems."

Credit: 
Brown University