Culture

The effect of military training on the sense of agency and outcome processing

People may report a reduced feeling of responsibility when they comply with orders in situations of asymmetric power, such as hierarchies. Previous research has shown that complying with orders reduces the sense of agency, that is, the feeling that you are the author of your own actions and thus responsible for the outcomes. Obeying orders also reduces the brain's processing of the outcomes of actions, compared to outcomes that are chosen freely. These results could help to explain why people can commit atrocious acts under coercion. The sense of agency, and feeling of responsibility for the outcomes of their action could be attenuated, as the brain reduces outcome processing, compared to scenarios where individuals decide for themselves on a course of action. The authors note that the psychological finding of feeling a reduced sense of agency under coercion should be considered quite distinct from the important legal and moral questions about the extent to which people are responsible for their actions when coerced.

These effects have been shown in experiments with adult volunteers, for whom coercion should be a rare (and generally illegal) occurrence. In contrast, some social structures, such as the armed forces, rely on strict hierarchical organization where people are required to follow orders. The professional role of military personnel implies compliance to hierarchical authority, based on the mandate society has given to that authority. In the present research published in Nature Communications today, the authors wondered to what extent the hierarchical organization of military environments might influence the experience of agency and outcome processing. "We wanted to investigate how a military environment would affect our experimental measures of agency and outcome processing. We also wanted to compare results for different military ranks: the officers who typically give orders, and subordinates who receive them", explains Dr. Emilie Caspar, first author of the present study. "Officers are trained to be accountable for their own actions, and also for the actions of troops under their command. We predicted that their experience of being a responsible agent might therefore be enhanced, compared to subordinates.", she adds.

In the present study, the authors used a simple laboratory paradigm, in which two volunteers respectively took turns in the role of 'agents' or 'victims'. The word 'victim' is used figuratively to define the role in the experiments, which followed ethical permissions, informed consent, and the declaration of Helsinki. Thus, all participants were always free to end their participation at any time, without giving reasons, and without personal or professional consequences for them. Agents were either free to decide to administer or not a mildly painful shock to the 'victim' in exchange for a +0.05€ monetary reward, or were coercively instructed by an experimenter to inflict or not the same shock. During the task, the authors used an implicit measure of the sense of agency based on time perception, and also recorded the brain activity of agents with an electroencephalogram. The team calculated a 'coercion effect', defined as the difference between the coercion condition and the free-choice condition. In a first study, the authors tested a group of civilians and a group of junior cadets, that is, officers in their first year of military training.

"Results of this study showed that for civilians, we replicated previous results. A useful implicit marker of sense of agency is the experience that your action and its outcome are compressed together in time. The interval between actions and outcomes that were freely chosen seemed shorter than the interval between the same actions and the same outcomes in a coercive condition. So, civilian participants appeared to have a reduced sense of agency in the coercive condition compared to the free-choice condition. However, for junior military cadets, there was no coercion effect: they showed no difference in our measure of agency between a condition in which they were following orders and a condition in which they could freely decide which action to execute", reports prof. Axel Cleeremans, co-senior author of the study. These results could reflect the negative influence of working in a highly hierarchical context on the sense of agency.

In a second study, the authors compared three groups of military personnel: junior cadets, privates with an average of 5 years' military experience, and senior cadets who had reached the rank of first lieutenant; again after 5 years' military experience - but this time of officer training. Results again showed no coercion effect on agency for junior cadets, and also showed no coercion effect for privates.

However, the senior cadets showed the standard coercion effect, with more agency in the free-choice condition than in the coercion condition. Further, senior cadets also had no attenuation in a brain measure of outcome processing, while privates did. Thus, working in a military environment, and position within the military hierarchy, both seemed to have a detrimental effect on sense of agency and outcome processing.

Importantly, these results are not simply explained by a selection process favouring junior cadets with a high sense of agency. An additional analysis conducted 3 years after the second study found that junior cadets who most attenuated sense agency and outcome processing in the free-choice condition relative to seniors were in fact those who persevered in the military, while those who maintained a high agency and outcome processing were more likely to leave.

"These results have deep societal implication, both for civilian society and hierarchical organizations such as the military" says Prof. Patrick Haggard, co-Senior author of the study. "If they are followed up in longitudinal studies, they open the possibility of training the sense of individual responsibility, which could have major benefits". Dr. Major Salvatore Lo Bue, coauthor of the study and professor at the Royal Military Academy of Belgium added: "The contemporary soldier operates in volatile, uncertain, complex, and ambiguous environments in which he/she sometimes has to make critical decisions on his/her own. He/she does not simply execute orders given by others. Insights about the underlying mechanisms of agency in the military environment can help to tailor the training to enable the soldier to make the swift switch between obedience when required, and making his/her own decisions when necessary."

Credit: 
Université libre de Bruxelles

From virtual to reality! Virtual training improves physical and cognitive functions

image: The virtual body as experienced by the user

Image: 
Tohoku University

Researchers at the Smart-Aging Research Center (IDAC) at Tohoku University have developed an innovative training protocol that, utilizing immersive virtual reality (IVR), leads to real physical and cognitive benefits.

We all know that physical exercise is crucial for overall well-being and helps postpone aging-related disorders; what is more surprising is that physical exercise can have beneficial effects not only on the body but on cognitive functions too. Unfortunately, physical activities are not always possible for people suffering or recovering from long-term diseases.

IVR, which allows the creation of a realistic virtual world that we can explore with our virtual body, can help solve this problem. It sounds unreal, but the illusion is so effective that even with the person sitting and the virtual body walking, the person thinks he/she is moving - it even generates comparable physiological reactions.

Professor Ryuta Kawashima, director of IDAC, led the team of researchers to explore whether or not virtual training can have similar benefits on cognitive functions as physical exercise. Healthy, young participants underwent the virtual training protocol. Wearing an IVR headset while sitting, they saw a virtual body (also called an avatar) displayed in the first person perspective. This created the illusory feeling of being the avatar itself. The virtual body alternated between 30 seconds of walking and 30 seconds of running for 8 minutes.

Researchers found that participants' heart rate increased coherently with the virtual movements, despite the fact that subjects were completely still; more importantly, cognitive functions (specifically, executive functions) and their neural basis were tested before and after the virtual training. The results showed that participants improved their cognitive performance (specifically, they were faster), as also confirmed by the increased activation of the brain-related areas (specifically, the left dorsolateral prefrontal cortex).

"The application of immersive virtual reality for clinical purposes is often doubted because it was originally designed for entertainment," says Professor Dalila Burin, who developed and conducted the experiment. "But this study proves that training protocols in IVR can be useful for people with motor impairments to have comparable benefits to real physical activity." Professor Burin adds, "It is also beneficial for people who want to start exercising in an entertaining and safe way."

By introducing the virtual reality technology in the cognitive neuroscience field, researchers aim to provide clinical solutions for patients and also contribute to theoretical models of body representation and motor control.

Credit: 
Tohoku University

Ultraviolet B exposure expands proenkephalin+ regulatory T cells with a healing function

image: UVB irradiation induces proliferation and activation of skin Treg cells. UVB-expanded skin Treg (UVB-skin Treg) cells promote wound healing by producing enkephalin and amphiregulin(AREG), which enhance keratinocyte growth/proliferation to repair skin wound.

Image: 
Department Immunology, Nagoya City University Graduate School of Medical Sciences

Ultraviolet B (UVB) is used as an effective therapy for individuals with psoriasis and atopic dermatitis because it has an immunosuppressive effect. The immune system develops to protect the body from infection or cancer. Its activation affects many physiological functions. Regulatory T (Treg) cells, expressing CD25 and Foxp3, constitute about 5-10% of peripheral CD4+T cells and work as brake on the immune system through the suppression of various immune responses. Researchers from Nagoya City University previously showed that skin Treg cells were expanded by UVB up to about 60% of CD4+T cells. Here they found that UVB-expanded skin Treg (UVB-skin Treg) cells had a healing function. UVB-skin Treg cells expressed proenkephalin (PENK), an endogenous opioid precursor, and amphiregulin (AREG), the epidermal growth factor receptor ligand, which promoted wound healing in vivo and keratinocyte outgrowth in a skin explant assay. Their results provide a new implication in developing a novel therapy using PENK+UVB-skin Treg cells.

Credit: 
Nagoya City University

Legal performance-enhancing substances associated with future problematic alcohol use

A new study published in the journal Pediatrics found that young adults aged 18-26 who used legal performance-enhancing substances were significantly more likely to report several problematic alcohol use and drinking-related risk behaviors seven years later. This relationship was especially strong among men.

The study, which analyzed a sample of over 12,000 U.S. participants from the National Longitudinal Study of Adolescent Health (Add Health), highlights the need for more research and government oversight and regulation of legal performance-enhancing substances.

"The results from our study are concerning given the common use of legal performance-enhancing substances among young people, particularly boys and men," says lead author Kyle T. Ganson, PhD, MSW, assistant professor at the University of Toronto's Factor-Inwentash Faculty of Social Work.

Performance-enhancing substances can be legal, such as creatine monohydrate or protein powders, or illegal, such as anabolic-androgenic steroids. Research has consistently shown adverse health and social outcomes due to the illegal use of unprescribed steroid use, but few studies have been conducted to identify outcomes associated with legal performance-enhancing substance use.

The researchers found that men who used legal performance-enhancing substances were more likely to experience five alcohol use problems and risk behaviors. This included binge drinking, getting hurt or engaging in risky behaviors while under the influence of alcohol, experiencing legal problems while under the influence of alcohol, continued alcohol use despite emotional or physical health problems, and reduced activities and socialization that interfered with alcohol use.

"Risky alcohol use is a serious problem for adult men, who have higher rates of death associated with alcohol use compared to women," said Dr. Ganson. "Problematic alcohol use ultimately impedes economic and employment success, and increases health care and law enforcement costs."

Ganson hypothesizes that the social pressure that boys and men feel to achieve a lean and muscular body type may explain the different results between genders. "For most boys and men, this body ideal is unattainable, leading to performance-enhancing substance use," he says. "This body image contrasts with the thin ideal for girls and women."

There are other reasons to be concerned about legal performance-enhancing substances as well.

"Legal performance-enhancing substances are unregulated by the Food and Drug Administration," said senior author Jason M. Nagata, MD, MSc, assistant professor at the University of California, San Francisco's Department of Pediatrics. "These substances are also commonly mislabeled and may contain harmful ingredients, such as anabolic steroids, which can lead to heart, liver, and kidney problems and worsen mental health."

An earlier study, led by Dr. Nagata and published in JAMA Pediatrics, also showed a relationship between legal performance-enhancing substances and later use of illegal anabolic-androgenic steroids.

The study's authors say health professionals and policy makers need to adjust their practices and goals to account for the gateway-like relationship they have observed between alcohol use and legal performance-enhancing substances.

"Health professionals should screen for these behaviors and counsel young people about potential health risks," says Ganson. "We also need state and federal policymakers to begin to take these substances seriously and recognize the adverse effects they have on youth."

Several states, including Massachusetts, California, and New York are making efforts to regulate the sale of these substances to young people.

"These legislative efforts are a great start and we need to get them passed into law," says Ganson. "I hope that we see more efforts from federal regulators."

Credit: 
University of Toronto

Discovery of an ancient dog species may teach us about human vocalization

image: Photograph taken of a Highland Wild Dog in Indonesia

Image: 
New Guinea Highland Wild Dog Foundation

In a study published in PNAS, researchers used conservation biology and genomics to discover that the New Guinea singing dog, thought to be extinct for 50 years, still thrives. Scientists found that the ancestral dog population still stealthily wanders in the Highlands of New Guinea. This finding opens new doors for protecting a remarkable creature that can teach biologists about human vocal learning. The New Guinea singing dog can also be utilized as a valuable and unique animal model for studying how human vocal disorders arise and finding potential treatment opportunities. The study was performed by researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, Cenderawasih University in Indonesia, and other academic centers.

The New Guinea singing dog was first studied in 1897, and became known for their unique and characteristic vocalization, able to make pleasing and harmonic sounds with tonal quality. Only 200-300 captive New Guinea singing dogs exist in conservation centers, with none seen in the wild since the 1970s.

"The New Guinea singing dog that we know of today is a breed that was basically created by people," said Elaine Ostrander, Ph.D., NIH Distinguished Investigator and senior author of the paper. "Eight were brought to the United States from the Highlands of New Guinea and bred with each other to create this group."

According to Dr. Ostrander, a large amount of inbreeding within captive New Guinea singing dogs changed their genomic makeup by reducing the variation in the group's DNA. Such inbreeding is why the captive New Guinea singing dogs have most likely lost a large number of genomic variants that existed in their wild counterparts. This lack of genomic variation threatens the survival of captive New Guinea singing dogs. Their origins, until recently, had remained a mystery.

Another New Guinea dog breed found in the wild, called the Highland Wild Dog, has a strikingly similar physical appearance to the New Guinea singing dogs. Considered to be the rarest and most ancient dog-like animal in existence, Highland Wild Dogs are even older than the New Guinea singing dogs.

Researchers previously hypothesized that the Highland Wild Dog might be the predecessor to captive New Guinea singing dogs, but the reclusive nature of the Highland Wild Dog and lack of genomic information made it difficult to test the theory.

In 2016, in collaboration with the University of Papua, the New Guinea Highland Wild Dog Foundation led an expedition to Puncak Jaya, a mountain summit in Papua, Indonesia. They reported 15 Highland Wild Dogs near the Grasberg Mine, the largest gold mine in the world.

A follow-up field study in 2018 allowed researchers to collect blood samples from three Highland Wild Dogs in their natural environment as well as demographic, physiological and behavioral data.

NHGRI staff scientist Heidi Parker, Ph.D., led the genomic analyses, comparing the DNA from captive New Guinea singing dogs and Highland Wild Dogs.

"We found that New Guinea singing dogs and the Highland Wild Dogs have very similar genome sequences, much closer to each other than to any other canid known. In the tree of life, this makes them much more related to each other than modern breeds such as German shepherd or bassett hound," Dr. Parker said.

According to the researchers, the New Guinea singing dogs and the Highland Wild Dogs do not have identical genomes because of their physical separation for several decades and due to the inbreeding among captive New Guinea singing dogs--not because they are different breeds.

In fact, the researchers suggest that the vast genomic similarities between the New Guinea singing dogs and the Highland Wild Dogs indicate that Highland Wild Dogs are the wild and original New Guinea singing dog population. Hence, despite different names, they are, in essence, the same breed, proving that the original New Guinea singing dog population are not extinct in the wild.

The researchers believe that because the Highland Wild Dogs contain genome sequences that were lost in the captive New Guinea singing dogs, breeding some of the Highland Wild Dogs with the New Guinea singing dogs in conservation centers will help generate a true New Guinea singing dogs population. In doing so, conservation biologists may be able to help preserve the original breed by expanding the numbers of New Guinea singing dogs.

"This kind of work is only possible because of NHGRI's commitment to promoting comparative genomics, which allows researchers to compare the genome sequences of the Highland Wild Dog to that of a dozen other canid species," Dr. Ostrander said.

Although New Guinea singing dogs and Highland Wild Dogs are a part of the dog species Canis lupus familiaris, researchers found that each contain genomic variants across their genomes that do not exist in other dogs that we know today.

"By getting to know these ancient, proto-dogs more, we will learn new facts about modern dog breeds and the history of dog domestication," Dr. Ostrander said. "After all, so much of what we learn about dogs reflects back on humans."

The researchers also aim to study New Guinea singing dogs in greater detail to learn more about the genomics underlying vocalization (a field that, to date, heavily relies on birdsong data). Since humans are biologically closer to dogs than birds, researchers hope to study New Guinea singing dogs to gain a more accurate insight into how vocalization and its deficits occur, and the genomic underpinnings that could lead to future treatments for human patients.

Credit: 
NIH/National Human Genome Research Institute

Warning witnesses of the possibility of misinformation helps protect their memory accuracy

image: Unwarned participants were more susceptible to misinformation and displayed greater auditory activity with hearing the misinformation. Warned participants were less susceptible to misinformation and displayed greater visual activities associated with witnessing the original crime.

Image: 
Jessica M. Karanian.Fairfield University

MEDFORD/SOMERVILLE, Mass. (August 31, 2020)--Warning witnesses about the threat of misinformation--before or after an event--significantly reduces the negative impact of misinformation on memory, according to new research performed at Tufts University.

The study, published online today in the Proceedings of the National Academy of Science (PNAS), revealed that warnings can protect memory from misleading information by influencing the reconstructive processes at the time of memory retrieval. The researchers also found that when making accurate "memory decisions" participants in the study who were warned about the threat of misinformation displayed increased content-specific neural activity in regions of the brain associated with the encoding of actual event details. Likewise, warnings reduced content-specific neural activity in brain regions associated with the encoding of misleading event details.

The authors said the findings could have important implications for improving the accuracy of everyday memory and eyewitness testimony as part of the legal system.

"Memory is notoriously fallible and susceptible to error but our results show that a simple alert about possible misinformation is an effective tool to help eyewitnesses think back to the actual experience--with accuracy," said Ayanna Thomas, a psychology professor in the School of Arts and Sciences at Tufts, who is a co-author and co-principal investigator of the study. "We expect this work could enhance interview procedures and protocols, specifically within the U.S. criminal justice system, which would benefit as a public entity from low-cost interview practices to improve the accuracy of eyewitness reports."

The study involved a total of 161 people who engaged in two experiments--one behavioral and one using neuroscientific methods--in which they watched a silent film depicting a crime and responded to a test of recognition memory. In the first experiment, participants listened to an audio narrative that described the crime which included consistent details, misleading details and neutral details. After the audio narrative, participants were given a final recognition memory test that assessed memory for the original witnessed event. Importantly, participants were randomly assigned into one of three warning groups: no warning, pre-warning and post-warning.

The results demonstrated that people who were warned about the inaccuracy of the retelling were less susceptible to the misinformation than people who were not warned. The researchers also found that there was little difference whether participants were warned before or after.

The second experiment involved a neuroimaging analysis designed to investigate the mechanisms by which warnings influence memory accuracy in the context of misinformation. In this evaluation, participants completed the identical tasks as they did in the first experiment but completed the final memory test while undergoing functional magnetic resonance imaging (fMRI). The researchers found warnings not only increased reinstatement of visual activity associated with witnessing the actual crime but also decreased reinstatement of auditory activity associated with hearing misleading post-event information.

"This suggests that warnings increase memory accuracy by influencing whether we bring back to mind details from accurate or inaccurate sources of information," said Elizabeth Race, a psychology professor at Tufts who also serves as a co-author and co-principal investigator of the study. Additionally, the strength of the content-specific cortical reactivation in visual and auditory regions predicted behavioral performance and the susceptibility of memory to misinformation.

"Together, these results provide novel insight into the nature of memory distortions due to misinformation and the mechanisms by which misinformation errors can be prevented," said Jessica M. Karanian, first and corresponding author of the study, who was a postdoctoral fellow at Tufts while conducting the research and is now an assistant professor of psychology at Fairfield University. "The adoption of these interview practices by police departments might protect the integrity of eyewitness accounts and improve the likelihood of just outcomes for all involved."

Past research on memory retrieval found that both prospective and retrospective warnings were able to mitigate the negative effect of misinformation on memory. However, this study not only affirms the beneficial effect of warning using a realistic scenario, such as testifying in a criminal case, but also demonstrates that prospective warnings can reduce misinformation errors in the context of repeated testing.

According to the Innocence Project, false eyewitness reports have contributed to approximately 70 percent of wrongful convictions. Despite efforts by the U.S. Department of Justice to create evidence-based standards in interview practices across police departments, the majority of surveyed police departments have not implemented such recommendations due to challenges in providing training and/or a lack of resources.

The researchers noted that since eyewitnesses are often interviewed and questioned multiple times throughout an investigation, a third party--such as a police office or an attorney--could provide a warning to an eyewitness that alerts them to the possibility of inaccuracies in post-event information that they might encounter in the future. The misinformation they might encounter could come from sources in the news, on social media or from a co-witness.

"We can envision the establishment of a new protocol for witness questioning in which witnesses are warned about the unreliability of subsequently-encountered information upon completing an initial interview," said Karanian. "We could also imagine that such pre-warnings might be useful if provided at the beginning of some interviews that may include unverified information reported by co-witnesses."

The research was a collaboration between Thomas's Cognitive Aging and Memory Lab at Tufts University, which investigates interactions between memory and metamemory to better understand the important role metamemory plays in memory acquisition, distortion and access; and Race's Integrative Cognitive Neuroscience Lab at Tufts, which uses a combination of cognitive neuroscience methods to answer questions about the neural mechanisms underlying human learning and memory.

Credit: 
Tufts University

Prior health insurance coverage disruptions linked to issues with healthcare access

ATLANTA - AUGUST 31, 2020 - A new American Cancer Society study finds health insurance coverage disruptions in the prior year led to issues with healthcare access and affordability for currently insured cancer survivors. The study appears in the Cancer Epidemiology, Biomarkers and Prevention, a journal of the American Association for Cancer Research.

Little is known about the effects of health insurance coverage disruptions on access to healthcare among cancer survivors. To learn more, investigators led by Jingxuan Zhao, MPH, estimated the prevalence of health insurance coverage disruptions and evaluated their associations with access to healthcare and affordability among cancer survivors aged 18-64 years in the United States using national data from years 2011 to 2018. Health insurance coverage disruption was measured as self-reports of any time in the prior year without coverage.

They found that approximately 260,000 currently insured cancer survivors aged 18-64 years had coverage disruptions in 2018. Among privately and publicly insured survivors, those with coverage disruptions were less likely to report all preventive services use, including blood pressure check, blood cholesterol check, flu shot, and dental care, compared to those continuously insured (16.9% vs 36.2%; 14.6% vs 25.3%, respectively). Currently insured survivors with private or public coverage were also more likely to report any problems with care affordability (55.0% vs 17.7%; 71.1% vs 38.4%, respectively) and any cost-related medication nonadherence (39.4% vs 10.1%; 36.5% vs 16.3%, respectively), such as skipping, taking less, and delaying medication to save money, compared to those continuously insured (all p

"Our findings in this study are especially relevant because widespread unemployment and potential loss of employer-based private health insurance coverage due to the COVID pandemic can also result in coverage disruptions. More cancer survivors may experience coverage disruptions, which may adversely affect their access to care and affordability," writes Zhao.

Credit: 
American Cancer Society

Cell phone location used to estimate COVID-19 growth rates

New research shows that counties with a greater decline in workplace cell phone activity during stay-at-home orders showed a lower rate of COVID-19 infections. The researchers believe patterns they saw in publicly available cell phone location data could be used to better estimate COVID-19 growth rates and inform decision-making when it comes to shutdowns and "reopenings." This research was published today in JAMA Internal Medicine.

"It is our hope that counties might be able to incorporate these publicly available cell phone data to help guide policies regarding re-opening throughout different stages of the pandemic," said the study's senior author, Joshua Baker, MD, MSCE, an assistant professor of Medicine and Epidemiology. "Further, this analysis supports the incorporation of anonymized cell phone location data into modeling strategies to predict at-risk counties across the U.S. before outbreaks become too great."

Baker and the other researchers, including the study's lead author Shiv T. Sehra, MD, an assistant professor of Medicine at the Harvard Medical School, used location data from cell phones - which were de-identified and made publicly available by Google - to analyze activity across up to 2,740 counties in the United States between early January and early May 2020. This data was broken up into locations where the activity took place, ranging from workplaces, to homes, retail stores, grocery stores, parks, and transit stations. Roughly between 22,000 and 84,000 points of data were analyzed for each day in the study period.

The idea was to compare where cell phone activity took place as a proxy to show where people, themselves, spent their time. This data was compared between two time periods: the first in January and February, before COVID-19's outbreak in the United States, then mid-February through early May, during the virus' initial surges and when stay-at-home orders were enacted.

Intuitively, they noted an increase in time spent at home, while visits to the workplace dropped significantly, along with a decline in visits to retail locations (such as stores and restaurants) and transit stations.

They saw that in counties where there was initially a higher density of cases, visits to workplaces, as well as retail locations and transit stations, fell more sharply than counties less affected by COVID-19. At the same time, in these counties, there was a more prominent spike in activity at homes.

In addition, the researchers saw that the counties where workplace activity fell the most had the lowest rates of new COVID-19 cases in the days that followed. Lag-times of 5, 10 and 15 days were observed to allow time for COVID-19's incubation period, but the lower infection rates held across the range.

Moving forward, Baker hopes more work can be done to vet cell phone data to see if they can be specifically used to predict COVID-19 hotspots and guide decision-making.

"It will be important to confirm that cell phone data is useful in other stages of the pandemic beyond initial containment," Baker said. "For example, is monitoring these data helpful during the reopening phases of the pandemic, or during an outbreak?"

Past its immediate importance for COVID-19, Baker sees future utility for this type of data.

"They do have the potential to help us better understand behavioral patterns which could help future investigators predict the course of future epidemics or perhaps monitor the impact of different public health measures on peoples' behaviors," he said.

Credit: 
University of Pennsylvania School of Medicine

Sex cells have a sweet tooth, and they pass it on to the brain

image: Inside the ovary of the fruit fly, sex cells divide, multiply and grow to become mature eggs. Novel discovery shows that this normal physiological process causes female fruit flies to develop a preference for sugar.

Image: 
Zita Santos & Carlos Ribeiro

Our job seems easy when compared with that of our cells. While they are hard at work, breaking some molecules and building others, we mainly have to do one thing - feed them. But what exactly should we feed them? This is not an easy problem to solve considering the constant competition happening inside. Whereas some cell types, like fat-cells, crave lipids, others may prefer protein or sugars. How does the brain factor in all competing demands and spits out a decision when faced with difficult choices like: steak or ice cream?

Now, in a study performed in fruit flies, a team of scientists at the Champalimaud Centre for the Unknown in Portugal, make a surprising discovery. Their results, published today (August 31st) in the scientific journal Nature Metabolism, reveal that changes in the nutritional requirements of sex cells make female flies crave sugar. Until now, this phenomenon was mainly described in pathological conditions, namely cancer. Its discovery in the normal physiological process of egg formation, provides important insight into the link between fertility and nutrition.

Cells with a sweet tooth

How can a small group of cells influence the behaviour of an entire organism? "A hint to the answer comes from oncology. When a cell becomes cancerous, it turns on cellular machinery that preferentially consumes sugar and turns it into building blocks necessary for cell multiplication. This process, where the cell changes its 'dietary preference' and function, is called metabolic reprogramming, and it is key for tumour growth.", says Carlos Ribeiro, a principal investigator at Champalimaud and a senior author of the study.

"This phenomenon was also recorded in non-pathological processes, mainly related to development. However, it was not known whether the cells' metabolic transformation could hijack the feeding decisions of the organism", adds Ribeiro. "This is what we set out to explore."

Ribeiro, together with Zita Santos, the other senior author of the study, chose to focus on the reproductive system of the fruit fly, specifically on the process of egg generation. "An egg begins with a single sex cell, which divides, multiplies and grows. The descendants of this original cell transform into the different cell types that together make up the complete egg", Santos explains.

When the team examined the cells throughout the egg's assembly process, they discovered that just like cancer cells, they were undergoing metabolic reprogramming. But not only that, they were activating the exact same cellular mechanism cancer cells use to promote cell proliferation by increasing their sugar consumption. In other words, they developed a sweet tooth.

"We were fascinated by these results", says Santos. "They explain previous reports showing that the female's sex-cells absorb a high proportion of sugars eaten by the animal. And they also fit well with the role of the egg, which needs to synthesise nutrients for a developing embryo."

Driving food choice from below the belt

These encouraging results drove the team to test whether the metabolic reprogramming of the sex cells in the ovary influences the animal's food choice. When they compared the dietary preferences of normal female flies with flies that are unable to produce eggs, they observed a robust difference. "The group of sterile flies had a significantly lower appetite for sugar!"

Moreover, when the team manipulated the cells' ability to metabolise sugar, both the production of eggs and the animals' sugar appetite were affected. "This demonstrates that it's not the cells themselves that generate the change in behaviour, but their metabolic programme. It is this specific programme that drives the flies to obtain the fuel they need for egg production.

How do the cellular changes in the ovary reach the brain and change the flies' behaviour? To answer this question, the team investigated the expression of fit. This small molecule is produced in the fat tissue that surrounds the fly's brain. The more Fit a fly has in her system, the less she cares for sweets.

Again, the team discovered a clear difference between normal and sterile females. Fit levels were significantly higher in the infertile group. "This is a strong indication that the effect of the sex cells on the brain is mediated by Fit. We still don't know how the communication between the ovary and the brain's fat tissue happens, but we are looking into it", Santos adds with a smile.

Diet and Fertility

Together, the team's findings outline a novel mechanism by which the metabolism of a small group of cells in the ovary controls the feeding behaviour of the animal. Could these results be relevant for the field of fertility?

Santos and Ribeiro have recently received a pilot award by the Global Consortium for Reproductive Longevity and Equality to investigate the answer to this question. At the basis of their approach lies an original idea: reversing the process.

"It's a kind of a chicken and egg concept", says Santos. "What comes first: metabolic reprogramming, or changes in food preference? We discovered that the metabolic reprogramming of the cells causes the female to consume more sugar, which she needs to generate eggs. We wonder what happens during aging. Could changes in metabolism explain fertility decline? And if so, would we be able to influence the fly's fertility as she ages by manipulating her diet?"

As Santos explains, female flies, similarly to women, experience age-related infertility. She hypothesises that changes in the ovarian metabolic programmes drive reproductive decline and that this phenomenon can be reduced or even reversed using targeted dietary interventions.

"We will explore this hypothesis in the fruit fly by using a combination of single-cell RNA sequencing and metabolomics. In parallel, we will characterise the cellular outcomes of ovarian decline and monitor the feeding behaviour of these animals. This will allow us to devise dietary strategies to reverse the identified alterations and increase reproduction in older female flies. We believe that this is a powerful path to identify potentially reversible processes underlying reproductive age-related decline. Also, since this is a mechanism that is shared by cancer cells, our findings may also be relevant for treating cancer", Ribeiro Concludes.

Credit: 
Champalimaud Centre for the Unknown

Pesticide-free crop protection yields up to US$ 20 billion/year benefits in Asia-Pacific

image: The total number of country-level introductions and first regional deployments of a given biological control agent is depicted for successive decades, over a 1918-2018 window. For instance, BIOCAT contained two introductions of the larval parasitoid Psyttalia humilis (Silvestri) against Tephritid fruit flies, i.e., a first regional use in 1927 (Cook Islands) followed by a second country-level deployment in 1935 on Fiji. All introductions pertain to the deployment of insect natural enemies for insect pest management in local food and agricultural production. Records are drawn from CABI's BIOCAT database.

Image: 
Nature Ecology & Evolution

Scientists have estimated for the first time how nature-based solutions for agricultural pest control deliver US$ 14.6 to US$ 19.5 billion annually across 23 countries in the Asia-Pacific region.

The new research, published in the journal Nature Ecology & Evolution, suggests that non-chemical crop protection (or biological control) delivers economic dividends that far surpass those attained through improved "Green Revolution" rice germplasm (estimated at US$ 4.3 billion a year).

The study, led by Dr Kris Wyckhuys and including contributions from CABI's Dr Matthew Cock and Dr Frances Williams on the data collection, unveils the magnitude and macro-economic relevance of biodiversity-based contributions to productivity growth in non-rice crops over a 100-year period between 1918 and 2018.

Scientifically-guided biological control of 43 exotic invertebrate pests allowed for between 73% to 100% yield loss recovery in critical food, feed and fibre crops including banana, breadfruit, cassava and coconut.

Dr Wyckhuys said, "The Green Revolution is credited with alleviating famine, mitigating poverty and driving aggregate economic growth since the 1960s - enabled through a tripling of rice output. Cornerstone of the Green Revolution were the 'packaged' seed x agro-chemical technologies and biological innovations such as high-yielding, disease-resistant cereal varieties.

"Our research is the first to gauge the financial benefit of using biological control to fight crop pests in the Asia-Pacific region and demonstrates how these ecologically-based approaches promoted rural growth and prosperity in marginal, poorly-endowed, non-rice environments.

"By thus placing agro-ecological innovations on equal footing with input-intensive measures, our work provides lessons for future efforts to mitigate invasive species, restore ecological resilience and sustainably raise output of global agri-food systems."

The scientists, who show how 75 different biological control agents mitigated 43 pests over a 100-year range, outline how biodiversity-driven ecosystem services underpin food systems and societal wellbeing in the face of environmental change.

Co-author Dr Michael Furlong added, "Biological control delivered durable pest control in myriad Asia-Pacific agriculture sectors, permitting yield-loss recoveries up to 73%, 81% and 100% in cassava, banana and coconut crops respectively.

"The ensuing economic dividends are substantial, as pest-induced losses up to US $6.8, $4.3 and $8.2 billion annually for the above crops were offset (at respective
values of $5.4-6.8 billion, $1.4-2.2 billion and $3.8-5.5 billion/year, for a conservative to high impact scenario range). As many of the underlying programs were run on a shoestring, the rate of return on biological control science is extraordinary.

"Our work constitutes an empirical demonstration of how insect biological control helped solidify the agrarian foundation of several Asia-Pacific economies and - in doing so - places biological control on an equal footing with other biological innovations such as Green Revolution germplasm.

"Not only does it spotlight its transformative impacts - especially in light of increasing global reliance on chemical pesticides - but it also celebrates the century-long achievements of dedicated, yet often, unacclaimed insect explorers and biological control pioneers."

Credit: 
CABI

Once infected, twice infected

image: Plantago lanceolata growing in a field in Wisconsin.

Image: 
Penczykowski lab, Washington University in St. Louis

Next time you head outside for a socially distant walk in between your Zoom meetings, notice the rich diversity of plants along your path. As we approach late summer, be sure to also notice the diversity of disease symptoms on those plants, including spots, blotches or fuzzy growth caused by bacteria, viruses or fungi.

A key to surviving in the wild is fighting off infection -- and not just once. As in humans, one infection may or may not leave a plant with lasting immunity.

In fact, an early infection might make things worse. New research from an international team including an assistant professor of biology at Washington University in St. Louis shows that infection actually makes a plant more susceptible to secondary infection -- in experiments and in the wild. The findings are published in the Aug. 31 issue of Nature Ecology & Evolution.

"We found that early infection facilitated later infection," said Rachel Penczykowski, assistant professor of biology in Arts & Sciences and co-first author on the study. She performed the field experiments as a postdoctoral researcher with Anna-Liisa Laine, senior author on the paper, now at the University of Zürich.

"And the order in which pathogen strains infect a plant matters," Penczykowski said. "Some pathogen strains are especially likely to facilitate infection by later-arriving strains."

The findings -- obtained through a series of elegant experiments that capture how pathogen strains naturally accumulate on plants over a growing season -- reveal the importance of understanding interactions among pathogens when developing strategies for maintaining healthy crop populations.

Early infection promotes later infection

A common roadside weed, Plantago lanceolata is native to Europe, where this study took place, and Asia; it is also commonly found in North America. Infection by the pathogen Podosphaera plantaginis, a powdery mildew fungus, is easy to spot with the naked eye.

In the wild, plant populations are exposed to and infected by multiple powdery mildew strains over the course of their lifetime. The authors wondered if prior exposure to one strain of powdery mildew affects the plant's susceptibility to a second.

To simulate what would happen in the wild, the authors took young, disease-free plants and brushed pathogen spores from one of four pathogen strains onto a single leaf per plant. The rest of the leaves were temporarily covered with a plastic bag.

The inoculated leaf was then covered with a spore-proof pouch for the duration of the experiment, which prevented infection from spreading between it and the other leaves. This method works because powdery mildew produces a localized, leaf-surface infection that does not spread systemically in the plant. Otherwise identical control plants received a sham inoculation instead of powdery mildew spores.

The plants were then placed in a common garden environment in a large field (without locally occurring Plantago or powdery mildew), where they were simultaneously exposed to all four pathogen strains.

Penczykowski and co-first author Fletcher Halliday, a current postdoctoral researcher in the Laine lab, found that none of the four strains of powdery mildew inoculated onto plants protected the plants from a secondary infection. In fact, prior exposure to mildew made plants more susceptible to a second powdery mildew infection compared to infection-naïve controls.

"If you look at each strain individually, some of the strains were better than others at promoting later infection," Penczykowski said.

"Because crop plants may also be exposed to a diversity of pathogen strains during a given growing season, understanding the ways in which different pathogen strains impact each other is important for developing sustainable disease control strategies in agricultural systems."

Into the wild

Scientists sometimes place cohorts of healthy, greenhouse-grown "sentinel plants" into field populations to measure the risk of pathogen infection. Doing this with sentinel plants allows researchers to control for genetic background, age and condition.

To test how prior inoculation affected the probability of plants becoming infected during epidemics in wild populations, the authors inoculated plants as they did in the common garden experiment (again, with uninoculated controls for comparison). Except this time, they moved the potted sentinel plants into wild populations and waited for naturally occurring mildew spores to arrive.

The researchers found that previously infected sentinel plants acquired secondary mildew infections more often than control plants that had never been infected. This was true even though the only way plants were catching the naturally occurring pathogen strains was through the wind.

"What we saw in both our common garden and our sentinel plant experiments was that previously inoculated plants were more susceptible to later infection," Halliday said. "But could we detect the signature of pathogen strain facilitation in naturally infected wild plant populations? That would require an intensive survey of wild plant-pathogen dynamics."

And into the wild the scientists went -- that is, using data from wild populations that were fortunately collected the previous year.

In an intensive survey of 13 field populations, the scientists tracked mildew infection in wild plants over the course of two months. They tagged plants as they found mildew infection; otherwise, they were not manipulated in any way and had been growing in the field their whole lives.

A small leaf area of each infected plant was cut and brought to the lab to identify the mildew strains that infected the plants at different times throughout the growing season.

The importance of being early

Powdery mildew strains vary in their ability to survive the winter and in the timing of their reproductive cycle.

Some strains arrive earlier in the growing season and are likely to be the ones that had successfully overwintered and reproduced quickly.

Halliday dove into the genetic data compiled from the surveys of the 13 field populations and found that strains detected early in the season commonly facilitated subsequent infections, and strains that arrived to the populations later in the season benefited from that facilitation.

"The early-arriving strains are the ones that are driving the course of epidemics and also affecting the diversity of pathogen strains that assemble in plant populations," Halliday said.

"In other words, the strains that are ready to hit the ground running in spring may impact both the ecological and evolutionary dynamics of plant-pathogen interactions," Penczykowski added.

Credit: 
Washington University in St. Louis

Study provides insight on how to build a better flu vaccine

Flu season comes around like clockwork every year, and sooner or later everyone gets infected. The annual flu shot is a key part of public health efforts to control the flu, but the vaccine's effectiveness is notoriously poor, falling somewhere from 40% to 60% in a typical year.

A growing body of evidence suggests that a history of exposure to influenza virus might be undermining the effectiveness of the annual flu vaccine. Partial immunity developed during prior flu seasons -- either through natural infection or vaccination -- might interfere with the body's response to a new vaccine, such that vaccination mainly boosts the recognition of prior influenza strains but does little to create the ability to fight new strains.

Now, a team led by researchers at Washington University School of Medicine in St. Louis has developed an approach to assess whether a vaccine activates the kind of immune cells needed for long-lasting immunity against new influenza strains. Using this technique, the researchers showed that the flu vaccine is capable of eliciting antibodies that protect against a broad range of flu viruses, at least in some people. The findings, published Aug. 31 in the journal Nature, could aid efforts to design an improved flu vaccine that provides protection not only against old influenza viruses but also new ones.

"Every year, about half of the U.S. adult population gets vaccinated against influenza," said senior author Ali Ellebedy, PhD, an assistant professor of pathology and immunology at Washington University. "It's necessary for public health, but it's also incredibly expensive and inefficient. What we need is a one-and-done influenza shot, but we are not there yet. Anything that helps us understand how immunity develops in the context of prior exposures would be important as we try to make a better vaccine."

The key to long-lasting immunity lies in lymph nodes, minuscule organs of the immune system positioned throughout the body. Easy to miss in healthy people, lymph nodes become swollen and tender during an infection as immune cells busily interact and multiply within them.

The first time a person is exposed to a virus - either by infection or vaccination - immune cells capture the virus and bring it to the nearest lymph node. There, the virus is presented to so-called naïve B cells, causing them to mature and start producing antibodies to fight the infection. Once the virus is successfully routed, most of the immune cells that take part in the battle die off, but a few continue circulating in the blood as long-lived memory B cells.

The second time a person is exposed to a virus, memory B cells quickly reactivate and start producing antibodies again, bypassing naive B cells. This rapid response quickly builds protection for people who have been reinfected with the exact same strain of virus, but it's not ideal for people who have received a vaccine designed to build immunity against a slightly different strain, as in the annual flu vaccine.

"If our influenza vaccine targets memory cells, those cells will respond to the parts of the virus that haven't changed from previous influenza strains," Ellebedy said. "Our goal is to get our immune system up to date with the new strains of influenza, which means we want to focus the immune response on the parts of the virus that are different this year."

To get decades-long immunity against the new strains, the flu strains from the vaccine need to be taken to the lymph nodes, where they can be used to train a new set of naïve B cells and induce long-lived memory B cells specifically tailored to recognize the unique features of the vaccine strains.

To find out what happens inside lymph nodes after influenza vaccination, Ellebedy enlisted the help of co-authors Rachel Presti, MD, PhD, an associate professor of medicine, and Sharlene Teefey, MD, a professor of radiology at Washington University. Presti led a team at the Infectious Disease Clinical Research Unit that coordinated the sampling of blood and lymph nodes from healthy volunteers before and after vaccination. Guided by ultrasound imaging, Teefey carefully extracted so-called germinal centers that hold immune cells from underarm lymph nodes of eight healthy, young volunteers vaccinated with the 2018-19 quadrivalent influenza vaccine. That vaccine was designed to protect against four different strains of influenza virus. The immune cells were extracted at one, two, four and nine weeks after vaccination.

Ellebedy and colleagues ­- including co-senior authors Steven Kleinstein, PhD, a professor of pathology at Yale University School of Medicine, and Andrew Ward, PhD, a professor of integrative structural and computational biology at Scripps Research Institute, as well as co-first authors Jackson Turner, PhD, a postdoctoral researcher who works with Ellebedy, Julian Zhou, a graduate student in Kleinstein's lab, and Julianna Han, PhD, a postdoctoral scholar who works with Ward - analyzed the immune cells in the germinal centers to find the ones that had been activated by vaccination.

In three volunteers, both memory B cells and naïve B cells in the lymph nodes responded to the vaccine strains, indicating that the vaccine had initiated the process of inducing long-lasting immunity against the new strains.

"Our study shows that the influenza vaccine can engage both kinds of cells in the germinal centers, but we still don't know how often that happens," Ellebedy said. "But given that influenza vaccine effectiveness hovers around 50%, it probably doesn't happen as often as we would like. That brings up the importance of promoting strategies to boost the germinal centers as a step toward a universal influenza vaccine."

Credit: 
Washington University School of Medicine

Researchers discover a specific brain circuit damaged by social isolation during childhood

Loneliness is recognized as a serious threat to mental health. Even as our world becomes increasingly connected over digital platforms, young people in our society are feeling a growing sense of isolation. The COVID-19 pandemic, which forced many countries to implement social distancing and school closures, magnifies the need for understanding the mental health consequences of social isolation and loneliness. While research has shown that social isolation during childhood, in particular, is detrimental to adult brain function and behavior across mammalian species, the underlying neural circuit mechanisms have remained poorly understood.

A research team from the Icahn School of Medicine at Mount Sinai has now identified specific sub-populations of brain cells in the prefrontal cortex, a key part of the brain that regulates social behavior, that are required for normal sociability in adulthood and are profoundly vulnerable to juvenile social isolation in mice. The study findings, which appear in the August 31 issue of Nature Neuroscience, shed light on a previously unrecognized role of these cells, known as medial prefrontal cortex neurons projecting to the paraventricular thalamus, the brain area that relays signals to various components of the brain's reward circuitry. If the finding is replicated in humans, it could lead to treatments for psychiatric disorders connected to isolation.

"In addition to identifying this specific circuit in the prefrontal cortex that is particularly vulnerable to social isolation during childhood, we also demonstrated that the vulnerable circuit we identified is a promising target for treatments of social behavior deficits," says Hirofumi Morishita, MD, PhD, Associate Professor of Psychiatry, Neuroscience, and Ophthalmology at the Icahn School of Medicine at Mount Sinai, a faculty member of The Friedman Brain Institute and the Mindich Child Health and Development Institute, and senior author of the paper. "Through stimulation of the specific prefrontal circuit projecting to the thalamic area in adulthood, we were able to rescue the sociability deficits caused by juvenile social isolation."

Specifically, the team found that, in male mice, two weeks of social isolation immediately following weaning leads to a failure to activate medial prefrontal cortex neurons projecting to the paraventricular thalamus during social exposure in adulthood. Researchers found that juvenile isolation led to both reduced excitability of the prefrontal neurons projecting to the paraventricular thalamus and increased inhibitory input from other related neurons, suggesting a circuit mechanism underlying sociability deficits caused by juvenile social isolation. To determine whether acute restoration of the activity of prefrontal projections to the paraventricular thalamus is sufficient to ameliorate sociability deficits in adult mice that underwent juvenile social isolation, the team employed a technique known as optogenetics to selectively stimulate the prefrontal projections to paraventricular thalamus. The researchers also used chemogenetics in their study. While optogenetics enables researchers to stimulate particular neurons in freely moving animals with pulses of light, chemogenetics allows non-invasive chemical control over cell populations. By employing both of these techniques, the researchers were able to quickly increase social interaction in these mice once light pulses or drugs were administered to them.

"We checked the presence of social behavior deficits just prior to stimulation and when we checked the behavior while the stimulation was ongoing, we found that the social behavior deficits were reversed," said Dr. Morishita.

Given that social behavior deficits are a common dimension of many neurodevelopmental and psychiatric disorders, such as autism and schizophrenia, identification of these specific prefrontal neurons will point toward therapeutic targets for the improvement of social behavior deficits shared across a range of psychiatric disorders. The circuits identified in this study could potentially be modulated using techniques like transcranial magnetic stimulation and/or transcranial direct current stimulation.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Implant choice more important than surgeon skill for hip replacement success

A study analysing over 650,000 hip replacement patients across England and Wales over 14 years sought to investigate why one hospital has consistently been identified as having better than expected outcomes compared to other settings. The findings have shown that the outstanding hip implant survival results seen in one centre in the UK are associated with implant choice more than surgeon skill.

The study by researchers from the Musculoskeletal Research Unit at the University of Bristol, the NIHR Bristol Biomedical Research Centre, and the University of Exeter, using data from the National Joint Registry has been published in PLOS Medicine.

Mr Jonathan Evans, Academic Clinical Lecturer at the Bristol Medical School; Translational Health Sciences (THS), based at Southmead Hospital, Bristol and lead author, said: "These findings are vitally important to making sure as many of our patients have a good outcome from their hip replacement as possible.

"We want patients across the country to feel empowered to ask their surgeon not only what implants they plan to use for their hip replacement but more importantly to ask for the long-term evidence that the implant works well. If they do not feel happy with the answer, then patients should feel confident asking for another opinion or even vote with their feet and go to a different hospital."

In 2017, there were over 822 different types of hip replacements implanted in England and Wales, but the Royal Devon & Exeter NHS Foundation Trust (RD&E) has used only three in the last 14 years. In the RD&E, only 1.7 per cent of hips needed to be re-done 14 years after the hip replacements were put in, but in the rest of the country this figure was 2.9 per cent. Given that about 100,000 people have a hip replacement every year, this difference could lead to many more patients needing further surgery.

The researchers considered age, sex and general health in their analyses and showed that when the patient's outcomes from the RD&E were compared to cases nationwide where the same implants had been used, there was no difference in how many of the hips lasted 14 years. This suggests that consistent use of a reliable hip replacement implant may be a more important determinant of success than the surgeon performing the operation.

It has long been seen that there is variation in success rates between hospitals (as seen in the National Joint Registry annual report) and it is a priority of the NHS to reduce this variation, ensuring best possible results for all patients. The Getting it Right First Time (GIRFT) initiative in the UK seeks to reduce variation in outcomes between hospitals and learning from centres with statistically "better than expected" results is key to that.

A hip replacement principally consists of two components, one that replaces the ball and another that replaces the socket. There is variation in how these parts are fixed to the bone, as well as in the materials used to create the bearing (contact) surface.

Hip and knee replacements are two of the most common and effective forms of surgery. Yet even in the best-case scenarios, they will eventually fail due to processes such as infection, fracture, normal wear and tear or reaction to wear particles. In many of these cases, patients require revision surgery which is more prone to failure, associated with poorer function and more expensive than primary surgery. Making the first hip replacement last as long as possible is in the best interest of patients, surgeons and the NHS as a whole.

Mr Michael Whitehouse, Reader in Trauma and Orthopaedics at the Bristol Medical School: THS and joint senior author on the study feels this information is critical to help patients make the best decisions about their care. He explained: "It is important to recognise that this study is not about encouraging surgeons to use one particular implant but to use the information available to them in the National Joint Registry and other reliable sources to choose implants with a track record of long-term success.

"Our study shows that long-term survival of a hip replacement is primarily down to a surgeon's implant decisions rather than the particular way they perform the operation."

Credit: 
University of Bristol

Humans' construction 'footprint' on ocean quantified for first time

image: A map showing the physical footprint of marine construction globally, in square kilometres.

Image: 
Bugnot et al., 'Current and projected global extent of marine built structures', Nature Sustainability.

In a world-first, the extent of human development in oceans has been mapped. An area totalling approximately 30,000 square kilometres - the equivalent of 0.008 percent of the ocean - has been modified by human construction, a study led by Dr Ana Bugnot from the University of Sydney School of Life and Environmental Sciences and the Sydney Institute of Marine Science has found.

The extent of ocean modified by human construction is, proportion-wise, comparable to the extent of urbanised land, and greater than the global area of some natural marine habitats, such as mangrove forests and seagrass beds.

When calculated as the area modified inclusive of flow-on effects to surrounding areas, for example, due to changes in water flow and pollution, the footprint is actually two million square kilometres, or over 0.5 percent of the ocean.

The oceanic modification includes areas affected by tunnels and bridges; infrastructure for energy extraction (for example, oil and gas rigs, wind farms); shipping (ports and marinas); aquaculture infrastructure; and artificial reefs.

Dr Bugnot said that ocean development is nothing new, yet, in recent times, it has rapidly changed. "It has been ongoing since before 2000 BC," she said. "Then, it supported maritime traffic through the construction of commercial ports and protected low-lying coasts with the creation of structures similar to breakwaters.

"Since the mid-20th century, however, ocean development has ramped up, and produced both positive and negative results.

"For example, while artificial reefs have been used as 'sacrificial habitat' to drive tourism and deter fishing, this infrastructure can also impact sensitive natural habitats like seagrasses, mudflats and saltmarshes, consequently affecting water quality.

"Marine development mostly occurs in coastal areas - the most biodiverse and biologically productive ocean environments."

Future expansion 'alarming'

Dr Bugnot, joined by co-researchers from multiple local and international universities, also projected the rate of future ocean footprint expansion.

"The numbers are alarming," Dr Bugnot said. "For example, infrastructure for power and aquaculture, including cables and tunnels, is projected to increase by 50 to 70 percent by 2028.

"Yet this is an underestimate: there is a dearth of information on ocean development, due to poor regulation of this in many parts of the world.

"There is an urgent need for improved management of marine environments. We hope our study spurs national and international initiatives, such as the EU Marine Strategy Framework Directive, to greater action."

The researchers attributed the projected expansion on people's increasing need for defences against coastal erosion and inundation due to sea level rise and climate change, as well as their transportation, energy extraction, and recreation needs.

Credit: 
University of Sydney