Culture

Overconfidence in news judgement

A new study published in Proceedings of National Academics of Sciences finds that individuals who falsely believe they are able to identify false news are more likely to fall victim to it. In the article published today, Ben Lyons, assistant professor of communication at the University of Utah, and his colleagues examine the concern about the public's susceptibility to false news due to their inability to recognize their own limitations in identifying such information.

"Though Americans believe confusion caused by false news is extensive, relatively few indicate having seen or shared it," said Lyons. "If people incorrectly see themselves as highly skilled at identifying false news, they may unwittingly be more likely to consume, believe and share it, especially if it conforms to their worldview."

Lyons and his colleagues used two large nationally representative surveys with a total of 8,285 respondents. Individuals were asked to evaluate the accuracy of a series of Facebook headlines and then rate their own abilities to discern false news content. Lyons used these two measures to assess overconfidence among respondents and how it is related to beliefs and behaviors.

"Our results paint a worrying picture. Many people are simply unaware of their own vulnerability to misinformation."

The vast majority of respondents--about 90 percent--reported they are above average in their ability to discern false and legitimate news headlines. Three in four individuals overestimated their ability to distinguish between legitimate and false news headlines and respondents placed themselves 22 percentiles higher than their score warranted, on average. About 20 percent of respondents rated themselves 50 or more percentiles higher than their score warranted.

"Using data measuring respondents' online behavior, we show that those who overrate their ability more frequently visit websites known to spread false or misleading news. These overconfident respondents are also less able to distinguish between true and false claims about current events and report higher willingness to share false content, especially when it aligns with their political leanings."

Prior research suggests it may be individuals' lack of skill itself that drives engagement with false news and finds that people who are worse at discerning between legitimate and false news are worse at doing so in their browsing habits. However, Lyons' analysis also shows that inflated perceptions of ability are independently associated with engaging with misinformation, suggesting the perceptual gaps are an additional source of vulnerability.

These results provide new evidence of an important potential mechanism by which people may fall victim to misinformation and disseminate it online. Although the design does not identify the causal effect of overconfidence, these findings suggest that the mismatch between one's perceived ability to spot false stories and people's actual abilities may play an important and previously unrecognized role in the spread of false information online.

After publication, find the full study here.

Credit: 
University of Utah

Extreme CO2 greenhouse effect heated up the young Earth

Very high atmospheric CO2 levels can explain the high temperatures on the still young Earth three to four billion years ago. At the time, our Sun shone with only 70 to 80 per cent of its present intensity. Nevertheless, the climate on the young Earth was apparently quite warm because there was hardly any glacial ice. This phenomenon is known as the 'paradox of the young weak Sun.' Without an effective greenhouse gas, the young Earth would have frozen into a lump of ice. Whether CO2, methane, or an entirely different greenhouse gas heated up planet Earth is a matter of debate among scientists. New research by Dr Daniel Herwartz of the University of Cologne, Professor Dr Andreas Pack of the University of Göttingen, and Professor Dr Thorsten Nagel of the University of Aarhus (Denmark) now suggests that high CO2 levels are a plausible explanation. This would also solve another geoscientific problem: ocean temperatures that were apparently too high. The article "A CO2 greenhouse efficiently warmed the early Earth and decreased seawater 18O/16O before the onset of plate tectonics" appears in the Proceedings of the National Academy of Sciences.

A much-debated question in earth science concerns the temperatures of the early oceans. There is evidence that they were very hot. Measurements of oxygen isotopes on very old limestone or siliceous rocks, which serve as geothermometers, indicate seawater temperatures above 70°C. Lower temperatures would only have been possible if the seawater had changed its oxygen isotope composition. However, this was long considered unlikely.

Models from the new study show that high CO2 levels in the atmosphere may provide an explanation, since they would also have caused a change in the ocean's composition. 'High CO2 levels would thus explain two phenomena at once: first, the warm climate on Earth, and second, why geothermometers appear to show hot seawater. Taking into account the different oxygen isotope ratio of seawater, we would arrive at temperatures closer to 40°C,' said Daniel Herwartz of the University of Cologne. It is conceivable that there was also a lot of methane in the atmosphere. But that would not have had any effect on the composition of the ocean. Thus, it would not explain why the oxygen geothermometer indicates temperatures that are too high. 'Both phenomena can only be explained by high levels of CO2,' Herwartz added. The authors estimate the total amount of CO2 to have totalled approximately one bar. That would be as if today's entire atmosphere consisted of CO2.

'Today, CO2 is just a trace gas in the atmosphere. Compared to that, one bar sounds like an absurdly large amount. However, looking at our sister planet Venus with its approximately 90 bar of CO2 puts things into perspective,' explained Andreas Pack from the University of Göttingen. On Earth, CO2 was eventually removed from the atmosphere and the ocean and stored in the form of coal, oil, gas, and black shales as well as in limestone. These carbon reservoirs are mainly located on the continents. However, the young Earth was largely covered by oceans and there were hardly any continents, so the storage capacity for carbon was limited. 'That also explains the enormous CO2 levels of the young Earth from today's perspective. After all, roughly three billion years ago, plate tectonics and the development of land masses in which carbon could be stored over a long period of time was just picking up speed,' explained Thorsten Nagel from Aarhus University.

For the carbon cycle, the onset of plate tectonics changed everything. Large land masses with mountains provided faster silicate weathering, which converted CO2 into limestone. In addition, carbon became effectively trapped in the Earth's mantle as oceanic plates were subducted. Plate tectonics thus caused the CO2 content of the atmosphere to drop sharply. Repeated ice ages show that it became significantly colder on Earth. 'Earlier studies had already indicated that the limestone contents in ancient basalts point to a sharp drop in atmospheric CO2 levels. This fits well with an increase in oxygen isotopes at the same time. Everything indicates that the atmospheric CO2 content declined rapidly after the onset of plate tectonics,' Daniel Herwartz concluded. However, in this context 'rapidly' refers to several hundred million years.

Credit: 
University of Cologne

Duetting songbirds 'mute' the musical mind of their partner to stay in sync

image: The plain-tail wren shows neurobiologists the magic between collaborative performers sparks when music-making parts of the brain go silent.

Image: 
Melissa Coleman

Art Garfunkel once described his legendary musical chemistry with Paul Simon, "We meet somewhere in the air through the vocal cords ... ." But a new study of duetting songbirds from Ecuador, the plain-tail wren (Pheugopedius euophrys), has offered another tune explaining the mysterious connection between successful performing duos.

It's a link of their minds, and it happens, in fact, as each singer mutes the brain of the other as they coordinate their duets.

In a study published May 31 in Proceedings of the National Academy of Sciences, a team of researchers studying brain activity of singing male and female plain-tailed wrens has discovered that the species synchronizes their frenetically paced duets, surprisingly, by inhibiting the song-making regions of their partner's brain as they exchange phrases.

Researchers say that the auditory feedback exchanged between wrens during their opera-like duets momentarily inhibits motor circuits used for singing in the listening partner, which helps link the pair's brains and coordinate turn-taking for a seemingly telepathic performance. The study also offers fresh insight into how humans and other cooperative animals use sensory cues to act in concert with one another. 

"You could say that timing is everything," said Eric Fortune, co-author of the study and neurobiologist at New Jersey Institute of Technology's Department of Biological Sciences. "What these wrens have shown us is that for any good collaboration, partners need to become 'one' through sensory linkages. The take-home message is that when we are cooperating well... we become a single entity with our partners."

"Think of these birds like jazz singers," added Melissa Coleman, the paper's corresponding author and associate professor of biology at Scripps College. "Duetting wrens have a rough song structure planned before they sing, but as the song evolves, they must rapidly coordinate by receiving constant input from their counterpart.

"What we expected to find was a highly active set of specialized neurons that coordinate this turn-taking, but instead what we found is that hearing each other actually causes inhibition of those neurons -- that's the key regulating the incredible timing between the two."

For the study, the team had to travel to the heart of the plain-tail wren music scene, within remote bamboo forests on the slopes of Ecuador's active Antisana Volcano. Camped at the Yanayacu Biological Station's lab, the team made neurophysiological recordings of four pairs of native wrens as they sang solo and duet songs, analyzing sensorimotor activity in a premotor area of the birds' brains where specialized neurons for learning and making music are active.

The recordings showed that during duet turn-taking -- which often take the form of tightly knit call-and-answer phrases, or syllables, that together sound as if a single bird is singing -- the birds' neurons fired rapidly when they produced their own syllables. 

Yet, as one wren begins to hear their partner's syllables sung in the duet, the neurons quiet down significantly. 

"You can think of inhibition as acting like a trampoline," explained Fortune. "When the birds hear their partner, the neurons are inhibited, but just like rebounding off a trampoline, the release from that inhibition causes them to swiftly respond when it's their time to sing."

Next, the team played recordings of wrens duetting while they were in a sleep-like state, anesthetized with a drug that affects a major inhibitory neurotransmitter in the wrens' brains that is also found in humans, gamma-aminobutyric acid (GABA). The drug transformed the activity in the brain, from inhibition to bursts of activity when the wrens heard their own music. 

"These mechanisms are shared or similar to what happens in our brains because we are doing the same kind of things," said Fortune. "There are similar brain circuits in humans that are involved in learning and coordinating vocalizations."

Fortune and Coleman say the results offer a fresh look into how the brains of humans and other cooperating animals use sensory cues to act in concert with each other, from flowing musical and dance performances, or even the disjointed feeling of inhibition commonly experienced today during video conferencing.

"These days, inhibition is occurring at all the wrong times when we have poor internet connections during our Zoom, WebEx, and Facetime conferences. The delays affect the sensory information that we rely on for coordinating the timing of our conversations," said Coleman. "I think this study is important for understanding how we interact with the world whenever we are trying to produce a single behavior as two performers. We are wired for cooperation, the same way as these jazz singing wrens."

Credit: 
New Jersey Institute of Technology

Researchers report reference genome for maize B chromosome

Three groups (Dr. James Birchler's group from University of Missouri, Dr. Jan Barto's group from Institute of Experimental Botany of the Czech Academy of Sciences and Dr. HAN Fangpu's group from the Institute of Genetics and Developmental Biology of the Chinese Academy of Sciences) recently reported a reference sequence for the supernumerary B chromosome in maize in a study published online in PNAS (doi:10.1073/pnas.2104254118).

Supernumerary B chromosomes persist in thousands of plant and animal genomes despite being nonessential. They are maintained in populations by mechanisms of "drive" that make them inherited at higher than typical Mendelian rates. Key properties such as its origin, evolution, and the molecular mechanism for its accumulation in maize have remained unclear even though such chromosomes have been a potent tool for studying maize genetics.

The researchers used a well-established set of sequencing and mapping tools, including chromosome flow-sorting, Illumina sequencing, Bionano optical mapping, and chromatin conformation capture (Hi-C).

The rich availability of deletion derivatives ensured strong scaffolding and vetting of assembly. In addition, 758 protein-coding genes were identified from the 125.9-Mb of chromosome sequence, of which at least 88 are expressed.

The scientists discovered that the current gene content is a result of continuous transfer from the A chromosomal complement over an extended evolutionary period. This process has been accompanied by subsequent degradation even though selection for maintenance of this nonvital chromosome has also continued.

The annotation results demonstrate that transposable elements in the B chromosome are shared with the standard A chromosome set. However, the failure of multiple lines of evidence to reveal a syntenic region in the B chromosome with any A chromosome indicates that this chromosome has been present in the evolutionary lineage for millions of years since any such synteny has disintegrated.

Sequence and deletion analysis reveals that a specific DNA repeat is located in and around the centromere that is involved with its drive mechanism, consisting of nondisjunction at the second pollen mitosis and preferential fertilization of the egg by the B-containing sperm.

This analysis cleverly combines comparisons among a variety of translocation and B-deletion stocks along with many years of genetic analysis. This approach provides a unique view of the sequence of this chromosome, as well as characterization of potentially functional elements within it.

Credit: 
Chinese Academy of Sciences Headquarters

Study suggests tai chi can mirror healthy benefits of conventional exercise

UCLA HEALTH RESEARCH BRIEF

FINDINGS

A new study shows that tai chi mirrors the beneficial effects of conventional exercise by reducing waist circumference in middle-aged and older adults with central obesity. The study was done by investigators at the University of Hong Kong, The Chinese University of Hong Kong; Chinese Academy of Sciences; and UCLA.

BACKGROUND

Central obesity is a major manifestation of metabolic syndrome, broadly defined as a cluster of cardiometabolic risk factors, including central obesity, dyslipidemia, hyperglycemia, low high-density lipoprotein cholesterol (HDL-C) level, and high blood pressure, that all increase risk for type 2 diabetes and cardiovascular disease.

METHOD

543 participants were randomly assigned in a 1:1:1 ratio to a control group with no exercise intervention (n= 181), conventional exercise consisting of aerobic exercise and strength training (EX group) (n= 181), and a tai chi group (TC group) (n= 181). Interventions lasted 12 weeks.

The primary outcome was waist circumference. Secondary outcomes were body weight; body mass index; high-density lipoprotein cholesterol (HDL-C), triglyceride, and fasting plasma glucose levels.

IMPACT

The findings suggest that tai chi is an effective approach for management of central obesity. This study has great translational significance because our findings support the notion of incorporating tai chi into global physical activity guidelines for middle-aged and older adults with central obesity.

Credit: 
University of California - Los Angeles Health Sciences

Ethnic diversity helps identify more genomic regions linked to diabetes-related traits

By including multi-ethnic participants, a largescale genetic study has identified more regions of the genome linked to type 2 diabetes-related traits than if the research had been conducted in Europeans alone.

The international MAGIC collaboration, made up of more than 400 global academics, conducted a genome-wide association meta-analysis led by the University of Exeter. Now published in Nature Genetics, their findings demonstrate that expanding research into different ancestries yields more and better results, as well as ultimately benefitting global patient care.

Up to now, nearly 87 per cent of genomic research of this type has been conducted in Europeans. This means that the way these findings are implemented may not optimally benefit people from non-European ancestries.

The team analysed data across a wide range of cohorts, encompassing more than 280,000 people without diabetes. Researchers looked at glycaemic traits, which are used to diagnose diabetes and monitor sugar and insulin levels in the blood.

The researchers incorporated 30 percent of the overall cohort with individuals of East Asian, Hispanic, African-American, South Asian and sub-Saharan African origin. By doing so, they discovered 24 more loci - or regions of the genome -linked to glycaemic traits than if they had conducted the research in Europeans alone.

Professor Inês Barroso, of the University of Exeter, who led the research, said: "Type 2 diabetes is an increasingly huge global health challenge- with most of the biggest increases occurring outside of Europe. While there are a lot of shared genetic factors between different countries and cultures, our research tells us that they do differ, in ways that we need to understand. It's critical to ensuring we can deliver a precision diabetes medicine approach that optimises treatment and care for everyone."

First author Dr Ji Chen, of the University of Exeter, said: "We discovered 24 additional regions of the genome by including cohorts which were more ethnically diverse than we would have done if we'd restricted our work to Europeans. Beyond the moral arguments for ensuring research is reflective of global populations, our work demonstrates that this approach generates better results."

The team found that though some loci were not detected in all ancestries, they were still useful to capture information about the glycaemic trait in that ancestry. Co-author Cassandra Spracklen, Assistant Professor at the University of Massachusetts-Amherst, said: "Our findings matter because we're moving towards using genetic scores to weigh up a person's risk of diabetes. We know that scores developed exclusively in individuals of one ancestry don't work well in people of a different ancestry. This is important as increasingly healthcare is moving towards a more precise approach. Failing to account for genetic variation according to ancestry will impact our ability to accurately diagnose diabetes."

Credit: 
University of Exeter

Global warming already responsible for one in three heat-related deaths

Between 1991 and 2018, more than a third of all deaths in which heat played a role were attributable to human-induced global warming, according to a new article in Nature Climate Change.

The study, the largest of its kind, was led by the London School of Hygiene & Tropical Medicine (LSHTM) and the University of Bern within the Multi-Country Multi-City (MCC) Collaborative Research Network. Using data from 732 locations in 43 countries around the world it shows for the first time the actual contribution of man-made climate change in increasing mortality risks due to heat.

Overall, the estimates show that 37% of all heat-related deaths in the recent summer periods were attributable to the warming of the planet due to anthropogenic activities. This percentage of heat-related deaths attributed to human-induced climate change was highest in Central and South America (up to 76% in Ecuador or Colombia, for example) and South-East Asia (between 48% to 61%).

Estimates also show the number of deaths from human-induced climate change that occurred in specific cities; 136 additional deaths per year in Santiago de Chile (44.3% of total heat-related deaths in the city), 189 in Athens (26.1%), 172 in Rome (32%), 156 in Tokyo (35.6%), 177 in Madrid (31.9%), 146 in Bangkok (53.4%), 82 in London (33.6%), 141 in New York (44.2%), and 137 in Ho Chi Minh City (48.5%).
The authors say their findings are further evidence of the need to adopt strong mitigation policies to reduce future warming, and to implement interventions to protect populations from the adverse consequences of heat exposure.

Dr Ana M. Vicedo-Cabrera, from the University of Bern and first author of the study, said: "We expect the proportion of heat-related deaths to continue to grow if we don't do something about climate change or adapt. So far, the average global temperature has only increased by about 1°C, which is a fraction of what we could face if emissions continue to grow unchecked."

Global warming is affecting our health in several ways, from direct impacts linked to wildfires and extreme weather, to changes in the spread of vector-borne diseases, among others. Perhaps most strikingly is the increase in mortality and morbidity associated with heat. Scenarios of future climate conditions predict a substantial rise in average temperatures, with extreme events such as heatwaves leading to future increases in the related health burden. However, no research has been conducted into what extent these impacts have already occurred in recent decades until now.

This new study focused on man-made global warming through a 'detection & attribution' study that identifies and attributes observed phenomena to changes in climate and weather. Specifically, the team examined past weather conditions simulated under scenarios with and without anthropogenic emissions. This enabled the researchers to separate the warming and related health impact linked with human activities from natural trends. Heat-related mortality was defined as the number of deaths attributed to heat, occurring at exposures higher than the optimum temperature for human health, which varies across locations.

While on average over a third of heat-related deaths are due to human-induced climate change, impact varies substantially across regions. Climate-related heat casualties range from a few dozen to several hundred deaths each year per city, as shown above, depending on the local changes in climate in each area and the vulnerability of its population. Interestingly, populations living in low and middle-income countries, which were responsible for a minor part of anthropogenic emissions in the past, are those most affected.

In the UK, 35% of heat-related deaths could be attributed to human-induced climate change, which corresponds to approximately 82 deaths in London, 16 deaths in Manchester, 20 in West Midlands or 4 in Bristol and Liverpool every summer season.

Professor Antonio Gasparrini from LSHTM, senior author of the study and coordinator of the MCC Network, said: "This is the largest detection & attribution study on current health risks of climate change. The message is clear: climate change will not just have devastating impacts in the future, but every continent is already experiencing the dire consequences of human activities on our planet. We must act now."

The authors acknowledge limitations of the study including being unable to include locations in all world regions--for example, large parts of Africa and South Asia--due to a lack of empirical data.

Credit: 
London School of Hygiene & Tropical Medicine

Lundquist investigators in global study expanding genomic research of different ancestries

image: Helix with Globes

Image: 
The Lundquist Institute

LOS ANGELES (May 31, 2021) -- Today The Lundquist Institute announced that its investigators contributed data from several studies, including data on Hispanics, African-Americans and East Asians, to the international MAGIC collaboration, composed of more than 400 global academics, who conducted a genome-wide association meta-analysis led by the University of Exeter. Now published in Nature Genetics, their findings demonstrate that expanding research into different ancestries yields more and better results, as well as ultimately benefitting global patient care. Up to now nearly 87 percent of genomic research of this type has been conducted in Europeans.

"We are very excited about contributing to this global study," said Dr. Jerome I. Rotter, Investigator and Director of the Institute for Translational Genomics and Population Sciences at The Lundquist Institute and Professor of Pediatrics and Human Genetics at the Geffen School of Medicine at UCLA. "It is highly significant that this global study is multi-ethnic and trans-ethnic. Such multi-ethnic studies are a major strength and focus of the cardiometabolic genomic epidemiology at The Lundquist. Our studies examine type 2 diabetes, coronary artery disease, hypertension, cardiac arrhythmias, and fatty liver, as well as studying their risk factors and related traits such as insulin resistance, dyslipidemia, and obesity."

The global study team analyzed data across a wide range of cohorts, encompassing more than 280,000 people without diabetes. Researchers looked at glycemic traits, which are used to diagnose diabetes and monitor sugar and insulin levels in the blood.

The researchers incorporated 30 percent of the overall cohort with individuals of East Asian, Hispanic, African-American, South Asian and sub-Saharan African origin. By doing so, they discovered 24 more loci - or regions of the genome - linked to glycaemic traits than if they had conducted the research in Europeans alone.

Professor Inês Barroso, of the University of Exeter, who led the research, said: "Type 2 diabetes is an increasingly huge global health challenge - with most of the biggest increases occurring outside of Europe. While there are a lot of shared genetic factors between different countries and cultures, our research tells us that they do differ, in ways that we need to understand. It's critical to ensuring we can deliver a precision diabetes medicine approach that optimizes treatment and care for everyone."

First author Dr. Ji Chen, of the University of Exeter, said: "We discovered 24 additional regions of the genome by including cohorts which were more ethnically diverse than we would have done if we'd restricted our work to Europeans. Beyond the moral arguments for ensuring research is reflective of global populations, our work demonstrates that this approach generates better results."

The team found that though some loci were not detected in all ancestries, they were still useful to capture information about the glycaemic trait in that ancestry.

Co-author Cassandra Spracklen, Assistant Professor at the University of Massachusetts-Amherst, said: "Our findings matter because we're moving towards using genetic scores to weigh up a person's risk of diabetes. We know that scores developed exclusively in individuals of one ancestry don't work well in people of a different ancestry. This is important as increasingly healthcare is moving towards a more precise approach. Failing to account for genetic variation according to ancestry will impact our ability to accurately diagnose diabetes."

Credit: 
The Lundquist Institute

Gender stereotypes still hold true for youth and types of political participation

Gender roles absorbed at an early age seem to have shaped today's youth regarding their involvement in politics, in line with traditional stereotypes, concludes a new study, conducted amongst adolescents and young adults aged between 15 and 30 in Italy, within the Horizon 2020 project: "CATCH-EyoU. Processes in Youth's Construction of Active EU Citizenship".

In their research article, published in the peer-reviewed, open-access scientific journal Social Psychological Bulletin, the research team from the University of Bologna report that it is young males that would more often engage directly with politics, like enrolling in a political party, acting to influence government policy, contacting a politician or taking part in a protest. On the other hand, young females would rather opt for civic activities, such as volunteering, charities, religious-based initiatives, boycotting etc.

Interestingly, previous research has attributed the higher level of political participation amongst males to women generally having lower incomes and access to education, as well as the fact they are generally busier with housework and caring for the family. However, by controlling for educational and socioeconomic background, the new study concludes the reason behind the gender gaps is rather the roles the society has been instilling in the survey's participants from an early age.

The researchers explain that, culturally, traits like autonomy, leadership, self-affirmation and dominance are seen as manlike, and as such, they are taught to boys through all possible channels, including family, school, peers and the media. As a result, later on, these boys are likely to feel more confident in expressing their political views and taking actions to defend them.

"The findings suggest that reducing the gender gap in political participation requires to give girls from an early age the opportunity to exercise leadership, experience a sense of agency and gain a critical awareness of the constraints and barriers they face as women to overcome them. Educational programs on gender equality, Youth Participatory Action Research, and girls' empowerment projects may serve to this scope," comment the authors of the study.

Nevertheless, gender gaps in voter turnout are effectively non-existing, according to the study. This has been true for the last European parliamentary elections, national parliamentary elections, and local elections. In fact, in Italy, gender gaps in voter turnout have been negligible ever since women were allowed to vote.

However, the reasons why women and men vote might be quite different, speculate the researchers. While men might have the desire to cast their votes simply because it's a logical part of their political behaviour, in women, it might rather be a case of exercising the stereotypical role of a woman, associated with stronger feelings for civic duty, conscientiousness and proneness to abiding by the rules.

"Studying youth engagement is highly informative because participation at a young age is conducive to future engagement in one's life course," says the team in conclusion. "Future research should further examine the evolution of gender differences over time, their causes and effects among younger generations, as well as their impact on political equality."

"While the current era of the #metoo movement suggests that gender dynamics may be undergoing new and promising social changes toward greater female involvement, the existing data on the persistence of gender gaps in participation among youth--also confirmed by our results--poses important questions on the factors that determine differential preferences for specific typologies of actions by men and women."

Credit: 
The Polish Association of Social Psychology

The price is right: Modeling economic growth in a zero-emission society

Pollution from manufacturing is now widespread, affecting all regions in the world, with serious ecological, economic, and political consequences. Heightened public concern and scrutiny have led to numerous governments considering policies that aim to lower pollution and improve environmental qualities. Inter-governmental agreements such as the Paris Agreement and the United Nations' Sustainable Development Goals all focus on lowering emissions of pollution. Specifically, they aim to achieve a "zero-emission society," which means that pollution is cleaned up as it is produced, while also reducing pollution (This idea of dealing with pollution is referred to as the "kindergarten rule.")

Of course, any efforts to achieve this goal require monetary investment and changes to manufacturing strategies, which, many worry, could hurt the economy. Now, a modeling study conducted by researchers from Tokyo University of Science and The Shoko Chukin Bank, Japan, published in the Journal of Cleaner Production, shows that it is possible to achieve economic growth simultaneously with environmental preservation. "There are existing models that look at how economies fluctuate under various conditions, such as differing environmental quality or tax rates, but these models haven't examined the effects of implementing the kindergarten rule," Prof. Hideo Noda, the study's lead author, explained. "So we thought it was important to extend the model and include a condition where the hypothetical society spends a part of its GDP to achieve zero emissions. Looking at emissions is also more tangible and easier to grasp than a vaguer concept of 'environmental quality.'"

The researchers used an economic model that allows for movement back and forth between two stages: a no-innovation phase and an innovation phase. The key to this model is the importance of innovation; previous models that focus on the environment and the economy did not account for innovation (e.g., research and development) as a major driver of economic growth in most developed nations. Acknowledging this connection is essential for improving our knowledge regarding how environmental problems and economic growth are linked.

When researchers included rules for the zero-emission society, the model indicated that it was compatible with economic growth (i.e., a sustained GDP growth), despite a portion of the GDP being dedicated to reducing pollution. For this to work, however, the model says that the GDP needs to be above a certain level. Additionally, the amount of GDP allocated to lowering pollution must be flexible. Researchers also observed that under the no-innovation phase, GDP growth is higher and the amount spent on pollution reduction decreases faster. In contrast, under the innovation phase, GDP growth is lower and the decrease in amount spent combating pollution is also slower.

According to Prof. Noda, this work provides important theoretical groundwork for policy, because currently, the relationship between zero emissions and economic growth isn't well understood. "Yet, this topic is extremely relevant to any policy push for sustainability--for example, one section of the UN's Sustainable Development Goals explicitly focuses on economic growth," he explains. "Our model should help persuade the leaders of some countries that it is feasible to reduce emissions without tanking the economy."

That, Prof. Noda hopes, may in turn make leaders more eager to implement the changes that are urgently needed to address global environmental crises like climate change.

Credit: 
Tokyo University of Science

Looking at future of Antarctic through an Indigenous Māori lens

It is time for the management and conservation of the Antarctic to begin focusing on responsibility, rather than rights, through an Indigenous Māori framework, a University of Otago academic argues.

In an article published in Nature Ecology & Evolution, Associate Professor Priscilla Wehi, of the Centre for Sustainability, says now is the time to be thinking of these potential changes.

"New Zealand is currently re-setting its priorities for future Antarctic research, and there may be review of the current international environmental conventions as we approach the 50-year anniversary of the protocols in 2048.

"We argue that Indigenous Māori frameworks offer powerful ways of thinking about how we protect the Antarctic, by focusing on responsibilities rather than rights, including the responsibilities we have to future generations," she says.

Antarctica is unlike any other place on earth - it is remote, there are no permanent human settlements, and no one nation has sovereignty.

"Consensus decision making by Antarctic Treaty nations will determine what happens next, but within that group there are many different interests and perspectives on what should happen, from mining rights through to ecosystem protection.

"Global conceptions of Antarctica are dominated by colonial narratives. On the other hand, an Indigenous Māori framework, focussing on relational thinking and connectedness, humans and non-human kin, and drawing on concepts of both reciprocity and responsibility, offers transformational insight into true collective management and conservation of Antarctica.

"Because mātauranga has a foundation where relationships between different parts of the universe, between humans and other living beings, and so on, are acknowledged and embedded, this is a very different way of examining the world and offers a different perspective on what we could - or even should - do next."

The article brings together mostly Indigenous researchers from Otago, Massey University, and Te R?nanga o Ngāi Tahu and is the first paper to bring Indigenous frameworks to the issue of what an Antarctic future might look like, Associate Professor Wehi says.

Human navigation into Antarctic waters dates back to about the 7th century with Hui Te Rangiora. Accounts of a frozen ocean and enormous ice cliffs were passed down through generations.

Observing and recording changes in the physical environment, naming and classifying areas of risk, and predicting environmental disturbance are critical for voyages like those of Hui Te Rangiora.

"Incorporating Indigenous environmental knowledge enhances our ability to understand, monitor, plan for, and adapt to weather and climate variability, but can also offer alternate frameworks from which to enhance practice," she says.

"Recognizing the intrinsic link between the well-being of tangata and whenua, Māori have an intergenerational obligation to ensure that the relationship with Antarctica is reciprocal and sustainable."

Co-author Associate Professor Krushil Watene, of Massey University, says Antarctica is one of the most important and one of the most fascinating places on Earth - both for science and philosophy.

"Antarctica challenges our perspectives, unsettles them and in doing so provides us with opportunities to reimagine our lives together, to reimagine our relationships with the natural environment, and to rethink our responsibilities globally.

"We need to urgently prompt thinking and discussion around the deep reflection and transformative change that Antarctica challenges us, as a global community to undertake. We need to be bold and brave in charting a future in which our planet can thrive. Philosophy generally, and indigenous philosophy in particular, brings important and valuable perspectives through which such futures can be charted," she says.

Credit: 
University of Otago

50 years of progress in women's health

Debates over women's health have long been contentious, but have also resulted in significant improvements in areas like equitable access to health care and survivorship. But the overall picture remains far from perfect. For example, the United States still has the highest rate of maternal death among high-income countries, particularly among African American women.

As the United States Supreme Court prepares to hear a Mississippi abortion case challenging the landmark 1973 Roe v. Wade decision, some experts are questioning whether women's health may be reversing course.

Cynthia A. Stuenkel, MD, clinical professor of medicine at University of California San Diego School of Medicine, and JoAnn E. Manson, MD, DrPH, professor of epidemiology at Harvard T.H. Chan School Of Public Health, review 50 years of progress in women's health in a perspective article published in the May 29, 2021 online issue of the New England Journal of Medicine.

"Reproductive justice is broader than the pro-choice movement and encompasses equity and accessibility of reproductive health care, as well as enhanced pathways to parenthood," wrote the authors.

In addition to Roe v. Wade, advances in reproductive health include:

1972 US Supreme Court ruling on Eisenstadt vs Baird ensuring unmarried persons equal access to contraception
2010 Affordable Care Act made contraceptives an insured preventive health benefit
Advances in reproductive technologies, including in vitro fertilization, genetic testing and fertility preservation by cancer specialists

Advances in women's health go beyond reproduction, said the authors. As interest and focus has expanded to all stages of a woman's life, science has begun to catch up to the specialized needs of women and sex-specific risk factors for chronic diseases that disproportionately affect women's health, such as autoimmune diseases, mental health, osteoporosis and coronary heart disease.

Progress in breast cancer care and prevention yielded a five-year overall survival rate of 90 percent
The human papillomavirus (HPV) vaccine reduced cervical cancer mortality decreased by 50 percent

"Moving forward, it will be essential to recognize and study intersectional health disparities, including disparities based on sex, race, ethnicity, gender identity, sexual orientation, income and disability status. Overcoming these challenges and addressing these inequities will contribute to improved health for everyone," wrote the authors.

Credit: 
University of California - San Diego

Less is more? New take on machine learning helps us "scale up" phase transitions

image: A correlation configuration (top left) is reduced using a newly developed block-cluster transformation (top right). Both the original and reduced configurations have an improved estimator technique applied to give configuration pairs of different size (bottom row). Using these training pairs, a CNN can learn to convert small patterns to large ones, achieving a successful inverse RG transformation.

Image: 
Tokyo Metropolitan University

Tokyo, Japan - Researchers from Tokyo Metropolitan University have enhanced "super-resolution" machine learning techniques to study phase transitions. They identified key features of how large arrays of interacting "particles" behave at different temperatures by simulating tiny arrays before using a convolutional neural network to generate a good estimate of what a larger array would look like using "correlation" configurations. The massive saving in computational cost may realize unique ways of understanding how materials behave.

We are surrounded by different states or "phases" of matter i.e. gases, liquids, and solids. The study of phase transitions, how one phase transforms into another, lies at the heart of our understanding of matter in the universe, and remains a hot topic for physicists. In particular, the idea of universality, where wildly different materials behave in similar ways thanks to a few shared features, is a powerful one. That's why physicists study model systems, often simple grids of "particles" on an array that interact via simple rules. These models distill the essence of the common physics shared by materials and, amazingly, still exhibit many of the properties of real materials, like phase transitions. Due to their elegant simplicity, these rules can be encoded into simulations that tell us what materials look like under different conditions.

However, like all simulations, the trouble starts when we want to look at lots of particles at the same time. The computation time required becomes particularly prohibitive near phase transitions, where dynamics slows down, and the "correlation length," a measure of how the state of one atom relates to the state of another some distance away, grows larger and larger. This is a real dilemma if we want to apply these findings to the real world: real materials generally always contain many more orders of magnitude of atoms and molecules than simulated matter.

That's why a team led by Professors Yutaka Okabe and Hiroyuki Mori of Tokyo Metropolitan University, in collaboration with researchers in Shibaura Institute of Technology and Bioinformatics Institute of Singapore, have been studying how to reliably extrapolate smaller simulations to larger ones using a concept known as an inverse renormalization group (RG). The renormalization group is a fundamental concept in the understanding of phase transitions and led Wilson to be awarded the 1982 Nobel Prize in Physics. Recently, the field met a powerful ally in convolutional neural networks (CNN), the same machine learning tool helping computer vision identify objects and decipher handwriting. The idea would be to give an algorithm the state of a small array of particles and get it to "estimate" what a larger array would look like. There is a strong analogy to the idea of super-resolution images, where blocky, pixelated images are used to generate smoother images at a higher resolution.

The team has been looking at how this is applied to "spin" models of matter, where particles interact with other nearby particles via the direction of their "spins." Previous attempts have particularly struggled to apply this to systems at temperatures above a phase transition, where configurations tend to look more random. Now, instead of using spin configurations i.e. simple snapshots of which direction the particle spins are pointing, they considered correlation configurations, where each particle is characterized by how similar its own spin is to that of other particles, specifically those which are very far away. It turns out correlation configurations contain more subtle queues about how particles are arranged, particularly at higher temperatures.

Like all machine learning techniques, the key is to be able to generate a reliable "training set". The team developed a new algorithm called the block-cluster transformation for correlation configurations to reduce these down to smaller patterns. Applying an improved estimator technique to both the original and reduced patterns, they had pairs of configurations of different size based on the same information. All that's left is to train the CNN to convert the small patterns to larger ones.

The group considered two systems, the 2D Ising model and the three-state Potts model, both key benchmarks for studies of condensed matter. For both, they found that their CNN could use a simulation of a very small array of points to reproduce how a measure of the correlation g(T) changed across a phase transition point in much larger systems. Comparing with direct simulations of larger systems, the same trends were reproduced for both systems, combined with a simple temperature rescaling based on data at an arbitrary system size.

A successful implementation of inverse RG transformations promises to give scientists a glimpse of previously inaccessible system sizes, and help physicists understand the larger scale features of materials. The team now hopes to apply their method to other models which can map more complex features such as a continuous range of spins, as well as the study of quantum systems.

Credit: 
Tokyo Metropolitan University

Lessening the cost of strategies to reach the Paris Agreement

image: Temperature stabilization and overshoot pathways and best available conversion factors for methane under each illustrative pathway.

Image: 
NIES

Five researchers shed new light on a key argument to reduce greenhouse gases (GHG): they provided the first economic analysis of conversion factors of other GHG like methane into their CO2 equivalent in overshoot scenarios. Although the United Nations Framework Convention on Climate Change (UNFCCC) considers settling for one value of reference (known as "Common Metric") to make this conversion among the Paris Agreement, the models presented here show the economic advantage of flexibility between various factors of conversion. "A key notion in the UNFCCC is to reduce GHG emissions in the least costly way so as to ensure global benefits" highlights Katsumasa Tanaka, primary author of the Science Advances study.

The research provides series of dynamic variations of conversion factors depending on possible trajectories of global warming to lessen the economic cost while maintaining some stability to anticipate the implementation of policies. They took into consideration different scenarios, one in which we reach the Paris Agreement's objectives of stabilization at 2°C and 1.5°C and others in which we would overshoot these objectives and need to strengthen efforts later on. These overshoot scenarios are a violation of the Paris Agreement, but the authors argued that such possibilities cannot be ruled out, in view of the near-term climate policies currently. They further noted that the feasibility of these scenarios still depends on very deep mitigation needed later in this century. They applied conversion factors in the numerical model and simulated the additional mitigation costs in all these scenarios to target the most favourable values.

The choice of a common conversion factor

A conversion system into CO2 equivalent is used to determine the participation of different GHG on a given time to prioritize actions. A well-known example is the global warming potential (GWP). To enable comparison among the parties of the Paris Agreement, the 100-year global warming potential (GWP100) was chosen as a reference. Greenhouse gases having very different lifespan and radiative impact, this conversion system is dependent on the choice of a time horizon.

"With GWP100 we look at the cumulative greenhouse effect on a 100-year period, which for methane gives a conversion factor of 28. This means 1 kilogram of methane is 28 times more potent than a kilogram of CO2" explains Johannes Morfeldt, a collaborator of this study joined from Sweden. Yet, since methane has a shorter lifespan and a higher radiative impact than CO2, the cumulative effect on 20 years (GWP20) is much more significant: 84 times more than 1 kilogram of CO2.

Changing the time horizon changes the conversion factor, and therefore influences which gas gets high on the agenda. If a kilogram of methane is 84 times more important than one of CO2, it will be more efficient to lower the global emissions by reducing methane. A debate is in motion since the nineties regarding which conversion factor should be used, and this team of researchers meant to bring additional information on their economic cost in light of possible pathways of global warming.

"We realized with our model that GWP100 is good for the coming decades, but is far from ideal in the long run" says Philippe Ciais, one of the co-authors of the study. "We don't see much variation in an optimal scenario of stabilization at 2°C. But in the event of an overshoot scenario we observe a high discrepancy among ideal conversion factors for today and when we reach 2°C of global warming. If we do not have a dynamic approach to change these values along the way, then society will bear an additional cost to mitigate climate change" adds Olivier Boucher, another co-author.

An optimal agenda

The researchers then modelled these additional costs to estimate which conversion factor would be ideal at a given time for different temperature trajectories. They showed that setting in stone GWP100 would bring about additional mitigation costs that could be avoided by switching to a dynamic factor. In a scenario of stabilization at 2°C these additional costs round up near 2%, but in high overshoot scenarios it goes up to 5%. "This shows that the ideal factors of conversion depend on a time horizon but are also primarily determined by the pathway, and strongly influenced by a temperature overshoot" underlines Daniel Johansson, a Swedish co-author.

This study shows that adapting to possible trajectories by switching from GWP100 to shorter time-horizons in the future could spare additional mitigation costs compared to the sole use of GWP100. The researchers also understand that these values cannot continuously change to enable policies to be anticipated and implemented. Thus, they put forward series of simple combinations of cost-effective conversion factors depending on the possible pathways. The authors suggest that "the UNFCCC and Parties to the Paris Agreement consider adapting the choice of conversion factors to the future pathway as it unfolds, to implement the cheapest options to reduce greenhouse gases emissions."

As we do not know yet the long-term pathway, the matter of the cost-effectiveness could be included in the technical assessment supporting the global stocktake within the UNFCCC. This key element of the Paris Agreement evaluates every five years the countries' collective progress toward long-term goals and aims at increasing the level of ambition of national policies. An inclusion of cost-effectiveness of factors of conversion on this recurring stocktaking process could allow the necessary assessment in time to inform following sessions as the long-term pathway unfolds.

Credit: 
National Institute for Environmental Studies

Elucidating how the production of antibodies is regulated, one cell at a time

image: Each circle represents a different cell. Different colors are cells with different characteristics. The proximity between the circles represents similarity between the genes that these cells are using. This figure shows that the cells divide into two large groups: the most mature (on the bottom right) and the most immature (on the left side)

Image: 
Saumya Kumar (iMM)

A study coordinated by Luís Graça, principal investigator at the Instituto de Medicina Molecular João Lobo Antunes (iMM; Portugal) and Professor at the Faculty of Medicine of the University of Lisbon (FMUL) used lymph nodes, tonsils and blood, to show how the cells that control production of antibodies are formed and act. The results published now in the scientific journal Science Immunology* unveiled key aspects about the regulation of antibody production, with significant importance for diseases where antibody production is dysregulated such as autoimmune diseases or allergies.

In the last few months we witnessed the importance of vaccine-induced antibody protection against infections like COVID-19. However, it has been very difficult to study the human cells involved in the production of antibodies after vaccination, as this process takes place in the lymph nodes and not in the blood. To study this process, it was necessary to use emerging technologies for the sequencing and identification of genes in each individual cell. "To understand the power of this technology, we must note that all of our cells have the same genes. However, a cell like a lymphocyte uses a different combination of genes compared to a neuron. Thus, after vaccination, when a lymphocyte starts the process of controlling the production of antibodies, it will turn on some genes and turn off others. This is what we studied for hundreds of cells simultaneously", explains Luís Graça.

The difficulty of the process can be appreciated if we remember that about 20 years ago the sequencing of the human genome required a large group of laboratories in several countries benefiting from a series of further developments for over 10 years. Now, this sequenced genome is available for scientists to study the activity of genes in hundreds of independent cells. Something that would have been impossible a few years ago. Saumya Kumar, the first author of the work, says: "When the study started four years ago, we did not have the experimental tools needed and the advances in technology have been extraordinary. Using omics technology offered an incredible solution to this problem and we ended up using it".

The information thus obtained allowed the researchers to study, in great detail, the genes and molecules involved in regulating the production of antibodies. In this way, a wide range of opportunities open up to attempt the manipulation of some of these molecules for enhanced production of antibodies in vaccines, or to decrease the production of antibodies in diseases caused by them (such as autoimmunity or allergy).

In the words of Luís Graça: "When the biological systems of our organism are not properly regulated, disease arises. It is the knowledge of the organism's regulation that allows to correct these pathological situations restoring the healthy balance of a well regulated system".

This study also demonstrates that science has no boundaries: the group at iMM includes scientists from different nationalities, with different skills, from clinicians to bioinformaticians.

Credit: 
Instituto de Medicina Molecular