Culture

Grasshoppers and roadblocks: Coping with COVID-19 in rural Mexico

image: A Zapotec woman making tamales using locally grown maiz, or corn.

Image: 
Jeffrey Cohen

COLUMBUS, Ohio - On the outskirts of some small Indigenous communities in the Mexican state of Oaxaca, a few volunteer guards keep watch along roads blocked by makeshift barricades of chains, stones and wood.

The invader they are trying to stop is COVID-19.

For many of Mexico's Indigenous people, poor and ignored by state and federal governments, the fight against the COVID-19 pandemic is one that rests primarily with themselves, said Jeffrey Cohen, a professor of anthropology at The Ohio State University.

That means they must take steps like limiting access to their villages.

"Most of these communities only have one road in and out," Cohen said. "So these guards, called topiles, block that road so that outsiders with the virus won't get in - and residents won't go to a nearby city and potentially bring the virus back."

Cohen has spent years in the central valleys of Oaxaca conducting anthropological research among the Zapotec people.

In the journal Global Public Health, Cohen recently co-authored an article about how the Indigenous people in Oaxaca are coping with the pandemic. His co-author is Nydia Delhi Mata- Sánchez, his former student and now the rector of the Universidad Tecnológica de los Valles Centrales de Oaxaca.

Oaxaca, in southern Mexico, is one of the country's most ethnically diverse states and home to many indigenous minority groups, including the Zapotec. It is also one of poorest states in the country. The Mexican government estimates that nearly 70 percent of the nation's Indigenous population lives in poverty.

Most of the Indigenous communities in Oaxaca are small and isolated, which has kept them comparatively less exposed to the coronavirus than the rest of Mexico.

About two-thirds of the approximately 500 indigenous and rural communities in Oaxaca had no cases of COVID-19 when Cohen and Mata-Sanchez were doing research for this paper in the early months of the pandemic. Now about one-third still have no cases, Cohen said.

But as the virus has seeped into their villages, the Zapotec and other Indigenous people are finding ways to cope with the pandemic. One is through setting territorial boundaries like the roadblocks.

Also, village leaders are promoting social distancing and the use of masks. While these measures have become a political issue in many parts of Mexico, as they have in the United States, that's not a problem among most Indigenous communities.

"One of the strengths of these local leaders is that they have a more traditional form of leadership that isn't based on political affiliations," Cohen said.

"The village leaders are generally respected by the people and they are listened to when they promote health measures like wearing a mask and social distancing."

In addition, villagers are rethinking their eating habits and turning to traditional food sources that had lost popularity in recent years as residents traveled to food markets in larger towns for more modern fare.

For example, villagers are collecting wild honey, as they used to do more often in earlier times. And many have returned to eating "chapulines," grasshoppers harvested from the fields and quickly toasted over a fire.

"It is a protein-rich alternative to expensive, store-bought meats that are no longer available locally," Cohen said.

"These are the kinds of foods that never went away completely, but were less popular, especially among younger people who thought of these as things their grandparents ate."

But what may be the most important key to how the Zapotec are coping with the pandemic is the strengthening of the tradition of reciprocity among their peoples.

"It is a more formal arrangement than just helping neighbors as we see in America. It is so important that it has its own name - in the area where I have done research it is called guelaguetza," Cohen said.

When people have become sick with COVID or other diseases, community members will take care of their food crops, and share their water and food. No one is left to fend for themselves, he said.

Although the Zapotec and other Indigenous people in Mexico are fighting the pandemic as best they can, they need more government support.

"There is still so little response at the state and federal level to Indigenous concerns," he said. Many of these needs preceded COVID-19, but the pandemic has exacerbated the issues.

The most pressing concern for most communities is access to clean water, according to Cohen. The lack of potable water increases the risk of intestinal problems like cholera, among other health conditions, which will intensify the effects of COVID-19.

In addition, many Indigenous people must travel outside their villages for education, work and health care, which is difficult and dangerous during the pandemic.

"Many people were hurting before COVID-19 and the pandemic is only making things worse," Cohen said. "The Zapotec's best bet, they know, is still themselves."

Credit: 
Ohio State University

Applying quantum computing to a particle process

image: An ATLAS particle collision event display from 2018, showing the spray of particles (orange lines) emanating from the collision of protons, and the detector readout (squares and rectangles).

Image: 
ATLAS Collaboration

A team of researchers at Lawrence Berkeley National Laboratory (Berkeley Lab) used a quantum computer to successfully simulate an aspect of particle collisions that is typically neglected in high-energy physics experiments, such as those that occur at CERN's Large Hadron Collider.

The quantum algorithm they developed accounts for the complexity of parton showers, which are complicated bursts of particles produced in the collisions that involve particle production and decay processes.

Classical algorithms typically used to model parton showers, such as the popular Markov Chain Monte Carlo algorithms, overlook several quantum-based effects, the researchers note in a study published online Feb. 10 in the journal Physical Review Letters that details their quantum algorithm.

"We've essentially shown that you can put a parton shower on a quantum computer with efficient resources," said Christian Bauer, who is Theory Group leader and serves as principal investigator for quantum computing efforts in Berkeley Lab's Physics Division, "and we've shown there are certain quantum effects that are difficult to describe on a classical computer that you could describe on a quantum computer." Bauer led the recent study.

Their approach meshes quantum and classical computing: It uses the quantum solution only for the part of the particle collisions that cannot be addressed with classical computing, and uses classical computing to address all of the other aspects of the particle collisions.

Researchers constructed a so-called "toy model," a simplified theory that can be run on an actual quantum computer while still containing enough complexity that prevents it from being simulated using classical methods.

"What a quantum algorithm does is compute all possible outcomes at the same time, then picks one," Bauer said. "As the data gets more and more precise, our theoretical predictions need to get more and more precise. And at some point these quantum effects become big enough that they actually matter," and need to be accounted for.

In constructing their quantum algorithm, researchers factored in the different particle processes and outcomes that can occur in a parton shower, accounting for particle state, particle emission history, whether emissions occurred, and the number of particles produced in the shower, including separate counts for bosons and for two types of fermions.

The quantum computer "computed these histories at the same time, and summed up all of the possible histories at each intermediate stage," Bauer noted.

The research team used the IBM Q Johannesburg chip, a quantum computer with 20 qubits. Each qubit, or quantum bit, is capable of representing a zero, one, and a state of so-called superposition in which it represents both a zero and a one simultaneously. This superposition is what makes qubits uniquely powerful compared to standard computing bits, which can represent a zero or one.

Researchers constructed a four-step quantum computer circuit using five qubits, and the algorithm requires 48 operations. Researchers noted that noise in the quantum computer is likely to blame for differences in results with the quantum simulator.

While the team's pioneering efforts to apply quantum computing to a simplified portion of particle collider data are promising, Bauer said that he doesn't expect quantum computers to have a large impact on the high-energy physics field for several years - at least until the hardware improves.

Quantum computers will need more qubits and much lower noise to have a real breakthrough, Bauer said. "A lot depends on how quickly the machines get better." But he noted that there is a huge and growing effort to make that happen, and it's important to start thinking about these quantum algorithms now to be ready for the coming advances in hardware.

Such quantum leaps in technology are a prime focus of an Energy Department-supported collaborative quantum R&D center that Berkeley Lab is a part of, called the Quantum Systems Accelerator.

As hardware improves it will be possible to account for more types of bosons and fermions in the quantum algorithm, which will improve its accuracy.

Such algorithms should eventually have broad impact in the high-energy physics field, he said, and could also find application in heavy-ion-collider experiments.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Changing the connection between the hemispheres affects speech perception

image: Top: Stimulation electrodes were centered over CP6 (right hemisphere) and CP5 (left hemisphere) (41). Middle: The interhemispheric phase synchrony was manipulated using 40Hz TACS with an interhemispheric phase lag of 0° (TACS 0°) or 180° (dotted line, TACS 180°). The colors represent the polarity (positive = red; negative = blue) of the current for the time stamp highlighted by the dotted line. Bottom: Simulation of the electric field strength induced by bi-hemispheric TACS in a template brain. RH: Right hemisphere; LH: Left hemisphere

Image: 
Basil Preisig

When we listen to speech sounds, the information that enters our left and right ear is not exactly the same. This may be because acoustic information reaches one ear before the other, or because the sound is perceived as louder by one of the ears. Information about speech sounds also reaches different parts of our brain, and the two hemispheres are specialised in processing different types of acoustic information. But how does the brain integrate auditory information from different areas?

To investigate this question, lead researcher Basil Preisig from the University of Zurich collaborated with an international team of scientists. In an earlier study, the team discovered that the brain integrates information about speech sounds by 'balancing' the rhythm of gamma waves across the hemispheres--a process called 'oscillatory synchronisation'. Preisig and his colleagues also found that they could influence the integration of speech sounds by changing the balancing process between the hemispheres. However, it was still unclear where in the brain this process occurred.

Did you hear 'ga' or 'da'?

The researchers decided to apply electric brain stimulation (high density transcranial alternating current stimulation or HD-TACS) to 28 healthy volunteers while their brains were being scanned (with fMRI) at the Donders Centre for Cognitive Neuroimaging in Nijmegen. They created a syllable that was somewhere in between 'ga' and 'da', and played this ambiguous syllable to the right ear of the participants. At the same time, the disambiguating information was played to the left ear. Participants were asked to indicate whether they heard 'ga' or 'da' by pressing a button. Would changing the connection between the two hemispheres also change the way the participants integrated information played to the left and right ear?

The scientists disrupted the 'balance' of gamma waves between the two hemispheres, which in turn affected what the participants reported to hear ('ga' or 'da').

Phantom perception

"This is the first demonstration in the auditory domain that interhemispheric connectivity is important for the integration of speech sound information", says Preisig. "This work paves the way for investigating other sensory modalities and more complex auditory stimulation". "These results give us valuable insights into how the brain's hemispheres are coordinated, and how we may use experimental techniques to manipulate this" adds senior author Alexis-Hervais Adelman.

The findings, to be published in PNAS, may also have clinical implications. "We know that disturbances of interhemispheric connectivity occur in auditory 'phantom' perceptions, such as tinnitus and auditory verbal hallucinations", Preisig explains. "Therefore, stimulating the two hemispheres with (HD-)TACS may offer therapeutic benefits. I will follow up on this research by applying TACS in patients with hearing loss and tinnitus, to improve our understanding of neural attention control and to enhance speech comprehension for this group."

Credit: 
Max Planck Institute for Psycholinguistics

Limited transmission of Covid-19 from open schools but teachers were affected

image: Helena Svaleryd, Professor at Department of Economics, Uppsala University

Image: 
Mikael Wallerstedt

Most countries introduced school closures during the spring of 2020 despite substantial uncertainty regarding the effectiveness in containing SARS-CoV-2. In Sweden, upper-secondary schools moved online while lower-secondary schools remained open. A comparison of parents with children in the final year of lower-secondary and first year of upper-secondary school shows that keeping the former open had limited consequences for the overall transmission of the virus. However, the infection rate doubled among lower-secondary teachers relative to upper-secondary ones. The infection rate among partners of lower-secondary teacher was 30 percent higher than among their upper-secondary counterparts.

On March 18, 2020, Swedish upper-secondary schools moved to online instruction while lower-secondary schools remained open. This facilitates a comparison of infections and disease between groups that are comparable in other regards. In the study, all PCR-confirmed cases of SARS-CoV-2 and all healthcare registered cases of COVID-19 until the summer break are linked to register data on families and teachers in lower and upper-secondary schools.

Since the age of the student is likely to correlate with the severity of symptoms, student infectiousness and various types of risk behaviour, it is crucial to compare parents to children close in age. According to the study, the risk of infection was 17 percent higher among parents whose youngest child studied at the final year of lower-secondary rather than the first year of upper-secondary school. Had lower-secondary schools moved online, the estimates correspond to 500 fewer detected cases among a total of 450 000 lower-secondary parents (4.5 percent of the population). This can be compared to 53 000 PCR-confirmed cases in the total population during until the summer break in mid-June.

When comparing lower to upper-secondary teachers, we find that the risk for both PCR-confirmed infection and healthcare treatment due to COVID-19 doubled by keeping schools open. Among 124 occupations, upper-secondary teachers had a median risk of infection while lower-secondary were the 7th most affected. This comparison excludes healthcare workers who had markedly different access to PCR-testing. By the end of June, 79 out of 39 500 lower-secondary teachers had been hospitalized due to COVID-19, one of whom deceased. According to the study, this number had been down to 46 if lower-secondary schools had closed.

It is well-known that SARS-CoV-2 is transmitted within households. The study finds that the risk of a positive PCR-test was 30 percent higher among partners of lower-secondary teachers than among their upper-secondary counterparts. The estimates for more serious cases of COVID-19 are somewhat lower than for PCR-tests but - just as for parents - these estimates are imprecise.

Closing the schools is a costly measure with potentially long-run detrimental effects for students. The results for parents are in line with theoretical models predicting a limited impact of school closures on the general transmission of SARS-CoV-2. In an international comparison, the precautionary measures undertaken in Swedish schools are best described as mild. Thus, strict measures within open schools cannot explain the relatively minor impact on the overall rate of transmission. The results for teachers suggest that further precautionary measures could be considered.

The study does not analyse the impact of school closures for virus transmission among students. We note, however, that there are few cases of serious illness among the young. In particular, zero deaths from COVID-19 had been recorded among 2-19 year olds in Sweden until mid-summer 2020.

Credit: 
Uppsala University

How comparable different stress tests are

image: Gesa Berretz from the Bochum Biopsychology Department was first author of the review article.

Image: 
RUB, Marquard

Scientists use many different tests to investigate what happens in the brain in people experiencing stress. It is unclear to what extent the various methods with which subjects are placed under stress are comparable to each other. In a meta-analysis, a biopsychology team from Ruhr-Universität Bochum compared 31 previous studies that had investigated stress using functional magnetic resonance imaging (fMRI). The team worked out which regions of the brain are activated as standard during stress and which stress tests trigger similar activation patterns. They describe the results in the journal Neuroscience and Biobehavioral Reviews, published online on 5 February 2021.

To conduct the work, Gesa Berretz, Dr. Julian Packheiser and Prof. Sebastian Ocklenburg from the Department of Biopsychology collaborated with Professor Robert Kumsta, Genetic Psychology, and Professor Oliver Wolf, Cognitive Psychology.

Activation patterns from 31 studies compared

"We know that stress influences the entire information processing pathway, for example attention, working memory and long-term memory," says Gesa Berretz. "But there has not been any consensus so far on how these different situations induce the same feeling of stress and what happens in the brain." Many researchers are trying to clarify this question and use different methods to induce stress in their study participants. While doing this, they measure the brain activity of subjects using fMRI. The activation patterns are stated in the form of coordinates in a three-dimensional space, representing the regions of the brain that were active during the stress test.

The Bochum-based team evaluated 31 studies using what is known as an activation likelihood estimation analysis. During this, the researchers compared the coordinates of the activation patterns from all of the studies and checked statistically to what extent the patterns were similar. Data from 1279 subjects were included. The result: a range of areas of the brain, including the insula, the claustrum, the lentiform nucleus and the inferior frontal gyrus, were always activated, no matter which stress test was applied. "These areas of the brain appear to play a central role in stress," summarises Gesa Berretz.

The potential role of the regions of the brain in stress

The insula is, among other things, associated with pain perception, self-awareness and social perception, and integrates sensory and internal emotional information. It is additionally involved in controlling the hormonal stress response. The claustrum is also responsible for integrating various information and is important for consciousness. The activation of these regions indicates, according to the researchers, that the study participants direct their attention inwards towards their emotional processes when under stress.

The inferior frontal gyrus is responsible for semantic and phonological processing and for working memory. "Activation presumably occurs because many of the methods involve demanding cognitive tasks," the authors assume.

The lentiform nucleus is associated with movement and coordination. Its role in the context of stress is not known. "We speculate that acute stress leads to an increase in general muscle tension and preparation for a possible fight-or-flight response," explains Gesa Berretz.

Two stress tests emerge as outliers

The analysis also found that the methods used largely achieved consistent results and thus appear well-suited to investigating stress. Only two methods, called Cyberball and aversive viewing, represented exceptions. In the first method, subjects were socially excluded during a virtual ball game. The brain activity pattern triggered by this stress test showed fewer overlaps with the activation patterns from other methods. During aversive viewing, the subjects watch unsettling film scenes with violent content, while they are shown neutral film material under control conditions. In some experiments involving this method, the meta-analysis found no differences between the stress and control conditions. As a result, according to the researchers, particular care should be taken when interpreting studies that use these methods.

Credit: 
Ruhr-University Bochum

50 years since decimalisation: A very British compromise

image: A new coin from the Royal Mint marks 50 years since Britain went decimal.

Image: 
The Royal Mint

The move from the ancient system of £sd coinage to 100p to £1 was controversial, considered by some to be a sign of the UK's receding importance on the world stage and even as the beginning of the tumultuous relationship with Europe that ultimately led to Brexit.

But as a result of extensive analysis of a quiet backwater of recent history, Andy argues that the move towards decimalisation had little to do with Europe and in fact was a traditional British compromise.

"It was an event that I remember from my youth, but I found that for such a significant decision that affected everybody in the UK, decimalisation is something that is not covered in much depth," says Andy.

"It was the big news story in my teens. I mean, they changed the money! I wondered if anyone had done much about it, so I read around what people say about this period and what people have said recently about Brexit.

Commonwealth and not Europe a major factor

"Some see it as a harbinger of Europeanisation, leading up to the entry of the UK into the EU and ultimately the reaction to that process resulting in Brexit. My research shows that European considerations played virtually no part in that decision. There is no mention of Europe or the EEC in the original Halsbury Report that paved the way to decimalisation."

On the other hand, the Commonwealth was influential in the move to decimal currency, with South Africa, Australia and New Zealand all committing to decimalisation in the 1960s while Conservative and Labour governments in the UK wavered over whether to move in the same direction.

One option for the British currency was for it to be revalued in alignment with European counterparts like the Franc and the Deutschmark, which were around a tenth of the value of sterling. This was never considered seriously in Britain as it was seen as damaging to British prestige globally.

An exchange between Labour chancellor James Callaghan and BBC's Graham Turner when decimalisation was announced in 1966 underlined the approach:

Callaghan: Speaking for myself I think there's a lot to be said for the pound. Every one of us in Britain is familiar with it, we know what it stands for, abroad they know what it stands for.

Turner: Why have you decided not to use the word 'cent'?

Callaghan: Oh, I much prefer 'penny', why should we go American - penny is a good, it is indeed the oldest coin in Britain, it was originally a silver coin. I see no reason why we should adopt 'cent', it's a miserable sounding word by comparison with penny.

Political and financial wrangles

To understand the development of the new currency, Andy looked at several aspects, "like the way it was managed politically, the lead up to it and how it played up the modernisation agenda - the attempt to arrest a period of British decline".

"I looked at the influence of pressure groups and I found that consumers and retailers, who might have been concerned about currency reform, wanted to follow the lead of Australia and New Zealand in basing the new currency on a unit equal to 10 shillings, or half a pound.

"However, the City and the Bank of England were implacably opposed to any reduction in the value of the major unit, and they decisively influenced the government, playing into the inherent conservative instincts of Harold Wilson's Labour government."

Irish angle still relevant in 2021

The UK's relationship with the Republic of Ireland was another significant element of decimalisation, and is an area of interest for Andy following his MA thesis, which looked at the last days of the Northern Ireland Labour Party in the same era.

He adds that, "Although Ireland eventually decimalised on the same date and on the same basis as the UK, the Irish government delayed their decision for two years, while they gave serious consideration to aligning their currency to that of the main European currencies. This to some extent presaged future developments in relationships between Ireland, the UK and Europe - clearly an issue with huge resonance in the present day.

"I visited the National Archive at Kew, and it was so helpful to be able to access old newspapers from the University, but I also accessed the national archives in Dublin for the Irish angle."

Thesis could be basis for a book

Now retired after working in finance and then teaching in further education, Andy is keen to expand his thesis into a book. COVID-19 put paid to a visit to New Zealand last October where he was due to speak to the Royal New Zealand Numismatic Society, but he was able to present his research at their annual conference via Zoom. Since then a contact from the RNZNS as well as Professor Barry Doyle at Huddersfield have given him ideas for further writing.

Prof Doyle, who supervised Andy's PhD, said, "This is a really original study that explores a topic overlooked by two generations of historians. The thesis was highly commended by the examiners who were particularly impressed with the way it showed the influence of both the Commonwealth and the Republic of Ireland, on the shaping of Britain's currency reform.

"I'm looking forward to seeing Andy's work in book form - it will make a huge contribution to our understanding of the modernisation of the British economy."

Credit: 
University of Huddersfield

Detecting single molecules and diagnosing diseases with a smartphone

Biomarkers play a central role in the diagnosis of disease and assessment of its course. Among the markers now in use are genes, proteins, hormones, lipids and other classes of molecules. Biomarkers can be found in the blood, in cerebrospinal fluid, urine and various types of tissues, but most of them have one thing in common: They occur in extremely low concentrations, and are therefore technically challenging to detect and quantify.

Many detection procedures use molecular probes, such as antibodies or short nucleic-acid sequences, which are designed to bind to specific biomarkers. When a probe recognizes and binds to its target, chemical or physical reactions give rise to fluorescence signals. Such methods work well, provided they are sensitive enough to recognize the relevant biomarker in a high percentage of all patients who carry it in their blood. In addition, before such fluorescence-based tests can be used in practice, the biomarkers themselves or their signals must be amplified. The ultimate goal is to enable medical screening to be carried out directly on patients, without having to send the samples to a distant laboratory for analysis.

Molecular antennas amplify fluorescence signals
Philip Tinnefeld, who holds a Chair in Physical Chemistry at LMU, has developed a strategy for determining levels of biomarkers present in low concentrations. He has succeeded in coupling DNA probes to tiny particles of gold or silver. Pairs of particles ('dimers') act as nano-antennas that amplify the fluorescence signals. The trick works as follows: Interactions between the nanoparticles and incoming light waves intensify the local electromagnetic fields, and this in turn leads to a massive increase in the amplitude of the fluorescence. In this way, bacteria that contain antibiotic resistance genes and even viruses can be specifically detected.

"DNA-based nano-antennas have been studied for the last few years," says Kateryna Trofymchuk, joint first author of the study. "But the fabrication of these nanostructures presents challenges." Philip Tinnefeld's research group has now succeeded in configuring the components of their nano-antennas more precisely, and in positioning the DNA molecules that serve as capture probes at the site of signal amplification. Together, these modifications enable the fluorescence signal to be more effectively amplified. Furthermore, in the minuscule volume involved, which is on the order of zeptoliters (a zeptoliter equals 10-21 of a liter), even more molecules can be captured.

The high degree of positioning control is made possible by DNA nanotechnology, which exploits the structural properties of DNA to guide the assembly of all sorts of nanoscale objects - in extremely large numbers. "In one sample, we can simultaneously produce billions of these nano-antennas, using a procedure that basically consists of pipetting a few solutions together," says Trofymchuk.

Routine diagnostics on the smartphone
"In the future," says Viktorija Glembockyte, also joint first author of the publication, "our technology could be utilized for diagnostic tests even in areas in which access to electricity or laboratory equipment is restricted. We have shown that we can directly detect small fragments of DNA in blood serum, using a portable, smartphone-based microscope that runs on a conventional USB power pack to monitor the assay." Newer smartphones are usually equipped with pretty good cameras. Apart from that, all that's needed is a laser and a lens - two readily available and cheap components. The LMU researchers used this basic recipe to construct their prototypes.

They went on to demonstrate that DNA fragments that are specific for antibiotic resistance genes in bacteria could be detected by this set-up. But the assay could be easily modified to detect a whole range of interesting target types, such as viruses. Tinnefeld is optimistic: "The past year has shown that there is always a need for new and innovative diagnostic methods, and perhaps our technology can one day contribute to the development of an inexpensive and reliable diagnostic test that can be carried out at home."

Credit: 
Ludwig-Maximilians-Universität München

Researchers have broken the code for cell communication

image: By studying yeast cells, researchers have successfully mapped how cells communicate and synchronize their behaviour.

Image: 
University of Gothenburg

Knowledge on how cells communicate is an important key to understanding many biological systems and diseases. A research team led by researchers at the University of Gothenburg has now used a unique combination of methods to map the mechanism behind cellular communication. Their findings can potentially improve understanding of the underlying mechanism behind type 2 diabetes.

We know that human communication is important, but communication between the cells in our bodies is just as vital. The processes where cells synchronize and coordinate their behaviour is required for an organism to function and for human organs to be able to perform their functions.

"How do cells go from monologues to dialogues? How do cells transit from acting as individuals to acting as a community? We need to better understand this complex and difficult-to-study behaviour," says Caroline Beck Adiels, senior lecturer at the Department of Physics at the University of Gothenburg.

Have found the mechanism behind cellular communication

She is responsible for the study now published in the scientific journal PNAS, in which the researchers established a method for studying cellular communication. In the study, they successfully mapped the mechanism behind cellular communication in the metabolic process, using small culture chambers that allow the control of the environment around the cells.

The researchers chose to study yeast cells, since they are similar to human cells, and their focus is on glycolytic oscillations - a series of chemical reactions during metabolism where the concentration of substances can pulse or oscillate. The study showed how cells that initially oscillated independent of each other shifted to being more synchronized, creating partially synchronized populations of cells.

"One of the unique things with this study is that we have been able to study individual cells instead of simply entire cell populations. This has allowed us to really be able to see how the cells transition from their individual behaviour to coordinating with their neighbours. We have been able to map their behaviour both temporally and spatially, that is to say, when something occurs and in which cell," says Beck Adiels.

Opens up opportunities for understanding type 2 diabetes

According to Beck Adiels, this knowledge can be applied in many other biological systems and more complex cells where coordinated cell behaviour plays an important role. This type of behaviour is also found in cells such as heart muscle cells and in pancreatic cells, which can be an important piece of the puzzle in diabetes research.

"The study can contribute to understanding how pancreatic cells are regulated and how they secrete insulin, which can help us understand the underlying mechanism behind type 2 diabetes. Eventually, this could contribute to developing new medicines for treating the disease."

The study is a collaboration between eight researchers at Swedish and international universities, and Caroline Beck Adiels emphasizes that this interdisciplinary collaboration has been fundamental in studying the complex behaviour of cells from multiple perspectives.

"I am very proud of this work, which had not been possible to complete if we had not collaborated across disciplines," she says.

Credit: 
University of Gothenburg

Citizens versus the internet

The Internet has revolutionized our lives - whether in terms of working, finding information or entertainment, connecting with others, or shopping. The online world has made many things easier and opened up previously unimaginable opportunities. At the same time, it presents both individuals and societies with major challenges: The underlying technologies do not necessarily serve users' best interests.

"We're interested in questions such as: How can we create online environments that respect human autonomy and promote truth? And what can people themselves do to avoid being misled and manipulated?", says Anastasia Kozyreva, lead author and researcher at the Center for Adaptive Rationality at the Max Planck Institute for Human Development. The research team began by examining the differences between the online and offline worlds, and identified four major challenges.

User behavior is influenced by manipulative choice architectures. These "dark patterns" steer users toward unintended behaviors; they include advertising that blends into the content or navigation of a page to generate more clicks, or confusing privacy settings that prompt people to share more information than they really want to.

The information presented by AI-powered information architectures is not neutral; it is personalized on the basis of the data collected from users. This means that two people who enter the same term into a search engine will probably be shown different results. That can be helpful if, for example, we want to look up a restaurant and the search engine displays hits in our neighborhood at the top of the list, rather than a restaurant with the same name on the other side of the world. But if we are shown news or political content solely on the basis of our preferences, we risk finding ourselves in a filter bubble where we are no longer exposed to any other opinions.

The research team sees false and misleading information as another challenge for people online. Videos and posts propagating conspiracy theories and unverified rumors can spread rapidly through social media, causing real harm. For example, people may decide not to get vaccinated due to misinformation about vaccines, putting themselves and others at risk.

Distracting online environments constantly seek to attract users' attention - whether by means of push notifications, flashing displays, pop-up ads, or constantly updated content. The aim is to capture and hold users' attention for as long as possible: That is the very basis of Internet platforms' business models. We find ourselves spending far more time on our screens than we intended - with no real benefit and at the cost of our attention for other things.

Taking a behavioral science perspective, the researchers propose specific interventions to address these four challenges. They suggest that "boosting tools" can be used to train new competencies and enable better, more autonomous decisions in the online world.

Self-nudging is one of the cognitive tools that people can use to create "healthier" choice and information environments for themselves. Self-nudging empowers people to set up their digital environment in the way that works best for them. This might involve turning off notifications from apps or rearranging one's smartphone home screen so that only useful apps are displayed: the calendar, camera, and maps, for example, along with meditation and weather apps. Everything that is overly distracting, such as social media and games, is better tucked away in folders. The researchers also recommend that users consciously set time limits on their social media use.

"The digital world is full of traps," says Ralph Hertwig, Director of the Center for Adaptive Rationality at the Max Planck Institute for Human Development. "But we can take steps to avoid falling into them. In the same way as we might hide our chocolate stash at the back of the cupboard and put a bowl of apples on the table, we can turn off notifications from apps that permanently demand our attention. Out of sight is out of mind - whether in real life or in the digital world."

And just as we look right and left before crossing a street, we should make a habit of asking certain questions to evaluate the content we encounter online. Questions such as: What is the origin of the information? Which sources are cited? Can I find similar content on reputable websites? This approach can boost users' competence in evaluating the reliability of online information. But Internet platforms could also help users to assess content - for example, by displaying decision trees that remind users to check the source and the facts before sharing content.

More generally, however, policymakers also need to consider putting in place stronger regulatory measures to ensure that Internet users retain control over the digital environment and their personal data - for example, through default privacy settings. Last but not least, the smart and self-determined use of digital technologies needs to be taught in both school and adult education. The earlier, the better.

The researchers emphasize that none of the interventions they propose can singlehandedly counter online manipulation or prevent the spread of misinformation. "It will take a combination of smart cognitive tools, early media literacy education, and a regulatory framework that limits the power of commercial interests to hijack people's attention to make the online world a more democratic and truthful place," says Stephan Lewandowsky, professor of cognitive psychology at the University of Bristol.

Credit: 
Max Planck Institute for Human Development

Scientists identify how harmless gut bacteria "turn bad"

image: Professor Sam Sheppard from the Milner Centre for Evolution at the University of Bath led the study.

Image: 
University of Bath

An international team of scientists has determined how harmless E. coli gut bacteria in chickens can easily pick up the genes required to evolve to cause a life-threatening infection. Their study, published in Nature Communications, warns that such infections not only affect the poultry industry but could also potentially cross over to infect humans.

E. coli is a common bacterium that lives in the intestines of most animals, including humans. It is usually harmless when it stays in the gut, however it can become very dangerous if it invades the bloodstream, causing a systemic infection that can even lead to death.

Avian pathogenic E.coli (APEC) is most common infection in chickens reared for meat or eggs. It can lead to death in up to 20 per cent of cases and causes multi-million pound losses in the poultry industry. The problem is made worse by increasing antibiotic resistance and infections also pose a risk of causing disease in humans.

The team of scientists, led by the Milner Centre for Evolution at the University of Bath, sequenced and analysed the whole genomes of E. coli bacteria found in healthy and infected chickens bred at commercial poultry farms to better understand why and how these normally innocuous bugs can turn deadly.

They found there was no single gene responsible for switching a harmless bacterium into a pathogenic one, but rather that it could be caused by several combinations of a diverse group of genes.

Their results indicate that all bacteria in chicken intestines have the potential to pick up the genes they need to turn into a dangerous infection, through a process called horizontal gene transfer.

Horizontal gene transfer enables bacteria to acquire new genetic material from other bacteria nearby. This can happen by scavenging DNA molecules from dead bacteria; by exchanging strands of DNA by having 'bacterial sex' or by getting infected by viruses which transfer DNA from one bacterium to another.

Professor Sam Sheppard, from the Milner Centre for Evolution at the University of Bath, led the study. He said: "Previously we thought that E. coli became pathogenic by acquiring specific genes from other bugs, often packaged in mobile elements called plasmids.

"But our study compared the genomes of disease-causing and harmless E. coli in chickens and found that they can 'turn bad' simply by picking up genes from their environment.

"Bacteria do this all the time inside the guts of chicken, but most of the time the scavenged genes are detrimental to the bacteria so it becomes an evolutionary dead end.

"However, there are 26 billion chickens worldwide, representing around 70 per cent of all bird biomass on earth.

"That increases the likelihood of bacteria picking up genes that could help the bacteria survive and turn infectious, or even jump species to infect humans."

The study authors stress the need to monitor strains that are most likely to become pathogenic so can treat them before they become dangerous.

Professor Sheppard said: "We were surprised to find that it's not just a single strain that causes APEC, but any strain can potentially acquire the 'monster combination' of genes needed to turn bad."

Strains with the potential to turn pathogenic could be identified using a similar method to that used to detect variant strains of Covid19. After whole genome sequencing, rapid PCR tests can be used to probe for specific genes that could lead to an APEC infection.

Professor Sheppard said: "We identified around 20 genes that are common in pathogenic bugs and if we can look out for these key genes in a flock of birds, that would help farmers target those carriers before they cause a problem."

Credit: 
University of Bath

Algorithm that performs as accurately as dermatologists

image: Sam Polesie, Sahlgrenska Academy, University of Gothenburg

Image: 
Photo: University of Gothenburg

A study has now been presented that boosts the evidence for using AI solutions in skin cancer diagnostics. With an algorithm they devised themselves, scientists at the University of Gothenburg show the capacity of technology to perform at the same level as dermatologists in assessing the severity of skin melanoma.

The study, published in the Journal of the American Academy of Dermatology, and its results are the work of a research group at the Department of Dermatology and Venereology at Sahlgrenska Academy, University of Gothenburg.

The study was conducted at Sahlgrenska University Hospital in Gothenburg. Its purpose was, through machine learning (ML), to train an algorithm to determine whether skin melanoma is invasive and there is a risk of it spreading (metastatizing), or whether it remains at a growth stage in which it is confined to the epidermis, with no risk of metastasis.

The algorithm was trained and validated on 937 dermatoscopic images of melanoma, and subsequently tested on 200 cases. All the cases included were diagnosed by a dermatopathologist.

The majority of melanomas are found by patients rather than doctors. This suggests that, in most cases, diagnosis is relatively easy. Before surgery, however, it is often much more difficult to determine the stage the melanoma has reached.

To make the classifications more accurate, dermatologists use dermatoscopes -- instruments that combine a type of magnifying glass with bright illumination. In recent years, interest in using ML for skin tumor classifications has increased, and several publications have shown that ML algorithms can perform on par with, or even better than, experienced dermatologists.

The current study is now giving a further boost to research in this field. When the same classification task was performed by the algorithm on the one hand and seven independent dermatologists on the other, the result was a draw.

"None of the dermatologists significantly outperformed the ML algorithm," states Sam Polesie, a researcher at the University of Gothenburg and specialist doctor at Sahlgrenska University Hospital, who is the corresponding author of the study.

In a developed form, the algorithm could serve as support in the task of assessing the severity of skin melanoma before surgery. The classification affects how extensive an operation needs to be, and is therefore important for both the patient and the surgeon.

"The results of the study are interesting, and the hope is that the algorithm can be used as clinical decision support in the future. But it needs refining further, and prospective studies that monitor patients over time are necessary, too," Polesie concludes.

Credit: 
University of Gothenburg

Study explores neurocognitive basis of bias against people who look different

PHILADELPHIA--The "scarred villain" is one of the oldest tropes in film and literature, from Scar in "The Lion King" to Star Wars' Darth Vader and the Joker in "The Dark Knight." The trope is likely rooted in a long-evolved human bias against facial anomalies -- atypical features such as growths, swelling, facial paralysis, and scars. A new brain-and-behavior study from researchers in the Perelman School of Medicine at the University of Pennsylvania illuminates this bias on multiple levels.

The researchers, whose findings were published this week in the Annals of the New York Academy of Sciences, used surveys, social simulations, and functional MRI (fMRI) studies to study hundreds of participants' responses and attitudes towards attractive, average, and anomalous faces. The findings clarify how the "anomalous-is-bad" stereotype manifests, and implicate a brain region called the amygdala as one of the likely mediators of this stereotype.

"Understanding the psychology of the 'anomalous-is-bad' stereotype can help, for example, in the design of interventions to educate the public about the social burdens shouldered by people who look different," said lead author Clifford Workman, PhD, a postdoctoral researcher in the Penn Center for Neuroaesthetics. The center is led by Anjan Chatterjee, MD, a professor of Neurology at Penn Medicine, who was senior author of the study.

Bias against people with facial disfigurements has been demonstrated in various prior studies. Researchers broadly assume that this bias reflects ancient adaptive traits which evolved to promote healthy mate selection, for example, and to steer us clear of people who have potentially communicable diseases. Regardless the cause, for many people, their facial anomalies render them unjust targets of discrimination.

In their study, Workman and colleagues investigated how this bias manifests at different levels, from expressed attitudes towards faces, to actual behavior during simulated social interactions, and even down to brain responses when viewing faces.

In one part of the study, the researchers showed a set of faces that were either average-looking, attractive, or anomalous to 403 participants from an online panel, and asked them to rate the depicted people on various measures. The researchers found that, compared to more attractive faces, participants considered anomalous faces less trustworthy, less content, and more anxious, on average. The anomalous faces also made the participants feel less happy. Participants also acknowledged harboring "explicit bias" reflected in negative expectations about people with anomalous faces as a group.

In the other part of the study, Workman and colleagues examined moral attitudes and dispositions, the behavior during simulated social interaction, and fMRI-measured brain responses, for 27 participants who viewed similar sets of faces.

Here again there was some evidence of the anomalous-is-bad habit of thinking, though it was not clear that this translated into mistreatment of people with anomalous faces. For example, in a simulated donation game measuring pro-sociality -- the willingness to be positive and helpful towards another -- the participants were not significantly less pro-social towards anomalous-looking people. However, participants in the highest tier of socioeconomic status, compared to the others, were significantly less pro-social towards anomalous-looking people.

On fMRI scans, brain regions called the amygdala and the fusiform gyri showed significant neural responses specifically to anomalous faces. Activity in a portion of the left amygdala, which correlated with less pro-sociality towards anomalous faces, also seemed related to participants' beliefs about justice in the world and their degree of empathic concern.

"We hypothesize that the left amygdala integrates face perception with moral emotions and social values to guide behavior, such that weaker emotional empathy, and a stronger belief that the world is just, both facilitate dehumanizing people with facial anomalies," Chatterjee said.

Analyzing such responses is inherently challenging, because they involve a mix of subjective perceptions, such as the "visual salience," or relative importance, of a face, and the "emotional arousal" elicited by seeing the face. To inform future research, as part of the study, the team used the fMRI data to clarify which brain regions are associated with these distinct aspects of the experience of seeing faces.

Credit: 
University of Pennsylvania School of Medicine

Going the distance--insights into how cancer cells spread

image: Schematic drawing of polyclonal metastasis. Metastatic cells generate metastatic niche by activation of hepatic stellate cells (HSCs). Non-metastatic cells can survive and proliferate with the presence of such metastatic niche, resulting in polyclonal metastasis development.

Image: 
Kanazawa University

Most tumors consist of a heterogenous mix of cells. Genetic mutations found only in some of these cells are known to aid with the spread and progression of cancer. However, oncologists often find that when tumors metastasize to distant organs, they retain this heterogenous nature--a phenomenon termed "polyclonal metastasis". The mechanism by which non-metastatic cells accompany the metastatic cells is ambiguous. Now, Masanobu Oshima and his research team have used mouse models to explain how non-metastatic cells begin their long commute.

The team has previously developed various cancerous mutants of mice and analyzed them closely to reveal which cancer cells inherently spread and which do not. It was found that cells with four mutations, colloquially termed AKTP, were the most fatal. When these cells were transplanted into the spleens of mice, they migrated to and formed colonies in the livers within 3 days. In contrast, cells with two mutations, AK and AP, could not traverse this distance. To replicate polyclonal metastasis, AP cells were then co-transplanted with AKTP cells, and voila, both cell types indeed moved into the livers. Instead, when AP cells were injected into the blood (without prior exposure to the AKTP cells) they could not metastasize. Certain processes seemed to be at play when the cells were incubated together.

Next, AKTP cells within the liver tumors were killed to see how closely that affected the AP cells. The AP cells continued thriving and grew into larger tumors suggesting they did not need the AKTP cells anymore. Thus, at some point in the journey from the spleen to the liver the AP cells turned dangerous. To identify this point, the researchers traced back the chain of events. Within a day after transplantation, AKTP clusters were found in the sinusoid vessel, a major blood vessel supplying the liver. By 14 days, this cluster transformed into a mass termed as a "fibrotic niche". The same mass was observed with a mix of AP and AKTP cells, but not with AP cells alone. What's more, within this mass AKTP cells were activating hepatic stellate cells (HSCs). HSCs are responsible for scarring of liver tissue. Activated HSCs then set up the perfect environment for AP cells to proliferate infinitely. Harboring the AP cells within the fibrotic environment was, therefore, a key step.

"These results indicate that non-metastatic cells can metastasize via the polyclonal metastasis mechanism using the fibrotic niche induced by malignant cells," conclude the researchers. Targeting this fibrotic niche might be a promising strategy to keep the spread of solid tumors in check.

Credit: 
Kanazawa University

Producing more sustainable hydrogen with composite polymer dots

image: The polymer dots in the black solution (inset image) can absorb more light, and show better photocatalytic properties, than the single-component polymer dots in the coloured solutions.

Image: 
P-Cat

Hydrogen for energy use can be extracted in an environmentally friendly way from water and sunlight, using photocatalytic composite polymer nanoparticles developed by researchers at Uppsala University. In laboratory tests, these "polymer dots" showed promising performance and stability alike. The study has been published in the Journal of the American Chemical Society.

How we are to meet future demand for sustainable energy is a much-debated question. One feasible way to go is hydrogen, which can be produced from renewable resources: water and solar energy. But the process requires what are known as photocatalysts. Traditionally, these have been made of metal-based materials that are often toxic. Instead, a research group headed by Haining Tian at Uppsala University's Ångström Laboratory is working to develop nano-sized organic photocatalysts - "polymer dots" - designed to be both environmentally friendly and cost-effective.

Since polymer dots (Pdots) are so tiny, they are evenly distributed in water. Compared with traditional photocatalysts, this provides a larger reaction surface, which means that more light can be stored in the form of hydrogen gas. The research group has now developed a Pdot containing three components. In tests, the particle has shown very good catalytic performance and stability.

"Combining several components that absorb light at different wavelengths is the easiest way to create a system in which all the visible surfaces capture light. But getting these components to work well together in a photocatalytic system is challenging," says Haining Tian, Associate Professor (Docent) of Physical Chemistry at Uppsala University.

To investigate how well the various components work together, Tian and his colleagues used spectroscopic techniques in which the Pdot was exposed to light for a certain length of time. They were thus able to follow how photochemical intermediates were created and, under illumination, disappeared.

"It's exciting to see that both ultrafast energy transfer and electron transfer take place in one particle, and that this helps the system to make use of the light and separate the charge for the catalytic process," says the study's lead author Aijie Liu, a postdoctoral researcher at the Department of Chemistry - Ångström Laboratory.

The researchers have succeeded in optimising the system of triple-component polymer dots so that it catalyses the conversion of solar energy into hydrogen with a 7% efficiency rate at 600 nanometres (nm). This is significantly better than the 0.3% at 600 nm obtained by the group when they were working on Pdots consisting of only one component. One problem has previously been that the photocatalysts degrade prematurely, but now the researchers were unable to discern any distinct degradation even after 120 hours' testing.

Credit: 
Uppsala University

Flowers of St. John's Wort serve as green catalyst

image: The flowers of St. John's Wort (Hypericum perforatum) have not only healing but also catalytic effects.

Image: 
Julia Naumann

Since ancient times, St. John's Wort has been used as a medicinal herb covering a wide range of applications such as the treatment of burns, skin injuries, neuralgia, fibrosis, sciatica and depression. Due to its high medicinal potential, the plant known in technical terminology as Hypericum perforatum even became "Medicinal Plant of the Year" in 2015. Now, scientists at TU Dresden have shown that there is much more to the herb than its healing properties.

To this end, two interdisciplinary groups from biology and inorganic chemistry have joined forces and thus achieved astonishing results.

Originally, the research groups led by botanist Prof. Stefan Wanke and chemist Prof. Jan. J. Weigand wanted to synthesize graphene-like 2D structures from natural products in the joint project funded by the Sächsische Aufbaubank (SAB; HyperiPhen project 100315829 in TG70 Bioleben). For this purpose, hypericin, a compound of St. John's Wort, served as a template and starting material. In the course of the investigations, it turned out that hypericin efficiently catalyzes photochemical reactions. Prof. Weigand then came up with the idea of using the dried flowers of St. John's Wort, from which hypericin can be obtained by extraction, as a green and sustainable alternative to common catalysts.

"The chemistry of natural substances and especially the background of botany were completely new to us. The exciting results that came out of it are all the more gratifying. The interdisciplinary project shows how important it is in science to think outside the "box,"" says Prof. Weigand, commenting on the success of the collaboration.

The team is thus following a current trend in modern synthetic chemistry to include sustainable aspects. The search for sustainable, renewable and environmentally friendly photoredox catalysts is proving to be extremely challenging. The results now obtained are all the more promising. The plant compound hypericin, a secondary metabolite from St. John's Wort, is used as the active compound in chemical reactions without the need for prior chemical processing. The Dresden scientists have successfully applied for a German patent for this newly developed method (DE 10 2019 215 871).

Also Prof. Wanke is amazed at the success of the collaboration: "Although the research project started with a good idea, bringing it to life was not entirely trivial, as the two working groups first had to "get to know" each other. Our research fields and methods used were far apart. But soon the first unusually exciting results emerged. Everyone involved learned a lot. We would like to continue the research, but the funding is still missing."

Credit: 
Technische Universität Dresden