Culture

Princeton researcher bringing single-cell gene expression studies to a benchtop near you

image: Improvements by the team allow scientists to more easily link genetic perturbations to changes in single-cell gene expression profiles, and to perform Perturb-seq studies more cheaply. Image courtesy of the authors.

Image: 
Replogle, et al.

By disrupting the expression of a particular gene and observing how this change affects expression of other genes, researchers can learn about the cellular roles of the disrupted gene. New technologies such as Perturb-seq offer unprecedented detail and depth of insight from such genetic disruption studies, but technical and practical hurdles have limited use of Perturb-seq. A new study by Princeton researcher Britt Adamson and colleagues at the University of California-San Francisco and biotech company 10X Genomics, which appeared March 30th in the journal Nature Biotechnology, aims to change that.

"We present several improvements to this approach, which together lay the groundwork for performing Perturb-seq screens at larger scale and with combinatorial perturbations," Adamson said.

The first step in Perturb-seq is to perturb a set of target genes. This is accomplished using one of a suite of CRISPR-based technologies to edit a cell's genome.

CRISPR techniques are adapted from a defense system found in bacteria and archaea. These organisms contain stretches of DNA (known as clustered regularly interspaced palindromic repeats, or CRISPRs) captured from the viruses that infect them. Bacteria and archaea use CRISPR sequences to produce RNA fragments that guide CRISPR-associated (Cas) enzymes to viral genomes (DNA, or in some cases RNA). Once there, these enzymes cut up the bound genome and halt infection.

Scientists have repurposed these systems for use in animal cells by using synthetic RNAs called "guides," or sgRNAs, which can target Cas enzymes to a cell's own DNA. There, cleavage introduces heritable mutations and disrupts targeted genes. Alternatively, scientists can employ an inactivated version of Cas that binds to target genes and prevents (CRISPRi) or enhances (CRISPRa) the gene's expression.

The second step in Perturb-seq is to investigate how the perturbation of targeted genes affects the pattern of other genes expressed by cells. This is done using a technique called single-cell RNA sequencing (scRNA-seq), which provides a read-out of gene expression from individual cells. In short, scRNA-seq collects and tags the molecules expressed by genes (called messenger RNAs, or mRNAs) from individual cells within a population of cells.

Using computers, researchers can then read the tags attached to mRNA sequences and group together mRNA identities from each cell. This allows the researchers to evaluate gene expression profiles, or "transcriptomes," from each cell to determine how cells are similar or different from each other.

Importantly, in Perturb-seq, input cell populations carry CRISPR-based perturbations, and because sgRNAs are the key to mapping assembled transcriptomes to those perturbations, sgRNA identities must also be determined for each cell. However, sgRNAs are not captured by standard scRNA-seq methods. Therefore, to conduct Perturb-seq experiments, researchers have previously relied on methods of indirect mapping. Such methods are plagued by technical limitations. For example, they're difficult to use when multiple sgRNAs are delivered to each cell.

These are the problems Adamson and her collaborators, led by first author Joseph M. Replogle, an M.D.-Ph.D. trainee at the University of California-San Francisco, wanted to solve. "We developed protocols for capturing sgRNA sequences on different scRNA-seq platforms," Adamson said.

These new protocols, which the researchers call "direct capture Perturb-seq," provide a way to capture and amplify sgRNAs alongside the cellular transcriptome during scRNA-seq. Importantly, direct capture Perturb-seq allows researchers an easy way to track the presence of multiple sgRNAs in individual cells, which the team showed can also be useful for improving CRISPRi.

To demonstrate other ways in which direct-capture Perturb-seq might be applied, the authors used it to replicate and extend the results of an earlier study. The initial study, which examined the effects of disrupting pairs of genes on cell growth, demonstrated that blocking expression of certain genes involved in cholesterol biosynthesis causes buildup of a metabolic intermediate that damages a cell's DNA. With direct capture Perturb-seq, the researchers were able to rapidly characterize how cells respond to repression of those genes, develop a model for how cells manage the effects of that intermediate, and discover what happens to cells when they cannot.

As a second improvement, the team also enabled mRNAs from particular genes of interest to be enriched from scRNA-seq transcriptomes, making scRNA-seq experiments like the ones described above cheaper to perform. "Together, these improvements should help researchers expand the scale of Perturb-seq experiments and better enable efforts to study how genes work," Adamson said.

Credit: 
Princeton University

NRL researchers create electronic diodes beyond 5G performance

image: David Storm, a research physicist, and Tyler Growden, a National Research Council postdoctoral researcher, at the U.S. Naval Research Laboratory with their molecular beam epitaxy system that develops gallium nitride-based (GaN) semiconductors in Washington, D.C., March 10, 2020. Storm and Growden published their research on GaN semiconductor materials, which showed high yield and performance well suited for high frequency and high power electronic devices in Applied Physics Letters.

Image: 
U.S. Navy photo by Jonathan Steffen

WASHINGTON -- David Storm, a research physicist, and Tyler Growden, an electrical engineer, both with the U.S. Naval Research Laboratory, developed a new gallium nitride-based electrical component called a resonant tunneling diode (RTD) with performance beyond the anticipated speed of 5G.

The fifth-generation network technology is now just starting to roll out across the United States.

Storm and Growden's electronic component diode research findings were published March 19, 2020 in the academic journal Applied Physics Letters.

"Our work showed that gallium nitride-based RTDs are not inherently slow, as others suggested," Growden said. "They compare well in both frequency and output power to RTDs of different materials."

The diodes enable extremely fast transport of electrons to take advantage of a phenomenon called quantum tunneling. In this tunneling, electrons create current by moving through physical barriers, taking advantage of their ability to behave as both particles and waves.

Storm and Growden's design for gallium nitride-based diodes displayed record current outputs and switching speeds, enabling applications requiring electromagnetics in the millimeter-wave region and frequencies in terahertz. Such applications could include communications, networking, and sensing.

The team developed a repeatable process to increase the diodes yield to approximately 90%; previous typical yields range around 20%.

Storm said accomplishing a high yield of operational tunneling devices can be difficult because they require sharp interfaces at the atomic level and are very sensitive to many sources of scattering and leakage.

Sample preparation, uniform growth, and a controlled fabrication process at every step were the key elements to the diodes satisfactory results on a chip.

"Until now, gallium nitride was difficult to work with from a manufacturing perspective," Storm said. "I hate to say it, but our high yield was as simple as falling off a log, and a lot of it was due to our design."

Storm and Growden said they are committed to continue refining their RTD design to improve the current output without losing power potential. They performed their work along with colleagues at Ohio State University, Wright State University, as well as industry partners.

Credit: 
Naval Research Laboratory

Testing suggests 3% of NHS hospital staff may be unknowingly infected with coronavirus

Hospital staff may be carrying SARS-CoV-2, the coronavirus that causes COVID-19 disease, without realising they are infected, according to a study by researchers at the University of Cambridge.

Patients admitted to NHS hospitals are now routinely screened for the SARS-CoV-2 virus, and isolated if necessary. But NHS workers, including patient-facing staff on the front line, such as doctors, nurses and physiotherapists, are tested and excluded from work only if they develop symptoms of the illness. Many of them, however, may show no symptoms at all even if infected, as a new study published in the journal eLife demonstrates.

The Cambridge team pro-actively swabbed and tested over 1,200 NHS staff at Addenbrooke's Hospital, Cambridge University Hospitals NHS Foundation Trust, throughout April. The samples were analysed using a technique called PCR to copy and read the genetic information of material present on the swab, producing a colour change whenever the coronavirus was present in a specimen. At the same time, staff members were asked about relevant coronavirus symptoms.

Of the more than 1,000 staff members reporting fit for duty during the study period, 3% nevertheless tested positive for the coronavirus. On closer questioning, around one in five reported no symptoms, two in five had very mild symptoms that they had dismissed as inconsequential, and a further two in five reported COVID-19 symptoms that had stopped more than a week previously.

To probe routes of possible transmission of the virus through the hospital and among staff, the researchers also looked at whether rates of infection were greater among staff working in "red" areas of the hospital, those areas caring for COVID-19 patients. Despite wearing appropriate personal protective equipment (PPE), "red" area staff were three times more likely to tested positive than staff working in COVID-19 free "green" areas. It's not clear whether this genuinely reflects greater rates of transmission from patients to staff in red areas. Staff may have instead transmitted the virus to each other or acquired it at home. Staff working in the "red" areas were also swabbed earlier in the study, closer to when the lockdown was first initiated, so the higher rates of infection in this group might just be a symptom of higher rates of virus circulating in the community at the time.

Nevertheless, extrapolating these results to the more than half a million patient-facing staff working across the NHS UK-wide suggests that as many as 15,000 workers may have been on duty and infected, with the potential to transmit the virus to co-workers, family members and patients, during the month of April. In fact, this figure could be even higher in settings where the supply of PPE has been very problematic.

The implications of the new study, say senior authors Dr Mike Weekes and Professor Stephen Baker from the Cambridge Institute of Therapeutic Immunology and Infectious Disease (CITIID), are that hospitals need to be vigilant and introduce screening programmes across their workforces.

"Test! Test! Test! And then test some more," Dr Weekes explains. "All staff need to get tested regularly for COVID-19, regardless of whether they have any sort of symptoms - this will be vital to stop infection spreading within the hospital setting."

Credit: 
University of Cambridge

Vitamin D determines severity in COVID-19 so government advice needs to change

Researchers from Trinity College Dublin are calling on the government in Ireland to change recommendations for vitamin D supplements [Monday 11th May 2020].

A new publication from Dr Eamon Laird and Professor Rose Anne Kenny, School of Medicine, and the Irish Longitudinal Study on Ageing (TILDA), in collaboration with Professor Jon Rhodes at University of Liverpool, highlights the association between vitamin D levels and mortality from COVID-19.

The authors of the article, just published in the Irish Medical Journal, analysed all European adult population studies, completed since 1999, which measured vitamin D and compared vitamin D and death rates from COVID-19. The article can be viewed at: http://imj.ie/irish-medical-journal-may-2020-vol-113-no-5/

Vitamin D is produced in the skin from UVB sunlight exposure and is transported to the liver and then the kidney where it is changed into an active hormone that increases calcium transport from food in the gut and ensures calcium is adequate to keep the skeleton strong and free of osteoporosis.

But vitamin D can also support the immune system through a number of immune pathways involved in fighting SARS2COV. Many recent studies confirm the pivotal role of vitamin D in viral infections.

This study shows that, counter intuitively, countries at lower latitude and typically sunny countries, such as Spain and Northern Italy, had low concentrations of vitamin D and high rates of vitamin D deficiency. These countries also experienced the highest infection and death rates in Europe.

The northern latitude countries of Norway, Finland and Sweden, have higher vitamin D levels despite less UVB sunlight exposure, because supplementation and fortification of foods is more common. These Nordic countries have lower COVID-19 infection and death rates. The correlation between low vitamin D levels and death from COVID-19 is statistically significant.

The authors propose that, whereas optimising vitamin D levels will certainly benefit bone and muscle health, the data suggests that it is also likely to reduce serious COVID-19 complications. This may be because vitamin D is important in regulation and suppression of the inflammatory cytokine response, which causes the severe consequences of COVID-19 and 'acute respiratory distress syndrome' associated with ventilation and death.

Professor Rose Anne Kenny said:

"In England, Scotland and Wales, public health bodies have revised recommendations since the COVID-19 outbreak. Recommendations now state that all adults should take at least 400 IU vitamin D daily. Whereas there are currently no results from randomised controlled trials to conclusively prove that vitamin D beneficially affects COVID-19 outcomes, there is strong circumstantial evidence of associations between vitamin D and the severity of COVID-19 responses, including death."

"This study further confirms this association. We call on the Irish government to update guidelines as a matter of urgency and encourage all adults to take supplements during the COVID-19 crisis. Deficiency is frequent in Ireland. Deficiency is most prevalent with age, obesity, in men, in ethnic minorities, in people with diabetes, hypertension and in nursing homes."

Dr Eamon Laird added:

"Here we see observational evidence of a link of vitamin D with mortality. Optimising vitamin D intake to public health guidelines will certainly have benefits for overall health and support immune function. Research like this is still exploratory and we need further trials to have concrete evidence on the level of vitamin D that is needed for optimal immune function. However, studies like this also remind us how low our vitamin D status is in the population (even in sunny countries) and adds further weight to some sort of mandatory vitamin D fortification policy. If the Nordic countries are allowed to do this, there is no reason Ireland, the UK or rest of Europe can't either."

Credit: 
Trinity College Dublin

Passive immunization may slow down SARS-CoV-2 and boost immunity in patients, buying time

image: The convalescent plasma option for COVID-19 treatment

Image: 
The Hashemite University

Amsterdam, May 12, 2020 - The worldwide COVID-19 pandemic has infected more than 4 million people and killed close to 280,000.1 Finding a vaccine has become a global public health priority. However, creating a viable vaccine might take a long time; scientists estimate a vaccine may be available in between 12 and 18 months. A potential interim solution reported in the International Journal of Risk & Safety in Medicine may be a passive vaccine, or passive immummization (PI), which can provide instant, short-term fortification against infectious agents.

"Using valuable plasma from recovered patients might be useful in our global war against COVID-19," explained Foad Alzoughool, PhD. He and his co-author, Lo'ai Alanagreh, PhD, both of the Department of Laboratory Medical Sciences, Faculty of Applied Medical Sciences, The Hashemite University, Jordan, have studied the application of PI in previous pandemics and conclude that this approach is a potential solution to address the immediate health threat of COVID-19.

After exposure to a viral infection, an individual's body creates antibodies to fight off the virus. These antibodies in the blood of a recovered patient can be collected as convalescent plasma and transferred to the blood of a newly infected patient where it can neutralize the pathogen, eliminate it from the blood stream, and boost immunity. While PI does not provide long-term protection against the virus, it can reduce the aggressiveness and mortality of an infection.

The use of PI immunization dates to the beginning of the twentieth century during the Spanish flu epidemic, when patients who received convalescent plasma serum had lower mortality rates than others. Experimental usage of PI during outbreaks of Ebola virus, chikungunya virus and the H1N1 flu also shows the potential of using PI in the prevention and treatment of viral infecitions.

There is evidence as well of the effectiveness of the PI technique in the SARS-CoV epidemic in Guandong, China and the MERS-CoV in Saudi Arabia, particularly if it is introduced soon after symptom onset. In one report, patients who received PI had a significantly shorter hospital stay and lower mortality than other groups. In another, patients who received convalescent plasma before day 14 of illness had a higher discharge rate. Healthcare workers who were infected with SARS-CoV and failed to respond to treatment survived after transfusion with convalescent plasma.

"If you are looking for COVID-19 treatment, you will find it in the blood of survivors," said Dr. Alanagreah. "In a time when no registered antiviral drug or vaccine is available, PI might help in slowing down the deadly virus and save lives, particuarly for the elderly and patients with pre-existing conditions."

More than 1.5 million people have recovered from the disease,2 and many of them would be willing to donate plasma to help slow down the pandemic. Dr. Alzoughool and Dr. Alanagreh noted, importantly, that practicing this method now will help health systems be prepared in case a second wave of disease occurs.

Credit: 
IOS Press

Do democracies behave differently from non-democracies when it comes to foreign policy?

The question of whether democracies behave differently from non-democracies is a central, and intense, debate in the field of international relations. Two intellectual traditions - liberalism and realism - dominate. Liberals argue that democracies do indeed behave differently, while realists insist that regime type and ideology are of little relevance in understanding foreign policy behavior.

Arman Grigoryan, a faculty member in the Department of International Relations at Lehigh University has contributed to this debate with a recent article in a top journal, International Security. Grigoryan has focused on a particularly controversial subtopic of this debate, which is whether supporting and spreading democracy is an important priority for democratic states. His answer to that question: No.

Two events have served as triggers for Grigoryan's decision to write the paper. The first was the democratic mass movement in his native Armenia in 2007-2008, or more accurately, the posture the West adopted toward that movement, which Grigoryan describes as one "between indifference and hostility." The other event was a mass movement in another post-Soviet state - the one in Ukraine in 2013-2014 - which the West quickly mobilized to support. Grigoryan found the liberal arguments about the motives for supporting the Ukrainian movement, which were all about supporting a force for democracy, suspect, given what he had observed in Armenia.

"What made the desire to examine this puzzle even more urgent were the realist criticisms of the policy as driven by 'liberal delusions,'" says Grigoryan. "Realism is the theory which has traditionally dismissed claims about the causal relevance of states' regime types and ideological commitments. Yet now even realists were arguing that the policy was driven by ideology, even if they were criticizing it as wrong-headed. But was it ideology?"

A systematic comparison of the West's reactions to the movements in Ukraine and Armenia provided an opportunity to answer that question. The two cases were very similar on most dimensions, yet the outcomes could not have been more different. If ideology drove the policy in Ukraine, why did it not do the same in Armenia? Grigoryan wondered.

Grigoryan focuses on another motive of the West's behavior to answer the question: the rollback of Russian influence in the post-Soviet space.

In Ukraine the purported liberal motive and the motive to pull Ukraine out of Russia's strategic orbit pulled in the same direction, because the Ukrainian movement was intensely hostile to Russia. Support for the Ukrainian movement, in other words, was not particularly informative as far as liberal motives were concerned.

"In Armenia these two motives pushed in opposite directions, because the Armenian mass movement did not have an anti-Russian or any other kind of geopolitical coloring," says Grigoryan. "The lack of Western solidarity with the Armenian movement, therefore, was much more informative."

Grigoryan anticipates an important skeptical question in the article: could the finding from the comparative analysis of the West's reactions to the mass movements in Ukraine and Armenia be nothing more than a strange anomaly, a deviation from an otherwise strict pattern? He dedicates a part of the paper to the examination of the West's overall record in order to answer that question.

He argues that what happened in Ukraine and Armenia is very much in line with the overall pattern.

"Democracy has been supported when such support has dovetailed with certain material interests - geopolitical, economic, or corporate - and never when such interests have diverged from the liberal preference for democracy," says Grigoryan.

Credit: 
Lehigh University

Journal of Dental Research study: Fluoridation is not associated with increase in osteosarcoma

May 11, 2020, Alexandria, Va., USA--The Journal of Dental Research published today the results of a study that demonstrated that community water fluoridation is not associated with increased risk of osteosarcoma.

More than sixty percent of the U.S. population have access to community water fluoridation, considered to be one of the most important public health policies of the twentieth century due to its reduction of tooth decay at the population level. Fluoride ingestion has been suggested as a possible risk factor for osteosarcoma based on a 1990 animal study. Six of the seven subsequent case-control studies in humans reported that fluoride in drinking water was not associated with osteosarcoma.

This study assessed whether living in a fluoridated community was a risk factor for osteosarcoma by performing a secondary data analysis using data collected from two separate, but linked studies. Patients for both Phase 1 and Phase 2 were selected from U.S. hospitals using a hospital-based matched case-control study design. For both phases, cases were patients diagnosed with osteosarcoma and controls were patients diagnosed with other bone tumors or non-neoplastic conditions.

In Phase 1, cases (N=209) and controls (N=440) were patients of record in the participating orthopedic departments from 1989-1993. In Phase 2, cases (N=108) and controls (N=296) were incident patients who were identified and treated by orthopedic physicians from 1994-2000. This analysis included all patients who met eligibility criteria on whom we had complete data on covariates, exposures, and outcome. Conditional logistic regression was used to estimate odds ratios (OR) and 95% confidence intervals (CI) for the association of community water fluoridation with osteosarcoma.

The adjusted OR, for osteosarcoma and ever-having lived in a fluoridated area for non-bottled water drinkers was 0.51(0.31 - 0.84), p=0.008. The same comparison adjusted OR for bottled water drinkers was 1.86 (0.54 - 6.41), p=0.326.

"These results indicate that residence in a fluoridated community is not related to an increase in risk for osteosarcoma after adjusting for race, ethnicity, income, distance from the hospital, urban/rural living status, and drinking bottled water. This should not be surprising given that ingestion of fluoridated water is a common exposure and osteosarcoma remains a rare disease," said Chester Douglass; Harvard School of Dental Medicine, Department of Oral Health Policy and Epidemiology.

Credit: 
International Association for Dental, Oral, and Craniofacial Research

Little skates could hold the key to cartilage therapy in humans

image: A little skate (Leucoraja erinacea) hatchling (left) with a skeletal preparation of a little skate hatchling (right), with cartilage stained blue and mineralized tissues stained pink.

Image: 
Andrew Gillis

WOODS HOLE, Mass. -- Nearly a quarter of Americans suffer from arthritis, most commonly due to the wear and tear of the cartilage that protects the joints. As we age, or get injured, we have no way to grow new cartilage. Unlike humans and other mammals, the skeletons of sharks, skates, and rays are made entirely of cartilage and they continue to grow that cartilage throughout adulthood.

And new research published this week in eLife finds that adult skates go one step further than cartilage growth: They can also spontaneously repair injured cartilage. This is the first known example of adult cartilage repair in a research organism. The team also found that newly healed skate cartilage did not form scar tissue.

"Skates and humans use a lot of the same genes to make cartilage. Conceivably, if skates are able to make cartilage as adults, we should be able to also," says Andrew Gillis, senior author on the study and a Marine Biological Laboratory Whitman Center Scientist from the University of Cambridge, U.K.

The researchers carried out a series of experiments on little skates (Leucoraja erinacea) and found that adult skates have a specialized type of progenitor cell to create new cartilage. They were able to label these cells, trace their descendants, and show that they give rise to new cartilage in an adult skeleton.

Why is this important? There are few therapies for repairing cartilage in humans and those that exist have severe limitations. As humans develop, almost all of our cartilage eventually turns into bone. The stem cell therapies used in cartilage repair face the same issue--the cells often continue to differentiate until they become bone. They do not stop as cartilage. But in skates, the stem cells do not create cartilage as a steppingstone; it is the end result.

"We're looking at the genetics of how they make cartilage, not as an intermediate point on the way to bone, but as a final product," says Gillis.

The research is in its early stages, but Gillis and his team hope that by understanding what genes are active in adult skates during cartilage repair, they could better understand how to stop human stem-cell therapies from differentiating to bone.

Note: There is no scientific evidence that "shark cartilage tablets" currently marketed as supplements confer any health benefits, including relief of joint pain.

Credit: 
Marine Biological Laboratory

Early experiences determine how birds build their first nest

image: A male zebra finch constructing a nest. A new study is examining how early life experiences have a big impact on the birds' construction of their first homes. Photo courtesy of Alexis Breen.

Image: 
Alexis Breen.

Early life experiences of zebra finches have a big effect on the construction of their first homes, according to a new study by researchers at the University of Alberta's Faculty of Science and the University of St Andrews' School of Biology.

The study shows that the presence of an adult bird as well as the types of materials available in early adolescence influence two key aspects of first-time nest building: material preference and construction speed.

"Interestingly, we noted that the preference for different materials, differentiated by colour in our study, is shaped by the juvenile experience of this material--but only in the presence of an adult," said Lauren Guillette, assistant professor in the Department of Psychology and project lead.

"This work is important because it debunks the long-held myth that birds build nests that look like the nest in which they hatched--making nest-building a useful model system to experimentally test how animals learn about physical properties of the world."

In this study, the researchers controlled the environment in which zebra finches grew up. Each bird hatched into a nest of a specific colour--pink or orange. As the birds grew up, they were paired with another bird of the same age. Then, some pairs were grouped with an adult bird in an environment that contained a different colour of nest material than the colour in which they hatched. Other young pairs of birds experienced only an adult as company, or only nest material and no adult, other pairs just had each other.

Using these methods the researchers could determine if birds build their first nest with a colour that matches their natal nest, or the colour they experienced while growing up.

The results show that as juvenile zebra finches embarked on building their first nest, most birds preferred to use materials to which they'd had access to while growing up--but only if an adult had also been present during this time. Further, birds who had not had juvenile access to an adult or material were between three and four times slower at nest building.

"Together, these results show that juvenile zebra finches combine relevant social and ecological cues--here, adult presence and material colour--when developing their material preference," explained Alexis Breen, lead author on the study who recently obtained a PhD at the University of St Andrews in Scotland.

The study, "Juvenile socio-ecological environment shapes material technology in nest-building birds," was published in Behavioral Ecology (doi: 10.1093/beheco/araa027).

Credit: 
University of Alberta

Acute stress may slow down the spread of fears

New psychology research from the University of Konstanz reveals that stress changes the way we deal with risky information - results that shed light on how stressful events, such as a global crisis, can influence how information and misinformation about health risks spreads in social networks.

"The global coronavirus crisis, and the pandemic of misinformation that has spread in its wake, underscores the importance of understanding how people process and share information about health risks under stressful times," says Professor Wolfgang Gaissmaier, Professor in Social Psychology at the University of Konstanz, and senior author on the study. "Our results uncovered a complex web in which various strands of endocrine stress, subjective stress, risk perception, and the sharing of information are interwoven."

The study, which appears in the journal Scientific Reports, brings together psychologists from the DFG Cluster of Excellence "Centre for the Advanced Study of Collective Behaviour" at the University of Konstanz: Gaissmaier, an expert in risk dynamics, and Professor Jens Pruessner, who studies the effects of stress on the brain. The study also includes Nathalie Popovic, first author on the study and a former graduate student at the University of Konstanz, Ulrike Bentele, also a Konstanz graduate student, and Mehdi Moussaïd from the Max Planck Institute for Human Development in Berlin.

In our hyper-connected world, information flows rapidly from person to person. The COVID-19 pandemic has demonstrated how risk information - such as about dangers to our health - can spread through social networks and influence people's perception of the threat, with severe repercussions on public health efforts. However, whether or not stress influences this has never been studied.

"Since we are often under acute stress even in normal times and particularly so during the current health pandemic, it seems highly relevant not only to understand how sober minds process this kind of information and share it in their social networks, but also how stressed minds do," says Pruessner, a Professor in Clinical Neuropsychology working at the Reichenau Centre of Psychiatry, which is also an academic teaching hospital of the University of Konstanz.

To do this, researchers had participants read articles about a controversial chemical substance, then report their risk perception of the substance before and after reading the articles, and say what information they would pass on to others. Just prior to this task, half of the group was exposed to acute social stress, which involved public speaking and mental arithmetic in front of an audience, while the other half completed a control task.

The results showed that experiencing a stressful event drastically changes how we process and share risk information. Stressed participants were less influenced by the articles and chose to share concerning information to a significantly smaller degree. Notably, this dampened amplification of risk was a direct function of elevated cortisol levels indicative of an endocrine-level stress response. In contrast, participants who reported subjective feelings of stress did show higher concern and more alarming risk communication.

"On the one hand, the endocrine stress reaction may thus contribute to underestimating risks when risk information is exchanged in social contexts, whereas feeling stressed may contribute to overestimating risks, and both effects can be harmful," says Popovic. "Underestimating risks can increase incautious actions such as risky driving or practising unsafe sex. Overestimating risks can lead to unnecessary anxieties and dangerous behaviours, such as not getting vaccinated."

By revealing the differential effects of stress on the social dynamics of risk perception, the Konstanz study shines light on the relevance of such work not only from an individual, but also from a policy perspective. "Coming back to the ongoing COVID-19 pandemic, it highlights that we do not only need to understand its virology and epidemiology, but also the psychological mechanisms that determine how we feel and think about the virus, and how we spread those feelings and thoughts in our social networks," says Gaissmaier.

Credit: 
University of Konstanz

Study suggests remnants of human migration paths exist underwater at 'choke points'

image: The sea level in the Bering Strait at the Last Glacial Maximum (20,000 years ago) versus today. Note the intricate archipelago that was present in the past but not today. Its islands (outlined in red) might have served as stepping stones for the first settlers crossing from Asia to North America.

Image: 
Dobson, et al.

LAWRENCE -- Today, sea-level rise is a great concern of humanity as climate change warms the planet and melts ice sheets in Greenland and Antarctica. Indeed, great coastal cities around the world like Miami and New Orleans could be underwater later in this century.

But oceans have been rising for thousands of years, and this isn't the first time they have claimed land once settled by people. A new paper published in Geographical Review shows evidence vital to understanding human prehistory beneath the seas in places that were dry during the Last Glacial Maximum. Indeed, this paper informs one of the "hottest mysteries" in science: the debate over when the first Asians peopled North America.

The researchers behind the paper studied "choke points" -- narrow land corridors, called isthmuses but often better known for the canals that cross them, or constricted ocean passages, called straits. Typically isthmuses would have been wider 20,000 years ago due to lower sea levels, and some straits did not even exist back then.

"We looked at nine global choke points -- Bering Strait, Isthmus of Panama, Bosporus and Dardanelles, Strait of Gibraltar, straits of Sicily and Messina, Isthmus of Suez, Bab al Mandab, Strait of Hormuz and Strait of Malacca -- to see what each was like 20,000 years ago when more water was tied up in ice sheets and glaciers," said lead author Jerry Dobson, professor emeritus of geography at the University of Kansas and president emeritus of the American Geographical Society. "During the Last Glacial Maximum, the ocean surface was 410 feet lower than today. So, worldwide the amount of land that has been lost since the glaciers melted is equivalent to South America."

Dobson has urged dedicated study of this land lost to the sea -- an area of archeological interest he dubs "aquaterra" -- and he thinks global choke points are the best places to begin.

"Look at these same choke points today -- watch the nightly news," he said. "They're centers of ongoing conflict. Notice how the Strait of Hormuz controls the international flow of oil and sparks conflicts. The United States almost went to war a few months ago in a faceoff with Iran over shipping through that choke point. Or, look at the Suez Canal and the role it played in the Suez Crisis of 1956 and Six-Day War of 1967. Choke points, particularly straits, are pivotal to conflicts."

Startling revelations confronted the three authors in all nine regions. In the Bering Strait between Asia and Alaska, for instance, their data led to a "totally new hypothesis" about how people likely migrated across from Siberia to North America. Science writer Fen Montaigne calls it "one of the greatest mysteries of our time . . . when humans made the first bold journey to the Americas." The new study found many unknown, transitory islands that would have acted like stepping stones luring travelers eastward.

"In the Bering Strait only a handful of islands exist today -- but there were literally scores of them at the Last Glacial Maximum," Dobson said. "They started appearing at least 30,000 years ago, and Siberia probably had people about 30,000 to 40,000 years ago. They formed from west to east and then inundated from west to east, which would have pushed them all the way to Alaska. The first islands appeared close enough that Asians could have seen some of them from shore. People might have been lured out to them. Then, more islands kept appearing to their east, so they moved farther step by step. Eventually, even the newest islands were lost to inundation -- so people were forced ultimately to North America."

Three of the study's global choke points surround the Mediterranean Sea. Here, too, draining the ocean uncovers new possibilities for archeological exploration.

On the Isthmus of Suez, the portage between the Red Sea and Mediterranean Sea, where the Suez Canal lies today, would have been 3.5 times as long at Last Glacial Maximum as it was just prior to construction of the canal. The crossing likely would have been displaced by a western route from Foul Bay, Egypt, to the first cataract of the Nile, thence downriver to the Mediterranean Sea.

The Black Sea was cut off from the world ocean as the sea level dropped below the Bosporus and Dardanelles. Rather than today's saltwater channel of 300 kilometers, there was an overland route of 220 miles, one-third of which was a deep lake now submerged beneath the Sea of Marmara. In response, now submerged settlements may have existed west of the current mouth of the Dardanelles, offshore near the eastern end of the Gulf of Soros, and beside the eastern and western ends of the Sea of Marmara.

The straits of Sicily and Messina almost severed the Mediterranean Sea into two separate seas divided 32 miles then versus 88 miles today. The LGM map shows additional islands and coastal plains in an area already known for early settlement. For instance, a 39 feet long, carved monolith at 130 feet depth recently discovered by underwater archeologists proves humans occupied the place about 10,000 years ago.

The KU researcher co-wrote the new study with Giorgio Spada and Gaia Galassi of Urbino University, ocean scientists who applied glacial isostatic adjustment (GIA) models, accounting for deformation and gravity variations in the sea floor caused by glacial melting and sea-level rise, in order to reconstruct the variation in paleo-topography for the past 30,000 years. Their work yielded much more accurate spacial and temporal resolution as to where land was exposed during the Last Glacial Maximum.

"We have lost an area equivalent to South America in size," Dobson said. "That is an enormous amount of land, and it's even better on average than any continent today. It was all coastal, all flat, and mostly tropical. We have a much better estimate of the size now than we did a few years ago. The difference is because of this new way we calculate sea level. The new model considers how the ocean bottom shifts in response to the weight of the water."

Coastal areas during the Last Glacial Maximum likely would have attracted people, as coastal lands do today. Dobson said archeological exploration is needed to search for boats, ports and settlements - evidence that could revolutionize conceptions of human migration and know-how at that time.

"How much technology was there?" he said. "Were there boats? No boats have ever been found that were that old, but we know people made it from Southeast Asia to Australia 65,000 years ago. So, anthropologists surmise they must have had boats. Even when sea level was at its lowest, the individual hops they had to make were long enough that it would seem likely they had boats. In the new article, we study the history of boats of all kinds based on research published in reputable scientific journals. Maritime travel goes surprisingly far back. So now, what kind of evidence can we find of ports? No one has ever claimed evidence of ports that far back. Of course, ports on coasts 400 feet lower than today would be hard to find, and precious little underwater archaeology has been conducted at that depth. We need to treat boats and ports as unknown and look for the evidence rather than proclaiming whether it did or did not happen."

The KU researcher said choke points should be of interest to geographers, ocean scientists, underwater archeologists, anthropologists and oceanographers because they provide "strategic insights on where to search for submerged evidence of human settlement."

"It's a matter of efficiency," Dobson said. "To understand maritime travel and associated settlements long ago, we can search whole oceans. Underwater searches are expensive, however, so little territory gets searched. Finds are rare because artifacts are few and far between. Choke points funnel travel into narrow corridors, and logically that concentrates the artifacts as well. If there is any evidence, that's where we most likely will find it."

Credit: 
University of Kansas

Genes may play a role in weight gain from birth control

AURORA, Colo. (May 12, 2020) - A woman's genetic make-up may cause her to gain weight when using a popular form of birth control, according to a study from researchers at the University of Colorado Anschutz Medical Campus.

"For years, women have said that birth control causes them to gain weight but many doctors failed to take them seriously," said the study's lead author Aaron Lazorwitz, MD, assistant professor of Obstetrics/Gynecology and Family Planning at the University of Colorado School of Medicine. "Now we have looked at the genetics and found that the way genes interact with some hormones in birth control could help explain why some women gain more weight than others."

The study, published online today in the journal Contraception, specifically looked at the etonogestrel contraceptive implant. The rod-like implant, considered one of the most effective birth control methods, is inserted under the skin and contains etonogestrel a kind of progestin that inhibits ovulation.

The researchers reviewed medical records to calculate weight changes from the insertion of the implant to the time when the women enrolled in the study. Out of 276 ethnically diverse subjects, they found a median weight change of +3.2 kg or about 7 pounds of weight gained over an average of 27 months of use. The majority or 73.9% of subjects experienced weight gain.

Drawing on pharmacogenomics, the study of how genes affect a person's response to drugs, Lazorwitz and his colleagues investigated the genetics of the participants and how they might interact with the birth control drug within the implant.

They hypothesized that variants in the genes encoding proteins that break down and interact with progestin and estrogen hormones might be the key. Ultimately, they found that genetic variants in estrogen receptor 1 (ESR1) among some study participants were associated with clinically significant weight gain.

ESR1 is found on chromosome six. It encodes an estrogen receptor involved in cellular hormone binding and DNA transcription when activated. Previous studies also found associations between ESR1 genetic variants and how well other medications work.

Women who had two copies of the ESR1 rs9340799 variant on average gained over 30 pounds more while using the contraceptive implant when compared to all other women in the study.

The study focused on the etonogestrel contraceptive implant, but it is possible that other birth control drugs could have similar interactions with genes that lead to weight gain.

"It is imperative to better understand how individual genetic variation may influence a woman's risk of adverse weight gain while using exogenous steroid hormone medications," Lazorwitz said.

For now, there is no way to predict who might be impacted.

Health care providers can offer counseling about potential weight gain or suggest other forms of birth control like copper IUDs which have no hormones.

"As our understanding of pharmacogenomics in women's health expands, we can develop individualized counseling that may reduce the incidence of hormone-related adverse effects, improve patient satisfaction, and help prevent future health risks associated with weight gain," Lazorwitz said.

Credit: 
University of Colorado Anschutz Medical Campus

Growing mountains or shifting ground: What is going on in Earth's inner core?

image: A new study of Earth's inner core used seismic data from repeating earthquakes, called doublets, to find that refracted waves, blue, rather than reflected waves, purple, change over time -- providing the best evidence yet that Earth's inner core is rotating.

Image: 
Graphic by Michael Vincent

CHAMPAIGN, Ill. -- Exhaustive seismic data from repeating earthquakes and new data-processing methods have yielded the best evidence yet that the Earth's inner core is rotating - revealing a better understanding of the hotly debated processes that control the planet's magnetic field.

The new study by researchers from the University of Illinois at Urbana-Champaign is published in the journal Earth and Planetary Science Letters.

Geologists do not fully understand how the Earth's magnetic field generator works, but suspect it is closely linked to dynamic processes near the inner core-outer core boundary area, the researchers said. Shifts in the location of the magnetic poles, changes in field strength and anomalous seismic data have prompted researchers to take a closer look.

"In 1996, a small but systematic change of seismic waves passing through the inner core was first detected by our group, which we interpreted as evidence for differential rotation of the inner core relative to the Earth's surface," said geology professor and study co-author Xiaodong Song, who is now at Peking University. "However, some studies believe that what we interpret as movement is instead the result of seismic waves reflecting off an alternately enlarging and shrinking inner core boundary, like growing mountains and cutting canyons."

The researchers present seismic data from a range of geographic locations and repeating earthquakes, called doublets, that occur in the same spot over time. "Having data from the same location but different times allows us to differentiate between seismic signals that change due to localized variation in relief from those that change due to movement and rotation," said Yi Yang, a graduate student and lead author of the study.

The team found that some of the earthquake-generated seismic waves penetrate through the iron body below the inner core boundary and change over time, which would not happen if the inner core were stationary, the researchers said. "Importantly, we are seeing that these refracted waves change before the reflected waves bounce off the inner core boundary, implying that the changes are coming from inside the inner core," Song said.

The basis of the debate lies in the fact the prior studies looked at a relatively small pool of somewhat ambiguous data generated from a method that is highly dependent on accurate clock time, the researchers said.

"What makes our analysis different is our precise method for determining exactly when the changes in seismic signals occur and arrive at the various seismic stations across the globe," Yang said. "We use a seismic wave that did not reach inner core as a reference wave in our calculations, which eliminates a lot of the ambiguity."

This precise arrival time analysis, an extensive collection of the best quality data and careful statistical analysis performed by Yang, are what give this study its power, Song said. "This work confirms that the temporal changes come mostly, if not all, from the body of the inner core, and the idea that inner core surface changes are the sole source of the signal changes can now be ruled out," he said.

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Trouble getting a doctor's appointment may drive Medicaid enrollees to opt for the ER

The expansion of Medi-Cal, California's Medicaid program, gave millions of low-income Californians access to health insurance, but this study conducted in Northern California found that new patients may have to wait up up to a month for an appointment with a participating primary care provider, depending on their county of residence. It is not uncommon for Medi-Cal enrollees to visit emergency rooms if they require more immediate care.

This study looks at the variation between contiguous counties in the availability of new patient primary care appointments for Medi-Cal enrollees and at the correlation between primary care access and rates of Medi-Cal patients' emergency room usage. Researchers found that counties where it was more difficult to schedule new patient primary care appointments had higher rates of emergency room usage by Medi-Cal patients. This places a greater strain on already overburdened emergency departments and drives up health care costs overall.

How California's challenges compare with those faced by other states that have expanded Medicaid eligibility under the Affordable Care Act is unclear, though the data suggest that "adequate access to primary care will begin to improve health outcomes and control costs among beneficiaries of Medicaid expansion."

Credit: 
American Academy of Family Physicians

Arthritis clinical trial shows support for dextrose injection to alleviate knee pain

A randomized controlled trial conducted by a research team at a primary care clinic at the Chinese University of Hong Kong indicates that intra-articular-only injection therapy with hypertonic dextrose is safe and effective for alleviating symptoms of knee osteoarthritis.

Over 52 weeks of treatment, the study followed 76 patients who were between 45 and 75 years old who had been diagnosed with knee osteoarthritis and who suffered moderate to severe chronic knee pain for at least three months. One group of 38 patients received the hypertonic dextrose injection therapy, while the other had the same therapy only using normal saline. While both groups reported some improvement, the hypertonic dextrose group reported more significant reductions in pain by the conclusion of the study. The researchers note that longer-term follow-up, direct comparison with other injection therapies, and cost-effective analysis are all needed.

Credit: 
American Academy of Family Physicians