Culture

Fewer scars in the central nervous system

image: The blood clotting protein fibrinogen (red) is deposited in the stem cell niche and regulates the contribution of stem cells (green) to repair mechanisms in central nervous system diseases.

Image: 
Source: Schachtrup lab / University of Freiburg

Neural stem and progenitor cells (NSPCs), from the so-called subventricular zone (SVZ), can help to repair a brain damaged by central nervous system disorders. It is known that the microenvironment within the SVZ directs the differentiation of stem and progenitor cells (NSPCs) toward cells in the nerve tissue. However, researchers have not yet been able to explain why NSPCs do not develop into neurons after injuries to the central nervous system, but rather into astrocytes. Astrocytes play a major role in the formation of scars and thus interfere with the regeneration of the nerves in the central nervous system. A team led by Prof. Dr. Christian Schachtrup and Lauriane Pous from the Faculty of Medicine at the University of Freiburg has now succeeded in analyzing a further step in these processes in the brain. The scientists present their results in the current issue of Nature Communications.

Schachtrup and his team show that, following damage to the cerebral cortex of mice, fibrinogen from the blood is enriched in the stem cell niche of the SVZ, which is located further inside the brain. Fibrinogen is a blood coagulation factor and a precursor of the protein fibrin, which coats and stabilizes the blood platelets that gather at the site of a vascular injury. Fibrinogen inhibits the neuronal differentiation of NSPCs, the researchers discovered. At the same time, the enriched fibrinogen leads to increased astrogenesis, i.e. the formation of new astrocytes, as fibrinogen activates the so-called BMP receptor signaling pathway. By experimentally reducing fibrinogen, for example by adding the snake venom Ancrod, the astrocyte formation from NSPCs was blocked, which is why only reduced scars developed.

"The discovery that an important blood coagulation protein, fibrinogen, can induce an astrogenic milieu in the SVZ stem cell niche, which determines the contribution of NSPCs to repair mechanisms in CNS diseases, has potential implications for several processes in CNS diseases in different stem cell niches," says Schachtrup. With his research, the Freiburg researcher hopes to contribute to making neuronal regeneration processes more treatable through drugs or cell replacement therapies.

Credit: 
University of Freiburg

Novel coronavirus receptors show similarities to SARS-CoV, according to new analysis

Washington, DC - January 30, 2020 - The recent emergence of Wuhan coronavirus (2019-nCoV) has put the world on high alert for transcontinental transmission, reminiscent of the outbreak of SARS--also a coronavirus--in 2002-2003.

Decade-long structural studies by Fang Li of the University of Minnesota, et al. have shown how the SARS virus (SARS-CoV) interacts with animal and human hosts in order to infect them. The mechanics of infection by the Wuhan coronavirus appear to be similar. These investigators used the knowledge they gleaned from multiple SARS-CoV strains--isolated from different hosts in different years--and angiotensin-converting enzyme-2 (ACE2) receptors from different animal species to model predictions for the novel Wuhan coronavirus. (Both viruses use ACE2 to gain entry into the cell, but it serves normally as a regulator for heart function.)

"Our structural analyses confidently predict that the Wuhan coronavirus uses ACE2 as its host receptor," the investigators wrote. That and several other structural details of the new virus are consistent with the ability of the Wuhan coronavirus to infect humans and with some capability to transmit among humans.

"Alarmingly, our data predict that a single mutation [at a specific spot in the genome] could significantly enhance [the Wuhan coronavirus's] ability to bind with human ACE2," the investigators write. For this reason, Wuhan coronavirus evolution in patients should be closely monitored for the emergence of novel mutations at the 501 position in its genome, and to a lesser extent, the 494 position, in order to predict the possibility of a more serious outbreak than has been seen so far.

Credit: 
American Society for Microbiology

Study provides first look at sperm microbiome using RNA sequencing

DETROIT - A new collaborative study published by a research team from the Wayne State University School of Medicine, the CReATe Fertility Centre and the University of Massachusetts Amherst provides the first in-depth look at the microbiome of human sperm utilizing RNA sequencing with sufficient sensitivity to identify contamination and pathogenic bacterial colonization.

"We show that non-targeted sequencing of human sperm RNA has the potential to provide a profile of micro-organisms (bacteria, viruses, archaea)," said Stephen Krawetz, Ph.D., associate director of the C.S. Mott Center for Human Growth and Development at WSU and the Charlotte B. Failing Professor of Fetal Therapy and Diagnosis in the Department of Obstetrics and Gynecology, and the Center for Molecular Medicine and Genetics. "This information was recovered from the data typically cast aside as part of routine nucleic acid sequencing. The enhanced sensitivity and specificity of the sequencing technology as compared to current approaches may prove useful as a diagnostic tool for microbial status as part of the routine assessment as we move toward personalized care."

The study, "What human sperm RNA-Seq tells us about the microbiome" published in the Journal of Assisted Reproduction and Genetics, sought to determine if human sperm RNA sequencing data could provide a sensitive method of detection of micro-organisms, including bacteria, viruses and archaea compared to current methods of targeted culturing. The researchers collected 85 semen samples, isolated the sperm RNA and subjected it to RNA sequencing.

Grace Swanson, Ph.D., a postdoctoral fellow working with Dr. Krawetz, discovered a sample with an abnormally high level of microbial sequences. After taking a closer look, the sample was found to contain a considerable amount of Streptococcus agalactiae bacteria. A leading cause of neonatal infection during pregnancy and post-delivery linked to significant mortality rates in premature births, this bacteria can also be life-threatening in adults, particularly the elderly.

The current method for testing the male reproductive tract microbiome relies on culturing samples. This, the study reported, can be limiting because the majority of pathogens cannot be cultured. The costs of RNA sequencing have dropped dramatically and continue to decrease, providing a more complete picture of the human biome.

"Given the recent increase and severity of Streptococcus (agalactiae) infection, as well as others in adults, neonates and newborns, non-targeted human sperm RNA sequencing data may, in addition to providing fertility status, prove useful as a diagnostic for microbial status," Dr. Krawetz said.

Credit: 
Wayne State University - Office of the Vice President for Research

National survey: Students' feelings about high school are mostly negative

New Haven, Conn. -- Ask a high school student how he or she typically feels at school, and the answer you'll likely hear is "tired," closely followed by "stressed" and "bored."

In a nationwide survey of 21,678 U.S. high school students, researchers from the Yale Center for Emotional Intelligence and the Yale Child Study Center found that nearly 75% of the students' self-reported feelings related to school were negative.

The study, which appeared in the January edition of the Journal of Learning and Instruction, also involved a second, "experience sampling" study in which 472 high school students in Connecticut reported their feelings at distinct moments throughout the school day. These momentary assessments told the same story: High school students reported negative feelings 60% of the time.

"It was higher than we expected," said co-author and research scientist Zorana Ivcevic. "We know from talking to students that they are feeling tired, stressed, and bored, but were surprised by how overwhelming it was."

Students were recruited for the survey through email lists of partner schools and through social media channels from nonprofits like the Greater Good Science Center and Born this Way Foundation. The students represent urban, suburban, and rural school districts across all 50 states and both public and private schools. The researchers found that all demographic groups reported mostly negative feelings about school, but girls were slightly more negative than boys.

"Overall," said co-author Marc Brackett, "students see school as a place where they experience negative emotions."

In the first online survey, students were asked to "think about the range of positive and negative feelings you have in school" and provide answers in three open text boxes. They were also asked to rate on a scale of 0 (never) to 100 (always) how often they felt 10 emotions: happy, proud, cheerful, joyful, lively, sad, mad, miserable, afraid, scared, stressed and bored.

In the open-ended responses, the most common emotion students reported was tired (58%). The next most-reported emotions -- all just under 50% -- were stressed, bored, calm, and happy. The ratings scale supported the findings, with students reporting feeling stressed (79.83%) and bored (69.51%) the most.

When those feelings are examined with more granularity, said Ivcevic, they reveal something interesting. The most-cited positive descriptions -- calm and happy -- are vague.

"They are on the positive side of zero," Ivcevic said, "but they are not energized or enthusiastic." Feeling "interested" or "curious," she noted, would reveal a high level of engagement that is predictive of deeper and more enduring learning.

She added that many of the negative feelings may be interrelated, with tiredness, for example, contributing to boredom or stress. "Boredom is in many ways similar to being tired," she said. "It's a feeling of being drained, low-energy. Physical states, such as being tired, can be at times misattributed as emotional states, such as boredom."

The researchers noted that the way students feel at school has important implications in their performance and their overall health and well-being. "Students spend a lot of their waking time at school," Ivcevic said. "Kids are at school to learn, and emotions have a substantial impact on their attention. If you're bored, do you hear what's being said around you?"

Public attention has turned recently to early start times for high schools in the U.S. and how that contributes to sleep deprivation among students, which is associated with a number of other health risks -- including weight gain, depression, and drug use -- and poor academic performance. The American Academy of Pediatrics has recommended that high schools start at 8:30 a.m. or later, but the vast majority start earlier.

"It is possible that being tired is making school more taxing," Ivcevic said, "so that it is more difficult for students to show curiosity and interest. It is like having an extra weight to carry."

Unfortunately, she added, decisions about school start times are often not made with students' health and wellbeing in mind. "There has been a movement in recent years to move school start times later," she said. "The reasons for not moving it have nothing to do with students' wellbeing or their ability to learn." Instead, these decisions are often driven by concerns about athletic programs, extracurricular activities, and transportation.

At the Yale Center for Emotional Intelligence, where Brackett is founding director and lead developer of RULER, an evidence-based approach to social and emotional learning, the goal is to give students and staff the tools to use their emotions wisely. RULER doesn't claim to prevent tiredness and boredom, but it is designed to help students to find an outlet for their feelings and to support teachers and students in developing emotion skills to promote greater engagement and enhance academic performance.

Credit: 
Yale University

Want to change your personality? It may not be easy to do alone

Most people have an aspect of their personality they'd like to change, but without help it may be difficult to do so, according to a study led by a University of Arizona researcher and published in the Journal of Research in Personality.

Contrary to the once-popular idea that people's personalities are more or less set in stone, research has proven that personalities do change throughout the lifespan, often in line with major life events. For example, there is evidence that people tend to be more agreeable and conscientious in college, less extroverted after they get married and more agreeable in their retirement years.

While it's well-established that personalities can change in response to life circumstances, researcher Erica Baranski wondered if people can actively and intentionally change aspects of their personalities at any given point simply because they desire to do so.

She and her colleagues studied two groups of people: approximately 500 members of the general population who ranged in age from 19 to 82 and participated in the research online; and approximately 360 college students.

Both groups completed the 44-item "Big Five Inventory," which measures five key personality traits: extroversion, conscientiousness, agreeableness, openness to experience and neuroticism, also referred to as emotional stability. The participants were then asked whether they desired to change any aspect of their personality. If they answered yes, they were asked to write an open-ended description of what they wanted to change.

Across both groups, most people said they desired to increase extroversion, conscientiousness and emotional stability.

The college students were surveyed again six months later, and the general population group was surveyed again a year later. Neither group had achieved the personality goals they set for themselves at the beginning of the study, and, in fact, some saw change in the opposite direction.

"In both samples, the desire to change at 'time one' did not predict actual change in the desired direction at all at 'time two,'" said Baranski, a postdoctoral psychology researcher in the University of Arizona Institute on Place, Wellbeing & Performance. "In the general population sample, we didn't find that personality change goals predicted any change in any direction."

College Students Saw More Change

While the general population group exhibited no change in personality traits between the first and second rounds of data collection, the college student group did show some changes; however, they were either in the opposite direction than desired or were for different personality traits than the one the person intended to change.

Specifically, college students who expressed the strongest desires to be more conscientious actually exhibited less conscientiousness six months later. That could be because those individuals exhibited low levels of conscientiousness to begin with, putting them at a disadvantage from the outset, Baranski said.

In addition, students who said they wanted to be more extroverted showed increases in agreeableness and emotionally stability rather than extroversion in the follow-up. Baranski said that perhaps as part of their effort to become more social and extroverted, they actually focused on being friendlier and less socially anxious - behaviors more directly related to agreeability and emotionally stability, respectively.

Baranski said college students may have exhibited more change than the general population because they are in such a transformational period in their lives. Still, the changes they experienced didn't align with the goals they set for themselves.

"College students are thrown into this new environment, and they may be unhappy and may look within selves to become happier and change some aspect of their personality," Baranski said. "But, meanwhile, there is a bombardment of other things that they're told they need to achieve, like doing well in a class or choosing a major or getting an internship, and those goals might take precedence. Even though they know more sustained and introspective change might be better, the short-term effort is more attractive and more necessary in the moment."

Overall, Baranski's findings illustrate how difficult it can be for people to change aspects of their personality based on desire alone. That doesn't mean people can't make the changes they want. They just might need outside help doing so - from a professional, a friend or maybe even a mobile app reminding them of their goals, Baranski said.

Baranski intentionally did not interact with study participants between the first and second rounds of data collection. That approach differs from that of another researcher, Southern Methodist University's Nathan Hudson, who in several other separate studies assessed personality change goals over a 16-week period but followed up with participants along the way. In that research, which Baranski cites, experimenters assessed participants' personality traits and progress toward their goals every few weeks. With that kind of interaction, participants were more successful in making changes.

"There is evidence in clinical psychology that therapeutic coaching leads to change in personality and behavior, and there is recent evidence that suggests that when there's a lot of regular interaction with an experimenter, personality change is possible," Baranski said. "But when individuals are left to their own devices, change may not be as likely."

Future research, Baranski said, should look at how much intervention is needed to help people achieve their personality goals, and which types of strategies work best for different traits.

"Across all the studies that have been done on this topic over the last several years, it's clear that most people want to change an aspect of their personality," Baranski said. "If left unattended, those goals aren't achieved, so it would be helpful for people who have those goals to know what is necessary for them to accomplish them."

Credit: 
University of Arizona

Giving cryptocurrency users more bang for their buck

A new cryptocurrency-routing scheme co-invented by MIT researchers can boost the efficiency -- and, ultimately, profits -- of certain networks designed to speed up notoriously slow blockchain transactions.

Cryptocurrencies hold promise for peer-to-peer financial transactions, potentially making banks and credit cards obsolete. But there's a scalability issue: Bitcoin, for instance, processes only a handful of transactions per second, while major credit cards process hundreds or thousands. That's because the blockchain -- the digital ledger cryptocurrencies are built on -- takes a really long time to process transactions.

A new solution is "payment channel networks" (PCNs), where transactions are completed with minimal involvement from the blockchain. Pairs of PCN users form off-blockchain escrow accounts with a dedicated amount of money, forming a large, interconnected network of joint accounts. Users route payments through these accounts, only pinging the blockchain to establish and close the accounts, which speeds things up dramatically. Accounts can also collect a tiny fee when transactions get routed through them.

Inefficient routing schemes, however, slow down even these fast solutions. They deplete users' balances in these accounts frequently, forcing them to invest a lot of money in each account or frequently rebalance their accounts on the blockchain. In a paper being presented next month at the USENIX Symposium on Networked Systems Design and Implementation, the researchers introduce "Spider," a more efficient routing scheme that lets users invest only a fraction of funds in each account and process roughly four times more transactions before rebalancing on the blockchain.

"It's important to have balanced, high-throughput routing in PCNs to ensure the money that users put into joint accounts is used efficiently," says first author Vibhaalakshmi Sivaraman, a graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). "This should be efficient and a lucrative business. That means routing as many transactions as possible, with as little funds as possible, to give PCNs the best bang for their buck."

Joining Sivaraman on the paper are former postdoc Shaileshh Bojja Venkatakrishnan, CSAIL graduate students Parimarjan Negi and Lei Yang, and Mohammad Alizadeh, an associate professor of electrical engineering and computer science and a CSAIL researcher; Radhika Mittal of the University of Illinois at Urbana-Champaign; and Kathleen Ruan and Giulia Fanti of Carnegie Mellon University.

Packet payments

PCNs rely heavily on bidirectional joint accounts -- where both parties can receive and send money -- so money can be routed between any users. User B can have a joint account with user A, while also linking separately to user C. Users A and C are not directly connected, but user A can send money to user C via the A-B and B-C joint accounts.

To exchange funds, each party must approve and update the balances in their joint accounts. Payments can only be routed on channels with sufficient funds to handle the transactions, causing major issues.

Traditional schemes send transactions along the shortest path possible, without being aware of any given user's balance or the rate of sending on that account. This can cause one of the users in the joint account to handle too many transactions and drop to a zero balance, making it unable to route further transactions. What's more, users can only send a payment in full. If a user wants to send, say, 10 bitcoins, current schemes try to push the full amount on the shortest path possible. If that path can't support all 10 bitcoins at once, they'll search for the next shortest path, and so on -- which can slow down or completely fail the transaction.

Inspired by a technique for internet communications called packet switching, Spider splits each full transaction into smaller "packets" that are sent across different channels at different rates. This lets the scheme route chunks of these large payments through potentially low-funded accounts. Each packet is then far more likely to reach its destination without slowing down the network or being rejected in any given account for its size.

"Shortest-path routing can cause imbalances between accounts that deplete key payment channels and paralyze the system," Sivaraman says. "Routing money in a way that the funds of both users in each joint account are balanced allows us to reuse the same initial funds to support as many transactions as possible."

All queued up

Another innovation was creating queues at congested accounts. If an account can't handle incoming transactions that require it to send money, instead of rejecting them, it queues them up. Then, it waits for any transactions that will replenish its funds -- within a reasonable time frame -- to be able to process those transactions.

"If you're waiting on a queue, but I send you funds within the next second, you can then use any of those funds to send your waiting transactions," Sivaraman says.

The researchers also adopted an algorithm -- built by Alizadeh and other researchers -- that monitors data center congestion to identify queueing delays at congested accounts. This helps control the rate of transactions. Say user A sends funds to user C through user B, which has a long queue. The receiver C sends the sender A, along with the payment confirmation, one bit of information representing the transaction's wait time at user B. If it's too long, user A routes fewer transactions through user B. As the queueing time decreases, account A routes more transactions through B. In this manner, by monitoring the queues alone, Spider is able to ensure that the rate of transactions is both balanced and as high as possible.

Ultimately, the more balanced the routing of PCNs, the smaller the capacity required -- meaning, overall funds across all joint accounts -- for high transaction throughput. In PCN simulations, Spider processed 95 percent of all transactions using only 25 percent of the capacity needed in traditional schemes.

The researchers also ran tests on tricky transactions called "DAGs," which are one-directional payments where one user inevitably runs out of funds and needs to rebalance on the blockchain. A key metric for the performance of PCNs on DAG transactions is the number of off-chain transactions enabled for each transaction on the blockchain. In this regard, Spider is able to process eight times as many off-chain transactions for each transaction on-chain. In contrast, traditional schemes only support twice as many off-chain transactions.

"Even with extremely frequent rebalancing, traditional schemes can't process all DAG transactions. But with very low-frequency rebalancing, Spider can complete them all," Sivaraman says.

Next, the researchers are making Spider more robust to DAG transactions, which can cause bottlenecks. They're also exploring data privacy issues and ways to incentivize users to use Spider.

Credit: 
Massachusetts Institute of Technology

Modern Africans and Europeans may have more Neanderthal ancestry than previously thought

Neanderthal DNA sequences may be more common in modern Africans than previously thought, and different non-African populations have levels of Neanderthal ancestry surprisingly similar to each other, finds a study publishing January 30 in the journal Cell. Researchers arrived at these findings by developing a new statistical method, called IBDmix, to identify Neanderthal sequences in the genomes of modern humans. The results also suggest that African genomes contain Neanderthal sequences in part due to back-migration of ancestors of present-day Europeans.

"Our study is significant because it provides important new insights into human history and patterns of Neanderthal ancestry in globally diverse populations," says senior study author Joshua Akey of Princeton University. "Our results refine catalogs of genomic regions where Neanderthal sequence was deleterious and advantageous and demonstrate that remnants of Neanderthal genomes survive in every modern human population studied to date."

Past studies have suggested that East Asians have approximately 20% more Neanderthal ancestry compared to Europeans. But the new findings suggest that these estimates may have been biased due to methodological limitations. Previously developed approaches, such as S*, use a modern reference panel--usually an African population assumed to lack Neanderthal ancestry. But if the reference panel unexpectedly contains Neanderthal sequences, then the method will underestimate Neanderthal ancestry in modern humans.

To address this problem, Akey and his colleagues developed IBDmix as a new category of methods for detecting archaic ancestry. Instead of using a modern reference panel, the approach calculates the probability that an individual's genotype is shared identical by descent (IBD) with an archaic reference genome. Compared with S*, IBDmix is a less biased approach because it has higher statistical power for detecting shared archaic sequences and yields fewer false positives.

The researchers applied IBDmix to 2,504 modern individuals from the 1000 Genomes Project, which represents geographically diverse populations, and used the Altai Neanderthal reference to identify Neanderthal sequence in these individuals. They robustly identified regions of Neanderthal ancestry in Africans for the first time, identifying on average 17 megabases (Mb) of Neanderthal sequence per individual in the African samples analyzed (which corresponds to approximately 0.3% of the genome), compared with less than one megabase reported in previous studies. More than 94% of the Neanderthal sequence identified in African samples was shared with non-Africans.

The researchers also observed levels of Neanderthal ancestry in Europeans (51 Mb/individual), East Asians (55 Mb/individual), and South Asians (55 Mb/individual) that were surprisingly similar to each other. Strikingly, East Asians had only 8% more Neanderthal ancestry compared to Europeans, in contrast to previous reports of 20%. "This suggests that most of the Neanderthal ancestry that individuals have today can be traced back to a common hybridization event involving the population ancestral to all non-Africans, occurring shortly after the Out-of-Africa dispersal," Akey says.

To explore potential explanations for the unexpectedly high Neanderthal ancestry in Africans, the researchers then compared the actual data to simulated genotype data derived from different demographic models. This analysis took into account various sequence characteristics, such as the length of the shared archaic segments, the frequency of these segments in Africans, and the amount of sequence shared exclusively between African and non-African populations.

They found that Africans exclusively share 7.2% of Neanderthal sequence with Europeans, compared with only 2% with East Asians. Simulations showed that low levels of back-migration persisting over the past 20,000 years can replicate features of the data and could therefore be a possible explanation for the observed levels of ancestry among different modern populations. The results suggest that previously developed methods using an African reference population are biased toward underestimating Neanderthal ancestry to a greater extent in Europeans compared to East Asians. "Collectively, these results show that Neanderthal ancestry estimates in East Asians and Europeans were biased due to unaccounted-for back-migrations from European ancestors into Africa," Akey says.

But gene flow went in both directions. The data also suggest that there was a dispersal of modern humans out of Africa approximately 200,000 years ago, and this group hybridized with Neanderthals, introducing modern human DNA into the genomes of Neanderthals. According to the authors, both out-of-Africa and into-Africa dispersals must be accounted for when interpreting global patterns of genomic variation.

"I am struck by the fact that we often conceptualize human history in very simple terms," Akey says. "For example, we often imagine there was a single dispersal out of Africa that happened 60,000 to 80,000 years ago that led to the peopling of the world. However, our results show this history was much more interesting and there were many waves of dispersal out of Africa, some of which led to admixture between modern humans and Neanderthals that we see in the genomes of all living individuals today."

Using IBDmix, the researchers also identified 51 high-frequency Neanderthal haplotypes--sets of DNA variations that tend to be inherited together--in modern humans, including several that were undetected with previously developed methods. For the first time, they detected high-frequency Neanderthal haplotypes in Africans, and regions containing these haplotypes are enriched for genes involved in immune function and ultraviolet-radiation sensitivity. These haplotypes may reflect instances of beneficial Neanderthal sequences being rapidly driven to high frequency in modern humans through a process known as adaptive introgression. "These novel findings provide insight into the evolutionary history of these populations, the selective pressures they faced, and current variation in health and disease," Akey says.

The authors note several limitations of their approach. Because IBDmix requires an archaic reference genome, it is not suitable for discovering sequences shared between modern humans and unknown or unsequenced hominin lineages. In addition, the approach requires sequenced genomes from at least ten individuals for robust inferences. In future studies, Akey and his team plan to apply their approach to additional African populations, functionally characterize Neanderthal sequences that may be advantageous, and study the implications of these archaic sequences in modern human health and disease.

Credit: 
Cell Press

New study identifies Neanderthal ancestry in African populations and describes its origin

image: A team of Princeton researchers led by Joshua Akey found that that African individuals have considerably more Neanderthal ancestry than previously thought, which was only observable through the development of new methods.

Image: 
Matilda Luk, Princeton University Office of Communications

When the first Neanderthal genome was sequenced, using DNA collected from ancient bones, it was accompanied by the discovery that modern humans in Asia, Europe and America inherited approximately 2% of their DNA from Neanderthals -- proving humans and Neanderthals had interbred after humans left Africa. Since that study, new methods have continued to catalogue Neanderthal ancestry in non-African populations, seeking to better understand human history and the effects of Neanderthal DNA on human health and disease. A comparable catalogue of Neanderthal ancestry in African populations, however, has remained an acknowledged blind spot for the field due to technical constraints and the assumption that Neanderthals and ancestral African populations were geographically isolated from each other.

In a paper published today in the journal Cell, a team of Princeton researchers detailed a new computational method for detecting Neanderthal ancestry in the human genome. Their method, called IBDmix, enabled them for the first time to search for Neanderthal ancestry in African populations as well as non-African ones. The project was led by Joshua Akey, a professor in Princeton's Lewis-Sigler Institute for Integrative Genomics (LSI).

"This is the first time we can detect the actual signal of Neanderthal ancestry in Africans," said co-first author Lu Chen, a postdoctoral research associate in LSI. "And it surprisingly showed a higher level than we previously thought," she said.

The method the Princeton researchers developed, IBDmix, draws its name from the genetic principle "identity by descent" (IBD), in which a section of DNA in two individuals is identical because those individuals once shared a common ancestor. The length of the IBD segment depends on how long ago those individuals shared a common ancestor. For example, siblings share long IBD segments because their shared ancestor (a parent) is only one generation removed. Alternatively, fourth cousins share shorter IBD segments because their shared ancestor (a third-great grandparent) is several generations removed.

The Princeton team leveraged the principle of IBD to identify Neanderthal DNA in the human genome by distinguishing sequences that look similar to Neanderthals because we once shared a common ancestor in the very distant past (~500,000 years ago), from those that look similar because we interbred in the more recent present (~50,000 years ago). Previous methods relied on "reference populations" to aid the distinction of shared ancestry from recent interbreeding, usually African populations believed to carry little or no Neanderthal DNA. However, this reliance could bias estimates of Neanderthal ancestry depending on which reference population was used. The Princeton researchers termed IBDmix a "reference free method" because it does not use an African reference population. Instead, IBDmix uses characteristics of the Neanderthal sequence itself, like the frequency of mutations or the length of the IBD segments, to distinguish shared ancestry from recent interbreeding. The researchers were therefore able to identify Neanderthal ancestry in Africans for the first time and make new estimates of Neanderthal ancestry in non-Africans, which showed Europeans and Asians to have more equal levels than previously described.

Kelley Harris, a population geneticist at the University of Washington who was not involved in the study, noted that the new estimates of Neanderthal ancestry using IBDmix highlight the technical problem in methods reliant on reference panels. "We might have to go back and revisit a bunch of results from the published literature and evaluate whether the same technical issue has been throwing off our understanding of gene flow in other species," she said.

In addition to identifying Neanderthal ancestry in African populations, the researchers described two revelations about the origin of the Neanderthal sequences. First, they determined that the Neanderthal ancestry in Africans was not due to an independent interbreeding event between Neanderthals and African populations. Based on features of the data, the research team concluded that migrations from ancient Europeans back into Africa introduced Neanderthal ancestry into African populations.

Second, by comparing data from simulations of human history to data from real people, the researchers determined that some of the detected Neanderthal ancestry in Africans was actually due to human DNA introduced into the Neanderthal genome. The authors emphasized that this human-to-Neanderthal gene flow involved an early dispersing group of humans out of Africa, occurring at least 100,000 years ago -- before the Out-of-Africa migration responsible for modern human colonization of Europe and Asia and before the interbreeding event that introduced Neanderthal DNA into modern humans. The finding reaffirmed that hybridization between humans and closely related species was a recurrent part of our evolutionary history.

While the Princeton researchers acknowledged the limited number of African populations they were able to analyze, they hope their new method and their findings will encourage more study of Neanderthal ancestry across Africa and other populations. Regarding the overall significance of the research, Chen said: "This demonstrates the remnants of Neanderthal genomes survive in every modern human population studied to date."

Credit: 
Princeton University

Schizophrenia genetics analyzed in South African Xhosa

image: Synaptic genes collectively carry a significantly greater burden of private damaging mutations in persons with schizophrenia compared to controls.

Image: 
Emily Willoughby/Genetics of Schizophrenia in the South African Xhosa

The first genetic analysis of schizophrenia in an ancestral African population, the South African Xhosa, will be published Jan. 31 in Science. An international group of scientists conducted the research.

The study of schizophrenia was carried out in the Xhosa population because Africa is the birthplace of all humans, yet ancestral African populations have rarely been the focus of genetics research. The Xhosa do not have an unusually high risk of schizophrenia.

The relative lack of genetics studies in Africa leaves a major gap in understanding human genetics. Almost 99% of human evolution took place in Africa, after the first modern humans originated and before humans migrated from Africa to Europe and Asia 50,000 to 100,000 years ago. Because of the lack of studies in Africa, many generations of human genetic history are missing from our understanding of human adaptation and of human disease.

The Xhosa trace their history to the migration of Bantu-speaking people from the Great Lakes region of eastern Africa to southern Africa centuries ago. The Xhosa now live throughout South Africa and are the largest population group of the Eastern Cape region.

Schizophrenia affects approximately 1% of people in all parts of the world and is one of the leading causes of disability worldwide. This study revealed that Xhosa individuals with schizophrenia are significantly more likely to carry rare, damaging genetic mutations compared to Xhosa individuals without severe mental illness.

Many of the genes disrupted by the rare damaging mutations of these patients are involved in the organization and function of brain synapses. Synapses coordinate the communication between brain nerve cells called neurons,The organization and firing of neuronal synapses are ultimately responsible for learning, memory and brain function.

The genes and pathways identified by this research inform the understanding of schizophrenia for all human populations, and suggest potential mechanisms for the design of more effective treatments.

Credit: 
University of Washington School of Medicine/UW Medicine

Bacteria engineered to protect bees from pests and pathogens

image: A Varroa mite, a common pest that can weaken bees and make them more susceptible to pathogens, feeds on a honey bee.

Image: 
Alex Wild/University of Texas at Austin

Scientists from The University of Texas at Austin report in the journal Science that they have developed a new strategy to protect honey bees from a deadly trend known as colony collapse: genetically engineered strains of bacteria.

An increasing number of honey bee colonies in the U.S. have seen the dwindling of their adult bees. According to a national survey, beekeepers lost nearly 40% of their honey bee colonies last winter, the highest rate reported since the survey began 13 years ago.

The engineered bacteria live in the guts of honey bees and act as biological factories, pumping out medicines protecting the bees against two major causes of colony collapse: Varroa mites and deformed wing virus. The researchers believe their method could one day scale up for agricultural use because the engineered bacteria are easy to grow, inoculating the bees is straightforward and the engineered bacteria are unlikely to spread beyond bees.

"It has direct implications for bee health," said Nancy Moran, a professor of integrative biology and the primary investigator on the study.

"This is the first time anyone has improved the health of bees by genetically engineering their microbiome," added Sean Leonard, a graduate student and first author of the study.

Varroa mites and deformed wing virus often come together; as the mites feed on bees, they can spread the virus, while also weakening the bees and making them more vulnerable to pathogens in the environment.

To address each problem, the team engineered one strain of bacteria to target the virus and another for the mites. Compared with control bees, the bees treated with the strain of bacteria targeting the virus were 36.5% more likely to survive to day 10. Meanwhile, Varroa mites feeding on another set of bees treated with the mite-targeting strain of bacteria were about 70% more likely to die by day 10 than mites feeding on control bees.

According to the American Beekeeping Federation, honey bees contribute nearly $20 billion each year to the value of U.S. crop production, and they play an enormous role in global food production. Without honey bees, dozens of crops, from almonds to berries to broccoli, would either vanish or produce significantly less food.

Like humans, honey bees have an ecosystem of bacteria in their guts called a microbiome and also an antiviral defense mechanism called RNA interference (RNAi) that helps the body fight off certain viruses, called RNA viruses. When an RNA virus is introduced, it produces molecules called double-stranded RNAs that a healthy cell detects, triggering an RNAi immune response.

"You usually only get signs of these molecules when an RNA virus is replicating," Moran said. "It's a signal that this might be an evil thing and you should attack it."

To promote a helpful RNAi response to viruses in bees -- and trigger a lethal RNAi response in the mites -- the team introduced modified bacteria to hundreds of bees in a laboratory setting. Sprayed with a sugar water solution containing the bacteria, the bees groomed one another and ingested the solution. The team found inoculating young worker bees with the engineered bacteria led the bees' immune systems to be primed to protect them against deformed wing virus, which is an RNA virus, and caused the mites' own immune systems to fight against and ultimately kill them.

While the experiments occurred under strict biocontainment protocols used with genetic engineering, Moran said, even absent such protocols, the risk of the engineered bacteria escaping into the wild and infecting other insects -- and thereby conferring some anti-pest or anti-pathogen superpowers -- is very low. The type of bacteria used are highly specialized to live in the bee gut, can't survive for long outside of it and are protective for a virus that strikes only bees. Still, further research will be needed to determine the effectiveness and safety of the treatments in agricultural settings.

Another benefit of the approach is for researchers to use it as a tool in studying bee genetics. The engineered bacteria can knock down specific bee genes, enabling insights into the workings of the bee genome, and possibly enabling new breeding strategies to produce more robust bee colonies.

Credit: 
University of Texas at Austin

Health: Daily smoking and drinking may be associated with advanced brain age

Daily drinking and smoking may be associated with modest increases in relative brain age compared to those who drink and smoke less, according to a study published in Scientific Reports.

Research has shown that certain lifestyle habits, such as heavy smoking and alcohol consumption, are associated with adverse effects in specific brain regions. However, it is unclear how smoking and alcohol consumption may be associated with brain age, especially when the whole brain is considered.

Arthur W. Toga and colleagues used machine learning methods and MRI to identify relative brain age in 17,308 individuals aged 45 to 81 years whose data was included in the UK Biobank. Relative brain age is an individual's brain age based on MRI measurements, compared to the average brain age of their peers.

The authors found that in 11,651 individuals for whom information on smoking habits was collected, those who smoked on most or all days had a higher relative brain age than those who smoked less frequently or not at all. Each additional pack-year of smoking was associated with 0.03 years of increased relative brain age. A pack-year was defined as smoking a pack of cigarettes per day on average for a whole year. In 11,600 individuals for whom information on drinking behavior was collected, those who drank alcohol on most days had a higher relative brain age than those who drank less frequently or not at all. Each additional gram of alcohol consumption per day was associated with 0.02 years of increased relative brain age. The findings suggest that detrimental effects of smoking and drinking on brain age may occur mainly in those who smoke and drink at high frequencies and with modest increases in brain age.

The authors caution that besides smoking and alcohol consumption, various other environmental and genetic factors may be associated with brain age. Studies in larger samples are needed to further clarify these associations.

Credit: 
Scientific Reports

Pulsar-white dwarf binary system confirms general relativistic frame-dragging

A century after it was first theorized, researchers have detected the effects of Lense-Thirring precession - an effect of relativistic frame-dragging - in the motion of a distant binary star system, a new study reports. The results of the twenty-year study confirm a prediction of Einstein's general theory of relativity. When a massive object rotates, general relativity predicts that it pulls the surrounding spacetime around with it, a phenomenon known as frame-dragging. This phenomenon causes precession of the orbital motion of gravitationally bound objects. While frame-dragging has been detected by satellite experiments in the gravitational field of the rotating Earth, its effect is tremendously small and challenging to measure. More massive objects, such as white dwarfs or neutron stars, provide a better opportunity to observe the phenomenon under much more intense gravitational fields. Vivek Venkatraman Krishnan and colleagues observed PSR J1141-6545, a young pulsar in a tight, fast orbit with a massive white dwarf. They measured the arrival times of the pulses to within 100 microseconds, over a period of nearly twenty years, which allowed them to identify a long-term drift in the orbital parameters. After eliminating other possible causes of this drift, Venkatraman Krishnan et al. conclude that it is the result of Lense-Thirring precession due to the rapidly rotating white dwarf companion. The findings confirm the prediction of general relativity and allowed the authors to constrain the white dwarf's rotational speed.

Credit: 
American Association for the Advancement of Science (AAAS)

Brain drowns in its own fluid after a stroke

video: Cerebral edema, swelling that occurs in the brain, is a severe and potentially fatal complication of stroke. New research, which was conducted in mice and appears in the journal Science, shows for the first time that the glymphatic system - normally associated with the beneficial task of waste removal - goes awry during a stroke and floods the brain, triggering edema and drowning brain cells. Video shows spreading depolarization of cells followed by cerebrospinal fluid flowing into the brain post-stroke.

Image: 
University of Rochester Medical Center

Cerebral edema, swelling that occurs in the brain, is a severe and potentially fatal complication of stroke. New research, which was conducted in mice and appears in the journal Science, shows for the first time that the glymphatic system - normally associated with the beneficial task of waste removal - goes awry during a stroke and floods the brain, triggering edema and drowning brain cells.

"These findings show that the glymphatic system plays a central role in driving the acute tissue swelling in the brain after a stroke", said Maiken Nedergaard, M.D., D.M.Sc., co-director of the University of Rochester Medical Center (URMC) Center for Translational Neuromedicine and senior author of the article. "Understanding this dynamic - which is propelled by storms of electrical activity in the brain - point the way to potential new strategies that could improve stroke outcomes."

First discovered by the Nedergaard lab in 2012, the glymphatic system consists of a network that piggybacks on the brain's blood circulation system and is comprised of layers of plumbing, with the inner blood vessel encased by a 'tube' that transports cerebrospinal fluid (CSF). The system pumps CSF through brain tissue, primarily while we sleep, washing away toxic proteins and other waste.

While edema is a well-known consequence of stroke, there are limited treatment options and the severity of swelling in the brain depends upon the extent and location of the stroke. Because the brain is trapped in the skull, it has little room to expand. If the swelling is severe, it can push in on important structures such as the brainstem, which regulates the cardiovascular and respiratory systems, resulting in death. In extreme cases and often as a last resort, surgeons will remove a part of the skull to relieve the pressure on the brain.

Prior to the findings of the new study, it has been assumed that the source of swelling was the result of fluid from blood.

An electrical wave, then the flood

Ischemic stroke, the most common form of stroke, occurs when a vessel in the brain is blocked. Denied nutrients and oxygen, brain cells become compromised and depolarize - often within minutes of a stroke. As the cells release energy and fire, they trigger neighboring cells, creating a domino effect that results in an electrical wave that expands outward from the site of the stroke, called spreading depolarization.

As this occurs, vast amounts of potassium and neurotransmitters released by neurons into the brain. This causes the smooth muscles cells that line the walls of blood vessels to seize up and contract, cutting off blood flow in a process known as spreading ischemia. CSF then flows into the ensuing vacuum, inundating brain tissue and causing edema. The already vulnerable brain cells in the path of the flood essentially drown in CSF and the brain begins to swell. These depolarization waves can continue in the brain for days and even weeks after the stroke, compounding the damage.

"When you force every single cell, which is essentially a battery, to release its charge it represents the single largest disruption of brain function you can achieve - you basically discharge the entire brain surface in one fell swoop," said Humberto Mestre, M.D., a Ph.D. student in the Nedergaard lab and lead author of the study. "The double hit of the spreading depolarization and the ischemia makes the blood vessels cramp, resulting in a level of constriction that is completely abnormal and creating conditions for CSF to rapidly flow into the brain."

The study correlated the brain regions in mice vulnerable to this post-stroke glymphatic system dysfunction with edema found in the brains of humans who had sustained an ischemic stroke.

Pointing the way to new stroke therapies

The findings suggest potential new treatment strategies that used in combination with existing therapies focused on restoring blood flow to the brain quickly after a stroke. The study could also have implications for brain swelling observed in other conditions such as subarachnoid hemorrhage and traumatic brain injury.

Approaches that block specific receptors on nerve cells could inhibit or slow the cycle of spreading depolarization. Additionally, a water channel called aquaporin-4 on astrocytes - an important support cell in the brain - regulates the flow of CSF. When the team conducted the stroke experiments in mice genetically modified to lack aquaporin-4, CSF flow into the brain slowed significantly. Aquaporin-4 inhibitors currently under development as a potential treatment for cardiac arrest and other diseases could eventually be candidates to treat stroke.

"Our hope is that this new finding will lead to novel interventions to reduce the severity of ischemic events, as well as other brain injuries to which Soldiers may be exposed," said Matthew Munson, Ph.D., program manager, fluid dynamics, Army Research Office, an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory. "What's equally exciting is that this new finding was not part of the original research proposal. That is the power of basic science research and working across disciplines. Scientists 'follow their nose' where the data and their hypotheses lead them - often to important unanticipated applications."

Credit: 
University of Rochester Medical Center

Researchers build a better lung model

(Boston)--Using a combination of pluripotent stem cells (cells that can potentially produce any cell or tissue type) and machine learning (artificial intelligence that allows computers to learn automatically), researchers have improved how they generate lung cells.

Using this technique, cells can be grown in a laboratory and stored for more than one year without losing their lung identity and used to model lung diseases thereby finding better treatments and cures for lung diseases in the future.

Induced pluripotent stem (iPS) cells are derived from the donated skin or blood cells of adults and, with the reactivation of four genes, are reprogrammed back to an embryonic stem cell-like state. iPS cells can be differentiated toward any cell type in the body and do not require the use of embryos.

Building on previous work from the Center for Regenerative Medicine (CReM) of Boston University and Boston Medical Center, researchers in the CReM, working together with investigators from Carnegie Mellon University (CMU), reprogrammed blood from adults into iPS cells. They then treated these stem cells with growth factors over a period of one month until they became cells which were very similar to adult lung cells.

According to the researchers, often when this type of experiment is performed the resulting cells are not a pure collection of the cell that they aimed to create (target cell) and they do not keep the characteristics of the target cell for prolonged periods of time.

"Therefore, we developed a combination of techniques that examines the gene expression of thousands of single cells combined with DNA barcoding of each individual cell and machine learning to build up a dynamic picture of what factors favor cells that go on to be lung cells in our system. Using this knowledge we were able to improve our methods for generating lung cells so that we can now create more relevant cells that keep their cell identity in a dish for more than one year," explained Killian Hurley, MD, PhD, researcher at the Royal College of Surgeons in Ireland, who co-authored the study with Jun Ding, PhD, a post-doctoral fellow at CMU.

The researchers believe this study will improve their ability to model lung disease and treatments in the laboratory for diseases including idiopathic pulmonary fibrosis, chronic obstructive pulmonary disease (COPD), alpha-1 antitrypsin deficiency and neonatal respiratory distress or early-onset interstitial lung disease.

Millions of people in the United States and around the world have severe lung diseases, often without good treatments or cure. Some of these diseases may even require lung transplantation which is a complex and high risk surgery with the need for donor organs always exceeding the supply.

"The machine learning methods we developed for this study can also be applied to studies of other tissues and organs," said Ding. "We hope that our newly developed techniques for generating a pure, unlimited supply of cells using patients-derived stem cells can make possible new treatments or cures for diseases. These developments would prolong lives and improve the quality of those lives."

"The key hurdle to understanding what goes wrong with an individual patient's lung cells has been our inability to access those cells or to grow them in the laboratory. This approach allows us to now engineer from any individual patient those very finicky cells and to introduce bar codes into those cells that allow us to track and understand each cell and all their progeny over time in the laboratory dish. The result is an inexhaustible source of new lung cells that can be prepared from any patient of any age," added co-corresponding author Darrell Kotton, MD, David C. Seldin Professor of Medicine and Director, CReM, who led the work together with Ziv Bar-Joseph, PhD, the FORE Systems Professor of Computer Science at CMU.

Credit: 
Boston University School of Medicine

New clues into the genetic origins of schizophrenia

The first genetic analysis of schizophrenia in an ancestral African population, the South African Xhosa, appears in the Jan. 31 issue of the journal Science. An international group of scientists conducted the research, including investigators from Columbia University Mailman School of Public Health and New York State Psychiatric Institute, as well as the University of Cape Town and the University of Washington.

The study was carried out in the Xhosa population because Africa is the birthplace of all humans, yet ancestral African populations have rarely been the focus of genetics research. (There is no evidence that the Xhosa have an unusually high risk of schizophrenia). The researchers analyzed blood samples collected from 909 individuals diagnosed with schizophrenia and 917 controls living in South Africa. Their study revealed that participants with schizophrenia are significantly more likely to carry rare, damaging genetic mutations compared to participants without schizophrenia. These rare mutations were also more likely to affect brain and synaptic function. Synapses coordinate the communication between brain nerve cells called neurons; the organization and firing of neuronal synapses are ultimately responsible for learning, memory, and brain function.

The genes and pathways identified by this research inform the understanding of schizophrenia for all human populations, the researchers say. Further studies in African populations might also suggest potential mechanisms for the design of more effective treatments.

"The presence of only a few DNA variations damaging to synaptic function could have an outsized effect on schizophrenia," says co-author Ezra Susser, MD, DrPH, professor of epidemiology and psychiatry at the Columbia Mailman School, Columbia University Irving Medical Center, and New York State Psychiatric Institute. "While these variants differ from person to person, we believe they may disrupt neural pathways that elevate risk for schizophrenia."

The relative lack of genetics studies in Africa leaves a major gap in understanding human genetics. Almost 99 percent of human evolution took place in Africa after the first modern humans originated and before humans migrated from Africa to Europe and Asia 50,000 to 100,000 years ago. Because of the lack of studies in Africa, many generations of human genetic history are missing from our understanding of human adaptation and of human disease. Studies of ancestral African populations like the Xhosa have more diverse background DNA, which facilitates the identification of truly rare mutations.

Credit: 
Columbia University's Mailman School of Public Health