Culture

Brain pressure controls eye pressure, revealing new avenues for glaucoma treatment

Researchers at the University of South Florida (USF) have discovered a novel feedback pathway from the brain to the eye that modulates eye pressure – a significant advancement in the effort to diagnose and treat glaucoma. Glaucoma is associated with increased pressure in the eye due to a reduce ability of the eye to maintain proper fluid drainage. The heightened pressure applies mechanical strain to the optic nerve as the nerve exits the eye, resulting in vision loss and potential blindness.

It has long been hypothesized that brain pressure might also play a role in glaucoma because the amount of strain on the optic nerve depends not just on eye pressure, but the difference in pressure between the eye and brain. The groundbreaking study published in the Journal of Physiology shows, for the first time, that eye and brain pressure are physiologically connected. The neuroscientists came to this conclusion by altering brain pressure in animal models and noting changes in the fluid drainage properties of the eye that could be blocked by chemicals that eliminate feedback signals from the brain. Interestingly, the eye's ability to clear fluid changed in a manner that restored a healthy pressure difference across the optic nerve.

"The drainage control system may service to protect the optic nerve from swings in eye or brain pressure," said Chris Passaglia, PhD, professor in the USF Department of Medical Engineering. "Its discovery offers a new target for glaucoma treatment, wherein the modulatory mechanisms of the system might be exploited to help lower eye pressure and impede disease progression in glaucoma patients."

Glaucoma is the leading cause of blindness in people over the age of 60. Since symptoms often don't arise until the condition has advanced, ophthalmologists check the eye pressures of patients during routine exams by administering an "air puff test." However, Passaglia says there are more complex aspects of the disease that make diagnosis a challenge. Some patients exhibit symptoms of glaucoma yet have normal eye pressure. While others with high eye pressure, don't always show signs of the condition.

Researchers are now trying to pinpoint the location of the brain cells that are sending signals to the eye and find which nerve fibers in the eye are being mediated by the brain. This will help physicians better diagnose glaucoma and have a greater understanding of what's causing it to develop.

Credit: 
University of South Florida

Collective leadership groups maintain cohesion and act decisively

Members of collective leadership groups can maintain cohesion and act decisively when faced with a crisis, in spite of lacking the formal authority to do so, according to new research from Cass Business School.

The study 'Ambiguous Authority and Hidden Hierarchy: Collective Leadership in an Elite Professional Service Firm' examines the distinctive power dynamics revealed among peers in a professional service firm as they responded to an acute organisational crisis - the substantial reduction in the size of the partnership when the global banking crisis forced a decline in the firm's revenue.

As they dealt with the crisis, members in the firm's collective leadership group exercised considerable informal power under the cloak of ambiguity, highlighting the hidden hierarchy that existed within the collective. Their response also emphasised the significance of an individual 'heroic' leader within the collective - in this case, the Senior Partner.

The author Professor Laura Empson said professional service firms provided an ideal example of collective leadership.

"Professional service firms have a partnership form of governance, with an extended network of professional peers collectively owning the firm. As a result, peers expect considerable autonomy and consultation about how their business is managed," she said.

The firm is ranked in its sector's top five globally in terms of revenue, profitability, and number of staff. At the time of the study (2009-10), it generated revenue of US$1,500 million, had 500 partners and employed 5,000 staff.

In 2008, the collapse of Lehman Brothers and the subsequent banking crisis caused a significant threat to the firm's core business, with the firm facing a sharp decline in income and limited scope to reduce costs as partner remuneration formed a substantial part of the cost base.

"There was no facility within the Partnership Agreement to reduce a partner's profit share and as a co-owner of the firm, a partner could only be compelled to leave following a vote of the full partnership. In addition to this, a large reduction in partner numbers would threaten the social embeddedness of the partnership as a whole," said Professor Empson.

Using interviews, institutional archives and observation, Professor Empson's research focused on three specific aspects of the collective leadership group: composition, interaction and situation (the response to the financial crisis).

The composition and authority of the collective leadership group was ambiguous, including the senior executive pair of Senior and Managing Partner, whose roles were deliberately not distinguished and had considerable overlap. Power relations between the two were ambiguous but not contested. The group also included joint heads of practice, Board members and an unofficial management team.

"The prevailing pattern of interaction within the group was of an instinctive, intuitive mutual adjustment which was demonstrated by how the group built consensus, created collective responsibility, attempted to avoid conflict and worked to maintain harmony," said Professor Empson.

"This was facilitated by high levels of social interaction within the group which members suggested was associated with the characteristics of career tenure, with people building close relationships over time, as well as shared values and mutual trust."

Professor Empson found that when dealing with the crisis, the Senior Partner mobilised a hidden hierarchy within the group, recognising that the group's ambiguous authority, which had worked until this point, represented a major difficulty in dealing with the crisis.

"The Senior Partner first organised an initial highly confidential meeting to discuss the firm's response with a smaller group including some joint heads of practice and some Board members. He and the Managing Partner said they didn't have all the answers and wanted to get the 'best minds' round the table to share thoughts and ideas".

After this meeting, the Managing Partner and Finance Director commenced formal contingency planning to develop a list of partners for restructuring. Their list of selected partners was then rejected by the practice heads and the Board and was further refined by a subset of the leadership group, with practice heads challenging each other's lists. This caused conflict and disagreement but was ultimately resolved after forceful interventions from the Senior and Managing Partner.

The Senior and Managing Partner then sought to build support for decisive action within the wider collective leadership group and other influential colleagues within the partnership by establishing consensus for the decision and extending collective responsibility.

It was decided that 50 partners (10 per cent of the total) would be asked to leave and offered a substantial compensation package. A further 35 partners were asked to accept a reduction in their partnership equity. All 'restructured' partners accepted their packages.

Professor Empson said the interviewees praised the Senior Partner for his deliberate yet unobtrusive leadership which resolved the crisis.

"Echoing Winston Churchill's famous speech during a bleak period of World War II, a Board member told me: "(The Senior Partner) managed to bring a tricky group of people to unity. It was his finest hour."

Professor Empson said the study suggests that collective leadership requires effective individual leaders who can nurture the context in which collective leadership can flourish.

"Whilst they may avoid the rhetoric and personality cult of the conventional 'heroic' leader, their ability to mobilise and direct individuals can be seen as a subtle and nuanced version of heroism. It is perhaps ironic that this study has highlighted to extent to which collective leadership ultimately ends and begins with the individual."

Credit: 
City St George’s, University of London

Carnegie Mellon leverages AI to give voice to the voiceless

image: This word cloud depicts responses on social media to the question of where Rohingya refugees should go.

Image: 
Carnegie Mellon University

PITTSBURGH--Complete the following sentence: Rohingya refugees should go to¬¬¬ --

A. Pakistan.
B. Bangladesh.
C. Hell.

These aren't good choices, but all are sentiments that have been expressed repeatedly on social media. The Rohingyas, who began fleeing Myanamar in 2017 to avoid ethnic cleansing, are ill-equipped to defend themselves from these online attacks, but innovations from Carnegie Mellon University's Language Technologies Institute (LTI) could help counter the hate speech directed at them and other voiceless groups.

The LTI researchers have developed a system that leverages artificial intelligence to rapidly analyze hundreds of thousands of comments on social media and identify the fraction that defend or sympathize with disenfranchised minorities such as the Rohingya community. Human social media moderators, who couldn't possibly manually sift through so many comments, would then have the option to highlight this "help speech" in comment sections.

"Even if there's lots of hateful content, we can still find positive comments," said Ashiqur R. KhudaBukhsh, a post-doctoral researcher in the LTI who conducted the research with alumnus Shriphani Palakodety. Finding and highlighting these positive comments, they suggest, might do as much to make the internet a safer, healthier place as would detecting and eliminating hostile content or banning the trolls responsible.

Left to themselves, the Rohingyas are largely defenseless against online hate speech. Many of them have limited proficiency in global languages such as English, and they have little access to the internet. Most are too busy trying to stay alive to spend much time posting their own content, KhudaBukhsh said.

To find relevant help speech, the researchers used their technique to search more than a quarter of a million comments from YouTube in what they believe is the first AI-focused analysis of the Rohingya refugee crisis. They will present their findings at the Association for the Advancement of Artificial Intelligence annual conference, Feb. 7-12, in New York City.

Similarly, in an as-yet unpublished study, they used the technology to search for antiwar "hope speech" among almost a million YouTube comments surrounding the February 2019 Pulwama terror attack in Kashmir, which enflamed the longstanding India-Pakistan dispute over the region.

The ability to analyze such large quantities of text for content and opinion is possible because of recent major improvements in language models, said Jaime Carbonell, LTI director and a co-author on the study. These models learn from examples so they can predict what words are likely to occur in a given sequence and help machines understand what speakers and writers are trying to say.

But the CMU researchers developed a further innovation that made it possible to apply these models to short social media texts in South Asia, he added. Short bits of text, often with spelling and grammar mistakes, are difficult for machines to interpret. It's even harder in South Asian countries, where people may speak several languages and tend to "code switch," combining bits of different languages and even different writing systems in the same statement.

Existing machine learning methods create representations of words, or word embeddings, so that all words with a similar meaning are represented in the same way. This technique makes it possible to compute the proximity of a word to others in a comment or post. To extend this technique to the challenging texts of South Asia, the CMU team obtained new embeddings that revealed language groupings or clusters. This language identification technique worked as well or better than commercially available solutions.

This innovation has become an enabling technology for computational analyses of social media in that region, Carbonell noted.

Samplings of the YouTube comments showed about 10% of the comments were positive. When the researchers used their method to search for help speech in the larger dataset, the results were 88% positive, indicating that the method could substantially reduce the manual effort necessary to find them, KhudaBukhsh said.

"No country is too small to take on refugees," said one text, while another argued "all the countries should take a stand for these people."

But detecting pro-Rohingya texts can be a double-edged sword: some texts can contain language that could be considered hate speech against their alleged persecutors, he added.

Antagonists of the Rohingya are "really kind of like animals not like human beings so that's why they genocide innocent people," said one such text. Though the method reduces manual efforts, comments such as this indicate the continuing need for human judgment and for further research, the scientists concluded.

Credit: 
Carnegie Mellon University

Broad support needed to maximize impact of cars designed for kids with mobility issues

image: Researcher Sam Logan high-fives a child sitting in a Go Baby Go car.

Image: 
Courtesy Sam Logan

CORVALLIS, Ore. - For the first month and a half after receiving a modified toy car designed for children with disabilities, the kids and their families seemed motivated to use driving as a means of exploration and socialization.

But in the month and a half after that, most kids' driving time fell off to almost nothing.

Sam Logan, an Oregon State University kinesiologist who conducts research using the cars in his lab, said families who use the "Go Baby Go" ride-on cars require more robust support to push past barriers and keep using the cars over time. Otherwise, instead of helping young children with mobility issues explore their world, the cars end up forgotten in a closet.

OSU is one of several Go Baby Go chapters around the country working with children ages 3 and younger who experience limited mobility. OSU students take the off-the-shelf cars that kids can ride in and modify them with large easy-to-press activation buttons and PVC pipe frames to keep children sitting safely upright inside the car.

In previous studies, researchers made frequent home visits and were able to troubleshoot any problems families faced in using the cars. Logan said their aim has always been to be as inclusive as possible, so that even children with complex medical needs are able to drive.

This latest study used an electronic tracking system developed by two former high school students to record participants' car usage. Now freshmen at Brown University, twins Benjamin and Joshua Phelps started working in Logan's lab when they were freshmen at South Eugene High School and used open-source coding to craft a system that logged every push of the ignition button and the length of each use.

"The reason behind this study was to figure out what happens when families get cars and they're not part of a research study and they're not getting systematic support," said Logan, a kinesiologist in OSU's College of Public Health and Human Sciences. "Turns out, when we gave the cars out for three months and tracked their use, the average was about seven days out of 90 that they were used."

The drop-off in usage was significant: In the more hands-on studies, kids used the cars for an average of 1,060 minutes over three months; in this one, they averaged only 171 minutes, most of which occurred in the first 45 days. Half of the 14 participants saw less than 100 total minutes of use, including one child who never used the car.

In a companion study, families said the two biggest barriers to usage were their environment and the kids' responses to the device. The noise, speed and lack of finesse-steering posed challenges for some kids, and limited space or bad weather kept some families from making the car part of their regular routine.

To overcome those barriers, as the cars become more widespread, Logan sees the need for a welcome kit to help families get started, telling them what to expect and how to approach specific problems, as well as offering ideas for fun ways to use the cars, like driving to the mailbox or setting up a play car wash.

Community-based support is also crucial. Families in similar situations could share their knowledge with newcomers, and public recreational spaces could better accommodate the Go Baby Go cars.

"What we haven't figured out yet is, what can that support look like in a way that's sustainable across the country," Logan said.

Credit: 
Oregon State University

Collision helped make the Milky Way -- and now we know when

Thanks to some astrophysical sleuthing, researchers have pinpointed an early galactic merger that helped shape the Milky Way.

The merger -- a collision, actually -- happened 11.5 billion years ago. That's when a small galaxy called Gaia-Enceladus slammed into what then existed of the Milky Way, Earth's home galaxy, which is about 13.5 billion years old.

"We know today that the Milky Way was formed by the merger of many small galaxies. This is the first time we have been able to determine when such a merger happened," said Sarbani Basu, professor and chair of astronomy at Yale and co-author of a new study reporting the discovery. "This is an important step in understanding when the Milky Way accreted, or collected, its mass."

The study appears Jan. 13 in the journal Nature Astronomy. Dozens of astronomers from around the world, led by the University of Birmingham in the U.K., conducted the work. Yale graduate student Joel Ong is also a co-author.

The research team followed the life story of a single, bright star in the Indus constellation, visible today from Earth's southern hemisphere. The scientists said this star, nu Indi, was already orbiting inside the Milky Way prior to the Gaia-Enceladus collision, which unfolded over millions of years. As the merger progressed, it altered nu Indi's orbit around the center of the Milky Way, providing a marker for when the merger happened. (Stars have orbits, just as planets do.)

"My role was to determine the age of the star (nu Indi) using seismic data," Basu said. "Like many low-mass stars, this star pulsates, or quakes, continuously. The quakes can be described as a series of tones and overtones."

Basu and her colleagues calculated "frequencies" from nu Indi's tones and overtones. Those frequencies, in turn, indicated the star's physical structure and properties. From there, the researchers were able to gauge nu Indi's stage of development, factor in its brightness, and estimate its age.

Knowing nu Indi's age provided a limit for when the merger could have taken place, the researchers said.

Some of the world's latest technology aided researchers in their detective work. They got data on nu Indi's quakes from NASA's Transiting Exoplanet Survey Satellite (TESS). Launched in 2018, TESS is surveying stars across most of the sky to search for planets orbiting those stars and to study the stars themselves. The researchers also used information collected from the European Space Agency (ESA) Gaia Mission.

University of Birmingham astrophysicist Bill Chaplin, lead author of the study, said determining the natural oscillations of stars -- called asteroseismology -- is a way to better understand the history of stars and the environments in which they formed.

"This study demonstrates the potential of asteroseismology with TESS, and what is possible when one has a variety of cutting-edge data available on a single, bright star," Chaplin said.

Credit: 
Yale University

Protein associated with ovarian cancer exacerbates neurodegeneration in Alzheimer's

image: In the brain of mice with Alzheimer's, areas near amyloid plaques (A) appear with fewer neural networks (B), dying neurons (C) and higher OCIAD1 (D). In cultured neuronal cells, the OCIAD1 proteins (E) appear in the mitochondria (F).

Image: 
Houston Methodist

Houston Methodist scientists identified a protein found in ovarian cancer that may contribute to declining brain function and Alzheimer's disease, by combining computational methods and lab research.

"Our finding suggests another known protein may be coming into play here, which could help us identify a new therapeutic target one day," said Stephen T.C. Wong, Ph.D., lead author and Associate Director of Bioinformatics and Biostatistics Cores at Houston Methodist Cancer Center. "These findings may suggest a different role of the protein amyloid beta in neurodegeneration. Many Alzheimer's researchers have focused on amyloid beta alone, or connections between amyloid beta and another protein, tau."

In a study published online in the journal EBiomedicine, Wong and his team at the Ting Tsung and Wei Fong Chao Center for BRAIN of Houston Methodist, reported on a new role of OCIAD1 (ovarian cancer immune-reactive antigen domain containing 1). Originally discovered for its effect on ovarian cancer metastasis and stem cell metabolisms, Wong's group found the OCIAD1 protein in human brain cells--and determined it impairs neurons and damages synapses in the brain, contributing to neurodegeneration in Alzheimer's disease.

"Our research addresses a fundamental question of Alzheimer's disease--how, or if, amyloid beta accumulation that can be seen up to two decades prior to brain function decline is involved in progressive neurodegeneration," said Wong, who is John S Dunn Sr. presidential distinguished chair in biomedical engineering and professor of computer science and bioengineering in oncology at Houston Methodist. "Examining factors that contribute to the progressive decline in people with Alzheimer's will help us develop diagnostic biomarkers and new therapeutics."

The scientists culled through archived bioinformatics data of brain tissue from deceased Alzheimer's patients, as well as mouse models by blending computational methods with laboratory research. They determined that OCIAD1 plays a role in the disease's progressive neurodegeneration by impairing mitochondria function. Known as the powerhouse of cells, damage to mitochondria results in the trickle-down cell death effect in the brain leading to neuron damage.

"We applied a system biology strategy to see if we could find a different mechanism of neurodegeneration in Alzheimer's disease. We identified OCIAD1 as a new neurodegeneration-relevant factor, predicted its function, and demonstrated it mediates the long-term impact of amyloid beta on cells and synaptic damages by impairing mitochondria function," said Xuping Li, Ph.D., co-corresponding author and an instructor in Wong's group.

Alzheimer's research has traditionally focused on a few major themes - the role of the amyloid protein on neuronal loss and how this toxic protein causes injury by interacting with tau. More recently, however, other research considers amyloid beta a bystander and questions whether it causes neuronal degeneration at all.

Wong's group next intends to examine whether OCIAD1 plays a role in the interplay between two known changes in Alzheimer's - amyloid beta and tau aggregates. If so, additional research would focus on the potential of OCIAD1 as a biomarker or therapeutic target.

The epidemic of Alzheimer's, a disease affecting more than 5.8 million Americans, is expected to increase as the aging population lives longer. According to the Alzheimer's Association and the Center for Disease Control and Prevention, Alzheimer's is the most expensive disease in the United States, costing an estimated $290 billion in 2019.

Credit: 
Houston Methodist

Climate change unlikely to drive sugar maples north

Climate is an important factor in determining a plant species' growing zone. Some studies suggest that by the turn of the next century, climate change will have caused some species to spread several dozen kilometres north of their current distribution areas.

Such changes could have major consequences on how land-based ecosystems function.

But a northern migration isn't in the cards for sugar maples, according to Alexis Carteron, who recently published his doctoral research findings in the Journal of Ecology. His work is supervised by Professor Etienne Laliberté of Université de Montréal and co-supervised by Mark Vellend of Université de Sherbrooke.

Carteron and his colleagues at Université de Montréal's Department of Biological Sciences and the Institut de recherche en biologie végétale reached this conclusion after conducting experiments in greenhouses at the Jardin botanique de Montréal using soil samples harvested from Mont-Mégantic National Park.

The importance of soil composition
Climate - and the rising temperatures recorded in recent decades - contributes substantially to tree migration, but so does soil composition. However, we know much less about the effects of soil compared to climate.

That's why Carteron and his colleagues decided to study the effects that microorganisms and soil chemistry have on sugar maple (Acer saccharum) seedlings' performance (survival and biomass).

The researchers first collected soil samples from the eastern slope of Mont Saint-Joseph in Mont-Mégantic National Park in June 2016. The samples were taken at different altitudes to reflect the two types of forest that grow at the site.

"Mont Saint-Joseph has a substantial variation in altitude with a temperate forest of mostly sugar maple trees growing next to a boreal forest populated with conifers," said Carteron. "When you look at the mountain from a distance, it's easy to see where one forest starts and the other one ends."

Different soil experiments
The next step involved sprouting maple seeds, also known as samaras, planting them in greenhouses at the Jardin botanique and allowing them to grow over the summers of 2016 and 2017 (interrupted by a dormant phase in winter).

The researchers then applied various sterilization and inoculation treatments to the soil samples to better understand and differentiate the effects of biotic (microorganisms, fungi) and abiotic (acidity, nutrients) factors on sugar maple survival and growth.

Lower survival rates and biomass in boreal forests
At the end of summer 2017, Carteron and his colleagues assessed how well the young sugar maples had grown (based on survival rates and biomass) in different soil types.

They found that sugar maples grown in soil from the boreal forest had substantially poorer performance than those grown in the transition zone between temperate and boreal forests.

Likewise, sugar maple trees grown in boreal forest soil and inoculated boreal forest soil performed 37 per cent and 44% worse respectively than those grown in temperate forest soil.

The researchers also noted that the pH of boreal forest soil might have negatively affected sugar maple survival rates. Meanwhile, soil from temperate forests--which is where sugar maples typically grow--allowed for better arbuscular mycorrhizal fungal colonization in the trees' roots, which can promote tree survival and growth.

"Due to the interaction of biotic and abiotic factors, boreal forest soil seems to offer a less hospitable environment for sugar maple trees than other soil types," said Carteron. "While global warming might have made it physiologically possible for sugar maple trees to grow in more northern areas, the soil conditions in these areas make a northern migration less likely."

But shouldn't soil composition also change as the climate heats up? "It's certainly possible that the soil's biotic and abiotic properties could change and allow for the sugar maple's growing zone to expand, but that type of change would take a very long time to occur," said Carteron, who's won many research awards in recent years.

Credit: 
University of Montreal

Chromatin organizes itself into 3D 'forests' in single cells

image: Northwestern University researchers have discovered how chromatin folds at the single-cell level. They found that it folds into a variety of tree-like domains spaced along a chromatin backbone.

Image: 
Northwestern University

A single cell contains the genetic instructions for an entire organism. This genomic information is managed and processed by the complex machinery of chromatin -- a mix of DNA and protein within chromosomes whose function and role in disease are of increasing interest to scientists.

A Northwestern University research team -- using mathematical modeling and optical imaging they developed themselves -- has discovered how chromatin folds at the single-cell level. The researchers found chromatin is folded into a variety of tree-like domains spaced along a chromatin backbone. These small and large areas are like a mixed forest of trees growing from the forest floor. The overall structure is a 3D forest at microscale.

Chromatin is responsible for packing DNA into the cell nucleus. In humans, that's about six feet of DNA in each cell. The new work suggests that chromatin is more structured and hierarchical in single cells than previously thought. Learning how chromatin correctly operates will help scientists understand what goes wrong with it in cancer and other diseases.

"By integrating theoretical and experimental work, we have produced a new chromatin folding picture that helps us see how the 3D genome is organized at the single-cell level," said Igal Szleifer, the Christina Enroth-Cugell Professor of Biomedical Engineering at Northwestern's McCormick School of Engineering. He co-led the research team with Vadim Backman.

Details of the interdisciplinary study will be published Jan. 10 in the journal Science Advances.

"If genes are the hardware, chromatin is the software," said Backman, the Walter Dill Scott Professor of Biomedical Engineering and director of the Center for Physical Genomics and Engineering. "If the structure of chromatin changes, it can alter the processing of the information stored in the genome, but it does not alter the genes themselves. Understanding chromatin folding holds the key to understanding how cells differentiate and how cancer happens."

Advances in genomic, imaging and information technologies are just beginning to enable scientists to better understand how chromatin works. The Northwestern researchers used a Partial Wave Spectroscopic (PWS) microscope, optical imaging developed by Backman and colleagues, to peer deep into live cells and "sense" alterations in chromatin packing. They also used electron imaging.

"Our paradigm-shifting picture of chromatin folding is an important missing piece in the holistic view of genomic structure," said Kai Huang, the study's first author. Huang is a postdoctoral fellow in Backman's research group. "The results should inspire new strategies to fight cancer."

Credit: 
Northwestern University

Medicaid expansion associated with fewer opioid overdose deaths across the US

The expansion of Medicaid coverage for low-income adults permitted by the Affordable Care Act (ACA) was associated with a six percent reduction in total opioid overdose deaths nationally, according to new research from NYU Grossman School of Medicine and University of California, Davis.

Published online January 10 in JAMA Network Open, the study is the first to look at whether the ACA-related Medicaid expansion is associated with county-level opioid overdose mortality. The researchers analyzed cause-of-death data from the National Vital Statistics System from 3,109 counties within 49 states and the District of Columbia between 2001 and 2017--looking at changes in opioid overdose rates in counties that expanded Medicaid and compared those to changes that occurred in the same time period in counties within states that did not expand Medicaid.

Drug overdose remains a leading cause of injury death in the United States and is responsible for more than 70,000 deaths annually. After examining the association of Medicaid expansion with county-by-year counts of opioid overdose deaths and by class of opioid, the researchers found that:

Medicaid expansion may have prevented between 1,678 and 8,132 opioid overdose deaths in 2015 to 2017 in the 32 states that expanded Medicaid between 2014 and 2016.

Adoption of Medicaid expansion was associated with a six percent lower rate of total opioid overdose deaths, 11 percent lower rate of death involving heroin, and a 10 percent lower rate of death involving synthetic opioids other than methadone (such as fentanyl).

Unexpectedly, an 11 percent increase in methadone overdose mortality was observed with Medicaid expansion.

"Our findings suggest that as states invest more resources in addressing the opioid overdose epidemic, policymakers should pay attention to the role that expanding Medicaid can play in reducing opioid overdose deaths by providing greater access to health care, and in particular, to treatment for opioid use disorder," said Magdalena Cerdá, DrPH, associate professor and director of the Center for Opioid Epidemiology and Policy in the Department of Population Health at NYU Langone Health, and the study's senior author. "At a broader level, the findings of this study suggest that providing expanded access to health care may be a key policy lever to address the opioid overdose crisis."

One concerning finding from the study was the association of Medicaid expansion with an 11 percent increase in overdose deaths involving methadone. According to Cerdá, Medicaid beneficiaries are more likely to receive prescriptions for methadone to treat pain, compared to the general population. While the dispensing of methadone to treat opioid use disorder is highly effective and standardized, the use of methadone to treat pain is associated with greater risk of overdose reflecting in part wide variation in prescribing practices.

"Past research has found Medicaid expansion is associated with not only large decreases in the number of uninsured Americans, but also considerable increases in access to opioid use disorder treatment and the opioid overdose reversal medication naloxone," said Nicole Kravitz-Wirtz, PhD, MPH, assistant professor with the Violence Prevention Research Program in the Department of Emergency Medicine at UC Davis, and the study's lead author. "Ours was the first study to investigate the natural follow-up question: Is the expansion associated with reductions in local opioid overdose deaths? On balance, the answer appears to be yes."

Study Limitations

Cerdá and colleagues cite a number of study limitations. First, the research relies on coding of death certificate data. Since coding has changed over time, some deaths due to opioid overdoses may be misclassified. A second limitation is that the investigators looked at deaths among the whole population as opposed to just Medicaid beneficiaries. Any effect detected may be an underestimate of the effect that would be observed among Medicaid beneficiaries, says Cerdá.

Finally, although the researchers looked at the relationship of Medicaid expansion and overdose mortality, they did not directly examine pathways such as how Medicaid expansion affects access to treatment for opioid use disorder, how it affects opioid misuse and nonfatal overdoses, or how it affects access to naloxone, a medication used to counter the effects of opioid overdose. Each of these additional pathways are worthy of future study and are likely important levers to pull in terms of addressing the opioid overdose crisis, says Cerdá.

Credit: 
NYU Langone Health / NYU Grossman School of Medicine

Malnutrition linked with increased risk of Zika birth defects

Congenital Zika Syndrome (CZS) refers to a collection of developmental malformations associated with Zika virus (ZIKV) congenital infection. This syndrome includes devastating conditions that have a huge impact on the rest of the life of the individual and their family, such as smaller (microcephaly) and unfolded (lissencephalic) brains, retinal abnormalities, enlarged ventricles of the heart, a lack of the inter-hemispheric connections and calcifications in the brain.

Brazil has been widely affected by ZIKV, but ~75% of CZS have been found in the socio-economically disadvantaged region of the Northeast.

In a new study, researchers from the University of Oxford and the Federal University of Rio de Janeiro have found that this rise in cases of CZS is linked to poor diet among the infants' mothers.

Professor Zoltán Molnár of the University of Oxford's Department of Physiology, Anatomy & Genetics, who participated in the study and has a long-term collaboration with the lead author Associate Professor Patricia Garcez of the Federal University of Rio de Janeiro Brazil partially supported by the Medical Research Council and Royal Society, said: 'We knew that areas of Brazil with the lowest socioeconomic status had the highest level of developmental impairment in babies due to CZS, which is why we looked at the possible link between ZIKV and one of the potentially most important co-factors, nutrition.

'This study showed that developmental impairment caused by ZIKV congenital infection is made much worse by environmental co-factors, specifically diets poor in protein, which explains why the devastating effects of CZS vary across ZIKV endemic regions.'

The link between Zika virus infection and the CZS has been demonstrated in previous studies, which helped researchers understand how the infection affected brain growth and development of blood vessels. These showed that ZIKV infects the cells that develop into the brain and alter genes and proteins related to the normal cell cycle and blood vessel development.

The current study also used a mouse model to replicate the effects of Zika infection in mice that had a low-protein diet, and found that several of the pathological signs found in humans appeared in the undernourished mice in a similar way.

Professor Molnár added: 'When we replicated the effects seen in humans who had poor diets in mice we saw similar effects in the foetuses, such as placental damage as well as poor embryonic body growth and a reduction in brain size of newborns born to undernourished pregnant mouse.

'The mouse mothers were clearly less able to fight against ZIKV, which was shown by a robust and persistent ZIKV infection in the spleens of undernourished mothers, in contrast to healthy mice. Our undernourished mouse model helped us to identify the cellular mechanisms that are responsible for the differences in humans.'

'Improving diet alone will not protect against ZIKV infections, but it can determine the severity of the CZS.

'While we need more work to translate these findings to human disease, our mouse model helped us to identify significant differences in the regulation patterns of key molecular pathways, and particular genes identified within developing brains reflect how a poor nutritional status increases the adverse effects of ZIKV infection.'

The study was partially funded by a joint MRC Grant between Professor Zoltán Molnár of the University of Oxford and Associate Professor Patricia Garcez of the Federal University of Rio de Janeiro.

Credit: 
University of Oxford

Scientists develop 'Twitter' for cells

Computational biologists led by Prof. Yvan Saeys (VIB-UGent Center for Inflammation Research) developed a new bioinformatics method to better study communication between cells. This method, called NicheNet, helps researchers to gain insight into how the gene expression of cells is regulated by interacting cells. NicheNet has a broad range of potential applications in fields like immunology and tumor biology, and was already successfully used by the collaborating group of Prof. Martin Guilliams (VIB-UGent Center for Inflammation Research).

Cell neighbors

In multicellular organisms cells don't function on their own, but they produce signaling molecules that influence gene expression in interacting cells. This intercellular communication plays an important role in many biological processes, such as the development and functioning of cells. Studying intercellular communication is not only important to understand fundamental biology, but also to gain insights into diseases like cancer. Interactions between cancer cells and other cells in the microenvironment of the tumor are crucial for its growth.

An example of a process in which intercellular communication is essential, is the differentiation of macrophages, a type of immune cell. This process is affected by other cell types in the environment, or 'niche', of the macrophage.

Researchers from the group of Martin Guilliams (VIB-Ghent University), who work in close collaboration with the Saeys lab, wanted to study this process for Kupffer cells, a macrophage in the liver blood stream. They generated a lot of gene expression data of all cells involved.

"But, using this type of data to unravel how cells communicate is not a trivial task'', says Yvan Saeys. "We needed to develop a new sophisticated algorithm to tackle this problem."

Machine learning for talking cells

Guided by post-doc Wouter Saelens and Yvan Saeys, PhD student Robin Browaeys started developing such a new method to analyze how cells might signal each other.

Browaeys explains: "Our idea was to make use of the enormous amount of available knowledge on intercellular signaling that was acquired over the years, and use this knowledge to find out which intercellular communication processes are going on in the data we had. To do this, we had to apply several machine learning and statistical techniques, including network algorithms that are also used to analyze social networks, for example."

Saelens summarizes: "In essence, you can compare our newly developed method, NicheNet, with a hypothetical biologist who not only knows everything that is already published about intercellular communication, but who can also apply all this knowledge on complex, big datasets. Making reasonable predictions on intercellular communication is something that would have required weeks of literature study in the past, but this can now be done with the push of a button."

Testing NicheNet

A first test case for NicheNet was the Kupffer cell niche data generated by the Guilliams lab. Researchers from that lab were able to experimentally validate some of the signals that NicheNet predicted. "Thanks to NicheNet, we looked into factors that we would not have thought about ourselves", confirms Martin Guilliams. "For us, NicheNet was an essential tool to help unravel the Kupffer cell niche".

"In addition to the Kupffer cell story, we have also been applying NicheNet to investigate cell-cell communication in the tumor microenvironment", adds Saeys. "We used NicheNet on single-cell data that was published earlier, but we are now working on novel single-cell datasets generated by collaborating research groups. How different types of treatment affect the cellular interactions within the tumor microenvironment, and how this influences the tumor, are some of the questions we are trying to address with NicheNet."

These different applications illustrate the value of NicheNet for generating novel hypotheses about how cells communicate in fundamental biological processes and diseases.

Credit: 
VIB (the Flanders Institute for Biotechnology)

When David poses as Goliath

Stellar black holes form when massive stars end their life in a dramatic collapse. Observations have shown that stellar black holes typically have masses of about ten times that of the Sun, in accordance with the theory of stellar evolution. Recently, a Chinese team of astronomers claimed to have discovered a black hole as massive as 70 solar masses, which, if confirmed, would severely challenge the current view of stellar evolution. The publication immediately triggered theoretical investigations as well as additional observations by other astrophysicists. Among those to take a closer look at the object was a team of astronomers from the Universities of Erlangen-Nürnberg and Potsdam. They discovered that it may not necessarily be a black hole at all, but possibly a massive neutron star or even an 'ordinary' star. Their results have now been published as a highlight-paper in the renowned journal Astronomy & Astrophysics*.

The putative black hole was detected indirectly from the motion of a bright companion star, orbiting an invisible compact object over a period of about 80 days. From new observations, a Belgian team showed that the original measurements were misinterpreted and that the mass of the black hole is, in fact, very uncertain. The most important question, namely how the observed binary system was created, remains unanswered. A crucial aspect is the mass of the visible companion, the hot star LS V+22 25. The more massive this star is, the more massive the black hole has to be to induce the observed motion of the bright star. The latter was considered to be a normal star, eight times more massive than the Sun.

A team of astronomers from Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and the University of Potsdam had a closer look at the archival spectrum of LS V+22 25, taken by the Keck telescope at Mauna Kea, Hawaii. In particular, they were interested in studying the abundances of the chemical elements on the stellar surface. Interestingly, they detected deviations in the abundances of helium, carbon, nitrogen, and oxygen compared to the standard composition of a young massive star. The observed pattern on the surface showed ashes resulting from the nuclear fusion of hydrogen, a process that only happens deep in the core of young stars and would not be expected to be detected at its surface.

'At first glance, the spectrum did indeed look like one from a young massive star. However, several properties appeared rather suspicious. This motivated us to have a fresh look at the archival data', said Andreas Irrgang, the leading scientist of this study and a member of the Dr. Karl Remeis-Observatory in Bamberg, the Astronomical Institute of FAU.

The authors concluded that LS V+22 25 must have interacted with its compact companion in the past. During this episode of mass-transfer, the outer layers of the star were removed and now the stripped helium core is visible, enriched with the ashes from the burning of hydrogen.

However, stripped helium stars are much lighter than their normal counterparts. Combining their results with recent distance measurements from the Gaia space telescope, the authors determined a most likely stellar mass of only 1.1 (with an uncertainty of +/-0.5) times that of the Sun. This yields a minimum mass of only 2-3 solar masses for the compact companion, suggesting that it may not necessarily be a black hole at all, but possibly a massive neutron star or even an 'ordinary' star.

The star LS V+22 25 has become famous for possibly having a massive black hole companion. However, a closer look at the star itself reveals that it is a very intriguing object in its own right, as whilst stripped helium stars of intermediate mass have been predicted in theory, only very few have been discovered so far. They are key objects to understanding binary star interactions.

Credit: 
Friedrich-Alexander-Universität Erlangen-Nürnberg

Explosion or collapse?

image: Light from the stellar explosion that created this energized cosmic cloud was first seen on planet Earth in October 1604, a mere 400 years ago. The supernova produced a bright new star in early 17th century skies within the constellation Ophiuchus. It was studied by astronomer Johannes Kepler and his contemporaries. Recent data has shown relative elemental abundances typical of a Type Ia supernova, and further indicated that the progenitor was a white dwarf star that exploded when it accreted too much material from a companion. The explosions discussed in the publication would produce a remnant that looks like Kepler but with the presence of an oxygen-neon-iron white dwarf at the center.

Image: 
Picture: X-ray: NASA/CXC/NCSU/M. Burkey et al.; Optical: DSS

A group of scientists, among them several from GSI Helmholtzzentrum für Schwerionenforschung and from Technical University of Darmstadt, succeeded to experimentally determine characteristics of nuclear processes in matter ten million times denser and 25 times hotter than the centre of our Sun. A result of the measurement is that intermediate-mass stars are very likely to explode, and not, as assumed until now, collapse. The findings are now published in the scientific magazine Physical Review Letters. They stress the fascinating opportunities offered by future accelerator facilities like FAIR in understanding the processes defining the evolution of the Universe.

Stars have different evolutionary paths depending on their mass. Low-mass stars such as the Sun will eventually become white dwarfs. Massive stars, on the other hand, finish with a spectacular explosion known as a supernova, leaving either a neutron star or a black hole behind. The fate of both low- and high-mass stars is well understood but the situation for intermediate-mass stars, which weigh between seven and eleven times as much as the Sun, has remained unclear. This is surprising since intermediate-mass stars are prevalent in our Galaxy.

"The final fate of intermediate-mass stars depends on a tiny detail, namely, how readily the isotope neon-20 captures electrons in the stellar core. Depending on this electron capture rate, the star will be either disrupted in a thermonuclear explosion or it will collapse to form a neutron star," explains Professor Gabriel Martínez-Pinedo of GSI's research department Theory and the Institut für Kernphysik, TU Darmstadt. Professor Karlheinz Langanke, Research Director of GSI and FAIR, adds: "This work started when we realized that a strongly suppressed, and hence previously ignored and experimentally unknown, transition between the ground states of neon-20 and fluorine-20 was a key piece of information needed to determine the electron capture rate in intermediate mass stars." By a combination of precise measurements of the beta-decay of fluorine-20 and theoretical calculations, an international collaboration of physicists with participation from GSI and TU Darmstadt, has now succeeded in determining this important rate. The experiment took place under conditions far more peaceful than those found in stars, namely at the Accelerator Laboratory of the University of Jyväskylä. The measurements showed a surprisingly strong transition between the ground states of neon-20 and fluorine-20 that leads to electron capture in neon-20 occurring at lower density than previously believed. For the star, this implies that, in contrast to previous assumptions, it is more likely to be disrupted by a thermonuclear explosion than to collapse into a neutron star. "It is amazing to find out that a single transition can have such a strong impact on the evolution of a big object like a star," says Dag Fahlin Strömberg, who, as a PhD student at TU Darmstadt, was responsible for large parts of project's simulations.

Since thermonuclear explosions eject much more material than those triggered by gravitational collapse, the results have implications for galactic chemical evolution. The ejected material is rich in titanium-50, chromium-54, and iron-60. Therefore, the unusual titanium and chromium isotopic ratios found in some meteorites, and the discovery of iron-60 in deep-sea sediments could be produced by intermediate-mass stars and indicate that these have exploded in our galactic neighbourhood in the distant (billions of years) and not so distant (millions of years) past.

In the light of these new findings the most probable fate of intermediate-mass stars seems to be a thermonuclear explosion, producing a subluminous type Ia supernova and a special type of white dwarf star known as an oxygen-neon-iron white dwarf. The (non-)detection of such white dwarfs in the future would provide important insights into the explosion mechanism. Another open question is the role played by convection -- the bulk movement of material in the interior of the star -- in the explosion.

At existing and future accelerator centres like the international FAIR project (Facility for Antiproton and Ion Research) currently under construction at GSI, new not yet investigated isotopes and their properties can be investigated. Thus, scientists continue to bring the universe into the laboratory to answer the unsolved questions about our cosmos.

Credit: 
Helmholtz Association

Long-term medication for schizophrenia is safe

image: Jari Tiihonen, professor at the Department of Clinical Neuroscience, Karolinska Institutet, Sweden. Photo: Stefan Zimmerman.

Image: 
Stefan Zimmerman

Researchers at Karolinska Institutet in Sweden and their colleagues in Germany, the USA and Finland have studied the safety of very long-term antipsychotic therapy for schizophrenia. According to the study, which is published in the scientific journal World Psychiatry, mortality was higher during periods when patients were not on medication than when they were.

People with schizophrenia have an average life expectancy ten to twenty years below the norm, and there has long been concern that one of the causes is the long-term use of antipsychotic drugs. Earlier compilations (meta-analyses) of results from randomised studies, however, indicated that the mortality rate for people with schizophrenia on antipsychotic medication was 30 to 50 per cent lower than those who have received placebo.

However, most of the studies done have been shorter than six months, which does not reflect the reality of treatment often being life-long. Researchers from Karolinska Institutet and their international colleagues have now done a long-term follow-up, substantiating previous results and demonstrating that antipsychotic drugs are not associated with increased risk of co-morbid complications, such as cardiovascular disease. The study is the largest conducted in the field to date.

"It's difficult to make comparisons between people on permanent medication and those who aren't, as these groups differ in many ways," says Heidi Taipale, assistant professor at the Department of Clinical Neuroscience at Karolinska Institutet. "One common method of dealing with this has been to try to take account of such differences when making comparisons. However, we chose another method, in which each person was their own control, making it possible for us to make individual comparisons of hospitalisation during periods of antipsychotic medication and periods of no treatment."

The researchers monitored just over 62,000 Finns who had received a schizophrenia diagnosis at some time between 1972 and 2014. This they did by accessing various Finnish registries up until 2015, giving an average follow-up period of over 14 years. They found that the likelihood of being hospitalised for a somatic disease was just as high during the periods when the patients were on antipsychotic drugs as when they were not. The differences in mortality, however, were noticeable. The cumulative mortality rate in the follow-up period at periods of medication and non-medication was 26 and 46 per cent respectively.

The researchers believe that there is overwhelming support for continual antipsychotic treatment for schizophrenia being a safer option than no medication. At the same time, treatment brings the risk of adverse reactions, such as an increase in weight, which can raise the risk of cardiovascular disease. The finding that treatment with antipsychotic drugs does not increase the likelihood of hospitalisation for cardiovascular disease may be attributable, argue the researchers, to the fact that the drugs can also have an antihypertensive effect and can reduce anxiety and the risk of substance abuse. Antipsychotic treatment may also help patients adopt a healthier lifestyle and make them more likely to seek care when needed.

"Antipsychotics get something of a bad press, which can make it difficult to reach out to the patient group with information on how important they are," says Jari Tiihonen, professor of psychiatry at the Department of Clinical Neuroscience, Karolinska Institutet. "We know from previous studies that only half of those who have been discharged from hospital after their first psychotic episode with a schizophrenia diagnosis take antipsychotic drugs. Besides, there are many people with schizophrenia who are on long-term benzodiazepine medication, which is in breach of existing guidelines and is associated with increased mortality risk. Building trust and understanding towards the efficacy and safety of antipsychotic drugs is important, and we hope that this study can contribute to this end."

Credit: 
Karolinska Institutet

New function for potential tumor suppressor in brain development

image: With the MADM technique, researchers can remove a gene from single cells and visualize what happens to these cells.

Image: 
© IST Austria - Hippenmeyer group

The gene Cdkn1c could have been considered an open-and-shut case: Mice in which the gene is removed are larger and have bigger brains, so Cdkn1c should function to inhibit growth. This rationale has led to Cdkn1c being studied as a tumour suppressor gene. New research from the group of Simon Hippenmeyer, professor at the Institute of Science and Technology Austria (IST Austria), has now uncovered a novel, opposite role for Cdkn1c. When Cdkn1c is removed only in certain cells of the brain, these cells die, arguing for a new growth promoting role of Cdkn1c. The new research is published today in the journal Nature Communications.

Simon Hippenmeyer and his research group, including co-first authors Susanne Laukoter (PhD student), Robert Beattie (postdoc) and Florian Pauler (senior technical assistant), removed Cdkn1c in a brain region called the cerebral cortex in mice and found a surprising result: Contrary to what had previously been thought, the cortex was smaller, not bigger, than in animals with a normal amount of Cdkn1c. To make sense of this seeming paradox, the researchers compared the effect of Cdkn1c loss in the whole animal with a loss of the gene in just a single tissue or even in single cells in the developing mouse.

Studying brain development and gene function at single cell level with MADM

Using a genetic technique called Mosaic Analysis with Double Markers (MADM) allowed the researchers to knockout a gene of interest in single cells and at the same time, visualize the effect of gene deletion on these cells under the microscope. When they removed the gene Cdkn1c in cells in the whole cortex, the cortex was smaller. "When we take out the gene, cells die. In fact, we see massive death by apoptosis", Hippenmeyer explains.

In a cortex where Cdkn1c was removed, the researchers further modified single cells with MADM to observe their fate. They found that if a cell has two intact copies of Cdkn1c, the cell is protected against death. If a cell has just one intact copy of Cdkn1c, the cell dies. Intriguingly, it does not matter whether the DNA, the "instruction manual" in our cells that defines how products like proteins are made, is active and thus allows generation of proteins, or not. Just having two copies of the intact DNA, the intact instruction manual, is enough to protect a cell from death.

Implications for studies on brain malformations and tumour development

For Hippenmeyer, this study underlines the importance of studying both systemic effects of gene loss (i.e. gene loss in the whole animal) and the effect of gene loss in individual cells. "Our method reveals a new function of Cdkn1c, as taking the gene out in a single cell has a fundamentally different effect from taking it out in the whole animal. Systemic effects may mask the effect observed in individual cells. It is important to also study this in human conditions that lead to malformations of the brain, such as microcephaly."

As Cdkn1c and its role in the development of tumours has been studied extensively, the new research likely also has important implications for this field, says Florian Pauler. "There has been interest in Cdkn1c as it has been regarded as a tumour suppressor. Like the single cells and individual tissue we studied, tumours can also be seen as non-systemic. So, our findings change the way we should think about Cdkn1c, also in tumours."

In the future, Hippenmeyer and his research group will continue to explore the mechanisms and functions of Cdkn1c. "When this piece of DNA is missing, something fundamental is changed and death is triggered in a cell. Of course, we want to now know why and how this happens", Hippenmeyer asserts.

Credit: 
Institute of Science and Technology Austria