Culture

Outreach effective for opioid use disorder long-term treatment

Proactive outreach, including knocking on the doors of individuals who recently overdosed on opioids, can be an effective way to engage more people who have opioid use disorder with long-term care, according to researchers at The University of Texas Health Science Center at Houston (UTHealth).

In a pilot study, the UTHealth research team measured results of a clinical research program, HEROES, that they created to curb opioid use disorder (OUD) in Houston. Findings were recently published in the Journal of Substance Abuse Treatment. HEROES is one of only a handful of programs in the country engaged in large-scale, door-to-door outreach and tracking results.

According to data from the Harris County Institute of Forensic Sciences, there were 325 opioid-related deaths from January through November of 2019 in Houston, up from 295 in all of 2018. The risk of overdose deaths is rising due to the increased availability of fentanyl and synthetic opioids. Federal data shows nearly 90% of people who have substance use disorder in the U.S. are not currently in treatment.

"People with OUD don't always voluntarily seek treatment like you would with any other disease," said James Langabeer, PhD, EdD, MBA, a professor at UTHealth who leads the HEROES program. "Even if they recognize they need help, there are few places for them to turn, especially if they don't have insurance. We want to remove all those barriers and bring help directly to them at a time when they are more likely to accept it."

Theories suggest that people have greater readiness for behavioral change during critical periods or life events, and surviving an overdose could represent such an event, said Langabeer, who was the lead author on the study.

Langabeer and his team at HEROES designed an intervention strategy to identify survivors of overdose through secured data from Memorial Hermann-Texas Medical Center and the Houston Fire Department Emergency Medical Services. Using a motivational interview guide, a trained recovery support coach and a paramedic knock on their doors. If the survivors are home and willing to talk, the team uses motivational interviewing techniques to encourage them to enroll in the HEROES program at no cost to them.

"This type of engagement is important because it helps to emphasize compassion and collaboration with each patient. We use it to have the person feel autonomy in their own decisions and not to be led or forced to a conclusion, but to reach a choice on their own and to feel accountable for it. We find that if they are engaged, they are more likely to follow through in the long run," Langabeer said.

The peer recovery coaches are people who are in long-term recovery from opioid use disorder themselves.

"We've been through the fire and came out with buckets of water to fight this disease," said Jessica Yeager, a coach with HEROES. "Now people can look at me and say 'You're just like me - you can help me!' If someone had knocked on my door when I was at my worst, my life would be different today."

Between April and December 2018, the team visited 103 people, and 33% chose to engage in the treatment program. After 30 days, 88% of those participants were still active in the program, and 56% were still active after 90 days.

"Having one third of people we reached out to agree to a major step in their recovery while we are in their doorway is a huge accomplishment and much better than we expected, since other studies have shown far lower results," Langabeer said. "Recognizing that major life decisions require time, we are extremely satisfied with the results. Also, we've seen people choose to come into treatment weeks or even months later, so the information we provide during outreach can also help shape future choices."

Treatment through the HEROES program includes access to opioid overdose reversal medication, a recovery coach available 24/7, behavioral counseling with an addiction therapist, weekly support group meetings, and help connecting to county resources.

Another study Langabeer's team just published, which appeared in Substance Abuse Treatment, Prevention, and Policy, found evidence that engaging individuals through nontraditional routes such as community outreach, the criminal justice system, and emergency departments can be effective in engaging more people in medication-assisted treatment.

"We have to find more proactive ways to identify and locate these individuals in order to offer choices and paths for recovery that fit their unique situation," said Langabeer, who was the senior author of the study.

Credit: 
University of Texas Health Science Center at Houston

New discovery has important implications for treating common eye disease

image: The honeycomb structure of human retinal pigment epithelial (RPE) cells with oxidative stress induced complement (C5b-C9) in red.

Image: 
Dr Sarah Doyle, Trinity College Dublin.

Scientists from Trinity College Dublin have made an important discovery with implications for those living with a common, debilitating eye disease (age-related macular degeneration, AMD) that can cause blindness.

They have discovered that the molecule TLR2, which recognises chemical patterns associated with infection in the body, also seems to play an important role in the development of retinal degeneration.

AMD is the most common form of central visual blindness in adults, with approximately 70,000 Irish people living with the condition. People with AMD may have difficulty recognising faces, reading, watching television and driving as their central retina degenerates.

Aging is the greatest risk factor for development of AMD, with one in four people over the age of 75 living with the condition. To date, no pharmaceutical interventions are available to prevent the progression of disease. Patients living with dry AMD are generally advised to make lifestyle changes such as stopping smoking and improving diet and exercise regimes.

Dr Sarah Doyle, assistant professor of immunology at Trinity, who led the study which has just been published in leading journal Cell Reports, said:

"The lack of approved therapies for AMD is mainly because the factors involved in triggering the disease are not very well understood. Understanding and identifying early molecular events that may trigger dry AMD will allow us to develop a more targeted approach to therapy. In this case, we believe that regulating the activity of TLR2 may, over time, help to prevent the progression of dry AMD."

Two biological processes involved in AMD are the uncontrolled "oxidative stress" that results in the formation of bleach-like chemicals in the retina, and the laying down of a protein called complement, that "tags" whatever it touches for elimination.

In this study the scientists implicated TLR2 as a critical bridge between oxidative damage and complement-mediated retinal degeneration. TLR2, which is found on the surface of cells, is part of the immune system because it is known to sense infection through recognising chemical danger signals that are found on microorganisms like bacteria and yeast.

Once TLR2 is activated by a danger signal it triggers a signal cascade, which is a bit like a cellular assembly line, with information about the cells immediate environment passed to our genes, which then respond with an inflammatory response.

"In the case of the eye, TLR2 appears to act as a sensor of oxidative-stress, recognising a chemical pattern that is generated during oxidation, rather than infection, and triggering a signal cascade that ends in promoting the laying down of complement," said first author on the paper, Dr Kelly Mulfaul, from Trinity.

Dr Sarah Doyle added:

"A function for TLR2 has not previously been reported in retinal neurodegenerative disease pathology but it is likely to play an important role, because when we remove TLR2 from our experimental model systems we reduce the level of complement and this has the effect of protecting cells that are essential for vision from dying.

"With the continual increase in life expectancy outpacing the rate at which drugs for age-related conditions are developed new avenues of therapy are badly needed, so the fact that blocking this single protein can have such a protective effect in the eye is a particularly exciting discovery."

Credit: 
Trinity College Dublin

Sub-Neptune sized planet validated with the habitable-zone planet finder

image: A candidate planet outside our solar system was validated using data from the Habitable-zone Planet Finder Spectrograph, a Penn State-led near-infrared spectrograph recently installed on the 10m Hobby-Eberly Telescope at McDonald Observatory in Texas .

Image: 
Gudmundur Stefansson (left) and Ethan Tweedie Photography (right)

A signal originally detected by the Kepler spacecraft has been validated as an exoplanet using the Habitable-zone Planet Finder (HPF), an astronomical spectrograph built by a Penn State team and recently installed on the 10m Hobby-Eberly Telescope at McDonald Observatory in Texas. The HPF provides the highest precision measurements to date of infrared signals from nearby low-mass stars, and astronomers used it to validate the candidate planet by excluding all possibilities of contaminating signals to very high level of probability. The details of the findings appear in the Astronomical Journal.

The planet, called G 9-40b, is about twice the size of the Earth, but likely closer in size to Neptune, and orbits its low mass host star, an M dwarf star, only 100 light years from Earth. Kepler detected the planet by observing a dip in the host star's light as the planet crossed in front of--or transited--the star during its orbit, a trip completed every six Earth days. This signal was then validated using precision spectroscopic observations from the HPF, ruling out the possibility of a close stellar or substellar binary companion. Observations from other telescopes, including the 3.5m telescope at Apache Point Observatory and the 3m Shane Telescope at Lick Observatory, helped to confirm the identification.

"G 9-40b is amongst the top twenty closest transiting planets known, which makes this discovery really exciting," said Guðmundur Stefánsson, lead author of the paper, and a former PhD student at Penn State who is currently a postdoctoral fellow at Princeton University. "Further, due to its large transit depth, G 9-40b is an excellent candidate exoplanet to study its atmospheric composition with future space telescopes."

"The spectroscopic observations from HPF allowed us to place an upper bound of 12 Earth masses on the mass of the planet," said Caleb Cañas, a graduate student at Penn State and an author of the paper. "This demonstrates that a planet is causing the dips in light from the host star, rather than another astrophysical object such as a background star. We hope to obtain more observations with HPF to precisely measure its mass, which will allow us to constrain its bulk composition and differentiate between a predominantly rocky or gas-rich composition."

HPF was delivered to the 10m Hobby-Eberly Telescope at McDonald Observatory in late 2017 and started full science operations in late 2018. The instrument is designed to detect and characterize planets in the habitable-zone--the region around the star where a planet could sustain liquid water on its surface--around nearby low-mass stars. A unique feature of HPF is its precise spectral calibration with a laser frequency comb built by collaborators at the National Institute of Standards and Technology and the University of Colorado Boulder.

"Using HPF, we are currently surveying the nearest low-mass stars--also called M-dwarfs, which are the most common stars in the galaxy--with the goal of discovering exoplanets in our stellar neighborhood," said Suvrath Mahadevan, professor of astronomy and astrophysics at Penn State and principal investigator of the HPF spectrograph.

In addition to the data obtained with HPF, the scientists obtained another observation of the transiting planet using the 3.5m telescope at Apache Point Observatory in New Mexico using a photometric technique and instrumentation developed as part of Stefánsson's doctoral thesis. These transit observations helped further resolve the "transit shape"--the curve that represents how much of the host planet's light is blocked--resulting in more precise planetary parameters. In addition, high-contrast imaging observations using the 3m Shane Telescope at Lick Observatory showed that the host star was the true source of the transits.

"It is exciting to see this first result of the HPF survey coming out. HPF was built from the ground up to enable precision measurements to discover and confirm planets," said Larry Ramsey, emeritus professor of astronomy and astrophysics at Penn State.

Credit: 
Penn State

How newborn stars prepare for the birth of planets

image: This image shows the Orion Molecular Clouds, the target of the VANDAM survey. Yellow dots are the locations of the observed protostars on a blue background image made by Herschel. Side panels show nine young protostars imaged by ALMA (blue) and the VLA (orange).

Image: 
ALMA (ESO/NAOJ/NRAO), J. Tobin; NRAO/AUI/NSF, S. Dagnello; Herschel/ESA

An international team of astronomers used two of the most powerful radio telescopes in the world to create more than three hundred images of planet-forming disks around very young stars in the Orion Clouds. These images reveal new details about the birthplaces of planets and the earliest stages of star formation.

Most of the stars in the universe are accompanied by planets. These planets are born in rings of dust and gas, called protoplanetary disks. Even very young stars are surrounded by these disks. Astronomers want to know exactly when these disks start to form, and what they look like. But young stars are very faint, and there are dense clouds of dust and gas surrounding them in stellar nurseries. Only highly sensitive radio telescope arrays can spot the tiny disks around these infant stars amidst the densely packed material in these clouds.

For this new research, astronomers pointed both the National Science Foundation's Karl G. Jansky Very Large Array (VLA) and the Atacama Large Millimeter/submillimeter Array (ALMA) to a region in space where many stars are born: the Orion Molecular Clouds. This survey, called VLA/ALMA Nascent Disk and Multiplicity (VANDAM), is the largest survey of young stars and their disks to date.

Very young stars, also called protostars, form in clouds of gas and dust in space. The first step in the formation of a star is when these dense clouds collapse due to gravity. As the cloud collapses, it begins to spin - forming a flattened disk around the protostar. Material from the disk continues to feed the star and make it grow. Eventually, the left-over material in the disk is expected to form planets.

Many aspects about these first stages of star formation, and how the disk forms, are still unclear. But this new survey provides some missing clues as the VLA and ALMA peered through the dense clouds and observed hundreds of protostars and their disks in various stages of their formation.

Young planet-forming disks

"This survey revealed the average mass and size of these very young protoplanetary disks," said John Tobin of the National Radio Astronomy Observatory (NRAO) in Charlottesville, Virginia, and leader of the survey team. "We can now compare them to older disks that have been studied intensively with ALMA as well."

What Tobin and his team found, is that very young disks can be similar in size, but are on average much more massive than older disks. "When a star grows, it eats away more and more material from the disk. This means that younger disks have a lot more raw material from which planets could form. Possibly bigger planets already start to form around very young stars."

Four special protostars

Among hundreds of survey images, four protostars looked different than the rest and caught the scientists' attention. "These newborn stars looked very irregular and blobby," said team member Nicole Karnath of the University of Toledo, Ohio (now at SOFIA Science Center). "We think that they are in one of the earliest stages of star formation and some may not even have formed into protostars yet."

It is special that the scientists found four of these objects. "We rarely find more than one such irregular object in one observation," added Karnath, who used these four infant stars to propose a schematic pathway for the earliest stages of star formation. "We are not entirely sure how old they are, but they are probably younger than ten thousand years."

To be defined as a typical (class 0) protostar, stars should not only have a flattened rotating disk surrounding them, but also an outflow - spewing away material in opposite directions - that clears the dense cloud surrounding the stars and makes them optically visible. This outflow is important, because it prevents stars from spinning out of control while they grow. But when exactly these outflows start to happen, is an open question in astronomy.

One of the infant stars in this study, called HOPS 404, has an outflow of only two kilometers (1.2 miles) per second (a typical protostar-outflow of 10-100 km/s or 6-62 miles/s). "It is a big puffy sun that is still gathering a lot of mass, but just started its outflow to lose angular momentum to be able to keep growing," explained Karnath. "This is one of the smallest outflows that we have seen and it supports our theory of what the first step in forming a protostar looks like."

Combining ALMA and VLA

The exquisite resolution and sensitivity provided by both ALMA and the VLA were crucial to understand both the outer and inner regions of protostars and their disks in this survey. While ALMA can examine the dense dusty material around protostars in great detail, the images from the VLA made at longer wavelengths were essential to understand the inner structures of the youngest protostars at scales smaller than our solar system.

"The combined use of ALMA and the VLA has given us the best of both worlds," said Tobin. "Thanks to these telescopes, we start to understand how planet formation begins."

Credit: 
National Radio Astronomy Observatory

The climate and increased extreme weather affect our energy systems

image: This is Deliang Chen.

Image: 
University of Gothenburg

Climate change, with more and more storms and heat waves, also has consequences for our energy supply. An international research team has now developed a new method for calculating how extreme weather affects energy systems.

Climate change is often described in terms of average temperature changes. But it is mainly extreme weather events, like cold snaps, autumn storms and summer heat waves, that have the greatest impact on the economy and society.

And our energy systems - especially systems that include renewable energy sources - are highly dependent on the weather and climate. But to date, there have been no suitable methods for calculating how future extreme weather events will affect these energy systems.

Weather changes affect renewable energy sources

Renewable energy, such as solar and wind power, play a crucial role for reducing climate change by partially replacing fossil-fuel-based energy sources.

"But their capacity is highly dependent on weather conditions, which makes their share in the existing energy system something of a challenge when it comes to reliability and stability," says Deliang Chen, professor of physical meteorology at the University of Gothenburg and one of the five researchers on the international research team.

Designed a new method

The researchers have now developed a new method for predicting how energy systems that contain renewable energy technology may be affected in the future by a changing climate with a focus on extreme weather. Researchers have used the method to analyse 30 Swedish cities, including Gothenburg, Stockholm and Malmö.

The results show that future extreme climate events can have a considerable impact on energy systems.

"When we used the method in 30 Swedish cities and considered 13 scenarios for climate change, we saw uncertainty in the potential for renewable energy and energy demand."

The gap between access to and energy demand may, with certain future climate variations, be as much as 34 per cent.

"That means a reduction in power supply reliability of up to 16 per cent due to extreme weather events."

Collaboration necessary

Faced with upcoming climate changes, shaping and optimising an energy system that can coordinate renewable energy sources in the energy supply requires close collaboration between energy system experts and climate researchers, according to Deliang Chen.

"We need this kind of collaboration to handle the complexity of climate and energy systems and to be able to predict the multidimensional effects that await."

Credit: 
University of Gothenburg

Pill-sized 'heater' could increase accessibility in diagnosing infectious disease

image: The technology significantly reduces the footprint of heating. Left: typical plate heater used in the laboratory. Right: Miniature heater.

Image: 
Qin Dai / University of Toronto Engineering

Researchers at the University of Toronto Engineering have developed a tiny "heater" - about the size of a pill - that could allow resource-limited regions around the world to test for infectious diseases without the need for specialized training or costly lab equipment.

The technology regulates the temperature of biological samples through different stages of diagnostic testing, which is crucial to the accuracy of test results.

"The precision and flexibility of our heater opens the door to a future of do-it-yourself diagnostic kits," says PhD candidate Pranav Kadhiresan, who developed the device alongside PhD candidate Buddhisha Udugama, under the supervision of Professor Warren Chan.

"We could combine the simplicity of a high school chemistry set with the precision of cutting-edge lab instruments," adds Kadhiresan. The technology behind the team's miniaturized heater invention is describe in a paper published in the journal of Proceedings of the National Academy of Sciences (PNAS).

In a typical diagnostic test for infectious pathogens, multiple temperature-regulation steps are involved. The ability to control temperature is especially important in areas where access to large research facilities are limited.

"The lack of electricity adds a layer of complexity," says Udugama. "Our miniature heater addresses that. It can be used in various settings to detect viruses without the need for electricity. If we were to summarize the benefits of our technology, it would be accessibility, portability and precision."

The outside of the heater tablet is composed of a non-reactive acrylic mould that encapsulates lithium, a reactive element that is commonly found in battery cells. When dissolved in water, the reactive lithium interacts with the solution to release heat and hydrogen gas. This results in an increase of temperature for an extended period of time.

The researchers observed that the reproducibility of the temperature profile is controlled by constant gas release, which is dictated by the shape of the lithium mould. After testing multiple shapes of the lithium mould - from circles to triangles - they found the star shape, measuring just 8 millimetres in diameter, to be the most ideal for precise heating.

Consolidating multiple steps into a single tablet also means specialized training is not required to operate any diagnostic testing, reducing the chance of human error and making the device accessible to the public.

"Tablets are conventionally used for medications such as aspirins. But we have now developed a series of tablets and pills that can diagnose diseases," says Chan.

"Combined with smartphone technology, everyone would have a portable system that can track, monitor and diagnose infections. This is critical for preventing the spread of diseases."

Credit: 
University of Toronto Faculty of Applied Science & Engineering

Study highlights potential need to standardize quality measurement for cardiovascular care

In a new study published today in JAMA Cardiology, a team of researchers led by Rishi Wadhera, MD, MPP, MPhil, an investigator in the Smith Center for Outcomes Research in Cardiology at Beth Israel Deaconess Medical Center (BIDMC), found that hospitals that received awards from the American Heart Association (AHA) and American College of Cardiology (ACC) for the delivery of high-quality care for acute myocardial infarction (AMI) and heart failure (HF) were more likely to be financially penalized under value-based programs than other hospitals.

"Our findings highlight that evaluations of hospital quality for acute myocardial infarction and heart failure care differ between the American Heart Association/American College of Cardiology national quality improvement initiatives and federal value-based programs," said Wadhera. "Hospitals recognized by the AHA/ACC for high quality care were more likely to be financially penalized by federal value-based programs than other hospitals, despite achieving similar and/or better outcomes."

Since the passage of the Affordable Care Act, the Centers for Medicare and Medicaid Services (CMS) have implemented national value-based programs that aim to incentivize the delivery of higher value care. The Hospital Readmissions Reduction Program (HRRP) imposes financial penalties on hospitals with higher-than-expected 30-day readmission rates. In addition, the Hospital Value-Based Purchasing Program (VBP) - a pay-for-performance initiative - rewards or penalizes hospitals based on their performance on multiple domains of care, including 30-day mortality. Both programs have focused on heart failure and acute myocardial infarction, in part due to their clinical and financial burden.

In this study of hospitals that received awards for high quality cardiovascular care from AHA/ACC national quality improvement initiatives, the researchers observed several findings about those hospitals' performance in national value-based programs:

1. Hospitals recognized for high quality care by the AHA/ACC ("award hospitals") were more likely to be penalized by the HRRP and VBP compared with other hospitals.

2. Award hospitals were less likely to receive financial rewards (payment increases) by the VBP.

3. Median payment reductions were higher for award hospitals than other hospitals under the VBP, and median payment increases were lower.

The team of researchers concluded that one potential explanation for the difference in evaluations of hospital quality may be that AHA/ACC award hospitals are disproportionately penalized by value-based programs for factors unrelated to the quality care they deliver. Award hospitals tended to be larger, urban, teaching hospitals - sites that often care for medically and socially complex populations. Because risk-adjustment models used for value-based programs do not include important clinical and social risk factors (e.g. poverty), award hospitals may be penalized for the patient populations and communities they serve rather than for poor quality of care.

"As the shift to value-based care continues in the United States and as multiple bodies simultaneously assess hospital systems, we need to prioritize efforts to promote fair, equitable and standardized measurement of cardiovascular care quality," said Wadhera, who is also an Instructor in Medicine at Harvard Medical School.

Credit: 
Beth Israel Deaconess Medical Center

Exploring a genome's 3D organization through a social network lens

image: Much as groups of sailors work together with ropes to accomplish tasks aboard ship, collections of transcription factor proteins work with sections of chromosomes as a community in the nucleus of a cell to carry out cell functions. Computational biologists at Carnegie Mellon University have developed algorithms for identifying these communities in cell nuclei.

Image: 
Ella Marushchenko

PITTSBURGH--Computational biologists at Carnegie Mellon University have taken an algorithm used to study social networks, such as Facebook communities, and adapted it to identify how DNA and proteins are interconnected into communities within the cell nucleus.

Jian Ma, associate professor in CMU's Computational Biology Department, said scientists have come to appreciate that DNA, proteins and other components within the nucleus appear to form structurally and functionally important communities. The behavior of these communities may prove key to understanding basic cellular processes and disease mechanisms, such as aging and cancer development.

Figuring out how to identify these communities among the tens of thousands of genes, proteins and other components of the cell is daunting, however. An important factor is proximity - both in terms of genes being controlled by the same regulatory proteins called transcription factors and in terms of spatial arrangement, with the complex folding and packing of DNA putting certain genes close to each other.

In many cases, the relationships are similar to many Facebook communities, with some members located near each other, while others who may be far apart are nevertheless drawn together through shared interests.

In a paper featured on the cover of the February issue of the journal Genome Research, lead authors Dechao Tian, a post-doctoral researcher, and Ruochi Zhang, a Ph.D. student in computational biology, explain how they developed a new algorithm, MOCHI, to subdivide the interwoven nuclear components into communities.

MOCHI was inspired by an algorithm originally developed by the laboratory of computer scientist Jure Leskovec. Beginning as a Ph.D. student at CMU and continuing as a faculty member at Stanford University, Leskovec has specialized in the analysis of large social and information networks.

The MOCHI algorithm looks at the spatial arrangement of all the genes and transcription factor proteins in a nucleus based on genome-wide chromosome interactions and global gene regulatory networks. Viewing this information as a 3D graph, the algorithm looks for certain subgraphs or "motifs," within it. A motif might be, say, a triangular shape, as is typical in social network analysis, or a four-node subgraph, which MOCHI uses for analyzing complex networks in the cell nucleus. The algorithm then clusters, or subdivides, the graph in a way that minimizes disruption of these motifs.

They tested MOCHI by applying it to five different cell types. Just as the original algorithm has proved adept at identifying communities within a large mass of social network data, MOCHI identified what appear to be hundreds of communities within the nuclei of these cell types.

As of yet, the researchers don't know what each community might do, but they say they have reason to believe the subdivisions made by MOCHI are valid. For instance, Ma said that the algorithm identified communities that seem to be common to all of the cell types used in this study. It also identified some communities that appear to be unique to a particular cell type. In addition, Ma said they found "enrichment" of disease related genes within the communities.

Much more work will be necessary to identify the function and behavior of each of these communities, Ma said, but the MOCHI algorithm gives researchers a starting point for study.

"There's a reason why these communities are formed in the nucleus," he said. "We just don't know the formation mechanisms of these communities yet." Understanding them might help researchers delineate fundamental cellular processes and suggest possible ways to better understand disease development.

The researchers also plan to include additional cell nucleus components, such as RNAs and other types of proteins, into their analysis.

In addition to Ma, Tian and Zhang, authors of the paper include Yang Zhang and Xiaopeng Zhu, a research associate and a project scientist, respectively, in the Computational Biology Department. The National Institutes of Health, including its 4D Nucleome Program, and the National Science Foundation supported this research.

Credit: 
Carnegie Mellon University

Active droplets

image: Prof. Job Boekhoven and Caren Wanzke at the Technical University of Munich discovered a new material system that can release drugs with a constant release rate over a tunable period of time. The microscopic image (right screen) shows droplets of the hydrolyzable oil embedded in a hydrogel.

Image: 
Andreas Heddergott / TUM

Using a mixture of oil droplets and hydrogel, medical active agents can be not only precisely dosed, but also continuously administered over periods of up to several days. The active agents inside the droplets are released at a constant rate, decreasing the risk of over- or underdosage.

Actually, Prof. Job Boekhoven was studying the origins of life: Together with his team at the Technical University of Munich (TUM), the chemist wanted to understand how molecules in the primordial ocean had managed to combine and form the precursors of the first living cells.

"In our research work we experimented with oil droplets, among other things. We were especially interested in mechanisms that protect molecules from degradation. We found that unstable molecules that form oil droplets would survive much longer than molecules that cannot form droplets. In a sense, the droplets protect the molecules inside."

However, the oily shield is not entirely impermeable: Some of the oil molecules react with the surrounding water. This hydrolysis causes the droplets to slowly but continuously lose mass and shrink until they eventually disappear. "The constant decay of these 'active droplets', led us to the idea of using them to dose drugs," recalls Boekhoven.

Safe from over- or underdosing

Pharmacologists have long sought methods for administering active agents at a constant rate. The ingredients in ointments or tablets are usually released quickly, increasing the risk of an overdose. Moreover, the fast rate of release shortens the duration of the intended effect. Methods for releasing drugs over extended periods of time at a constant rate are rare and often complicated to fabricate.

"We found that the droplets continuously release the drug while they get smaller and smaller. The consequence is that over the entire release period, the drug release rate remains constant", explains Boekhoven. "The power of this approach lies in its simplicity. You need only three components: droplets made of a hydrolyzable oil, a drug that partitions in the oil, and a hydrogel that stabilizes the position of the droplets."

Many fields of application

The new oil-hydrogel mix allows active agents to be administered not only continuously but also at a predetermined rate. The droplets can be loaded with larger or smaller doses of active substances. These are released as soon as the oil droplets come into contact with the water in blood or tissue. The hydrolysis proceeds at a constant rate until the droplets have dissipated completely.

These "active droplets" have many potential fields of application. They could, for example, be deployed in disinfectant or healing-promoting sore pads to treat poorly healing wounds. A patent application has already been filed for the oil-hydrogel material.

Credit: 
Technical University of Munich (TUM)

E-cigarette users are exposed to potentially harmful levels of metal linked to DNA damage

image: Prue Talbot (left), Shane Sakamaki-Ching (center) and Monique Williams.

Image: 
Talbot lab, UC Riverside.

RIVERSIDE, Calif. -- Researchers at the University of California, Riverside, have completed a cross-sectional human study that compares biomarkers and metal concentrations in the urine of e-cigarette users, nonsmokers, and cigarette smokers.

They found that the biomarkers, which reflect exposure, effect, and potential harm, are both elevated in e-cigarette users compared to the other groups and linked to metal exposure and oxidative DNA damage.

"Our study found e-cigarette users are exposed to increased concentrations of potentially harmful levels of metals -- especially zinc -- that are correlated to elevated oxidative DNA damage," said Prue Talbot, a professor of cell biology, who led the research team.

Zinc, a dietary nutrient, plays key roles in growth, immune function, and wound healing. Too little of this essential trace element can cause death; too much of it can cause disease. Its deficiency, as well as its excess, cause cellular oxidative stress, which, if unchecked, can lead to diseases such as atherosclerosis, coronary heart disease, pulmonary fibrosis, acute lymphoblastic leukemia, and lung cancer.

Electronic cigarettes consist of a battery, atomizing unit, and refill fluid. Metals in e-cigarette aerosols come mainly from the metal components in the atomizer-- nichrome wire, tin solder joints, brass clamps, insulating sheaths, and wicks -- as well as the e-fluids that the atomizers heat.

The study, which appears in BMJ Open Respiratory Research, marks the first time researchers have examined and quantified urinary biomarkers of effect and potential harm in relation to metals in e-cigarette users.

A biomarker is a quantifiable characteristic of a biological process. Biomarkers allow researchers and physicians to measure a biological or chemical substance that is indicative of a person's physiological state. Previous e-cigarette studies with humans have examined biomarkers of exposure -- for example, nicotine or nicotine metabolites -- but none have studied biomarkers of potential harm or shown how this harm correlates with metal exposure.

The biomarkers studied by the UC Riverside researchers were 8-hydroxydeoxyguanosine (8-OHdG), a biomarker of oxidative DNA damage; 8-isoprostane, an indicator of the oxidative degradation of lipids; and metallothionein, a metal response protein. All three biomarkers were significantly elevated in e-cigarette users compared to the concentrations in cigarette smokers.

"Our findings reaffirm that e-cigarette use is not harm free," said Shane Sakamaki-Ching, a graduate student in the Cell, Molecular and Developmental Biology Graduate Program and the research paper's first author. "Indeed, prolonged use may lead to disease progression."

The researchers advise physicians to exercise caution when recommending e-cigarettes to their patients. Electronic cigarette aerosols contain potentially harmful chemicals, cytotoxic flavor chemicals, metals, ultrafine particles, and reaction products. E-cigarette use has been linked to adverse health effects such as respiratory diseases, increased risk for cardiovascular disease, and impaired wound healing following surgery.

"Pregnant women, especially, should not be encouraged to use e-cigarettes," Talbot said. "Excess of zinc in their bodies can lead to nausea and diarrhea. Given the recent deaths and pulmonary illnesses related to e-cigarette usage, everyone should be made aware of the potential health risks linked to e-cigarette usage."

The study involved 53 participants from the Buffalo, New York, area. Talbot and Sakamaki-Ching were joined in the study by Monique Williams, My Hua, Jun Li, Steve M. Bates, Andrew N. Robinson, and Timothy W. Lyons of UCR; and Maciej L. Goniewicz of the Roswell Park Comprehensive Cancer Center, Buffalo, New York.

The study was supported by grants from the National Institutes of Health.

The University of California, Riverside (http://www.ucr.edu) is a doctoral research university, a living laboratory for groundbreaking exploration of issues critical to Inland Southern California, the state and communities around the world. Reflecting California's diverse culture, UCR's enrollment is more than 24,000 students. The campus opened a medical school in 2013 and has reached the heart of the Coachella Valley by way of the UCR Palm Desert Center. The campus has an annual statewide economic impact of almost $2 billion. To learn more, email news@ucr.edu.

Credit: 
University of California - Riverside

Diversifying traditional forest management to protect forest arthropods

image: Forest arthropods and rivers: a connection to be discovered yet

Image: 
Antoni Serra, UNIVERSITY OF BARCELONA-CRBA

The structure of vegetation and steam distance are important factors to consider in order to protect the biodiversity of forest arthropods, as stated in an article now published in the journal Forest Ecology and Management. The conclusions of the study note the farther we are from a river course, the better conditions for the communities of arthropods in the forests, since they need a cool and wet microclimate.

The new study is led by experts from the Faculty of Biology, the Biodiversity Research Institute (IRBio) and the Animal Biodiversity Resource Center of the University of Barcelona (CRBA), and it counts on the participation of experts from CREAF-UAB, the Museum of Natural Sciences of Barcelona, and the universities of Canberra and Melbourne (Australia).

Forest arthropods and rivers: a connection to be discovered yet

An almost unknown ecological relationship so far, that is the inflow of the presence of a river in the biodiversity of non-riparian forest arthropods. This is the new ecological perspective the article treats in order to study the population of arthropods in forest areas in the northeastern area of the peninsula, in O Incio (Lugo, Spain).

The great biodiversity of analyzed arthropods in the study -twenty-one orders and sixty families- expands the reference coordinates in previous studies that were mostly focused on specific groups, such as butterflies. "It is known that rivers are vertebral axes of terrestrial ecosystems, they regulate the microclimate and have many ecosystemic values. However, the relationship with forest arthropods -beyond gallery forest ones- was not quantified", notes the first author of the article, researcher Sergio Albacete, from the Department of Evolutionary Biology, Ecology and Environmental Sciences of the UB.

The new study puts emphasis on the value of discovering the biology of relatively unknown groups, "such as flies, dipteral insects with more than a hundred families in the Iberian Peninsula, and which are important for the processes of pollination, decomposition and plague control", notes the researcher.

Chestnut woodlands: ecology, economy and landscape

The studied forest areas in the study are chestnut woodlands (Castanea sativa), "habitats with a great ecological economic and landscape value", says researcher Alberto Maceda (UB-IRBio). "From a biological perspective, we find centenary trees and a great biological diversity that has been preserved over the years thanks to the traditional management, an activity which is being lost due to rural abandonment. These trees have economic value as well, since they provide fruit and wood, subsistence resources for the rural world in the past. As landscape, soutos -local name for the chestnut trees- are protected by the Directorate habitat, and are the dominant landscape of many valleys".

According to the conclusions, the distance between the trees and the rivers affects the richness, abundance and trophic guilds of the arthropods "after considering other factors (level of coverage of the understory, diversity and height of plants, density and diameter of the trunks, etc.) that affect the availability of food and shelter for these invertebrates", says Sergio Albacete. "Therefore, we have to consider the effects of such distance to establish ecological relationships between the arthropods and the rivers".

Everything points out to the fact that social wasps are the most affected group by the steam distance. "Without setting potential cause-effect relationships, the statistical analysis of individual effects in several studied variables would suggest social wasps do not like to live far from rivers", reveals Alberto Maceda. "We are now studying this relationship in detail, since we have been getting samples of the Asian wasp (Vespa velutina) since 2017. This invasive species is generating a social alarm and can affect the conservation of the native biodiversity".

A traditional management for the richness of understories

In order to protect the forest diversity of the arthropods -some of which control many agricultural and forest plagues in a natural way-, we would need to preserve the traditional management of forests, favour the richness of understories and maintain the chestnut tree as the predominant tree in the area. Another important factor would be to avoid clearance of paths -which eases chestnut gathering- before the end of the plant period of the understory.

"The traditional management -regarded as a medium-intensity perturbation- prevents the understory from being monopolized by plants with an extensive growth (blackberries, ivy, fern, etc.). Moreover, this management model prevents soutos from densifying due to the growth of the chestnut trees and the germination of other trees in the forest", notes Maceda.

In this habitat, the presence of oaks near the chestnut trees "is essential to control the chestnut gall wasp (Dryocosmus kuriphilus), since we can find deformations of plant organs where native parasites reproduce -other wasps- and control the plague of the chestnut tree which also provokes these", highlights the lecturer Juli Pujadé Villar, from the Department of Evolutionary Biology, Ecology and Environmental Sciences of the UB.

Discovering the structural complexity of the forest

Arthropods are the animal taxon with the largest diversity worldwide. However, we need to conduct many studies on different groups in several habitats to complete the general view of the conservation state of these invertebrates. At the moment, habitat loss, alterations in the phenology of plants due to climate change and the use of pesticides are threatening the conservation of many groups of arthropods worldwide.

"It is important to improve the management of the understories in a moment in which society thinks a forest is dirty when it has understory and the fire prevention policies lead to random clearances to clean it. We need more pedagogy at school -since studying biodiversity is getting a lower value- so that society understands a forest has a structural complexity a garden does not have, and we need to preserve it. In a context of climate change, this is getting more and more important", conclude the authors.

Credit: 
University of Barcelona

Patients most at risk of overdose at the beginning and after end of methadone treatment

Thursday, 20 February 2020: A new study, led by RCSI researchers, has found that patients receiving methadone treatment are most at risk of overdosing in the month following the end of methadone treatment and during the first four weeks of treatment.

However, the study did not observe transfers between services as high risk periods, with no deaths occurring following a transfer. This suggests that the current structures in Ireland promote a smooth transition of patients between services.

The study, published in the current edition of Addiction, was funded by the Health Research Board and was a collaboration between the School of Pharmacy and Biomolecular Sciences in RCSI, HRB Centre for Primary Care Research in RCSI, the HSE Addiction Services, Trinity College Dublin and the HSE National Social Inclusion Office.

People with opioid dependence have more than 10 times the risk of premature death than the general population. The most effective treatment is the prescription of legal, substitution drugs, most commonly methadone.

The researchers analysed data from 2,899 people who were prescribed and dispensed methadone in addiction services between January 2010 and December 2015. They observed 154 deaths, and 55 (35.7%) of those were identified as drug-related poisonings.

The rate of drug-related poisoning deaths was more than four times higher in the month following the end of treatment and over three times higher in the first four weeks of treatment when compared to the remaining time in treatment. These findings are consistent with growing evidence from other international studies.

"Identifying a higher risk at the beginning and immediately after the end of treatment highlights that retaining patients in treatment for longer periods will save lives. People often cycle in and out of treatment, thereby increasing their exposure to repeated periods of high risk," said Dr Gráinne Cousins, senior lecturer at RCSI's School of Pharmacy and Biomolecular Sciences and the study's lead author.

"Close monitoring of opioid tolerance before starting treatment and more effective methods of preventing relapse during the induction period may reduce this risk. Additionally, increasing patient awareness of the risk of overdose and increasing the availability of take-home naloxone may mitigate the risk of overdose during the high risk periods, particularly following treatment cessation."

No deaths were observed in the first month following transfer between treatment providers. Transfers between addiction services and primary care are facilitated by GP Coordinators employed by the addiction services. The GP Coordinator provides all relevant clinical details on the patient being transferred to the new treatment provider. The provision of opioid substitution treatment is also available in Irish prisons; if a prisoner is in treatment prior to incarceration, their treatment is continued in prison.

"Any inferences regarding risk must be cautious as less than half our sample experienced a transfer, and among those who did, it was most frequently a transition to and from prison. Further investigation of the impact of transfers between services is warranted," said Louise Duran, an RCSI postdoctoral research Fellow in the School of Pharmacy and Biomolecular Sciences.

Credit: 
RCSI

Psychologists discover secret to achieving goals

Research led by scientists at Queen Mary University of London has provided new insights into why people often make unrealistic plans that are doomed to fail.

The study, published in the journal Behavioural Brain Research, analysed the complex relationship between reward and effort in achieving goals, and identified two critical stages in the decision-making process.

The researchers found that when people first decide what to do they are motivated by rewards. However, once they begin to put plans into action, their focus turns to the difficulty of the effort they need to put in.

They suggest the key to achievable aims is to consider the effort needed when deciding what to do, and then remembering to focus on the rewards once the time comes to put the effort in.

To investigate the relationship between effort and reward, the research team designed experiments involving two different forms of effort, physical and mental.

Physical effort was measured by the action of squeezing a joystick whilst the ability for participants to solve simple mathematical equations tested mental effort.

Study participants were presented with different options that combined either high or low effort with high or low financial reward, and asked to select which one to pursue.

The scientists found that when selecting options participants were guided by the level of financial reward offered, but on execution of the task their performance was determined by the actual amount of effort they needed to exert.

The team observed that the results were similar for both the physical and mental effort-based experiments.

Dr Agata Ludwiczak, a Research Fellow from Queen Mary University of London and lead author of the study, said: "Common sense suggests the amount of effort we put into a task directly relates to the level of reward we expect in return. However, building psychological and economic evidence indicates that often high rewards are not enough to ensure people put in the effort they need to achieve their targets.

"We have found that there isn't a direct relationship between the amount of reward that is at stake and the amount of effort people actually put in. This is because when we make choices about what effort to put in, we are motivated by the rewards we expect to get back. But at the point at which we come to actually do what we had said we would do, we focus on the level of effort we have to actually put in rather than the rewards we hoped we would get."

Dr Osman, Reader in Experimental Psychology at Queen Mary, said: "If we aren't careful our plans can be informed by unrealistic expectations because we pay too much attention to the rewards. Then when we face the reality of our choices, we realise the effort is too much and give up. For example, getting up early to exercise for a new healthy lifestyle might seem like a good choice when we decide on our new year's resolutions, but once your alarm goes off on a cold January morning, the rewards aren't enough to get you up and out of bed."

Credit: 
Queen Mary University of London

How transient invaders can transform an ecosystem

When a plant or animal species is introduced to a new environment with few natural predators, it can spread uncontrollably, transforming the ecosystem and crowding out existing populations. One well-known example is the cane toad, which was introduced into Australia in 1935 and whose population is now well into the millions.

A related but less-understood scenario occurs when an invader arrives, transforms the ecosystem, and then dies out. MIT physicists have now shown how this kind of "transient invasion" can occur in bacterial populations, provoking a shift from one stable community state to another, while the invader itself disappears.

"These results highlight one possible way in which even if a species does not survive long term, it could nonetheless have long-term effects on the community," says Jeff Gore, an MIT associate professor of physics and the senior author of the study.

The findings may also shed light on how transient invaders affect real-world ecosystems such as bacteria that alter the human gut microbiome before passing through the digestive tract, or reindeer whose grazing deforests an island so thoroughly that the population can no longer survive there.

MIT postdoc Daniel Amor is the lead author of the paper, which appears today in Science Advances. MIT research scientist Christoph Ratzke is also an author of the study.

Stable populations

The research team originally set out to explore factors that can drive ecosystems to switch between two stable states. Many ecosystems can exist in alternative stable states, but one may be more "desirable" than the other, such as a lake that can be healthy or eutrophic (excessively covered in algae). In one dramatic example, the Sahara region switched from a humid grassland to a desert about 5,000 years ago.

"The mechanisms that drive these transitions, and whether we can control them, are not very clear," Amor says.

Gore's lab studies the principles that govern these kinds of complex ecological shifts by creating simplified versions that can be analyzed in a laboratory. In this case, the researchers studied populations of two bacterial species -- Corynebacterium ammoniagenes and Lactobacillus plantarum -- that are known to inhibit each other's growth by changing the pH of the growth medium in their environment. This mutual inhibition leads to two alternative states of the community in which one or the other species dominates.

The researchers first allowed the bacterial populations to naturally reach a stable state in which one species dominated the overall community. Once the populations stabilized, the researchers introduced an invader and measured how it affected the previously stable populations.

Of the six invader species that the researchers studied, three performed a takeover that shifted the overall population dynamics of the ecosystem, but then died out. This phenomenon occurred via changes in the acidity of the environment, the researchers found.

In the original stable state, L. plantarum makes the environment acidic, which keeps growth of C. ammoniagenes in check. Invaders that thrive in an acidic environment grow rapidly once introduced. However, for some of these invaders, their rapid growth produces metabolic byproducts that raise the pH, making the environment less hospitable for themselves and for L. plantarum. As a result, C. ammoniagenes takes over, and the invader disappears.

The researchers then explored whether this phenomenon could be seen in naturally occurring populations of bacteria. They took soil samples and grew the bacterial species they found, allowing these communities to reach a variety of alternative stable states in the new environment of the laboratory. After introducing the same invaders they used in the earlier experiments, they observed similar patterns of rapid growth and then disappearance of the invader, along with a shift in the composition of the original community.

"This suggests that it was not a rare effect that we only observe in hypothesis-guided experiments, but also in natural settings," Amor says.

Stefano Allesina, a professor of ecology and evolution at the University of Chicago, described the experiments as "elegant and robust."

"The work has clearly important implications for the manipulation of microbial communities. Being able to transition a microbial community from an unfavorable state to a favorable one is one of the most important challenges in the field, and the MIT team has shown how transient invaders may be the perfect 'light switch' -- they do their job and then disappear," says Allesina, who was not involved in the study.

Rapid extinction

While it may seem counterintuitive that a species would create conditions that lead to its own demise, this may happen often in nature, the researchers say. In a paper published in 2018, Gore and two colleagues described several species of bacteria that essentially commit "ecological suicide" by growing so fast that the local environment becomes too polluted with acidic waste for them to survive.

This occurs, in part, because genetic mutations that allow an individual to grow faster can spread rapidly through the population, even if it harms the environment.

"There are a lot of species that would modify the environment in such a way that it could lead to rapid population extinction," Gore says. "It could be the case that in some situations, the strategy of growing slowly and not polluting the environment may not be evolutionarily stable against a mutant that just takes advantage of the environment while it's good, growing quickly and leading to a lot of toxin production. The problem is that while it's rational in the short term for each individual to do that, it leads to a suboptimal outcome for the population."

Gore says he hopes the findings will encourage scientists who study more complicated ecosystems, such as lakes or the human gut microbiome, to look for these types of transient invasions and their after-effects.

"The nature of these complex systems is that they can be a little bit overwhelming," he says. "You don't know what are the things you should even be looking for in the data or what kind of experiments you should be doing. Our hope is that some of our work can motivate other people to look for this sort of phenomenon in their systems."

Credit: 
Massachusetts Institute of Technology

Beyond the brim, Sombrero Galaxy's halo suggests turbulent past

image: On the left is an image of the Sombrero galaxy (M104) that includes a portion of the much fainter halo far outside its bright disk and bulge. Hubble photographed two regions in the halo (one of which is shown by the white box). The images on the right zoom in to show the level of detail Hubble captured. The orange box, a small subset of Hubble's view, contains myriad halo stars. The stellar population increases in density closer to the galaxy's disk (bottom blue box). Each frame contains a bright globular cluster of stars, of which there are many in the galaxy's halo. The Sombrero's halo contained more metal-rich stars than expected, but even stranger was the near-absence of old, metal-poor stars typically found in the halos of massive galaxies. Many of the globular clusters, however, contain metal-poor stars. A possible explanation for the Sombrero's perplexing features is that it is the product of the merger of massive galaxies billions of years ago, even though the smooth appearance of the galaxy's disk and halo show no signs of such a huge disruption.

Image: 
NASA/Digitized Sky Survey/P. Goudfrooij (STScI)/The Hubble Heritage Team (STScI/AURA)

Surprising new data from NASA's Hubble Space Telescope suggests the smooth, settled "brim" of the Sombrero galaxy's disk may be concealing a turbulent past. Hubble's sharpness and sensitivity resolves tens of thousands of individual stars in the Sombrero's vast, extended halo, the region beyond a galaxy's central portion, typically made of older stars. These latest observations of the Sombrero are turning conventional theory on its head, showing only a tiny fraction of older, metal-poor stars in the halo, plus an unexpected abundance of metal-rich stars typically found only in a galaxy's disk, and the central bulge. Past major galaxy mergers are a possible explanation, though the stately Sombrero shows none of the messy evidence of a recent merger of massive galaxies.

"The Sombrero has always been a bit of a weird galaxy, which is what makes it so interesting," said Paul Goudfrooij of the Space Telescope Science Institute (STScI), Baltimore, Maryland. "Hubble's metallicity measurements (i.e., the abundance of heavy elements in the stars) are another indication that the Sombrero has a lot to teach us about galaxy assembly and evolution."

"Hubble's observations of the Sombrero's halo are turning our generally accepted understanding of galaxy makeup and metallicity on its head," added co-investigator Roger Cohen of STScI.

Long a favorite of astronomers and amateur sky watchers alike for its bright beauty and curious structure, the Sombrero galaxy (M104) now has a new chapter in its strange story -- an extended halo of metal-rich stars with barely a sign of the expected metal-poor stars that have been observed in the halos of other galaxies. Researchers, puzzling over the data from Hubble, turned to sophisticated computer models to suggest explanations for the perplexing inversion of conventional galactic theory. Those results suggest the equally surprising possibility of major mergers in the galaxy's past, though the Sombrero's majestic structure bears no evidence of recent disruption. The unusual findings and possible explanations are published in the Astrophysical Journal.

"The absence of metal-poor stars was a big surprise," said Goudfrooij, "and the abundance of metal-rich stars only added to the mystery."

In a galaxy's halo astronomers expect to find earlier generations of stars with less heavy elements, called metals, as compared to the crowded stellar cities in the main disk of a galaxy. Elements are created through the stellar "lifecycle" process, and the longer a galaxy has had stars going through this cycle, the more element-rich the gas and the higher-metallicity the stars that form from that gas. These younger, high-metallicity stars are typically found in the main disk of the galaxy where the stellar population is denser -- or so goes the conventional wisdom.

Complicating the facts is the presence of many old, metal-poor globular clusters of stars. These older, metal-poor stars are expected to eventually move out of their clusters and become part of the general stellar halo, but that process seems to have been inefficient in the Sombrero galaxy. The team compared their results with recent computer simulations to see what could be the origin of such unexpected metallicity measurements in the galaxy's halo.

The results also defied expectations, indicating that the unperturbed Sombrero had undergone major accretion, or merger, events billions of years ago. Unlike our Milky Way galaxy, which is thought to have swallowed up many small satellite galaxies in so-called "minor" accretions over billions of years, a major accretion is the merger of two or more similarly massive galaxies that are rich in later-generation, higher-metallicity stars.

The satellite galaxies only contained low-metallicity stars that were largely hydrogen and helium from the big bang. Heavier elements had to be cooked up in stellar interiors through nucleosynthesis and incorporated into later-generation stars. This process was rather ineffective in dwarf galaxies such as those around our Milky Way, and more effective in larger, more evolved galaxies.

The results for the Sombrero are surprising because its smooth disk shows no signs of disruption. By comparison, numerous interacting galaxies, like the iconic Antennae galaxies, get their name from the distorted appearance of their spiral arms due to the tidal forces of their interaction. Mergers of similarly massive galaxies typically coalesce into large, smooth elliptical galaxies with extended halos -- a process that takes billions of years. But the Sombrero has never quite fit the traditional definition of either a spiral or an elliptical galaxy. It is somewhere in between -- a hybrid.

For this particular project, the team chose the Sombrero mainly for its unique morphology. They wanted to find out how such "hybrid" galaxies might have formed and assembled over time. Follow-up studies for halo metallicity distributions will be done with several galaxies at distances similar to that of the Sombrero.

The research team looks forward to future observatories continuing the investigation into the Sombrero's unexpected properties. The Wide Field Infrared Survey Telescope (WFIRST), with a field of view 100 times that of Hubble, will be capable of capturing a continuous image of the galaxy's halo while picking up more stars in infrared light. The James Webb Space Telescope will also be valuable for its Hubble-like resolution and deeper infrared sensitivity.

Credit: 
NASA/Goddard Space Flight Center