Culture

New method allows minimally invasive cell sampling

image: Northwestern Engineering researchers have developed one of the first non-destructive methods of extracting multiple samples from a cell over time.

Image: 
Horacio D. Espinosa, Prithvijit Mukherjee, Eric Berns, and Milan Mrksich

At any given moment, a variety of dynamic processes occur inside a cell, with many developing over time. Because current research methods for gene profiling or protein analysis destroy the cell, study is confined to just that one moment in time, and researchers are unable to return to the cell to examine how things change beyond that snapshot.

A team led by Northwestern Engineering faculty has developed a minimally invasive method to sample cells that can be repeated multiple times, one of the first to do so. The process, called localized electroporation, has implications in studying processes that evolve, such as cells' response to treatments for cancer and other diseases.

Horacio Espinosa, James N. and Nancy J. Farley Professor in Manufacturing and Entrepreneurship in the McCormick School of Engineering, led the team that created the live cell analysis device (LCAD), which can non-destructively sample the contents from small number of cells many times.

When LCAD is coupled with SAMDI, a highly sensitive and label-free method for quantification of enzymatic activity using mass spectrometry, the intracellular contents sampled by LCAD are then analyzed for the presence of enzymes. SAMDI (Self-Assembled Monolayer Desorption Ionization) was developed in the lab of Milan Mrksich, Northwestern University vice president for research and Henry Wade Rogers Professor of Biomedical Engineering, Chemistry, and Cell and Molecular Biology.

"By exploiting advances in microfluidics and nanotechnology, localized electroporation can be employed to temporarily open small pores in the cell membrane enabling the transport of molecules into the cells or extraction of intracellular contents. Since the method is minimally invasive to the cells, it can be repeated multiple times without their disruption," Espinosa said.

"Certain enzymes may be linked to disease pathways, such as certain types of cancers, and they may be the target of therapeutics. Using this platform, it is now possible to study how enzymatic activity varies between healthy cells and cells from a tumor biopsy," Mrksich said.

The LCAD-SAMDI platform offers an opportunity for biologists and physicians to investigate how specific treatments may alter these enzymatic activities and the associated diseases over time.

"The platform is one of the world's first technologies allowing this type of research, a biopsy but performed on cells at the nanoscale," Espinosa said.

Said John A. Kessler, Ken and Ruthe Davee Professor of Stem Cell Biology at Northwestern's Feinberg School of Medicine and study coauthor, "Without disrupting the cell, it provides a window to processes inside cells and enables research that can determine the quantity of an active enzyme, how enzymatic activity in cells changes over time, and what changes in the activity occur in response to a treatment."

This method opens up the possibility to investigate time-dependent processes, like cell differentiation, disease progression, or drug response, at regular intervals.

"We envision that this technique can be used in scenarios such as screening drugs or designing and optimizing treatment courses that can arrest disease progression in cells," Espinosa said.

Most established methods require killing the cells being analyzed. Currently, complex computational methods are used for retrieving temporal information from single snapshots, but assumptions about the dynamics and limitations on the time scales and scenarios remain.

The LCAD also can be used to deliver proteins into cells. The combination of delivery and sampling could potentially be used in studies involving delivery of molecules, like DNA and proteins, and investigating its effect on the activity of another via sampling.

"We have used the same concept of localized electroporation to do CRISPR gene editing and we are now using machine learning to automate the process," Espinosa said.

Overall, this method can provide complementary information regarding cellular dynamics, which may not be possible using traditional assays. In the future, as the technology improves and sensitivity increases, it may be possible to sample temporal information for several different types of proteins simultaneously from the same cell populations.

Credit: 
Northwestern University

Spirituality linked to higher quality of life for stroke survivors, caregivers

DALLAS, May 26, 2020 -- Higher spirituality among stroke survivors was strongly linked to better quality of life for them and their caregivers who may also feel depressed, according to new research published today in Circulation: Cardiovascular Quality and Outcomes, an American Heart Association journal. May is American Stroke Month.

For many stroke survivors, a caregiver, often a family member or close friend, may help with daily tasks, making the survivor and the caregiver prone to depression. Depression can impact quality of life for both.

Roughly 200 stroke survivors in Italy, with low-to-medium disabilities and no other major health issues, and their caregivers completed questionnaires measuring spirituality, depression and quality of life between 2016 and 2018. Women and men were nearly equally represented among the stroke survivors, and their average age was 71 years. Among the caregivers, women comprised nearly two-thirds, and their average age was 52.

Spirituality is defined by the World Health Organization (WHO) as an individual's perception of life within the context of the culture and value systems of the society and in relation to the individual's goals, expectations, standards and concerns. "Research shows that spirituality may help some patients cope with illness, yet few studies have looked at its effects on quality of life among stroke survivors and their long-term care partners, who are at increased risk for depression," said lead study author Gianluca Pucciarelli, Ph.D., FAHA, research fellow at the University of Rome in Italy.

Quality of life was measured with a 26-item WHO questionnaire on physical, psychological, social and environmental aspects.

In this analysis, those who scored one-standard deviation above average were considered to have "higher spirituality."

Researchers noted at baseline:

A strong relationship between the degree of spirituality and quality of life even if caregivers were depressed.

Stroke survivors who scored above average on the spirituality questionnaire reported higher psychological quality of life even when their caregivers reported symptoms of depression.

Similarly, the caregivers with above-average spirituality scores reported better physical and psychological quality of life.

In contrast, stroke survivors who scored below average on the spirituality questionnaire had lower quality of life, overall, as did their caregivers with depression symptoms.

"In summary, when care partners feel depressed, something that is common for stroke caregivers, the survivor's spirituality made the difference in whether this was associated with better or worse quality of life. This demonstrates the important protective role of spirituality in illness, and why we must study it more," Pucciarelli said.

He noted that the findings call for greater awareness on the importance of spirituality among health professionals.

"Our study emphasizes the importance of viewing stroke survivors holistically, as a patient with symptoms and disabilities, and as an individual with emotional needs and part of an interdependent unit with their care partner," Pucciarelli said.

The predominant religion in Italy is Roman Catholicism, which could have affected the results. Also, the study included only stroke survivors with low-to-medium disabilities and no other major health issues, so the study's findings may not apply to survivors with more severe disabilities or other underlying illnesses.

Credit: 
American Heart Association

A child's brain activity reveals their memory ability

image: Frontoparietal activation reflects individual working memory abilities.

Image: 
Rosenberg et al., JNeurosci 2020

A child's unique brain activity reveals how good their memories are, according to research recently published in JNeurosci.

When you scramble to remember a phone number as you enter it into your phone, you rely on your working memory to keep the number at the front of your mind. Briefly holding and manipulating information relies on the activity of the frontoparietal network, a group of brain regions coined the "cognition core." Working memory performance changes throughout development, but can an individual's memory facility be determined based on brain activity?

Rosenberg et al. analyzed fMRI data from the Adolescent Brain Cognitive Development (ABCD) data set, a repository of scans and behavioral tests from over 11,000 children aged nine and ten. Children with better working memory performed better on a range of cognitive, language, and problem-solving tasks. Activity in the frontoparietal network during a memory task reflected the individual working memory capabilities of the children, with an activity pattern unique to working memory. The ABCD data set will reexamine the children for ten years, allowing future studies to explore how the neural signature of working memory evolves across development.

Credit: 
Society for Neuroscience

7,000 years of demographic history in France

image: Samantha Brunel examining a skull in Institut Jacques Monod 's high containment laboratory (CNRS/Université de Paris)

Image: 
© Eva-Maria Geigl et Thierry Grange, Institut Jacques Monod (CNRS/Université de Paris)

A team led by scientists from the Institut Jacques Monod (CNRS/Université de Paris)* have shown that French prehistory was punctuated by two waves of migration: the first during the Neolithic period, about 6,300 years ago, the second during the Bronze Age, about 4,200 years ago. This study, published in PNAS on May 25, which looked at the genomes of 243 ancient individuals over 7,000 years, demonstrates how admixture between native hunter-gatherers and the first Anatolian Neolithic migrants, who brought with them a lifestyle based on agriculture, persists to this day in the genomes of French people. Admixture of the Neolithic populations with those from the Pontic steppes**, who arrived 4,200 years ago in what is now France, also left a lasting imprint, with the Y chromosome of the majority of French men still bearing the signature of men from the steppes.

Credit: 
CNRS

New sex hormone in zebrafish

image: A zebrafish

Image: 
Pixabay

When University of Ottawa biologists Kim Mitchell and Vance Trudeau began studying the effects of gene mutations in zebrafish, they uncovered new functions that regulate how males and females interact while mating. We sat down with senior author Professor Trudeau, Research Chair in Neuroendocrinology at the Faculty of Science, to learn more.

Please tell us about this research project.

Kim and I were working with international collaborators from the Institute of Hydrobiology at the Chinese Academy of Sciences in Wuhan. Using gene editing technology set up by our Chinese colleagues, we mutated two related genes and studied the effects on sexual function in zebrafish. They are freshwater fish belonging to the carp and minnow family that are now a widely used model organism in biomedical research.

What did you discover?

We changed the secretogranin-2 genes through specific mutation and found that it affected the ability of females and males to breed. It severely reduced their sexual behaviour.

The fish look normal, but when both sexes are put together, they almost ignore each other!

Normally, within a few minutes after a male and female are introduced for the first time, the male chases the female in a courtship ritual, and shortly therefore they spawn - that is to say, the female releases her eggs to the water, and the male instantly fertilizes them. But we found that only 1 in 10 of the couples with mutated genes could spawn.

The couples carrying the introduced mutations produce eggs and sperm, but they are simply terrible at mating with each other.

This is the first evidence that mutation of these genes leads to disruption of sexual behaviour in any animal.

What role does secretogranin-2 play?

Secretogranin-2 is a large protein that is important for the normal functioning of brain cells and other cells that secrete hormones to control body functions such as growth and reproduction. However, this protein can get chopped up by special enzymes and we found that one small fragment called the secretoneurin peptide is important for stimulating sexual function.

In the genetically altered fish, we can partially restore sexual function by a single injection of the secretoneurin peptide into the body. We believe the peptide acts on cells in the brain and pituitary gland to increase hormone release thereby enhancing the ability of the female to ovulate and lay her eggs.

Why is this important?

We have uncovered new genes that can regulate reproduction, and the secretoneurin peptide is therefore itself a new hormone. The secretoneurin produced in fish is remarkably similar to that found in other animals, including humans. We can now use our genetically modified fish to look for other factors that could enhance sexual function, be it for increased spawning in cultured fish species, or to help with the search for new human infertility treatments.

This is just the beginning of the possibilities. The large secretogranin-2 genes may produce many other hormone-like peptides with unknown functions. It will be exciting to explore this in future research projects.

Credit: 
University of Ottawa

Heart failure patients with limited health literacy may have higher risk of death

Patients with heart failure who experience low health literacy are at an increased risk of hospitalization and mortality. This finding has significant clinical and public health implications and suggests that assessing and intervening upon an individual's understanding of their own health could improve heart failure outcomes, according to research published in JACC: Heart Failure.

Heart failure is a chronic condition that requires patients to engage complex self-management skills to monitor weight and blood pressure, control glycaemia, stick to drug and diet guidelines, and occasionally lose weight and exercise. Therefore, greater attention has recently been given to health literacy, which is defined by the authors of this study as "the degree to which individuals have the capacity to obtain, process and understand basic health information and services needed to make appropriate health decisions."

Previous studies have suggested that low health literacy among patients with heart failure could be associated with higher risk of mortality, hospitalizations and emergency department visits, but results have been inconsistent. Researchers in this study sought to determine the effect of health literacy on mortality, while adjusting for important potential confounders, on hospitalizations and emergency department visits among heart failure patients, the first meta-analysis of its kind.

Researchers, with the assistance of a medical librarian, conducted a systematic review across EMBASE, MEDLINE, PsycInfo and EBSCO CINAHL databases from inception to Jan. 1, 2019. Both observational and interventional studies evaluated the impact of health literacy among patients 18 years or older with heart failure on mortality, hospitalizations and emergency department visits for all causes. Interventional studies evaluated interventions among patients with heart failure who had low health literacy. Among the observational studies, 9,171 heart failure patients were included, of which 2,207 (24%) had inadequate or marginal health literacy.

In the studies reviewed, health literacy was assessed using objective or subjective measures--objective health literacy measurement tools evaluate how much the patient comprehends medical information and subjective measurement tools evaluate how much the patients think they understand.

The researchers found that low health literacy was associated with higher unadjusted risk for mortality (RR: 1.67, 95%CI: 1.18, 2.36), hospitalizations and emergency department visits. In adjusted analyses, low health literacy remained statistically associated with mortality and hospitalizations, but no correlation was found for emergency department visits. Among the four interventional studies, two effectively improved outcomes for heart failure patients with low health literacy.

"Our findings showed that an inadequate level of health literacy is associated with increased risks in mortality and hospitalization among patients with Heart Failure," said Lila J. Finney Rutten, PhD, an author of the study and professor of health services research in the Department of Health Sciences at Mayo Clinic in Rochester, Minnesota. "Identifying health literacy as a factor that affects health outcomes and measuring its effect on patients with Heart Failure is essential to allocate more resources for, and research on, interventions to improve health literacy."

Study limitations include the exclusion of publication bias evaluation due to the small number of studies for each outcome, health literacy studies were evaluated with different tools, which could limit comparability, and uncertainty if a health literacy assessment was done in an outpatient or inpatient setting, which may have influenced measurement.

Credit: 
American College of Cardiology

Antibody designed to recognize pathogens of Alzheimer's disease

Researchers have found a way to design an antibody that can identify the toxic particles that destroy healthy brain cells - a potential advance in the fight against Alzheimer's disease.

Their method is able to recognise these toxic particles, known as amyloid-beta oligomers, which are the hallmark of the disease, leading to hope that new diagnostic methods can be developed for Alzheimer's disease and other forms of dementia.

The team, from the University of Cambridge, University College London and Lund University, designed an antibody which is highly accurate at detecting toxic oligomers and quantifying their numbers. Their results are reported in the Proceedings of the National Academy of Sciences (PNAS).

"There is an urgent unmet need for quantitative methods to recognise oligomers - which play a major role in Alzheimer's disease, but are too elusive for standard antibody discovery strategies," said Professor Michele Vendruscolo from Cambridge's Centre for Misfolding Diseases, who led the research. "Through our innovative design strategy, we have now discovered antibodies to recognise these toxic particles."

Dementia is one of the leading causes of death in the UK and costs more than £26 billion each year, a figure which is expected to more than double in the next 25 years. Estimates put the current cost to the global economy at nearly £1 trillion per year.

Alzheimer's disease, the most prevalent form of dementia, leads to the death of nerve cells and tissue loss throughout the brain, resulting in memory failure, personality changes and problems carrying out daily activities.

Abnormal clumps of proteins called oligomers have been identified by scientists as the most likely cause of dementia. Although proteins are normally responsible for important cell processes, according to the amyloid hypothesis, when people have Alzheimer's disease these proteins -including specifically amyloid-beta proteins - become rogue and kill healthy nerve cells.

Proteins need to be closely regulated to function properly. When this quality control process fails, the proteins misfold, starting a chain reaction that leads to the death of brain cells. Misfolded proteins form abnormal clusters called plaques which build up between brain cells, stopping them from signalling properly. Dying brain cells also contain tangles, twisted strands of proteins that destroy a vital cell transport system, meaning nutrients and other essential supplies can no longer move through the cells.

There have been over 400 clinical trials for Alzheimer's disease, but no drug that can modify the course of the disease has been approved. In the UK, dementia is the only condition in the top 10 causes of death without a treatment to prevent, stop, or slow its progression.

"While the amyloid hypothesis is a prevalent view, it has not been fully validated in part because amyloid-beta oligomers are so difficult to detect, so there are differing opinions on what causes Alzheimer's disease," said Vendruscolo. "The discovery of an antibody to accurately target oligomers is, therefore, an important step to monitor the progression of the disease, identify its cause, and eventually keep it under control."

The lack of methods to detect oligomers has been a major obstacle in the progress of Alzheimer's research. This has hampered the development of effective diagnostic and therapeutic interventions and led to uncertainty about the amyloid hypothesis.

"Oligomers are difficult to detect, isolate, and study," said Dr Francesco Aprile, the study's first author. "Our method allows the generation of antibody molecules able to target oligomers despite their heterogeneity, and we hope it could be a significant step towards new diagnostic approaches."

The method is based on an approach for antibody discovery developed over the last ten years at the Centre for Misfolding Diseases. Based on the computational assembly of antibody-antigen assemblies, the method enables the design of antibodies for antigens that are highly challenging, such as those that live only for a very short time.

By using a rational design strategy that enables to target specific regions, or epitopes, of the oligomers, and a wide range of in vitro and in vivo experiments, the researchers have designed an antibody with at least three orders of magnitude greater affinity for the oligomers over other forms of amyloid-beta. This difference is the key feature that enables the antibody to specifically quantify oligomers in both in vitro and in vivo samples.

The team hopes that this tool will enable the discovery of better drug candidates and the design of better clinical trials for people affected by the debilitating disease. They also co-founded Wren Therapeutics, a spin-out biotechnology company based at the Chemistry of Health Incubator, in the recently opened Chemistry of Health building, whose mission it is to take the ideas developed at the University of Cambridge and translate them into finding new drugs to treat Alzheimer's disease and other protein misfolding disorders.

The antibody has been patented by Cambridge Enterprise, the University's commercialisation arm.

Credit: 
University of Cambridge

Solving the space junk problem

image: A computer-generated image representing space debris as could be seen from high Earth orbit. The two main debris fields are the ring of objects in geosynchronous Earth orbit and the cloud of objects in low Earth orbit.

Image: 
NASA

Space is getting crowded. Aging satellites and space debris crowd low-Earth orbit, and launching new satellites adds to the collision risk. The most effective way to solve the space junk problem, according to a new study, is not to capture debris or deorbit old satellites: it's an international agreement to charge operators "orbital-use fees" for every satellite put into orbit.

Orbital use fees would also increase the long-run value of the space industry, said economist Matthew Burgess, a CIRES Fellow and co-author of the new paper. By reducing future satellite and debris collision risk, an annual fee rising to about $235,000 per satellite would quadruple the value of the satellite industry by 2040, he and his colleagues concluded in a paper published today in the Proceedings of the National Academy of Sciences.

"Space is a common resource, but companies aren't accounting for the cost their satellites impose on other operators when they decide whether or not to launch," said Burgess, who is also an assistant professor in Environmental Studies and an affiliated faculty member in Economics at the University of Colorado Boulder. "We need a policy that lets satellite operators directly factor in the costs their launches impose on other operators."

Currently, an estimated 20,000 objects--including satellites and space debris--are crowding low-Earth orbit. It's the latest Tragedy of the Commons, the researchers said: Each operator launches more and more satellites until their private collision risk equals the value of the orbiting satellite.

So far, proposed solutions have been primarily technological or managerial, said Akhil Rao, assistant professor of economics at Middlebury College and the paper's lead author. Technological fixes include removing space debris from orbit with nets, harpoons, or lasers. Deorbiting a satellite at the end of its life is a managerial fix.

Ultimately, engineering or managerial solutions like these won't solve the debris problem because they don't change the incentives for operators. For example, removing space debris might motivate operators to launch more satellites--further crowding low-Earth orbit, increasing collision risk, and raising costs. "This is an incentive problem more than an engineering problem. What's key is getting the incentives right," Rao said.

A better approach to the space debris problem, Rao and his colleagues found, is to implement an orbital-use fee--a tax on orbiting satellites. "That's not the same as a launch fee," Rao said, "Launch fees by themselves can't induce operators to deorbit their satellites when necessary, and it's not the launch but the orbiting satellite that causes the damage."

Orbital-use fees could be straight-up fees or tradeable permits, and they could also be orbit-specific, since satellites in different orbits produce varying collision risks. Most important, the fee for each satellite would be calculated to reflect the cost to the industry of putting another satellite into orbit, including projected current and future costs of additional collision risk and space debris production--costs operators don't currently factor into their launches. "In our model, what matters is that satellite operators are paying the cost of the collision risk imposed on other operators," said Daniel Kaffine, professor of economics and RASEI Fellow at the University of Colorado Boulder and co-author on the paper.

And those fees would increase over time, to account for the rising value of cleaner orbits. In the researchers' model, the optimal fee would rise at a rate of 14 percent per year, reaching roughly $235,000 per satellite-year by 2040.

For an orbital-use fee approach to work, the researchers found, all countries launching satellites would need to participate--that's about a dozen that launch satellites on their own launch vehicles and more than 30 that own satellites. In addition, each country would need to charge the same fee per unit of collision risk for each satellite that goes into orbit, although each country could collect revenue separately. Countries use similar approaches already in carbon taxes and fisheries management.

In this study, Rao and his colleagues compared orbital-use fees to business as usual (that is, open access to space) and to technological fixes such as removing space debris. They found that orbital use fees forced operators to directly weigh the expected lifetime value of their satellites against the cost to industry of putting another satellite into orbit and creating additional risk. In other scenarios, operators still had incentive to race into space, hoping to extract some value before it got too crowded.

With orbital-use fees, the long-run value of the satellite industry would increase from around $600 billion under the business-as-usual scenario to around $3 trillion, researchers found. The increase in value comes from reducing collisions and collision-related costs, such as launching replacement satellites.

Orbital-use fees could also help satellite operators get ahead of the space junk problem. "In other sectors, addressing the Tragedy of the Commons has often been a game of catch-up with substantial social costs. But the relatively young space industry can avoid these costs before they escalate," Burgess said.

Credit: 
University of Colorado at Boulder

Unique insight into development of the human brain: Model of the early embryonic brain

We know a lot about the human brain, but very little about how it is formed. In particular, the stages from the second to the seventh week of embryonic development have so far been virtually unknown territory to brain researchers.

To learn more about this particular period, researchers from the Department of Neuroscience and the Novo Nordisk Foundation Center for Stem Cell Biology at the Faculty of Health and Medical Sciences have now developed a model that mimics these early stages of the human brain in the laboratory.

The model is based on embryonic stem cells grown in a microfluidic system developed in collaboration with bioengineers from Lund University in Sweden.

'We know that in the early embryonic stage the brain is exposed to various concentrations of growth factors which induces the formation of different brain regions. By using microfluidic methods, we can - under extremely controlled conditions - recreate the environment found in the early embryo, explains the first author on the study, Assistant Professor Pedro Rifes.

'When we expose stem cells to the controlled environment, we can create a tissue that resembles an embryonic brain at a very early stage, about 4-5 weeks after fertilisation of the egg - a stage that we have so far not been able to study'.

The Developmental Tree of the Human Brain

The researchers will use the new model to make a map of the development of the brain cells - a kind of 'Developmental tree' of the brain, thereby learning new things about how the enormous complexity of different nerve cells in the human brain is formed during the early embryonic stages.´

'For the first time, we have access to a tissue that resembles the early embryonic brain, and this allows us togo in and analyse what happens to each individual cell at each stage of development', says the principal scientist behind the study, Associate Professor Agnete Kirkeby.

The idea is that brain researchers around the world will be able to use this 'Developmental tree' of the brain as a guide to produce different types of nerve cells for stem cell therapy. By studying the natural development of the nerve cells, the researchers will be able to speed up the creation of recipes for producing specific nerve cells in the laboratory.

A Recipe for Stem Cell Treatment

Agnete Kirkeby is well aware of the importance of a faster path to stem cell treatments. Together with colleagues from Lund and Cambridge, she has for several years worked on developing a stem cell therapy for Parkinson's disease. This project required Kirkeby and her colleagues to produce a very specific type of nerve cells, the dopaminergic nerve cells, which are the cells that are lost in Parkinson's Disease.

'We have come a long way in the project and will soon be able to test the stem cell treatment in humans for the first time. But it took us more than 10 years to get this far because we depended on a trial-and-error methodology to develop the right nerve cells from the stem cells'.

With knowledge from the new model, the researchers expect to be able to considerably shorten this process in the future.

'If we understand exactly how the brain develops in the early stages, we will become better at guiding the stem cells in the right direction when producing human nerve cells in the lab. This will allow us to more quickly and efficiently develop cell treatments for neurological diseases such as epilepsy, Parkinson's Disease and certain types of dementia', says Agnete Kirkeby.

New Options for testing Environmental Toxins

In addition to increasing our knowledge on brain development and easing the path to future stem cell treatments, Agnete Kirkeby believes that the embryonic brain model may serve other useful purposes as well.

'The model may be used to investigate how brain cells in the early embryonic stages react to certain chemicals surrounding us in our daily lives - these might be substances in our environment, in consumer products or in the medications that some pregnant women may require. So far, we have not had a good model to test precisely this'.

Credit: 
University of Copenhagen - The Faculty of Health and Medical Sciences

Astronomers see 'cosmic ring of fire,' 11 billion years ago

image: This is an artist's impression of the ring galaxy.

Image: 
James Josephides, Swinburne Astronomy Productions

Astronomers have captured an image of a super-rare type of galaxy - described as a "cosmic ring of fire" - as it existed 11 billion years ago.

The galaxy, which has roughly the mass of the Milky Way, is circular with a hole in the middle, rather like a titanic doughnut. Its discovery, announced in the journal Nature Astronomy, is set to shake up theories about the earliest formation of galactic structures and how they evolve.

"It is a very curious object that we've never seen before," said lead researcher Dr Tiantian Yuan, from Australia's ARC Centre of Excellence for All Sky Astrophysics in 3 Dimensions (ASTRO 3D). "It looks strange and familiar at the same time."

The galaxy, named R5519, is 11 billion light-years from the Solar System. The hole at its centre is truly massive, with a diameter two billion times longer than the distance between the Earth and the Sun. To put it another way, it is three million times bigger than the diameter of the supermassive black hole in the galaxy Messier 87, which in 2019 became the first ever to be directly imaged.

"It is making stars at a rate 50 times greater than the Milky Way," said Dr Yuan, who is an ASTRO 3D Fellow based at the Centre for Astrophysics and Supercomputing at Swinburne University of Technology, in the state of Victoria.

"Most of that activity is taking place on its ring - so it truly is a ring of fire."

Working with colleagues from around Australia, US, Canada, Belgium and Denmark, Dr Yuan used spectroscopic data gathered by the WM Keck Observatory in Hawaii and images recorded by NASA's Hubble Space Telescope to identify the unusual structure.

The evidence suggests it is a type known as a "collisional ring galaxy", making it the first one ever located in the early Universe.

There are two kinds of ring galaxies. The more common type forms because of internal processes. Collisional ones form - as the name suggests - as a result of immense and violent encounters with other galaxies.

In the nearby "local" Universe they are 1000 times rarer than the internally created type. Images of the much more distant R5519 stem from about 10.8 billion years ago, just three billion years after the Big Bang. They indicate that collisional ring galaxies have always been extremely uncommon.

ASTRO 3D co-author, Dr Ahmed Elagali, based at the International Centre for Radio Astronomy Research in Western Australia, said studying R5519 would help determine when spiral galaxies began to develop.

"Further, constraining the number density of ring galaxies through cosmic time can also be used to put constraints on the assembly and evolution of local-like galaxy groups," he added.

Another co-author, Professor Kenneth Freeman from the Australian National University, said the discovery had implications for understanding how galaxies like the Milky Way formed.

"The collisional formation of ring galaxies requires a thin disk to be present in the 'victim' galaxy before the collision occurs," he explained.

"The thin disk is the defining component of spiral galaxies: before it assembled, the galaxies were in a disorderly state, not yet recognisable as spiral galaxies."

"In the case of this ring galaxy, we are looking back into the early universe by 11 billion years, into a time when thin disks were only just assembling. For comparison, the thin disk of our Milky Way began to come together only about nine billion years ago. This discovery is an indication that disk assembly in spiral galaxies occurred over a more extended period than previously thought."

Drs Yuan and Elagali, and Professor Freeman, worked with colleagues from the University of New South Wales, Macquarie University, and University of Queensland, all in Australia, together with others at the Cosmic Dawn Centre (DAWN) in Denmark, Texas A&M University in the US, York University in Canada, and Ghent University in Belgium.

Credit: 
ARC Centre of Excellence for All Sky Astrophysics in 3D (ASTRO 3D)

Problems with alcohol? 29 gene variants may explain why

A genome-wide analysis of more than 435,000 people has identified 29 genetic variants linked to problematic drinking, researchers at Yale University School of Medicine and colleagues report May 25 in the journal Nature Neuroscience.

"The new data triple the number of known genetic risk loci associated with problematic alcohol use," said Yale's Joel Gelernter, the Foundations Fund Professor of Psychiatry and professor of genetics and of neuroscience, who is the senior author of the multi-institutional study.

The study includes genome-wide analysis of people of European ancestry contained in four separate biobanks or datasets. The researchers looked for shared genetic variants among those who met criteria for problematic alcohol use, including alcohol use disorder and alcohol use with medical consequences. These disorders are major contributors to a wide variety of medical problems worldwide.

The analysis found 19 previously unknown independent genetic risk factors for problematic alcohol use, and confirmed 10 previously identified risk factors.

The meta-analysis of biobank data also included information on genetic risk factors for several psychiatric disorders. This information allowed researchers to study shared genetic associations between problematic drinking and disorders such as depression and anxiety.

They also found genetic heritability of these variants was enriched in the brain and in evolutionarily conserved regulatory regions of the genome, attesting to their importance in biological function. Using a technique called Mendelian randomization, they were able to investigate how one genetically influenced trait affects another genetically linked trait.

"This gives us ways to understand causal relations between problematic alcohol use traits such as psychiatric states, risk-taking behavior, and cognitive performance," said Yale's Hang Zhou, associate research scientist in psychiatry and lead author of the study.

"With these results, we are also in a better position to evaluate individual-level risk for problematic alcohol use," Gelernter said.

Credit: 
Yale University

A new law in laser physics could make eye surgery simpler

image: This is Dr. Antoine Runge in a lab at the School of Physics at the University of Sydney.

Image: 
Louise Cooper/University of Sydney

Scientists have developed a new type of laser that can deliver high amounts of energy in very short bursts of time, with potential applications in eye and heart surgery or the engineering of delicate materials.

The Director of the University of Sydney Institute of Photonics and Optical Science, Professor Martijn de Sterke, said: "This laser has the property that as its pulse duration decreases to less than a trillionth of a second, its energy could go through the roof.

"This makes them ideal candidates for the processing of materials that require short, powerful pulses. One application could be in corneal surgery, which relies on gently removing material from the eye. This requires strong, short light pulses that do not heat and damage the surface."

The research is published today in Nature Photonics.

The scientists have achieved this remarkable result by returning to a simple laser technology that is common in telecommunications, metrology and spectroscopy. These lasers use an effect known as soliton waves, which are waves of light that maintain their shape over long distances.

Solitons were first identified in the early 19th century, not in light but in water waves in the industrial canals of England.

"The fact that soliton waves in light maintain their shape means they are excellent for a wide range of applications, including telecommunications and spectrometry," said lead author Dr Antoine Runge from the School of Physics.

"However, while lasers producing these solitons are simple to make, they do not pack much punch. A completely different - and expensive - physical system is required to produce the high-energy optical pulses used in manufacturing."

Co-author Dr Andrea Blanco-Redondo, Head of Silicon Photonics at Nokia Bell Labs in the US, said: "Soliton lasers are the most simple, cost-effective and robust way to achieve these short bursts. However, until now, conventional soliton lasers could not deliver enough energy.

"Our results have the potential to make soliton lasers useful for biomedical applications," said Dr Blanco-Redondo, who was previously at the University of Sydney Nano Institute.

This research builds on earlier work established by the team at the University of Sydney Institute for Photonics and Optical Science, which published its discovery of pure-quartic solitons in 2016.

A new law in laser physics

In a normal soliton laser, the energy of light is inversely proportional to its pulse duration, demonstrated by the equation E = 1/τ. If you halve the pulse time of the light, you get twice the amount of energy.

Using quartic solitons, the energy of light is inversely proportional to the third power of the pulse duration, or E = 1/τ3. This means if your pulse time is halved, the energy it delivers in that time is multiplied by a factor of eight.

"It is this demonstration of a new law in laser physics that is most important in our research," Dr Runge said. "We have shown that E = 1/τ3 and we hope this will change how lasers can be applied in the future."

Establishing this proof of principle will enable the team to make more powerful soliton lasers.

Dr Blanco-Redondo said: "In this research we produced pulses that are as short as a trillionth of a second, but we have plans to get much shorter than that."

"Our next goal is to produce femtosecond duration pulses - one quadrillionth of a second," Dr Runge said. "This will mean ultra-short laser pulses with hundreds of kilowatts of peak power."

Professor De Sterke said: "We hope this type of laser can open a new way to apply laser light when we need high peak energy but where the base material is not damaged."

Credit: 
University of Sydney

New double-contrast technique picks up small tumors on MRI

image: A new technique developed by researchers at UC Davis offers a significant advance in using magnetic resonance imaging to pick out even very small tumors from normal tissue. The team created a probe that generates two magnetic resonance signals that suppress each other until they reach the target, at which point they both increase contrast between the tumor and surrounding tissue.

Image: 
Xiandoing Xue, UC Davis

researchers at the University of California, Davis offers a significant advance in using magnetic resonance imaging to pick out even very small tumors from normal tissue. The work is published May 25 in the journal Nature Nanotechnology.

Chemical probes that produce a signal on magnetic resonance imaging (MRI) can be used to target and image tumors. The new research is based on a phenomenon called magnetic resonance tuning that occurs between two nanoscale magnetic elements. One acts to enhance the signal, and the other quenches it. Previous studies have shown that quenching depends on the distance between the magnetic elements. This opens new possibilities for non-invasive and sensitive investigation of a variety of biological processes by MRI.

The UC Davis team created a probe that generates two magnetic resonance signals that suppress each other until they reach the target, at which point they both increase contrast between the tumor and surrounding tissue. They call this two-way magnetic resonance tuning (TMRET).

Combined with specially developed imaging analysis software, the double signal enabled researchers to pick out brain tumors in a mouse model with greatly increased sensitivity.

"It's a significant advance," said senior author Yuanpei Li, Associate Professor of biochemistry and molecular medicine at the UC Davis School of Medicine and Comprehensive Cancer Center. "This could help detect very small early-stage tumors."

Two magnetic components

The probe developed by the UC Davis team contains two components: nanoparticles of superparamagnetic iron oxide (SPIO), and pheophorbide a-paramagnetic manganese (P-Mn), packaged together in a lipid envelope. SPIO and P-Mn both give strong, separate signals on MRI, but as long as they are physically close together those signals tend to cancel each other out, or quench. When the particles enter tumor tissue, the fatty envelope breaks down, SPIO and P-Mn separate, and both signals appear.

Li's laboratory focuses on the chemistry of MRI probes and developed a method to process the data and reconstruct images, which they call double-contrast enhanced subtraction imaging or DESI. But for expertise in the physical mechanisms, they reached out to Professors Kai Liu and Nicholas Curro at the UC Davis Department of Physics (Liu is now at Georgetown University). The physicists helped elucidate the mechanism of the TMRET method and refine the technique.

The researchers tested the method in cultures of brain and prostate cancer cells and in mice. For most MRI probes, the signal from the tumor is up to twice as strong as from normal tissue - a "tumor to normal ratio" of 2 or less. Using the new dual-contrast nanoprobe, Li and colleagues could get a tumor-to-normal ratio as high as 10.

Li said the team is interested in translating the research into clinical use, although that will require extensive work including toxicology testing and scaling up production before they could apply for investigational new drug approval.

Credit: 
University of California - Davis

Study reveals first evidence inherited genetics can drive cancer's spread

image: When researchers analyzed the immune cells in the tumors and grouped them by type, they found more cancer-fighting cells in mice with ApoE4.

Image: 
Elizabeth and Vincent Meyer Laboratory of Systems Cancer Biology at The Rockefeller University

Sometimes cancer stays put, but often it metastasizes, spreading to new locations in the body. It has long been suspected that genetic mutations arising inside tumor cells drive this potentially devastating turn of events.

Now researchers have shown for the first time that our own pre-existing genetics can promote metastasis.

A new study, published May 25 in Nature Medicine, suggests that differences in a single gene, carried within someone's genome from birth, can alter progression of melanoma, a type of skin cancer. The researchers suspect these inherited variations may have the same effect on other types of cancer as well.

"Patients often ask 'Why am I so unlucky? Why did my cancer spread?' As doctors, we never had an answer," says lead investigator Sohail Tavazoie, Leon Hess Professor and senior attending physician. "This research provides an explanation."

The discovery may transform how scientists think about cancer metastasis, and lead to a better understanding patients' risks in order to inform treatment decisions, Tavazoie says.

The mystery of metastasis

Metastasis occurs when cancer cells escape the original tissue to establish new tumors elsewhere, a phenomenon that leads to the majority of cancer deaths. Scientists have suspected that cancer cells, which initially emerge due to mutations inside normal cells, gain their travelling ability following further mutations. But after decades of searching, they have yet to find such a genetic change that could be proven to encourage metastasis.

Previous research in Tavazoie's lab had identified a gene called APOE, present in the DNA of all of the body's cells before any cancer arises, that can impact the spread of melanoma. The gene produces a protein that appears to interfere with a number of processes used by cancer cells to metastasize, such as forming blood vessels, growing deeper into healthy tissue, and withstanding assault from tumor-fighting immune cells.

Humans, however, carry one of three different versions of ApoE: ApoE2, ApoE3, and ApoE4. Benjamin Ostendorf, a physician scientist in the lab, hypothesized that these variants could explain why melanoma progresses differently in different people.

In experiments with mice possessing one of each of the versions of the gene, he and colleagues found tumors in those with ApoE4 grew the smallest and spread the least.

A closer look revealed that ApoE4 is the most effective version of ApoE in terms of enhancing the immune response to tumor cells. Compared to animals with other variants, the mice carrying ApoE4 showed a greater abundance of tumor-fighting T cells recruited into the melanoma tumor, as well as reduced blood vessels.

"We think that a major impact of the variations in ApoE arises from differences in how they modulate the immune system's attack," Ostendorf says.

Toward better treatment

Genetic data from more than 300 human melanoma patients echoed the mouse experiments: On average, people with ApoE4 survived the longest, while those with ApoE2 lived the shortest. This connection to outcomes suggests that doctors could look at patients' genetics to assess the risk of their cancer progressing.

It could also influence the course of treatment. Melanoma patients are sometimes given therapy that encourages their own immune systems to better fight the cancer. The team's analysis of information from such patients, as well as experiments with mice, showed that those with ApoE4 respond best to immune-boosting therapies.

Likewise, the researchers showed that an experimental compound that increases production of ApoE, RGX-104, was effective at helping mice with ApoE4 fight off tumors. RGX-104 is currently in clinical trials. (Tavazoie is a scientific cofounder of Rgenix, the company that developed RGX-104.)

Further research is needed to determine how to optimize treatments for patients with other ApoE variants, Tavazoie says. ApoE2, for instance, was associated with an increased risk of metastasis. The researchers evidence so far suggests that ApoE3's metastasis-suppressing ability falls between that of the other two. "We need to find those patients whose genetics put them at risk for poor survival and determine what therapies work best for them," Tavazoie says.

The implications may extend beyond cancer. Other studies have shown that variations in ApoE contribute to Alzheimer's disease: ApoE4 aggravates risk of this neurodegenerative disorder, in contrast to its suppression of cancer progression.

"It's not quite clear what ApoE does in Alzheimer's, but we believe our work in cancer can inform our understanding of this disease as well," Tavazoie says. His lab, normally focused on cancer, has begun investigating the connection to the neurodegenerative disorder.

Credit: 
Rockefeller University

Synthesis of prebiotic peptides gives clues to the origin of life on Earth

image: Time dependencies of the glycylglycine (a, c) and diglycylgly-cine (b, d) concentrations (data obtained by HPLC) and their yields for the glycine-sodium trimetaphosphate-imidazole (1:1:1) system at 75 °C with different pH values (a, b) and at different temperatures with pH 9.5 (c, d); initial concentrations c0 = 0.1 M, reactor residence time τr = 16 min, total volume Vt = 25 ml, reactor volume Vr = 16 ml

Image: 
Kazan Federal University

Coordination Compounds Lab of Kazan Federal University started researching prebiotic peptide synthesis in 2013 with the use of the ASIA-330 flow chemistry system. Many lab projects are devoted to the problem of selectivity and specificity of processes in living nature. This problem is directly related to prebiotic chemistry, whose foundations were laid as a result of the amino acids synthesis by the German chemist Adolph Friedrich Ludwig Strecker (the Strecker reaction, 1851) and of the sugar synthesis by the Russian chemist, the founder of the Kazan Chemical School, Alexander Butlerov (the formose reaction, 1861). Studies on the prebiotic synthesis of sugars and peptides are aimed at a deeper understanding of the most fundamental and intriguing problem of our time - the origin of life.

The kinetics of oligopeptides formation in the flow systems glycine - sodium trimetaphosphate - imidazole/N-methylimidazole at thermocyclic regime was investigated by liquid chromatography (HPLC) and 31P NMR methods in the ranges of temperature from 45 to 90°C and pH from 8.5 to 11.5. Formation of significant amounts of glycylglycine (yield up to 52%) and diglycylglycine was revealed. Better results are obtained at 75°C in slightly alkaline conditions (pH 9.5-10.5), and in the presence of imidazole yields of oligopeptides are bigger than without this heterocycle. It should be mentioned that the used non-equilibrium regime of the glycylglycine and diglycylglycine syntheses turns out to be one of the most effective among all prebiotic syntheses reported so far in the literature, both in the absence and in the presence of imidazole. Earlier, H. Sawai and L.E. Orgel discovered that heterocycles, such as imidazole, can increase yields of peptides in solid state (Sawai H., Orgel L.E. (1975) J. Mol. Evol. 6:185-197. doi:10.1007/bf01732355). However, the proposed explanation of the catalytic effect of imidazole due to initial formation of N-triphosphoryl imidazole as a key intermediate is not working in the reactions studied by us because in the imidazole - trimetaphosphate mixture signals of any imidazole N-phosphates are absent in 31P NMR spectra. In this situation, a new imidazole catalysis mechanism by which imidazole reacts with cyclic N,O-phosphoryl glycine giving N-imidazolyl-O-glycyl phosphate as a key intermediate was proposed and validated in the present investigation. Detailed reaction mechanisms were proposed and justified by quantum chemical calculations using density functional theory (DFT) method at the high level (CAM-B3LYP/TZVP) with accounting solvent effect by the polarized continuum model (?-PCM). It is emphasized that while in the absence of imidazoles, prebiotic activation of amino acids occurs at the N-terminus, and in the presence of imidazoles it shifts to the O-terminus. This means that in the peptide elongation N-imidazolyl-O-aminoacyl phosphates play in prebiotic systems the outstanding role similar to that of aminoacyl adenylates formed at the ATP and aminoacyl-tRNA synthetases presence in biosystems. This seems to be a key pathway for prebiotic evolution in terms of peptide synthesis. So, the new crucial role of imidazoles in prebiotic evolution was discovered. The systems used and modes of their conversion can be good models for prebiotic peptide syntheses in a flow thermocyclic regime, including prebiotic peptide syntheses under conditions of various hydrothermal systems, particulary in Kamchatka, where temperature and pressure fluctuations are detected and pH varies from 2.0 to 9.5 while temperature ranges from 55 to 98oC in hot springs.

The work is of importance for the development of the problem of prebiotic peptide synthesis. The results can be used in the synthesis of small oligopeptides. In addition, the used experimental setup in combination with mathematical modeling and quantum-chemical calculations can be used to study other processes in the thermocyclic mode. The fact that N-methylimidazole has catalytic effect on the prebiotic peptide synthesis is especially important because similar heterocycles can be formed under shock exposure at prebiotic conditions (Shtyrlin V.G. et al. (2019) Orig. Life Evol. Biosph. 49:1-18. doi:10.1007/s11084-019-09575-8). In the cited work, it was found that upon impact on the water - formamide - bicarbonate - sodium hydroxide system, placed in a stainless steel preservation capsule, 7 imidazole derivatives (out of 21 products) are formed. It was established that the most effective syntheses proceed at pH ~9.5, and ammonia and formaldehyde are formed among many intermediate products. Note that according to the results of the cited study, a new hypothesis about the origin of life was proposed: life could have originated due to the impact of meteorites on alkaline water-formamide lakes located near volcanoes on the early Earth.

Further research can pertain to the synthesis of oligopeptides containing other amino acids and the synthesis of other biopolymers, primarily sugars. Particular attention will be paid to experimental and theoretical studies of mechanisms of heterogeneous catalysis in prebiotic syntheses of biopolymers. The focus will be on the role of coordination and complexation with metals in prebiotic syntheses, since metal complexes could control the stereoselectivity and specificity of many vital processes at the first stages of biochemical evolution.

Credit: 
Kazan Federal University