Culture

A small switch with a big impact

T cells play a key role in the human immune system. They are capable of distinguishing diseased or foreign tissue from the body's own, healthy tissue with great accuracy; they are capable of triggering the actions necessary to fight off the troublemakers. The details of this immune response are manifold and the individual steps are not yet fully understood.

Scientists of the universities of Würzburg and Mainz have now figured out new details of these processes, showing that tiny point mutations in a gene can modify T cells to be less aggressive. This could be an advantage after stem cell transplantation which includes T-cell transfusion in order to keep a number of severe side effects in check. The researchers have now published the results of their study in the Journal of Experimental Medicine. The study is led by Dr Friederike Berberich-Siebelt, head of the "Molecular and cellular immunology" research group at the Institute of Pathology of the University of Würzburg.

A protein family with multiple tasks

When T cells detect foreign or altered tissue, such as an infected or tumour tissue, this usually happens through the receptors on their cell surface. These T-cell receptors then send signals into the cell interior, initiating a response. In a first step, they activate a special family of transcription factors - scientifically called NFAT for nuclear factor of activated T-cells. The NFATs then bind to the DNA in the cell nucleus and trigger also the production of cytokines such as interleukin-2.

NFAT is composed of many family members which may have overlapping tasks or assume completely different functions. But that's not all: Like many other proteins in the cell, they can still be modified after their synthesis to customize their function. The recently published study focuses on one specific modification of the NFATc1 "family member" which is called sumoylation.

Advantageous point mutations

"Sumoylation plays a role in different cellular processes such as nuclear transport, programmed cell death or as an antiviral mechanism," Friederike Berberich-Siebelt explains. Sumoylation defects have also been observed in various diseases such as cancer and herpesvirus infections.

In the study now published, the scientists worked with laboratory animals that had two actually insignificant point mutations in the NFATc1 gene which, however, prevent sumoylation. This is not necessarily a disadvantage: "The offspring of these animals is perfectly healthy. The modified NFATc1 even mediates specific signals that reduce the clinical symptoms of multiple sclerosis at least in the animal model," Berberich-Siebelt explains. When using T cells that carry these mutations in stem cell transplantation, they are much less aggressive against the tissues of the host animals than "normal" cells.

Fascinating fundamental research

This effect is due to an increase in interleukin-2 at the beginning of the immune response at the biomolecular level. Interleukin-2 counteracts the differentiation into inflammatory T-cell subtypes and at the same time supports so-called regulatory T cells according to the authors of the study. It is quite possible that this discovery will have consequences for future stem cell transplantation which includes T-cell infusion. When using T cells in which NFATc1 is not sumoylated, this might prevent severe side effects, making the point mutation "a small modification with a big impact" according to Berberich-Siebelt.

To investigate this in more detail, Berberich-Siebelt and her team will continue to research the possibilities of therapeutic implementation within the framework of the Collaborative Research Center/Transregio "Control of graft-versus-host and graft-versus-leukaemia immune responses after allogeneic stem cell transplantation" funded by the German Research Foundation (DFG). "We want to find out whether CRISPR/Cas9 gene editing can be applied to human T cells to exhibit just the right amount of activity during hematopoietic stem cell transplantation," the scientist says.

But the new findings are also relevant independently of these potential consequences for therapeutic applications. "We are basically interested in understanding the fine regulation in cells, such as the T-cell receptor signalling and the function of NFAT family members and their isoforms in this context," says Berberich-Siebelt who finds the newly published results "fascinating". After all, the scientists did not have to switch off a gene or activate it excessively as is often the case in research. Instead, two actually harmless point mutations and subtle direct effects were sufficient to ultimately flip the switch from inflammation, autoimmunity and rejection to tolerance. A small shift of the focus at the beginning of the immune response had been sufficient to accomplish this.

Credit: 
University of Würzburg

Reducing the high social cost of death

image: An unexpected loss of a loved one can lead to a range of ramifications if not dealt with properly

Image: 
Kyoto University

Japan - How will you cope with the death of your mother or spouse? Their death may disturb your concentration, causing accidents or lowering your productivity. Some bereaved cannot sleep, and others cannot get out of bed. Some lose all appetite, while others binge eat constantly. Some grow alcoholic, and some suicidal. Our responses may depend on our family, culture, community, or belief-systems, but we all struggle to accept our loved ones' deaths.

The cost of grief is not confined to personal mental anguish. It reduces productivity, causes dependency on medicine and social services, and increases mortality risks for survivors. While this is well documented in Europe, we have little data for Japan, the world's most elderly country. To fill this gap, a research team led by Kyoto University is conducting a nationwide survey of bereavement.

"Japan's society is rapidly aging. By 2030, nearly everyone in Japan will suffer the death of a parent, elder relative, spouse, or close friend," explains lead author Carl Becker of the Center for the Promotion of Interdisciplinary Education and Research, who garnered the 2020 Educator Award from the international Association for Death Education and Counseling.

"Recent UK studies suggest that about 10% of bereaved individuals show significant decline in health, resulting in prolonged use of resources. If Japan faces the same percent, the impact will be catastrophic." The team decided to conduct similar surveys throughout Japan with additional questions focusing on economic and lifestyle changes.

Their pilot report -- published in the journal OMEGA -- shows that deeper grief correlates with an overarching decline in quality of life, seen in physical ailments, more down time, and higher rates of medical reliance. Interestingly, lower income families lost more productivity and pharmaceutical expenses, while lower satisfaction with funerals was linked to higher medical costs.

Results show that bereaved Japanese are similar to Europeans in their losses of everything from time, productivity, health, and medical expenses. Factors like the circumstances of death, the loss of income, lack of family or social support, and satisfaction with funeral proceedings can help predict who may need the most help in the future.

"By identifying key problems, we can begin to see what solutions are required to mitigate severe bereavement," states Tohoku University's Yozo Taniyama, second author of the study. "For example, better testing, medical care, and psychological treatment can help people handle unexpected death. More robust financial and social aid can help with the loss of income."

Tradition and rituals appear to facilitate better responses as well. Funeral services offer friends and relations a chance to reconnect and support the bereaved, reducing their loneliness and isolation. Moreover, rituals help the bereaved to come to terms with death.

The research team predicted that people with low or declining incomes would find funeral costs more burdensome. Although that group did lose more time and spend more on pharmaceuticals, they displayed little dissatisfaction with funeral costs. In fact, the people who expressed greater dissatisfaction were those who abbreviated funerals, who later tended to show higher rates of physical as well as psychological problems.

Becker concludes, "Japan has a tradition of ceremonies that bring people together to help the bereaved process their trauma. Much of the world is learning from Japan's traditions that value spiritual bonds with departed loved ones. It is healthier to revere our dead than to try to forget them."

Credit: 
Kyoto University

Statistical model improves analysis of skin conductance

image: With electrodes strapped to two fingers, researchers can read out a signal of electrodermal activity.

Image: 
Sandya Subramanian/MIT Picower Institute

Electrodermal activity - the sweat-induced fluctuations of skin conductance made famous in TV dramatizations of lie-detector tests - can be a truly strong indicator of subconscious, or "sympathetic," nervous system activity for all kinds of purposes, but only if it is analyzed optimally. In a new study in the Proceedings of the National Academy of Sciences, an MIT-based team of scientists provides a new, fast and accurate statistical model for analyzing EDA.

"Only so much of EDA is intuitive just by looking at the signal," said Sandya Subramanian, a graduate student in the Harvard-MIT Health Sciences and Technology program and the study's lead author. Meanwhile, existing mathematical methods of analysis either compute averages of the signal that obscure its instantaneous nature, or inefficiently force measurements into a fit with signal processing models that have nothing to do with what's going on in the body.

To make EDA analysis faster and more accurate for interpreting internal cognitive states (like anxiety) or physiological states (like sleep), the team instead sought a statistical model that matches with the actual physiology of sweat. When stimulated by the sympathetic nervous system, glands under the skin build up a reservoir of sweat and then release it when they are full. This kind of process, called "integrate-and-fire," is also characteristic of diverse natural phenomena like the electrical spiking of nerve cells and geyser eruptions, said senior author Emery N. Brown, Edward Hood Taplin Professor at The Picower Institute for Learning and Memory and the Institute for Medical Engineering & Science at MIT.

A key insight of the study was the recognition that there is a well-established statistical formula for describing integrate-and-fire systems called an "inverse Gaussian" that could provide a principled way to model EDA signals.

"There is a push away from modeling actual physiology to just using off-the-shelf machine learning," said Brown, who is also an anesthesiologist at Massachusetts General Hospital and a Professor at Harvard. "But we would have missed a very simple, straightforward and even elegant description that is a readout of the body's autonomic state."

Led by Subramanian, the study team, which also included MGH researcher Riccardo Barbieri, formulated an inverse Gaussian model of EDA, and then, put it to the test with 11 volunteers who wore skin conductance monitors for an hour as they sat quietly, read or watched videos. Even while "at rest" people's thoughts and feelings wander, creating ample variation in the EDA signal. Nevertheless, after analysis of all 11, the inverse Gaussian produced a tight fit with their actual readings.

The modeling was able to account for smaller peaks in EDA activity than other methods typically exclude and also the degree of "bumpiness" of the signal, as indicated by the length of the intervals between the pulses, Subramanian said.
In 9 of the 11 cases, adding one of a few related statistical models tightened the inverse Gaussian's fit a little further.

Subramanian said that in practical use, an EDA monitoring system based on an inverse Gaussian model alone could immediately be useful, but it could also be quickly fine-tuned by initial readings from a subject to apply the best combination of models to fit the raw data.

Even with a bit of blending of models, the new approach will be quicker, more computationally efficient and readily interpretable than less principled analysis methods, the authors said, because the tight coupling to physiology requires varying only a few parameters to maintain a good fit with the readings. That's important because if the job of an EDA monitoring system is to detect significant deviations in the signal from normal levels, such as when someone feels acute discomfort, that comparison can only be made based on an accurate, real-time model of what a subject's normal and significantly abnormal levels are.

Indeed among the next steps in the work are tests of the model in subjects under a wider range of conditions ranging from sleep, to emotional or physical stimulation, and even disease states such as depression.

"Our findings provide a principled, physiologically based approach for extending EDA analyses to these more complex and important applications," the authors concluded.

Credit: 
Picower Institute at MIT

The best of both worlds: A new take on metal-plastic hybrid 3D printing

image: An approach that extends the use of 3D printers to 3D electronics for future robotics and Internet-of-Things applications

Image: 
Waseda University

Three-dimensional (3D) printing technology has evolved tremendously over the last decade to the point where it is now viable for mass production in industrial settings. Also known as "additive manufacturing," 3D printing allows one to create arbitrarily complex 3D objects directly from their raw materials. In fused filament fabrication, the most popular 3D printing process, a plastic or metal is melted and extruded through a small nozzle by a printer head and then immediately solidifies and fuses with the rest of the piece. However, because the melting points of plastics and metals are very different, this technology has been limited to creating objects of either metal or plastic only--until now.

In a recent study published in Additive Manufacturing, scientists from Waseda University, Japan, developed a new hybrid technique that can produce 3D objects made of both metal and plastic. Professor Shinjiro Umezu, who led the study, explains their motivation: "Even though 3D printers let us create 3D structures from metal and plastic, most of the objects we see around us are a combination of both, including electronic devices. Thus, we thought we'd be able to expand the applications of conventional 3D printers if we managed to use them to create 3D objects made of both metal and plastic."

Their method is actually a major improvement over the conventional metallization process used to coat 3D plastic structures with metal. In the conventional approach, the plastic object is 3D-printed and then submerged in a solution containing palladium (Pd), which adheres to the object's surface. Afterwards, the piece is submerged in an electroless plating bath that, using the deposited Pd as a catalyst, causes dissolved metal ions to stick to the object. While technically sound, the conventional approach produces a metallic coating that is non-uniform and adheres poorly to the plastic structure.

In contrast, in the new hybrid method, a printer with a dual nozzle is used; one nozzle extrudes standard melted plastic (acrylonitrile butadiene styrene, or ABS) whereas the other extrudes ABS loaded with PdCl2. By selectively printing layers using one nozzle or the other, specific areas of the 3D object are loaded with Pd. Then, through electroless plating, one finally obtains a plastic structure with a metallic coating over selected areas only.

The scientists found the adhesion of the metal coating to be much higher when using their approach. What's more, because Pd is loaded in the raw material, their technique does not require any type of roughening or etching of the ABS structure to promote the deposition of the catalyst, unlike the conventional method. This is especially important when considering that these extra steps cause damage not only to the 3D object itself, but to the environment as well, owing to the use of toxic chemicals like chromic acid. Lastly, their approach is entirely compatible with existing fused filament fabrication 3D printers.

Umezu believes that metal-plastic hybrid 3D printing could become very relevant in the near future considering its potential use in 3D electronics, which is the focus of upcoming Internet-of-Things and artificial intelligence applications. In this regard, he adds: "Our hybrid 3D printing method has opened up the possibility of fabricating 3D electronics so that devices and robots used in healthcare and nursing care could become significantly better than what we have today."

This study hopefully paves the way for hybrid 3D printing technology that will enable us to get the best of both worlds--metal and plastic combined.

Credit: 
Waseda University

Mapping the chaos of movement

video: C. elegans are just a curve, which is relatively easy to describe mathematically.

Image: 
Dr. Tosif Ahamed / OIST

The behavior of living organisms might obey the same mathematical laws as physical phenomena, such as weather and the motion of planets, says new research from the Biological Physics Theory Unit at the Okinawa Institute of Science and Technology Graduate University (OIST).

Physics has a history of successfully predicting and modeling motion across vastly different scales, from molecules to colliding black holes. But when it comes to the behavior of living organisms, the concept is still very new. Recent OIST PhD graduate, Dr. Tosif Ahamed, is therefore part of a group of scientists that are pioneers in this field. His research, published in Nature Physics, used a species of a tiny worm, Caenorhabditis elegans, to propose a framework for capturing the mathematical structure underlying moving animals.

"Neuroscience tends to focus on what goes on inside the brain," Dr. Ahamed said. "But this is often expressed through an animal's movement and behavior. Therefore, understanding their behavior gives us a window into their brains. Recently, there has been an explosion of technology that can record the behavior of animals in high resolution."

Professor Greg Stephens, who leads the OIST Unit, added to this, "Remarkable technological progress has enabled new, precision measurements of living systems on all scales, from molecules of DNA to brain cells, to entire organisms. But we currently lack a fundamental framework for understanding the dynamics of these systems and the sequences of measurements over time. Our work reported here will help change that."

C. elegans have been an important species for many groundbreaking projects in biology and neuroscience but it's their simplicity that made them ideal for this study. As Dr. Ahamed explained, mathematically speaking, the shape of worms on a 2D plate is simply a curve, which is relatively easy to describe.

The research team, which included Dr. Antonio Costa from Vrije Universiteit Amsterdam, used high-video recordings of the worm, and converted the shape in each frame to a set of numbers. To do this, they divided the worm into 100 points and measured the tangent angles at these points. The researchers had previously found that a worm's posture could be represented by just four stereotyped shapes, for which they dubbed 'eigenworms'. Essentially, by mixing these eigenworms in different amounts, anyone can draw what a worm looks like at a given instant.

But in this study the researchers looked deeper. Instead of drawing the worm at a single instant, they sought to 'draw' the dynamics of its behavior, essentially to find the structure in a sequence of worm shapes.

The pendulum analogy

Show someone the instantaneous point of a swinging pendulum and they'll be able to imagine what it looks like at that moment in time, but this tells them nothing about what the pendulum is doing. But show someone the current point and an additional point at a previous time, and they'll know everything about both what the pendulum is doing now and what it will do in the future.

The research group took a similar approach when studying the animals, but this was much trickier than with the pendulum. At first the researchers had to develop a new metric of predictability. This measures the duration for which a system's future could be predicted better than just random guessing. They then collected shape sequences and used them to define a worm's current state. The researchers found seven stereotyped shape sequences, all of which were remarkably interpretable.

However, unlike with the pendulum, the researchers couldn't predict the worm's behavior indefinitely. "It's like with the weather," Dr. Ahamed said. "We're at a point where we can predict the weather with a high level of certainty for today and tomorrow but after that it becomes quite random. If I know what a worm is doing now, then I can tell you quite confidently what it's going to do in the next instant. But once we get to two or three seconds later, it becomes more difficult."

Dr. Ahamed wanted to explore why movement so unpredictable. Further analysis of their data hinted that chaotic dynamics might play a role.

Chaotic dynamics refers to systems in which small uncertainties in measurements can make long-term predictions impossible. This can happen even when a system isn't influenced by random fluctuations.

A classic example of this is a double pendulum. Even if multiple double pendulums are started from roughly the same position, the pendulums will make very different motions after a short period of time.

The research group explored these ideas with the worms. They found that if two worms start with similar behaviors, they will continue to act similarly for a short time (around one second) before their behavior diverges. Remarkably, the time it takes for this divergence to occur is determined by a mathematical quantity, which is a fundamental measure of predictability in chaotic systems.

They also looked at the motion through a more geometry-based lens, by mapping all the points that a worm had been to form a shape. Surprisingly, their results showed that the mathematical structure underlying worm behavior is closely related to one which governs energy-conserving phenomena. This was unexpected as worms, like all biological systems, lose energy through environmental friction and muscle use.

"We never expected to find this structure underlying the behavior" explained Dr. Ahamed. "It was definitely the most surprising part of this research."

Although this study looked specifically at C. elegans, the framework developed should be usable across the biological world.

"People generally don't think that living organisms can be mathematically modeled," said Dr. Ahamed. "But there's a finite number of movements that any animal can make and there's a measurable probability that they'll make certain movements over others. We're now at the stage where we can find mathematical frameworks. Next, we'll develop equations and models to explain these frameworks."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

First rehoming of laboratory dogs in Finland successful but required a great deal of work

image: Researchers at the Faculty of Veterinary Medicine of the University of Helsinki monitored the success of rehoming 16 laboratory beagles in 2015-2018.

Image: 
Andreas Arbelaez/Unsplash

The rehoming of laboratory dogs was the first of its kind in Finland. The rehoming process was started with months of practising basic pet dog skills with the dogs and by familiarising them with the world outside the laboratory.

The practice period lasted from four to six months, depending on the dog.

"However, we found out that the socialisation time was not quite sufficient for all dogs; owners reported that some dogs continued to be timid and suffer from separation anxiety. The laboratory dog rehoming process would be smoother if in the future laboratory dog facilities separated out the defaecation and rest areas, gave dogs access to an outside area and walked them outside on a leash," says Docent Marianna Norring from the Faculty of Veterinary Medicine at the University of Helsinki.

The dogs had been living in packs of eight dogs for two to eight years in the University's laboratory animal facilities, from where they had daily access to an enclosed outside space. They spent the nights in smaller groups of dogs.

At the University, the dogs had participated in both animal cognition and veterinary medical studies. The cognition research provided basic information on canine minds, and a new tranquillising agent suitable for dogs was developed in the veterinary medical study. The University of Helsinki does not currently have laboratory dogs.

The rehoming of laboratory dogs was implemented as a collaboration between SEY Animal Welfare Finland and the University of Helsinki. A large group of individuals participated in socialising the dogs and acquainting them with life outside the facility: animal caretakers, researchers, animal-rights campaigners and dog trainers. The aim was to take into account the individual characteristics of each dog when searching for a new home for them. Whenever possible, dogs were rehomed in pairs. Generally speaking, the new owners have been extremely happy about their new pets.

Credit: 
University of Helsinki

Hunger encourages risk-taking

image: The risk of being caught by a predator is one of the risks that wild animals face when searching for food. Here a shoal of fish meets a blacktip reef shark.

Image: 
Photo: Oliver Krüger

The lives of animals in the wild are full of risky situations with uncertain outcomes. Whether they are exploring new habitats in unfamiliar terrain or searching for new food sources, they run the risk of being caught and killed by a predator. In many instances, their very survival depends on a single decision. Whether an animal decides to take a risk or prefers to avoid danger varies greatly from one individual to another.

"Just as there are humans who are more cautious and others who take more risks, among animals of a particular species there are also individuals that are more or less risk-averse," says population ecologist Prof. Holger Schielzeth of the University of Jena. These differences are to some degree innate, but to a considerable extent they also depend on an individual's development. As Prof. Schielzeth, his Bielefeld colleague Prof. Klaus Reinhold and their research teams have now shown in an extensive meta-analysis, an animal's risk appetite is decisively influenced by the nutritional conditions it experiences while growing up. The researchers report on their findings in the latest issue of the specialist journal Biological Reviews.

Study results involving over 100 animal species compared

The researchers, working with lead author Nicholas Moran, analysed more than 120 experimental studies involving over 100 animal species and the results. Species studied included spiders, insects, crustaceans, fish, amphibians and birds. Common to all the studies was the fact that the animals had experienced phases of good and bad nutrition, and that their risk appetite was measured later in life. There were two opposing hypotheses: "On the one hand, one could assume that animals that have always enjoyed good circumstances and are therefore in a better condition would have more to lose and would therefore be more risk-averse," says evolutionary biologist Reinhold. On the other hand, he adds, a better nutritional status could mean that an animal would escape more easily from a risky situation, and would therefore be more likely to take a risk.

The analysis of the results of all the studies has now made things clear. An insufficient food supply causes animals to engage in higher-risk behaviour: the willingness to take risks rises by an average of 26 per cent in animals that have experienced hunger earlier in their lives.

"We were surprised that this result was so clear and unambiguous," says Schielzeth. The correlation applied to virtually all the behavioural contexts studied, such as exploration behaviour, migration and risky searches for food. There were of course variations in the strength of the effect. Nevertheless, Schielzeth assumes that this correlation could also exist in humans, at least to some extent, as we are, after all, also an "animal species".

This meta-analysis was carried out within the framework of the Collaborative Research Centre Transregio 212, "A Novel Synthesis of Individualisation across Behaviour, Ecology and Evolution: Niche Choice, Niche Conformance, Niche Construction" (NC3), which is based at the universities of Bielefeld and Münster, and in which the University of Jena is also involved. Dr Nicholas Moran is currently an MSCA Research Fellow at the National Institute of Aquatic Resources, Technical University of Denmark.

Credit: 
Friedrich-Schiller-Universitaet Jena

Caring for others is a key driver in getting people to use chatbots for mental health

A new study from North Carolina State University and Syracuse University assessed what would motivate people to use chatbots for mental health services in the wake of a mass shooting. The researchers found that users' desire to help others with mental health problems was a more powerful driver than seeking help for their own problems.

"We saw a sharp increase in mass shootings in the U.S. in recent years, and that can cause increases in the need for mental health services," says Yang Cheng, first author of the study and an assistant professor of communication at NC State. "And automated online chatbots are an increasingly common tool for providing mental health services - such as providing information or an online version of talk therapy. But there has been little work done on the use of chatbots to provide mental health services in the wake of a mass shooting. We wanted to begin exploring this area, and started with an assessment of what variables would encourage people to use chatbots under those circumstances."

The researchers conducted a survey of 1,114 U.S. adults who had used chatbots to seek mental health services at some point prior to the study. Study participants were given a scenario in which there had been a mass shooting, and were then asked a series of questions pertaining to the use of chatbots to seek mental health services in the wake of the shooting. The survey was nationally representative and the researchers controlled for whether study participants had personal experiences with mass shootings.

The researchers found a number of variables that were important in driving people to chatbots to address their own mental health needs. For example, people liked the fact that chatbots were fast and easy to access, and they thought chatbots would be good sources of information. The study also found that people felt it was important for chatbots to be humanlike, because they would want the chatbots to provide emotional support.

But researchers were surprised to learn that a bigger reason for people to use chatbots was to help other people who were struggling with mental health issues.

"We found that the motivation of helping others was twice as powerful as the motivation of helping yourself," Cheng says.

Helping others, in this context, would include talking to a chatbot in order to help a loved one experiencing mental illness from getting worse; finding ways to encourage the loved one to access the chatbot services; or to demonstrate to the loved one that the services are easy to use.

"Our study offers detailed insights into what is driving people to access mental health information on chatbot platforms after a disaster, as well as how they are using that information," Cheng says. "Among other applications, these findings should be valuable for the programmers and mental healthcare providers who are responsible for developing and deploying these chatbots."

Credit: 
North Carolina State University

The Lancet: Lopinavir-ritonavir is not an effective treatment for patients hospitalised with COVID-19

Findings from the RECOVERY trial do not support use of lopinavir-ritonavir to treat patients admitted to hospital with COVID-19

Lopinavir-ritonavir treatment does not significantly reduce deaths, length of hospital stay, or the risk of needing to be placed on a ventilator

The authors recommend that clinical guidelines be updated based on findings from the RECOVERY trial

The drug combination lopinavir-ritonavir is not an effective treatment for patients admitted to hospital with COVID-19, according to the results of a randomised controlled trial published in The Lancet.

Many clinical care guidelines have recommended lopinavir-ritonavir - an antiviral medication approved to treat HIV/AIDS - for the treatment of patients hospitalised with COVID-19. However, these guidelines should now be updated, authors say.

The Randomised Evaluation of COVid-19 thERapY (RECOVERY) trial, under way at 176 UK hospitals, is the first large-scale randomised clinical trial to report the effects of lopinavir-ritonavir in patients admitted to hospital with COVID-19.

Professor Martin Landray from the Nuffield Department of Population Health at the University of Oxford, UK, who co-leads the RECOVERY trial, said: "Treatment of COVID-19 with the drug combination lopinavir-ritonavir has been recommended in many countries. However, results from this trial show that it is not an effective treatment for patients admitted to hospital with COVID-19." [1]

"Since our preliminary results were made public on June 29, 2020, the World Health Organization has halted lopinavir-ritonavir treatment groups involved in its SOLIDARITY trial and reported that their interim results are in line with those presented here." [1]

Between March 19 and June 29, 2020, 1,616 patients in the RECOVERY trial were randomised to receive lopinavir-ritonavir while 3,424 patients received usual care alone. Those on lopinavir-ritonavir received 400 mg of lopinavir and 100 mg of ritonavir by mouth every 12 hours for 10 days or until discharge, if sooner. The primary outcome was 28-day all-cause mortality.

Findings from the trial indicate that using lopinavir-ritonavir to treat patients hospitalised with COVID-19 does not reduce deaths within 28 days of treatment beginning. 23% (374/1,616 patients) who received lopinavir-ritonavir and 22% (767/3,424 patients) allocated to usual care died within 28 days.

Professor Peter Horby, from the Nuffield Department of Medicine at the University of Oxford, UK, co-Chief Investigator of the RECOVERY trial, said: "The result from the RECOVERY trial is clear. When combined with findings from an earlier, smaller trial and with the WHO interim results, this provides strong evidence that lopinavir-ritonavir is not an effective treatment for patients hospitalised with COVID-19." [1]

"Whilst it is disappointing that there was no significant benefit from lopinavir-ritonavir for patients in hospital, these findings have allowed us to focus our efforts on other promising treatments, and have informed the way in which individual patients are treated." [1]

The authors also found that lopinavir-ritonavir did not reduce the length of patients' hospital stay, with 69% (1,113/1,616 patients) in the lopinavir-ritonavir group leaving hospital within 28 days, compared with 70% (2,382/3,424 patients) of those receiving usual care. Both groups had a median stay of 11 days.

No significant difference was observed in the risk of needing to be placed on a ventilator, with 10% (152/1,556 patients) of those in the lopinavir-ritonavir group needing ventilation, compared with 9% (279/3,280 patients) in those receiving usual care.

Results were consistent across all patient subgroups - including age, sex and ethnicity - with no evidence of any benefit from lopinavir-ritonavir treatment.

The authors note that few patients who had undergone intubation - insertion of a tube into the airway to aid breathing using a ventilator - took part in the trial due to difficulties in giving the treatment to patients who could not swallow, and so it is not possible to draw conclusions about the effectiveness of lopinavir-ritonavir for mechanically ventilated patients.

Results from the large-scale RECOVERY clinical trial - combined with findings from an earlier, smaller trial and with the WHO interim results - provide strong evidence that lopinavir-ritonavir is not an effective treatment for patients hospitalised with COVID-19, and clinical care guidelines should be updated accordingly.

Writing in a linked Comment, lead authors Bin Cao and Frederick G Hayden (who were not involved the study), from the China-Japan Friendship Hospital and Capital Medical University, China, and the University of Virginia School of Medicine, USA, say: "Compared with the first randomised trial to investigate lopinavir-ritonavir in patients with COVID-19 by Cao and colleagues (including the authors of this Comment), the size of the lopinavir-ritonavir group in the RECOVERY trial was much larger and hence provides a more solid evidence base regarding possible lopinavir-ritonavir treatment effects. [...] The findings of these two open-label studies support each other and conclude that lopinavir-ritonavir is not effective in improving outcomes for patients admitted to hospital with COVID-19."

Credit: 
The Lancet

New tools improve care for cancers that spread to the brain

image: "The proposed set of measurements serves as the framework to gauge the performance of multidisciplinary programs, with the goal of providing optimal consistent and coordinated care to patients with brain metastasis," said Camilo E. Fadul, MD, a neuro-oncologist at UVA Health and UVA Cancer Center.

Image: 
UVA Health

Ambitious efforts at UVA Cancer Center to improve care delivered to patients with cancer that has spread to the brain have yielded important insights and tools that can benefit other hospitals, a new publication reports.

The tools include the first set of metrics to assess care provided for these secondary tumors, known as brain metastasis. The UVA team says its findings will help doctors and patients make better-informed treatment decisions, enhance the care of brain metastases, and enable hospitals to improve the coordination and effectiveness of their interdisciplinary treatment programs.

"The proposed set of measurements serves as the framework to gauge the performance of multidisciplinary programs, with the goal of providing optimal consistent and coordinated care to patients with brain metastasis," said Camilo E. Fadul, MD, a neuro-oncologist at UVA Health and UVA Cancer Center.

Quality Care for Brain Metastases

Brain metastases are the most common tumors affecting the central nervous system. Although any type of cancer can spread to the brain, the most frequent are lung cancer, breast cancer and melanoma.

As patients with cancer are living longer, the frequency of brain metastasis is on the rise. So doctors are seeking to personalize treatment from a menu of options that include surgery, radiation (either targeted stereotactic radiosurgery or whole brain radiation), chemotherapy, targeted oral therapies and immunotherapy. Because of the complexity of treatment decisions and the need to provide safe and consistent patient-centered care, UVA takes an interdisciplinary approach, which brings together providers with expertise in radiation oncology, neuro-oncology, medical oncology, neurosurgery and palliative care.

While developing this program, the UVA group led by Dr. Fadul identified relevant outcome measurements to inform treatment decisions, evaluate the care process and implement interventions that will improve value. They tested the measurements by looking at hundreds of patients' data to identify strategies that will improve outcomes, and they reviewed every detail of UVA's care for patients, beginning with their brain metastasis diagnosis and ending with the patients' discharge.

They identified relevant outcome measurements for patients receiving brain metastases directed treatment by looking at survival - specifically, survival more than 90 days after the important milestones of diagnosis, surgery, stereotactic radiosurgery and whole-brain radiation therapy. Other measurements examined processes that may translate in better outcomes, such as whether patients had advance directives outlining the type of care they wish to receive documented in their electronic medical record, if they had palliative care consultation and if the drug memantine, which may protect cognitive function from possible radiation injury, was prescribed to patients receiving whole-brain radiation therapy.

"The proposed set of performance measurements serves as the foundation for improving healthcare delivery, for characterizing the structure, processes and functions of the optimal interdisciplinary teams, and for filling the gap associated with disparities in the quality of the care and outcomes after diagnosis of brain metastases," Dr. Fadul said.

He noted the measurements need to incorporate the input and perspective from patients and caregivers -- what are called "patient-reported outcomes." In addition, the metrics have to be refined and validated by other experts in the field. Several projects are already underway to build upon these results and further improve the care provided to patients with brain metastasis at UVA.

"Identification of areas for quality improvement, sharing among hospitals of data and strategies that improve outcomes, and development of consistent treatment guidelines based on continuous performance measurements, like the ones we proposed, will enhance the quality and value of the care provided to the growing population of patients afflicted with brain metastasis," Dr. Fadul said.

Credit: 
University of Virginia Health System

Study identifies characteristics of infused CAR T cells associated with efficacy and toxicity in in patients with large B-cell lymphoma

HOUSTON -- Researchers at The University of Texas MD Anderson Cancer Center have identified molecular and cellular characteristics of anti-CD19 CAR T cell infusion products associated with how patients with large B-cell lymphoma (LBCL) respond to treatment and develop side effects.

The research team also found that early changes in circulating tumor DNA one week after CAR T cell therapy may be predictive of treatment response in a particular patient. The paper was published online today in Nature Medicine.

"CAR T cell therapy is highly effective against LBCL," said corresponding author Michael Green, Ph.D., associate professor of Lymphoma and Myeloma. "However, we experience two main clinical challenges: achieving long-term remission and managing treatment-associated adverse events."

This study suggests that, within the first week of therapy, clinicians may be able to identify a subset of patients who may experience more poor outcomes or adverse treatment reactions, said Green. This would allow the care team to adjust therapy to improve efficacy or to act to mitigate toxicity.

CAR T cell signature, early molecular response may predict long-term outcomes

For this study, researchers performed single-cell analysis on CAR T cells to study gene expression profiles in the infused cells. CAR T cells were collected from those remaining in infusion bags following treatment of 24 patients with LBCL. These genetic profiles were compared to treatment responses, determined at three months post-infusion by PET/CT scan.

"When we look at the characteristics of the infused CAR T cells, we found that samples from patients who were less responsive to treatment had exhausted T cells, whereas those who experienced complete responses had T cells expressing 'memory' signatures," said co-corresponding author Sattva Neelapu, M.D., professor of Lymphoma and Myeloma. "Additionally, one cellular signature of T cell exhaustion was more commonly found in patients who exhibited a poor molecular response, and poor molecular response is generally associated with less-positive, long-term outcomes."

Further, the researchers analyzed early molecular responses in the patients by monitoring changes in circulating tumor DNA from treatment to one week post-infusion. The magnitude of change in tumor-associated DNA corresponded with response, suggesting that patients who displayed an early molecular response were more likely to experience a clinical response to treatment.

CAR T cell features predict likelihood of severe side effects

Adverse side effects of CAR T cell therapy can include cytokine release syndrome and immune effector cell-associated neurotoxicity syndrome (ICANS). These adverse events can delay patients' recovery and can lead to increased need for hospitalization and intensive care.

"When we examined the infusion product, we found that a cell population with characteristics similar to myeloid cells, with a monocyte-like transcriptional signature, was associated with development of high-grade neurotoxicity," said Green. "Detecting these cells may subsequently lead us to identify patients who would be at higher risk of developing neurotoxicity, allowing us to provide prophylactic treatment with agents that target the specific cellular features."

Further examination may lead to insights into the types and attributes of the cells present within the CAR T infusion product.

"This study also tells us that some rare and unexpected cells identified by single-cell analysis could be biologically important," said co-corresponding author Linghua Wang, M.D., assistant professor of Genomic Medicine. "Going forward, we plan to functionally characterize these monocyte-like cells to better understand their specific biological mechanisms driving these clinical results."

These findings will help researchers develop clinical interventions that can block or target these cells. They also plan to validate the capacity of circulating tumor DNA to accurately predict patients' long-term outcomes.

Credit: 
University of Texas M. D. Anderson Cancer Center

Women and men executives have differing perceptions of healthcare workplaces according to a survey report in the Journal of Healthcare Management

October 5, 2020 - Healthcare organizations that can attract and retain talented women executives have the advantage over their peers, finds a special report in the September/October issue of the Journal of Healthcare Management, an official publication of the American College of Healthcare Executives (ACHE). The report presents results from a 2018 survey, the sixth in a series, conducted by the ACHE of its members to compare the career attainments, attitudes, and workplace experiences of men and women healthcare executives.

Data from the survey shows that women with five to 20 years of experience in the healthcare field are significantly less likely than their male peers to see employers as gender-neutral in hiring, promotion, evaluation, and compensation. This special report on the data finds little to no improvement in these outcomes since the ACHE began measuring them in 1990 with the first survey in the series.

Credit: 
Wolters Kluwer Health

NIST innovation could improve detection of COVID-19 infections

A multidisciplinary research team at the National Institute of Standards and Technology (NIST) has developed a way to increase the sensitivity of the primary test used to detect the SARS-CoV-2 virus, which causes COVID-19. Applying their findings to computerized test equipment could improve our ability to identify people who are infected but do not exhibit symptoms.

The team's results, published in the scientific journal Analytical and Bioanalytical Chemistry, describe a mathematical technique for perceiving comparatively faint signals in diagnostic test data that indicate the presence of the virus. These signals can escape detection when the number of viral particles found in a patient's nasal swab test sample is low. The team's method helps a modest signal stand out more clearly.

"Applying our technique could make the swab test up to 10 times more sensitive," said Paul Patrone, a NIST physicist and a co-author on the team's paper. "It could potentially spot more people who are carrying the virus but whose viral count is too low for the current test to give a positive result."

The researchers' findings prove that the data from a positive test, when expressed in graphical form, takes on a recognizable shape that is always the same. Just as a fingerprint identifies a person, the shape is unique to this type of test. Only the shape's position, and importantly, its size, differ when graphed, varying with the quantity of viral particles that exist in the sample.

While it was known previously that the shape's position could vary, the team learned that its size can vary as well. Reprogramming test equipment to recognize this shape, regardless of size or location, is the key to improving test sensitivity.

The swab test employs a lab technique called quantitative polymerase chain reaction, or qPCR, to detect the genetic material carried by the SARS-CoV-2 virus. The qPCR technique takes any strands of viral RNA that exist in a patient's swab sample and then multiplies them into a far larger quantity of genetic material. Each time a new fragment of this material is made, the reaction releases a fluorescent marker that glows when exposed to light. It is this fluorescence that indicates the presence of the virus.

While the test method usually works well in practice, it can lack sensitivity to low viral particle counts. The test starts with the genetic material that is present and doubles it, then doubles it again, up to 40 times over, so that the fluorescent markers generate enough light to trigger a detector. Doubling, as anyone familiar with compound interest knows, is a powerful amplifier, growing slowly at first and then spiking to high numbers. The doublings produce a graph that is initially flat other than the bumps from systemic background noise, and eventually a telltale spike rises from it.

However, when the initial viral count is low, there may be false starts in the first few cycles. In these cases, even 40 doublings may not build a spike tall enough -- or a fluorescence bright enough -- to rise above the detection threshold. This issue can cause problems like inconclusive tests or "false negatives," meaning a person carries the virus but the test does not reveal it.

Preliminary studies indicate that the rate of false negatives may be as high as 30% in qPCR testing for COVID-19, including one study in which chest CT scans indicated positive cases where swab tests had not. Another study shows that asymptomatic and early-disease states are associated with up to 60 times fewer virus particles in patient samples. A JAMA study published in August supports the idea that asymptomatic carriers can spread the virus.

The NIST researchers found that the shape of a positive test graph -- a flat, noisy beginning followed by a spike -- is found even in data that currently does not trigger a positive test result. Their paper offers a formal proof that the shapes are mathematically "similar," akin to triangles that have the same angles and proportions despite being larger or smaller than one another. They apply this theoretical evidence in a routine that a computer can use to recognize the reference shape in the data.

"We're no longer constrained by having to pass a high detection threshold," Patrone said. "The spikes don't need to be large. They need to have the right shape."

Incorporating their findings into tests would immediately help the pandemic response, Patrone said, as it would help determine the number of asymptomatic and presymptomatic cases more accurately.

"In essence, lowering false negatives should help doctors and scientists get a better handle on the actual spread of the virus," he said. "There is a good chance that we're missing asymptomatic cases with the testing. The reduction we project in the number of viral RNA detected could pick up a significant number of asymptomatic cases."

The new test also would be unlikely to generate false positives because it would check that the curve was consistent with a reference shape, not merely that it crossed a detection threshold.

"In standard testing protocols, it is possible to get false positives -- for example, if background effects rise to the detection threshold and no one manually checks the result," Patrone said. "The likelihood of that happening in our analysis is very small because the math automatically rules out such signals."

Pandemic response workers would not need to do anything differently when collecting samples. Because the team's approach uses a mathematical algorithm applied after data collection, programmers could apply it by updating the lab equipment software with a few lines of computer code.

"Our work is a potentially easy fix because it's an advance in the data analysis," Patrone said. "It can easily be incorporated into the protocol of any lab or testing instrument, so it could have an immediate impact on the trajectory of the health crisis."

Credit: 
National Institute of Standards and Technology (NIST)

6,500-year-old copper workshop uncovered in the Negev Desert's Beer Sheva

image: Work on the dig in Beer Sheva.

Image: 
Anat Rasiuk, Israel Antiquities Authority

A new study by Tel Aviv University and the Israel Antiquities Authority indicates that a workshop for smelting copper ore once operated in the Neveh Noy neighborhood of Beer Sheva, the capital of the Negev Desert. The study, conducted over several years, began in 2017 in Beer Sheva when the workshop was first uncovered during an Israel Antiquities Authority emergency archeological excavation to safeguard threatened antiquities.

The new study also shows that the site may have made the first use in the world of a revolutionary apparatus: the furnace.

The study was conducted by Prof. Erez Ben-Yosef, Dana Ackerfeld, and Omri Yagel of the Jacob M. Alkow Department of Archeology and Ancient Near Eastern Civilizations at Tel Aviv University, in conjunction with Dr. Yael Abadi-Reiss, Talia Abulafia, and Dmitry Yegorov of the Israel Antiquities Authority and Dr. Yehudit Harlavan of the Geological Survey of Israel. The results of the study were published online on September 25, 2020, in the Journal of Archaeological Science: Reports.

According to Ms. Abulafia, Director of the excavation on behalf of the Israel Antiquities Authority, "The excavation revealed evidence for domestic production from the Chalcolithic period, about 6,500 years ago. The surprising finds include a small workshop for smelting copper with shards of a furnace -- a small installation made of tin in which copper ore was smelted -- as well as a lot of copper slag."

Although metalworking was already in evidence in the the Chalcolithic period, the tools used were still made of stone. (The word "chalcolithic" itself is a combination of the Greek words for "copper" and "stone.") An analysis of the isotopes of ore remnants in the furnace shards show that the raw ore was brought to Neveh Noy neighborhood from Wadi Faynan, located in present-day Jordan, a distance of more than 100 kilometers from Beer Sheva.

During the Chalcolithic period, when copper was first refined, the process was made far from the mines, unlike the prevalent historical model by which furnaces were built near the mines for both practical and economic reasons. The scientists hypothesize that the reason was the preservation of the technological secret.

"It's important to understand that the refining of copper was the high-tech of that period. There was no technology more sophisticated than that in the whole of the ancient world," Prof. Ben-Yosef says. "Tossing lumps of ore into a fire will get you nowhere. You need certain knowledge for building special furnaces that can reach very high temperatures while maintaining low levels of oxygen."

Prof. Ben-Yosef notes that the archeology of the land of Israel shows evidence of the Ghassulian culture. The culture was named for Tulaylât al-Ghassûl, the archeological site in Jordan where the culture was first identified. This culture, which spanned the region from the Beer Sheva Valley to present-day southern Lebanon, was unusual for its artistic achievements and ritual objects, as evidenced by the copper objects discovered at Nahal Mishmar and now on display at the Israel Museum in Jerusalem.

According to Prof. Ben-Yosef, the people who lived in the area of the copper mines traded with members of the Ghassulian culture from Beer Sheva and sold them the ore, but they were themselves incapable of reproducing the technology. Even among the Ghassulian settlements along Wadi Beer Sheva, copper was refined by experts in special workshops. A chemical analysis of remnants indicates that every workshop had its own special "recipe" which it did not share with its competitors. It would seem that, in that period, Wadi Beer Sheva was filled with water year-round, making the location convenient for smelting copper where the furnaces and other apparatus were made of clay.

Prof. Ben-Yosef further notes that, even within Chalcolithic settlements that possessed both stone and copper implements, the secret of the gleaming metal was held by the very few members of an elite. "At the beginning of the metallurgical revolution, the secret of metalworking was kept by guilds of experts. All over the world, we see metalworkers' quarters within Chalcolithic settlements, like the neighborhood we found in Beer Sheva."

The study discusses the question of the extent to which this society was hierarchical or socially stratified, as society was not yet urbanized. The scientists feel that the findings from Neveh Noy strengthen the hypothesis of social stratification. Society seems to have consisted of a clearly defined elite possessing expertise and professional secrets, which preserved its power by being the exclusive source for the shiny copper. The copper objects were not made to be used, instead serving some ritual purpose and thus possessing symbolic value. The copper axe, for example, wasn't used as an axe. It was an artistic and/or cultic object modeled along the lines of a stone axe. The copper objects were probably used in rituals while the everyday objects in use continued to be of stone.

"At the first stage of humankind's copper production, crucibles rather than furnaces were used," says Prof. Ben-Yosef. "This small pottery vessel, which looks like a flower pot, is made of clay. It was a type of charcoal-based mobile furnace. Here, at the Neveh Noy workshop that the Israel Antiquities Authority uncovered, we show that the technology was based on real furnaces. This provides very early evidence for the use of furnaces in metallurgy and it raises the possibility that the furnace was invented in this region.

"It's also possible that the furnace was invented elsewhere, directly from crucible-based metallurgy, because some scientists view early furnaces as no more than large crucibles buried in the ground," Prof. Ben-Yosef continues. "The debate will only be settled by future discoveries, but there is no doubt that ancient Beer Sheva played an important role in advancing the global metal revolution and that in the fifth millennium BCE the city was a technological powerhouse for this whole region."

Credit: 
American Friends of Tel Aviv University

Gemini South's high-def version of 'A Star is Born'

image: Two near-infrared composite images showing a 33 trillion-mile section of the Western Wall, a cloud of gas and dust in a star-forming region of the Carina Nebula. Each image was taken by Rice University astronomer Patrick Hartigan and colleagues from telescopes at the National Science Foundation's NOIRLab observatory in Chile and shows hydrogen molecules at the cloud's surface (red) and hydrogen atoms evaporating from the surface (green). The left-hand image was taken with the four-meter Blanco telescope's Wide-Field Infrared Imager in 2015. The right-hand image was taken with the 8.1-meter Gemini South telescope's wide-field adaptive optics imager in January 2018 and has about 10 times finer resolution thanks to a mirror that changes shape to correct for atmospheric distortion.

Image: 
Images courtesy of Patrick Hartigan/Rice University

HOUSTON -- (Oct. 5, 2020) -- NASA's James Webb Space Telescope is still more than a year from launching, but the Gemini South telescope in Chile has provided astronomers a glimpse of what the orbiting observatory should deliver.

Using a wide-field adaptive optics camera that corrects for distortion from Earth's atmosphere, Rice University's Patrick Hartigan and Andrea Isella and Dublin City University's Turlough Downes used the 8.1-meter telescope to capture near-infrared images of the Carina Nebula with the same resolution that's expected of the Webb Telescope.

Hartigan, Isella and Downes describe their work in a study published online this week in Astrophysical Journal Letters. Their images, gathered over 10 hours in January 2018 at the international Gemini Observatory, a program of the National Science Foundation's NOIRLab, show part of a molecular cloud about 7,500 light years from Earth. All stars, including Earth's sun, are thought to form within molecular clouds.

"The results are stunning," Hartigan said. "We see a wealth of detail never observed before along the edge of the cloud, including a long series of parallel ridges that may be produced by a magnetic field, a remarkable almost perfectly smooth sine wave and fragments at the top that appear to be in the process of being sheared off the cloud by a strong wind."

The images show a cloud of dust and gas in the Carina Nebula known as the Western Wall. The cloud's surface is slowly evaporating in the intense glow of radiation from a nearby cluster of massive young stars. The radiation causes hydrogen to glow with near-infrared light, and specially designed filters allowed the astronomers to capture separate images of hydrogen at the cloud's surface and hydrogen that was evaporating.

An additional filter captured starlight reflected from dust, and combining the images allowed Hartigan, Isella and Downes to visualize how the cloud and cluster are interacting. Hartigan has previously observed the Western Wall with other NOIRLab telescopes and said it was a prime choice to follow up with Gemini's adaptive optics system.

"This region is probably the best example in the sky of an irradiated interface," he said. "The new images of it are so much sharper than anything we've previously seen. They provide the clearest view to date of how massive young stars affect their surroundings and influence star and planet formation."

Images of star-forming regions taken from Earth are usually blurred by turbulence in the atmosphere. Placing telescopes in orbit eliminates that problem. And one of the Hubble Space Telescope's most iconic photographs, 1995's "Pillars of Creation," captured the grandeur of dust columns in a star-forming region. But the beauty of the image belied Hubble's weakness for studying molecular clouds.

"Hubble operates at optical and ultraviolet wavelengths that are blocked by dust in star-forming regions like these," Hartigan said.

Because near-infrared light penetrates the outer layers of dust in molecular clouds, near-infrared cameras like the Gemini South Adaptive Optics Imager can see what lies beneath. Unlike traditional infrared cameras, Gemini South's imager uses "a mirror that changes its shape to correct for shimmering in our atmosphere," Hartigan said. The result: photos with roughly 10 times the resolution of images taken from ground-based telescopes that don't use adaptive optics.

But the atmosphere causes more than blur. Water vapor, carbon dioxide and other atmospheric gases absorb some parts of the near-infrared spectrum before it reaches the ground.

"Many near-infrared wavelengths will only be visible from a space telescope like the Webb," Hartigan said. "But for near-infrared wavelengths that reach Earth's surface, adaptive optics can produce images as sharp as those acquired from space."

The advantages of each technique bode well for the study of star formation, he said.

"Structures like the Western Wall are going to be rich hunting grounds for both Webb and ground-based telescopes with adaptive optics like Gemini South," Hartigan said. "Each will pierce the dust shrouds and reveal new information about the birth of stars."

Credit: 
Rice University