Culture

VISTA unveils a new image of the Large Magellanic Cloud

image: ESO's VISTA telescope reveals a remarkable image of the Large Magellanic Cloud, one of our nearest galactic neighbours. VISTA has been surveying this galaxy and its sibling the Small Magellanic Cloud, as well as their surroundings, in unprecedented detail. This survey allows astronomers to observe a large number of stars, opening up new opportunities to study stellar evolution, galactic dynamics, and variable stars.

Image: 
ESO/VMC Survey

The Large Magellanic Cloud, or LMC, is one of our nearest galactic neighbors, at only 163 000 light years from Earth. With its sibling the Small Magellanic Cloud, these are among the nearest dwarf satellite galaxies to the Milky Way. The LMC is also the home of various stellar conglomerates and is an ideal laboratory for astronomers to study the processes that shape galaxies.

ESO's VISTA telescope, has been observing these two galaxies for the last decade. The image presented today is the result of one of the many surveys that astronomers have performed with this telescope. The main goal of the VISTA Magellanic Clouds (VMC) Survey has been to map the star formation history of the Large and Small Magellanic Clouds, as well as their three-dimensional structures.

VISTA was key to this image because it observes the sky in near-infrared wavelengths of light. This allows it to see through clouds of dust that obscure parts of the galaxy. These clouds block a large portion of visible light but are transparent at the longer wavelengths VISTA was built to observe. As a result, many more of the individual stars populating the centre of the galaxy are clearly visible. Astronomers analysed about 10 million individual stars in the Large Magellanic Cloud in detail and determined their ages using cutting-edge stellar models[1]. They found that younger stars trace multiple spiral arms in this galaxy.

For millennia, the Magellanic Clouds have fascinated people in the Southern Hemisphere, but they were largely unknown to Europeans until the Age of Discovery. The name we use today harkens back to the explorer Ferdinand Magellan, who 500 years ago began the first circumnavigation of the Earth. The records the expedition brought back to Europe revealed many places and things to Europeans for the first time. The spirit of exploration and discovery is ever more live today in the work of astronomers around the world, including the VMC Survey team whose observations led to this stunning image of the LMC.

Credit: 
ESO

Disabled people marginalised by paperwork and programmes which aim to help them

Disabled people face being marginalised by the very programmes that are designed to help them.

Rather than taking their differences and particular preferences into account, projects and welfare systems established to provide support are normalising disabled people, and unintentionally contributing to their further marginalisation.

Research from Lancaster University Management School (LUMS), published in Organization Studies, investigated a programme that allocated computers to disabled people. Its aim was to help people improve their sociability through electronic interactions. The research focuses on the part played by an assessment form designed to establish whether or not a person qualified for a computer.

The study found that the scheme's assessors did not apply a strict interpretation of the questions and answers on the form, and sometimes ignored responses or shaped answers to better suit the programme's requirements. This allowed some of those involved to receive a computer even though they did not comply with the allocation criteria, but had the unintended effect of glossing over their views and wishes in favour of the pre-set organisational goals of promoting sociability.

Dr Yvonne Latham, of the LUMS Department of Organisation, Work and Technology, who conducted the research, observed that carers, family members and project staff applied their own views and perceptions of what was important for disabled people, while often ignoring their actual preferences.

"The assumptions of those who organised the project were that disabled people are lacking something that can be 'fixed' so as to make their lives similar to those of the able-bodied," said Dr Latham. "Forms will often have yes or no answers to questions which demand more complicated responses. Consequently, welfare worked treat issues such as whether individuals are able to wash, dress or use the toilet by themselves - capabilities that are forever changing, often on a daily basis -with limited importance as they try to render impaired bodies more predictable than is plausible.

"In our case, while the form itself had implications for disabled people, the filling in of the questions and responses, and the results thereof, are also affected by the assumptions of those carrying out the questioning. Everyone has pre-conceived ideas, and these are evident with how they would violate both the spirit and the letter of the form - often normalising assumptions of the needs and desires of those people with whom they are speaking."

For example, among the disabled people interviewed was Ron. During his interview, Ron revealed that he did not want to use the computer for which he was being assessed to increase his social connectivity, but rather for activities such as buying and selling shares. He answered: 'I don't want to increase my social interactions because I'm miserable, like my brother'. He felt that people looking to fill out the form in a certain way were not listening to him.

The interviewer eventually decided that Ron would benefit from using the computer with internet access and would see a boost to his independence as a result, and thus was allocated a computer despite his not fitting the prior organisational criteria of a suitable user.

Other examples included Chloe, a 25-year-old wheelchair user, whose mother was adamant she would not let her use the computer for online shopping (one of the criteria of 'fit' for the programme), as she wanted to continue to take her out shopping. Chloe was not seen as socially isolated so much as lacking independence as a result of her mother's control over her life.

Polly, a woman in her 60s with Lupus, Angina and Arthritis, gave the expected responses, thus allowing a straightforward and positive form filling process. As a result of the cost of broadband and the discomfort she had while trying to sit and use the allocated computer, Polly later returned it because she said it was causing her stress, and because she had not really wanted it in the first place.

Co-author Professor David Knights added: "These examples show how the responses on the form can both be shaped by the interviewer to gain the expected response, but also how the interviewee can give responses they feel are what is expected, even if the result is not what they desire.

"The form and the project were designed to help overcome the marginalisation of disabled people through increasing their sociability, but these assumptions and the form's usage were reconfigured by those involved, glossing over the actual discussions that took place during the interviews and, on occasion, leading the interviewees to feel their views were being ignored.

"Imposing norms on disabled people and expecting them to fit in with preconceived ideas can have the unintended consequence of marking them out as being in need of special attention. There is a fine line in welfare between care and patronising power."

Credit: 
Lancaster University

Strategies to connect with barricaded buyers

Researchers from Clemson University and University of Kentucky published a new paper in the Journal of Marketing, which examines several means by which suppliers can enhance their competitiveness when selling to barricaded buyers.

The study, forthcoming in the November issue of the Journal of Marketing, is titled "Selling to Barricaded Buyers" and authored by Kevin Chase and Brian Murtha.

To ensure fair and competitive purchase processes, most states and many leading organizations limit the amount of contact suppliers have with buying team members once RFPs are released. Accordingly, the researchers refer to these buyers as "barricaded buyers" because it is difficult for suppliers to communicate with them. How can suppliers effectively sell to buyers who are sheltered by these barricades?

In the typical RFP process, there are fewer restrictions placed on communicating with buyers during the development of the RFP (i.e., the "pre-RFP" phase). In this phase, buyers involve suppliers to help clarify their needs so they can develop more appropriate RFP specifications. One frequent and efficient way to involve suppliers is commonly referred to as a "pre-RFP" meeting. These meetings (some of which require attendance) bring together competing suppliers who participate in facilities walk-throughs, inspections, and clarifying Q&A sessions. Chase explains that "We discovered that while some suppliers remained silent during these meetings, other suppliers took the opportunity to engage in what we call "peacocking" behavior. Peacocking is when a supplier signals or "shows off" the strength of its knowledge about, or its connections to individuals within, the buying firm. Our findings indicate that doing so can substantially demotivate one's competitors from responding to the buyer's RFP."

In addition to demotivating competitors through peacocking, suppliers can demotivate competitors by embedding their unique capabilities and language into buyers' RFPs during the pre-RFP phase. To do so, shrewd suppliers often shared facilitating documentation (e.g., marketing materials or sample RFPs) that included their unique capabilities and language. Buyers, some of whom had less experience developing RFPs, appreciated the help and used suppliers' facilitating documentation to help develop RFPs of their own. Other suppliers were quick to notice the unique capabilities and language of their competitors and lamented about the biased nature of the RFPs (whether the bias was intended or not).

In the post-RFP phase (i.e., once the formal RFP is released), buyer-supplier interactions are much more restricted. A common approach to developing an RFP response was to meet requirements exactly as requested. Buyers, however, often mentioned that the RFP requirements are really just the minimum requirements. "We discovered that more competitive suppliers not only met the requirements of the RFPs, but also went above and beyond the requirements in two critical ways--by also providing innovative solutions and/or by providing solutions buyers hadn't thought of. Doing so signals to buyers a sincere effort by the supplier to claim their business and that the supplier may provide additional value going forward in the relationship," explains Murtha.

In addition to the solutions offered, the study shows that buyers lean heavily on the subtle signals suppliers convey in their RFP response documents. For instance, several suppliers were surprisingly inattentive to the poor tone conveyed in their RFP responses and some used canned responses not tailored to the RFP's buyer. Buyers quickly removed such suppliers from their decision sets. Further, buyers paid particular attention to the similarity of the references suppliers provided. Similar references signaled the requisite experience and ability to handle the buyer's account; dissimilar references either triggered fears of supplier inability to handle their account (when references were smaller or less prestigious) or fears of inattentiveness to their account (when references were larger or more prestigious).

Although price is certainly important when selling to barricaded buyers, this research suggests its importance is largely situational, ranging from zero percent (for some services) to 75 percent (for some commodity products), with the average around 30 percent. As a result, suppliers engaging in the barricaded buying process should acknowledge the important role of price, but also the importance of these other strategies as ways to enhance their competitiveness.

Credit: 
American Marketing Association

The rare molecule weighing in on the birth of planets

image: As young planets grow within these discs they carve out gaps, leading to a structure of concentric rings.

Image: 
ESO/L. Calçada

Astronomers using one of the most advanced radio telescopes have discovered a rare molecule in the dust and gas disc around a young star - and it may provide an answer to one of the conundrums facing astronomers.

The star, named HD 163296, is located 330 light years from Earth and formed over the last six million years.

It is surrounded by a disc of dust and gas - a so-called protoplanetary disc. It is within these discs that young planets are born. Using a radio telescope in the Atacama Desert in Chile, researchers were able to detect an extremely faint signal showing the existence of a rare form of carbon monoxide - known as an isotopologue (13C17O).

The detection has allowed an international collaboration of scientists, led by the University of Leeds, to measure the mass of the gas in the disc more accurately than ever before. The results show that disc is much heavier - or more 'massive' - than previously thought.

Alice Booth, a PhD researcher at Leeds who led the study, said: "Our new observations showed there was between two and six times more mass hiding in the disc than previous observations could measure.

"This is an important finding in terms of the birth of planetary systems in discs - if they contain more gas, then they have more building material to form more massive planets."

The study - The first detection of 13C17O in a protoplanetary disk: a robust tracer of disk gas mass - is published today (12/09/2019) in Astrophysical Journal Letters.

The scientists' conclusions are well timed. Recent observations of protoplanetary discs have perplexed astronomers because they did not seem to contain enough gas and dust to create the planets observed.

Dr John Ilee, a researcher at Leeds who was also involved in the study, added: "The disc-exoplanet mass discrepancy raises serious questions about how and when planets are formed. However, if other discs are hiding similar amounts of mass as HD 163296, then we may just have underestimated their masses until now."

"We can measure disc masses by looking at how much light is given off by molecules like carbon monoxide. If the discs are sufficiently dense, then they can block the light given off by more common forms of carbon monoxide - and that could result in scientists underestimating the mass of the gas present.

"This study has used a technique to observe the much rarer 13C17O molecule - and that's allowed us to peer deep inside the disc and find a previously hidden reservoir of gas."

The researchers made use of one of the most sophisticated radio telescopes in the world - the Atacama Large Millimetre/submillimetre Array (ALMA) - high in the Atacama Desert.

ALMA is able to observe light that is invisible to the naked eye, allowing astronomers to view what is known as the 'cold universe' - those parts of space not visible using optical telescopes.

Booth said: "Our work shows the amazing contribution that ALMA is making to our understanding of the Universe. It is helping build a more accurate picture of the physics leading to the formation of new planets. This of course then helps us understand how the Solar System and Earth came to be."

The researchers are already planning the next steps in their work.

Booth added: "We suspect that ALMA will allow us to observe this rare form of CO in many other discs. By doing that, we can more accurately measure their mass, and determine whether scientists have systematically been underestimating how much matter they contain."

Credit: 
University of Leeds

Few people with peanut allergy tolerate peanut after stopping oral immunotherapy

image: These are peanuts spilling off a plate.

Image: 
NIAID

WHAT:
Allergy to peanut, which is often severe, is one of the most common food allergies in the United States. Although previous studies have shown that peanut oral immunotherapy (OIT)--ingesting small, controlled amounts of peanut protein--can desensitize adults and children and prevent life-threatening allergic reactions, the optimal duration and dose is unknown. In a study that followed participants after OIT successfully desensitized them to peanut, discontinuing OIT or continuing OIT at a reduced dose led to a decline in its protective effects. The study, published online today in The Lancet, also found that several blood tests administered before OIT could predict the success of therapy. The Phase 2 study was supported by the National Institute of Allergy and Infectious Diseases (NIAID), part of the NIH, and may inform who may benefit from peanut OIT and what changes in this experimental treatment should be implemented.

Investigators at Stanford University enrolled 120 people aged 7 to 55 with diagnosed peanut allergy in the Peanut Oral Immunotherapy Study: Safety Efficacy and Discovery, or POISED. While otherwise avoiding peanut throughout the trial, 95 participants received gradually increasing daily doses of peanut protein up to 4 grams, and 25 participants received daily placebo oat flour OIT. After 24 months, participants were given gradually increasing amounts of peanut in a controlled environment, to assess their tolerance. Of those participants who received peanut OIT, 83% passed the peanut challenge without an allergic reaction, while only 4% on placebo OIT did so.

Those on OIT who passed the challenge were then randomized to receive either placebo OIT or were switched to a 300-mg daily dose of peanut protein. One year later, more participants on 300-mg peanut OIT (37%) passed the challenge than those on placebo OIT (13%), confirming insights from smaller trials that desensitization is maintained in only a minority of participants after OIT is discontinued or reduced. Participants who passed food challenges also had lower initial levels of allergic antibodies to peanut protein and other indicators of allergic activity in the blood. Future research will focus on identifying optimal OIT regimens that maintain protection after therapy and will allow for regular food consumption without allergic symptoms.

Credit: 
NIH/National Institute of Allergy and Infectious Diseases

Innovative treatment to prevent common brain infection could save NHS £7 million per year

An innovative solution used to prevent common brain infections in patients having surgery for hydrocephalus has been found to significantly reduce infection rates according to a report published in The Lancet today (12 September 2019).

Hydrocephalus is a build-up of fluid on the brain. The excess fluid puts pressure on the brain, which can damage it. Approximately one out of every 500 babies is born with hydrocephalus, making it the most common reason for brain surgery in children.

Babies born with hydrocephalus (congenital) and adults or children who develop it (acquired) usually need prompt treatment to reduce the pressure on their brain. This is usually done with a shunt.

During surgery, a thin tube called a shunt is implanted in the brain. The excess cerebrospinal fluid (CSF) in the brain flows through the shunt to drain into the abdominal cavity where is absorbed. Approximately 1300 new shunts (UK shunt registry) are inserted in the UK each year.

Unfortunately, shunt infection affects up to 15% of patients having shunt surgery, and is more common in children and neonates. Shunt infection is a serious complication that can lead to meningitis, weeks in hospital, prolonged antibiotics, the need for further surgery and irreversible brain injury.

A team of scientists from the University of Nottingham, led by Professor of Surgical Infection Roger Bayston, developed a novel process (Bactiseal®) that allows brain shunts to be impregnated with antibiotics during manufacture.

A team of researchers from the University of Liverpool, Alder Hey Children's NHS Foundation Trust and The Walton Centre NHS Foundation Trust conducted the largest - ever clinical trial for hydrocephalus to test the infection-reducing properties of Bactiseal®. The BASICS trial (British Antibiotic and Silver Impregnated Catheters for ventriculoperitoneal Shunts) was funded by the National Institute for Health Research (NIHR), and cost £2.3M. Over 1600 patients with hydrocephalus took part, across 21 UK neurosurgery centres. The study took seven years (2012-2019) to complete.

The BASICS trial compared antibiotic and silver shunts to standard shunts (without antibiotic or silver coating). The results showed that antibiotic shunts reduce the infection rate from 6% to 2% and saved the NHS approximately £130K per infection averted. If the antibiotic shunts were used in all new patients, this would save the NHS approximately 7 million pounds per year.

The Chief investigators were Professor Conor Mallucci, Paediatric Neurosurgeon at Alder Hey Children's Hospital Trust, and Michael Jenkinson, Reader in Neurosurgery at the University of Liverpool and Consultant Neurosurgeon at The Walton Centre. The study was co-ordinated by Professor Carrol Gamble and run by the University of Liverpool's Clinical Trials Research Centre.

Conor Mallucci, said: "The results of our trial will have an impact on national and international hydrocephalus guidelines and policy. Using these antibiotic shunts will not only reduce potential harm to our patients but are also cost effective and should save Healthcare providers millions of pounds avoiding countless unnecessary days in hospital.'

Michael Jenkinson, said: "The BASICS study shows that antibiotic shunts reduce the risk of infection for all patients having shunt surgery for hydrocephalus. If we use antibiotic shunts routinely we really can 'get it right first time' by avoiding harm and delivering better outcomes for all our patients."

Roger Bayston, said: "The antibiotic shunts have now been shown in a well-designed randomised controlled trial to significantly reduce infection in hydrocephalus shunts. This is a major step forward in treatment of this condition, which can affect newborn babies and adults alike, and will reduce the need for surgery and for antibiotic treatment and will save heathcare costs."

The full paper, entitled 'Antibiotic or silver versus standard ventriculoperitoneal shunts (BASICS): a multicentre, single-blinded, randomised trial and economic evaluation', can be found on The Lancet website once the embargo has lifted.

Credit: 
University of Liverpool

UMass Amherst researchers release new findings in groundbreaking gambling study

image: The UMass Amherst epidemiologist is lead investigator of the MAGIC study.

Image: 
UMass Amherst

New findings released Sept. 12 from a groundbreaking gambling study by a University of Massachusetts Amherst research team show that out-of-state casino gambling among Massachusetts residents decreased significantly after the Commonwealth's first slot parlor, Plainridge Park Casino, opened in Plainville in the summer of 2015.

"That suggests that the slot parlor was successful at recapturing people who had been gambling in casinos out of state," says UMass Amherst epidemiologist Rachel Volberg, lead investigator of the Massachusetts Gambling Impact Cohort (MAGIC) study.

Conducted by the Social and Economic Impacts of Gambling in Massachusetts (SEIGMA) research team at UMass Amherst's School of Public Health and Health Sciences, MAGIC is the first major adult cohort study of gambling in the U.S., which examines gambling behaviors by surveying the same individuals over time. The study aims to uncover and understand populations at higher risk of experiencing problem gambling and gambling harm, and to support the development of effective and efficient treatment and prevention programs in Massachusetts.

The new report, prepared by lead author Alissa Mazar and presented to the Massachusetts Gaming Commission at its meeting in Boston, covers "Wave 3," a period from 2015 to 2016, before the state's two large resort-casinos opened in August 2018 and June 2019.

"Although this report focuses on results from before the opening of MGM Springfield and Encore Boston Harbor, we have already learned a great deal about how gambling problems among Massachusetts adults develop, progress and remit - information that will assist the Gaming Commission and the Department of Public Health in crafting the right mix of prevention, intervention, treatment and recovery services to effectively minimize and mitigate gambling harm in the Commonwealth," Volberg says.

In another "very interesting finding," Volberg notes that people who gambled were unlikely to stop gambling over the three years surveyed so far. "That has implications for both prevention and treatment," she says. "On the prevention side, it suggests that it's important to provide people with tools to manage their gambling so it doesn't become problematic."

Other findings:

From 2015 to 2016, the incidence rate of problem gambling, which refers to the proportion of people who newly experience problem gambling over a 12-month period, was 1.2 percent, which is similar to other jurisdictions.

From Wave 2 to Wave 3, the remission rate, which refers to the proportion of people no longer experiencing problem gambling who were experiencing this disorder 12 months prior, was 44 percent. Slightly more individuals remitted compared with the number identified as new problem gamblers.

"When you have people both developing a problem and remitting within a given period, that suggests problem gambling can be reduced by using some resources for treatment and some for preventing people from progressing to a more serious situation in the first place," Volberg says.

Conventional substance abuse and gambling treatment programs, which typically require people to abstain from their problem behavior, may not be the most effective treatment model, the study's data suggest. "Only 3 to 10 percent of problem gamblers ever seek professional treatment," Volberg says. "If you put up that abstinence barrier, it makes it very unattractive for someone who could benefit from the help."

The next MAGIC report, to be released in 2020, will examine the predictors of problem gambling over years and whether racial/ethnic, income, gender and/or regional differences exist in these predictors.

Credit: 
University of Massachusetts Amherst

Breaking the 'stalemate' in the most common soft tissue sarcoma in children

A phase 2 clinical trial has found that combining a molecular targeted drug called temsirolimus with chemotherapy shows promise in the treatment of rhabdomyosarcoma, the most common soft tissue sarcoma in childhood. The Children's Oncology Group trial was led by Leo Mascarenhas, MD, MS, Deputy Director of the Children's Center for Cancer and Blood Diseases at Children's Hospital Los Angeles. Results were recently published online in the Journal of Clinical Oncology.

"Since the early 1990s, there's been no change in the overall survival or risk of recurrence of this disease," explains Dr. Mascarenhas, Section Head, Oncology, in the Division of Oncology, Hematology and Blood and Marrow Transplantation at CHLA. "This trial was pivotal in finding a path forward to potentially break the stalemate."

Rhabdomyosarcoma is a rare childhood cancer that arises in the body's soft tissues, such as muscle. A small group of patients--those whose tumors can be surgically removed at the time of diagnosis--have an over 90% chance of being cured. But for others, the outlook is far less certain. About half are considered "intermediate-risk," with a 60% to 70% chance of long-term survival. Roughly 25% of patients are diagnosed with disease that's already spread; these children have a poor prognosis. In addition, once rhabdomyosarcoma relapses in any patient, long-term survival plummets to under 20%.

The goal of the clinical trial was to see if a targeted drug could be paired with chemotherapy to improve patient outcomes. It was the first-ever randomized trial in rhabdomyosarcoma to test targeted agents in combination with chemotherapy in both treatment groups.

Researchers compared two targeted drugs against each other: bevacizumab, which inhibits the growth of blood vessels that feed tumors, and temsirolimus, which inhibits a pathway often active in rhabdomyosarcoma called mammalian target of rapamycin (mTOR). Both drugs are approved by the Food and Drug Administration for use in other cancers.

The multicenter trial enrolled 86 rhabdomyosarcoma patients who had relapsed for the first time. About half received bevacizumab with chemotherapy; the other half received temsirolimus with chemotherapy. The chemotherapy agents used were vinorelbine and cyclophosphamide.

Enrollment was stopped early because an interim analysis showed that the temsirolimus combination was clearly superior. After six months, the event-free survival rate of patients receiving the bevacizumab treatment was 54.6%--comparable to results expected at this point in treatment with chemotherapy alone. For patients receiving temsirolimus, it was 69.1%.

The goal of this trial was to determine which molecularly targeted agent warranted further investigation. Because the patients studied had already relapsed, most did not survive long-term on either treatment. However, the Children's Oncology Group is now conducting a multicenter, phase 3 clinical trial to study the effectiveness of the temsirolimus-chemotherapy combination in newly diagnosed, intermediate-risk patients.

Researchers are trying to see if giving this therapy early on--when the cancer is most sensitive to treatment--will improve long-term outcomes.

"Prior to these results, there were no compelling ideas on how to improve survival of newly diagnosed patients," says Dr. Mascarenhas, who directs the Sarcoma and Solid Tumor Program at CHLA and is also Associate Professor of Pediatrics at the Keck School of Medicine of USC. "There is a lot more work to be done. But we now may have a way forward."

Credit: 
Children's Hospital Los Angeles

'Ringing' black hole validates Einstein's general relativity 10 years ahead of schedule

image: An illustration of a supersized black hole resulting from the merger of two smaller black holes. The collision has caused the black hole to ring and radiate gravitational waves (white).

Image: 
Maximiliano Isi/MIT

For the first time, astrophysicists have heard a black hole ringing like a bell. By reanalyzing the first black hole merger ever detected, the astrophysicists measured the gravitational wave 'tones' emitted following the event. The breakthrough comes 10 years earlier than expected and confirms that the properties of black holes are just as Einstein predicted in his theory of general relativity in 1915.

"Previously it was believed these tones were too faint to be detected, yet now we are able to," says study co-author Will Farr. "Just like the measurement of atomic spectra in the late 1800s opened the era of stellar astrophysics and classifying and understanding stars, this is the opening of the era of black hole spectra and understanding black holes and the general relativity that sits behind them."

Farr is an associate professor at Stony Brook University in New York and group leader for gravitational wave astronomy at the Flatiron Institute's Center for Computational Astrophysics in New York City. He and his colleagues present their findings September 12 in Physical Review Letters.

When two black holes merge into one, the resulting supersized black hole wobbles like a struck bell. The reverberations emit gravitational waves at characteristic tones that fade away as the black hole settles. The so-called 'no-hair theory' states that these tones -- and all other external properties of a black hole -- depend only on the black hole's mass and rotation, just as Einstein's general relativity predicts. Some scientists, however, propose that reality is hairier and that effects like quantum mechanics play a role as well.

Scientists knew that detecting a black hole's tones could settle the debate. But the tones were thought to be too quiet to be detected by the current-generation gravitational wave detectors LIGO and Virgo.

In the new study, the astrophysicists combined simulations of black hole mergers with a reanalysis of the first gravitational waves ever detected. Those waves came from the merger of two black holes. The analysis led to the identification of two independent tones emitted by the newly combined black hole. The pitch and decay rates of these tones lined up with Einstein's general relativity. The no-hair theory stood triumphant.

Farr says that with new data analysis and with LIGO and Virgo continuing to observe black hole mergers, tests from the observatories will become more precise. The added precision will likely lead to additional detections of black hole tones and an improved understanding of the exotic objects.

Farr collaborated on the study with Maximiliano Isi of the Massachusetts Institute of Technology and the astrophysicists Matt Giesler, Mark Scheel and Saul Teukolsky of the California Institute of Technology.

Credit: 
Simons Foundation

How breast cancer uses exosomes to metastasize to the brain

image: Extracellular vesicles, also known as exosomes, cross the blood-brain barrier via a transcytosis process.

Image: 
Kristin Johnson/Vascular Biology Program, Boston Children's Hospital for <em>ACS Nano</em> 2019; <a href="http://doi.org/10.1021/acsnano.9b04397" target="_blank">doi.org/10.1021/acsnano.9b04397</a>

Metastasizing breast cancers typically seek out the bones, lung, and brain. Brain metastases are especially dangerous; many women survive for less than a year after diagnosis. How is the cancer able to get past the blood brain barrier? And can it be blocked?

Those questions led PhD candidate Golnaz Morad, DDS, and her mentor Marsha Moses, PhD, to conduct an in-depth investigation of exosomes, also known as extracellular vesicles or EVs, and their role in breast-to-brain metastasis. Their surprising findings appear in the journal ACS Nano.

"Golnaz was able to identify the mechanism by which EVs pass through the blood brain barrier and provide a 'niche' so that breast cancer cells can metastasize to brain," says Moses, who directs the Vascular Biology Program at Boston Children's Hospital and whose lab is interested in women's cancers.

Now that they know the mechanism, Moses and Morad hope to identify therapeutic targets that could stop brain metastases from happening.

EVs and cancer

Simply put, EVs are tiny bubbles released by cells, encapsulating chemical messages they wish to convey. In the case of cancer cells, EVs carry factors that help create a more hospitable environment for both the primary tumor and its metastases, as Moses and Morad detailed recently in a review article. Primary tumors can secrete EVs into the circulation, allowing them to travel to distant organs and help spread the cancer.

"The main question we had was, how can EVs reach the brain in the first place?" says Morad. "The blood brain barrier doesn't allow anything larger than 400 daltons to passively get into brain tissue. Exosomes are more than two thousand times larger than the size cutoff limit."

Hijacking transcytosis

The blood brain barrier (BBB) is a complex structure made up of three kinds of cells. EVs traveling in the blood first encounter tightly joined brain endothelial cells. Pericytes comprise the next layer, followed by astrocytes (the "feet," which connect to the cell bodies). Crosstalk between the cells circles the wagons even tighter: Astrocytes and pericytes send cues to the endothelial cells to tighten up the junctions.

Morad, Moses, and colleagues worked with several models of the BBB, including in vitro microfluidic models of the BBB ("BBB on a chip"), as well as static in vitro models and live zebrafish and mouse models. The labs of Donald Ingber, MD, PhD, and Leonard Zon, MD, collaborated in the development of the BBB-on-a-chip and zebrafish models, respectively. The Moses Lab also collaborated with the lab of Christopher Carman, PhD, of Harvard's T.H. Chan School of Public Health for the high-resolution imaging conducted in this study.

Each model and technique provided its own insights. Bottom line, Morad and Moses showed, for the first time, that rather than squeezing in between the cells in the BBB to get into the brain, the EVs trick the endothelial cells into taking them up. Using a standard biological pathway called transcytosis, the cells simply engulfed the EVs, bringing them inside and releasing them into brain tissue like so many Trojan horses.

"EVs can also manipulate endothelial cells to facilitate their own transport across the BBB," says Morad. "They hijack the pathways involved in the uptake and sorting of molecules and change regulation of the pathways."

Ongoing work in the Moses Lab shows that once the EVs have breached the barrier, they trick astrocytes into sending signals to the surrounding environment, making it more receptive to tumor growth. An inventory of these signals, using mass spectrometry and other techniques, identified some that are known to degrade the network of proteins providing structural and biochemical support to brain cells.

EVs as anticancer delivery vehicles?

For the study, Morad had to develop special methods to harvest the EVs in quantity and to verify their identity.

"They are remarkable little vehicles, but it is very challenging to get enough of them," says Moses.

Having discovered what EVs do to help breast cancer metastasize to the brain, the Moses Lab hopes to turn the tables -- and use EVs (or synthetic versions) to deliver anticancer drugs that home to the metastatic site. That work is ongoing.

The work, for which the investigators have filed patents, was supported by the Breast Cancer Research Foundation, the National Institutes of Health (R01 CA185530, K01 DK111790), and the Advanced Medical Foundation. See the paper for a full list of authors.

Credit: 
Boston Children's Hospital

FASEB Journal: Anesthetic drug sevoflurane improves sepsis outcomes, animal study reveals

Patients with sepsis often require surgery or imaging procedures under general anesthesia, yet there is no standard regimen for anesthetizing septic patients. Of volatile (inhaled) anesthetics, sevoflurane and isoflurane are the most commonly used drugs, despite their undetermined mechanisms of action. A novel study in The FASEB Journal suggests that the type of drug used in general anesthesia could be critical to the survival of patients with sepsis.

To conduct the experiment, researchers induced sepsis in a mouse model. They then separated the mice into three groups: the first received sevoflurane, the second received isoflurane, and the third acted as a control, receiving no anesthetic. Compared with the control, the first group exposed to sevoflurane displayed improved survival rates, less bacteria in their organs, and less splenic neutrophil apoptosis (i.e., the process through which immune cells die). The second group exposed to isoflurane, on the other hand, displayed worsened sepsis outcomes than the control.

"With the prevalence of sepsis on the rise and the mortality rates of severe sepsis already extremely high, it is crucial that our findings are validated in a human model," said Koichi Yuki, MD, an associate professor of anesthesia at Boston Children's Hospital, Department of Anesthesiology, Critical Care and Pain Medicine, Cardiac Anesthesia Division. "We are hopeful that one day, the findings from this study will improve the outcomes of patients with sepsis."

"The clinical importance of this study cannot be overstated," said Thoru Pederson, PhD, Editor-in-Chief of The FASEB Journal.

Credit: 
Federation of American Societies for Experimental Biology

Neonicotinoid insecticides cause rapid weight loss and travel delays in migrating songbirds

Songbirds exposed to imidacloprid, a widely used neonicotinoid insecticide, exhibit anorexic behavior, reduced body weight and delays in their migratory itinerary, according to a new study. This is perhaps the first direct evidence of a mechanistic link between the pesticide and declining migratory bird populations. The results suggest that, even in tiny sublethal doses, the presence of these neurotoxic compounds at critical stopover sites refueling birds visit on their cross-continental springtime journeys could be contributing to the overall population declines observed among many migratory species. Neonicotinoids are the most widely used class of agricultural pesticide. However, while the controversial neurotoxic insecticide is advertised to pose a low risk to vertebrates, a growing body of evidence has shown that neonicotinoids may have significant negative impacts on a number of species, including birds. This research has suggested that birds who use agricultural environments as habitats or for foraging stopovers during migration are routinely exposed to neonicotinoid pesticides. Many of these species are also undergoing precipitous population declines. Despite this association, the overall influence of neonicotinoids on wild migratory songbirds remains virtually unknown. Building on previous research, Margaret Eng and colleagues used automated telemetry to track individual white-crowned sparrows they experimentally exposed to sublethal yet field-realistic doses of imidacloprid during migration. The authors found that the pesticide acted as an anorexic agent within the birds, causing rapid losses in body weight and fat. This resulted in extended stays at stopover sites as the birds had to forage longer to restore their greatly depleted fuel stores and to recover from imidacloprid's neurotoxic effects before moving on. According to Eng et al., the use of neonicotinoids along migratory routes throughout North America means that birds may suffer repeated exposure at successive stopover sites, amplifying migration delays and their consequences.

Credit: 
American Association for the Advancement of Science (AAAS)

Researchers and rats play 'hide and seek,' illuminating playful behavior in animals

Rats can be taught to play hide and seek with humans and can become quite skilled at the game, according to a new study, which presents a novel paradigm for studying insights into the neurobiology of playful behavior in animals. The inherent characteristics of animal play behavior – that it is free and provides no benefits beyond the game – make it difficult to evaluate using the traditional methods of neuroscience, which often rely on strict control and conditioning. As a result, very little is known about the prevalence or neural basis of playful behaviors in animals. Annika Reinhold and colleagues taught rats to play a simplified, rat-versus-human version of “Hide and Seek,” a game played by humans across cultures worldwide. After a few weeks, the rats were not only able to play the game but learned how to alternate between hiding and seeking roles. They learned how to perform each role at a highly proficient level. According to Reinhold et al., when seeking, the animals learned to look for a hidden human and to keep looking until they found them. When in a hiding role, the rats remained in place until discovered by the human player. Rather than food, the authors rewarded successful hiding and seeking behaviors with playful social interactions, such as tickling, petting or rough-and-tumble-like play. The results show that the animals became more strategic players over time – employing systematic searches, visual cues and investigating the past hiding places of their human counterparts. When hiding, they remained silent and changed their locations, preferring to be concealed in opaque cardboard boxes over transparent boxes. The authors also observed rat vocalizations unique to each role and associated neuronal recordings revealed intense activity in the prefrontal-cortex that varied with game events.

Credit: 
American Association for the Advancement of Science (AAAS)

JILA's novel atomic clock design offers 'tweezer' control

image: JILA/NIST physicist Adam Kaufman adjusts the set-up for a laser that controls and cools the strontium atoms in the optical tweezer clock. The atoms are trapped individually by 10 tweezers -- laser light focused into tiny spots -- inside the square orange container behind Kaufman's hand.

Image: 
Burrus/NIST

JILA physicists have demonstrated a novel atomic clock design that combines near-continuous operation with strong signals and high stability, features not previously found together in a single type of next-generation atomic clock. The new clock, which uses laser "tweezers" to trap, control and isolate the atoms, also offers unique possibilities for enhancing clock performance using the tricks of quantum physics.

Described in a paper to be published online Sept. 12 by the journal Science, the new clock platform is an array of up to 10 strontium atoms confined individually by 10 optical tweezers, which are created by an infrared laser beam aimed through a microscope and deflected into 10 spots.

JILA is a joint research and training institute operated by the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.

While JILA researchers have yet to fully evaluate the new clock's performance, preliminary data suggest the design is promising. The tweezer clock is "on duty" self-verifying its performance 96% of the time because it needs little downtime to prepare new atoms, and the atoms are well-isolated so they are less likely to interfere with one another. Both of these strengths are shared with one of the world's leading clocks, a clock based on a single ion (electrically charged atom). The tweezer clock also can provide the strong signals and stability of a multi-atom lattice clock, which traps atoms in a grid of laser light.

"The tweezer design's long-term promise as a competitive clock is rooted in its unique balancing of these capabilities," JILA/NIST physicist and project leader Adam Kaufman said.

Next-generation atomic clocks stabilize the color, or frequency, of a laser to atoms "ticking" between two energy levels. The tweezer clock traps and controls atoms individually to maintain ticking stability and detects this behavior without losing them, and thus can reuse the same atoms many times without needing to constantly reload new ones.

"The tweezer design addresses various issues with other atomic clocks," Kaufman said. "Using our technique, we can hold onto atoms and reuse them for as long as 16 seconds, which improves the duty cycle--the fraction of time spent using the atoms' ticking to correct the laser frequency--and precision. The tweezer clock can also get a single atom very rapidly into a trap site, which means there is less interference and you get a more stable signal for a longer time."

NIST and JILA researchers have been building next-generation atomic clocks for many years. These clocks operate at optical frequencies, which are much higher than current time standards based on microwave frequencies. The research is helping to prepare for the future international redefinition of the second, which has been based on the cesium atom since 1967. Optical clocks also have applications beyond timekeeping such as measuring Earth's shape based on gravity measurements (called geodesy), searching for the elusive dark matter thought to make up most of the matter in the universe, and enhancing quantum information sciences.

To create the tweezer clock, an infrared laser beam is aimed into a microscope and focused to a small spot. Radio waves at 10 different frequencies are applied sequentially to a special deflector to create 10 spots of light for trapping individual atoms. The traps are refilled every few seconds from a pre-chilled cloud of atoms overlapped with the tweezer light.

The atoms held by the tweezers are excited by a laser stabilized by a silicon crystal cavity, in which light bounces back and forth at a specific frequency. This "clock laser" light--provided by co-author and NIST/JILA Fellow Jun Ye's lab--is applied perpendicular to the tweezer light, along with an applied magnetic field. Non-destructive imaging reveals whether the atoms are ticking properly; the atoms only emit light, or fluoresce, when in the lower-energy state.

Too many atoms in the system can lead to collisions that destabilize the clock, so to get rid of extra atoms, the researchers apply a pulse of light to create weakly bound molecules, which then break apart and escape the trap. Tweezer sites are left either with one atom or empty; with each run of the experiment, each tweezer has about a 50% chance of being empty or containing a single atom. Having at most one atom per site keeps the ticking stable for longer time periods.

Like ordinary metal tweezers, the laser tweezers offer pinpoint control, which enables researchers to vary the spacing between atoms and tweak their quantum properties. Kaufman has previously used optical tweezers to "entangle" two atoms, a quantum phenomenon that links their properties even at a distance. The tweezers are used to excite the atoms so their electrons are more weakly bound to the nucleus. This "fluffy" state makes it easier to trap the atoms in opposing internal magnetic states called spin up and spin down. Then a process called spin exchange entangles the atoms. Special quantum states like entanglement can improve measurement sensitivity and thus may enhance clock precision.

The research team now plans to build a larger clock and formally evaluate its performance. Specifically, the researchers plan to use more tweezers and atoms, with a target of about 150 atoms. Kaufman also plans to add entanglement, which could improve clock sensitivity and performance and, in a separate application, perhaps provide a new platform for quantum computing and simulation.

Credit: 
National Institute of Standards and Technology (NIST)

Stem cell researchers reactivate 'back-up genes' in the lab

image: "The first step towards developing new treatments is figuring out how X chromosome reactivation actually works." PhD researchers Adrian Janiszewski (left) and Irene Talon (middle) with Assistant Professor Vincent Pasque (right) from KU Leuven.

Image: 
KU Leuven

Vincent Pasque and his team at KU Leuven have unravelled parts of a mechanism that may one day help to treat Rett syndrome and other genetic disorders linked to the X chromosome.

Women and most female mammals have two X chromosomes, but only one of these is active in any given cell. This active X chromosome is selected through a flip-of-the-coin process in the very early stages of embryonic development: each chromosome has a 50/50 chance of remaining active and getting to express its genes, or to be inactivated through a process called X chromosome inactivation.

X chromosome inactivation is a perfectly normal process, but the consequences can be devastating when one of the X chromosomes carries a defective gene. This is the case in female patients with Rett syndrome: after one chromosome in each cell becomes inactive, about half of the patient's cells will use the defective gene. Once born, the girls suffer a progressive loss of motor skills and speech. Male patients with Rett syndrome have only one X chromosome and therefore no healthy copy of the gene to compensate for the defective one; these patients usually die before birth.

So how can we treat Rett syndrome and other X-linked disorders? In theory, the answer is simple: in cells that use the defective gene, we reactivate the healthy copy on the inactive X chromosome. In practice, however, that's easier said than done.

Stem cell researchers from the Vincent Pasque Lab at KU Leuven, together with researchers from the Jean-Christophe Marine lab (VIB/KU Leuven) and the Edith Heard lab (EMBL, Germany) have now solved part of the puzzle. In a paper published in Genome Research, they present new findings on the underlying mechanism of X chromosome reactivation.

Reactivating genes in the lab

The first step towards developing new treatments is figuring out how X chromosome reactivation actually works, explains Adrian Janiszewski (KU Leuven), a co-lead author of the study. "Under normal circumstances, inactive X chromosomes only become active again during one of the very early stages of embryonic development. Rather than studying embryos, we used a technique known as cell reprogramming: we took adult cells from female mice and reprogrammed them in the culture dish into so-called induced pluripotent stem cells or iPS cells, which resemble embryonic stem cells but are not derived from early embryos."

Assistant Professor Vincent Pasque, the senior author of this study, continues: "Working with iPS cells has numerous advantages. Most importantly, when you reprogramme female adult cells into iPS cells, both X chromosomes become active again. In other words: X chromosome reactivation starts happening right under your microscope."

Irene Talon (KU Leuven), the second co-lead author of the study, continues: "We monitored almost 200 different X-linked genes throughout the X chromosome reactivation process. What we found is that reactivation happens gradually: different genes require different amounts of time to become active again. Our findings suggest that the explanation for this speed difference is a combination of the location of the gene in 3D space on the X chromosome and the role of proteins (transcription factors), and enzymes (histone deacetylases), in particular."

Long-term therapeutic potential

While this is one part of the puzzle, a lot of work remains to be done, Vincent Pasque concludes. "It's important to remember that we're talking about very fundamental research here. Contributing to the development of a cure for Rett syndrome and similar disorders is our long-term goal, but it will take us a while to get there, and there are many hurdles to overcome."

"We still need to figure out how to use the mechanism for a single gene, how to do it safely in patients, and how to target the right cells in the brain. We do not yet know how to overcome these formidable challenges but we do know that gaining a fundamental understanding of how things work is the crucial first step. That's how science works: it's a slow process."

"Now that we have pinpointed three factors involved in X chromosome reactivation, we can start experimenting to find out more about their precise role. We need to know the ins and outs of X chromosome reactivation before we can try to use the mechanism for therapeutic purposes. This is why fundamental research is so important."

Credit: 
KU Leuven