Culture

New NASA visualization probes the light-bending dance of binary black holes

image: In this frame from the new visualization, a supermassive black hole weighing 200 million solar masses lies in the foreground. Its gravity distorts light from the accretion disk of a smaller companion black hole almost directly behind it, creating this surreal view. Different colors for the accretion disks make it easier to track the contributions of each one.

Image: 
NASA's Goddard Space Flight Center/Jeremy Schnittman and Brian P. Powell

A pair of orbiting black holes millions of times the Sun's mass perform a hypnotic pas de deux in a new NASA visualization. The movie traces how the black holes distort and redirect light emanating from the maelstrom of hot gas - called an accretion disk - that surrounds each one.

Viewed from near the orbital plane, each accretion disk takes on a characteristic double-humped look. But as one passes in front of the other, the gravity of the foreground black hole transforms its partner into a rapidly changing sequence of arcs. These distortions play out as light from both disks navigates the tangled fabric of space and time near the black holes.

"We're seeing two supermassive black holes, a larger one with 200 million solar masses and a smaller companion weighing half as much," said Jeremy Schnittman, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Maryland, who created the visualization. "These are the kinds of black hole binary systems where we think both members could maintain accretion disks lasting millions of years."

The accretion disks have different colors, red and blue, to make it easier to track the light sources, but the choice also reflects reality. Hotter gas gives off light closer to the blue end of the spectrum, and material orbiting smaller black holes experiences stronger gravitational effects that produce higher temperatures. For these masses, both accretion disks would actually emit most of their light in the UV, with the blue disk reaching a slightly higher temperature.

Visualizations like this help scientists picture the fascinating consequences of extreme gravity's funhouse mirror. The new video doubles down on an earlier one Schnittman produced showing a solitary black hole from various angles.

Seen nearly edgewise, the accretion disks look noticeably brighter on one side. Gravitational distortion alters the paths of light coming from different parts of the disks, producing the warped image. The rapid motion of gas near the black hole modifies the disk's luminosity through a phenomenon called Doppler boosting - an effect of Einstein's relativity theory that brightens the side rotating toward the viewer and dims the side spinning away.

The visualization also shows a more subtle phenomenon called relativistic aberration. The black holes appear smaller as they approach the viewer and larger when moving away.

These effects disappear when viewing the system from above, but new features emerge. Both black holes produce small images of their partners that circle around them each orbit. Looking closer, it's clear that these images are actually edge-on views. To produce them, light from the black holes must be redirected by 90 degrees, which means we're observing the black holes from two different perspectives - face on and edge on - at the same time.

"A striking aspect of this new visualization is the self-similar nature of the images produced by gravitational lensing," Schnittman explained. "Zooming into each black hole reveals multiple, increasingly distorted images of its partner."

Schnittman created the visualization by computing the path taken by light rays from the accretion disks as they made their way through the warped space-time around the black holes. On a modern desktop computer, the calculations needed to make the movie frames would have taken about a decade. So Schnittman teamed up with Goddard data scientist Brian P. Powell to use the Discover supercomputer at the NASA Center for Climate Simulation. Using just 2% of Discover's 129,000 processors, these computations took about a day.

Astronomers expect that, in the not-too-distant future, they'll be able to detect gravitational waves - ripples in space-time - produced when two supermassive black holes in a system much like the one Schnittman depicted spiral together and merge.

Banner: In this frame from the new visualization, a supermassive black hole weighing 200 million solar masses lies in the foreground. Its gravity distorts light from the accretion disk of a smaller companion black hole almost directly behind it, creating this surreal view. Different colors for the accretion disks make it easier to track the contributions of each one. Credit: NASA's Goddard Space Flight Center/Jeremy Schnittman and Brian P. Powell

Credit: 
NASA/Goddard Space Flight Center

FSU College of Medicine research links Parkinson's disease and neuroticism

New research from the Florida State University College of Medicine has found that the personality trait neuroticism is consistently associated with a higher risk of developing the brain disorder Parkinson's disease.

The research by Professor of Geriatrics Antonio Terracciano and team, published in Movement Disorders, found that adults in the study who scored in the top quartile of neuroticism had more than 80% greater risk of Parkinson's, compared to those who scored lower on neuroticism.

"Some clinicians think that the anxiety and depression is just the result of Parkinson's," Terracciano said. "However, our findings suggest that some emotional vulnerability is present early in life, years before the development of Parkinson's disease."

The effects were similar for women and men and across socioeconomic strata. Furthermore, the association was virtually unchanged in models that excluded incident cases within the first five years of follow-up and remained significant in models that accounted for demographic variables and other risk factors, including smoking, physical activity, anxiety and depression.

Three similar studies have published results consistent with Terracciano's findings, but with smaller sample sizes. Collectively, they provide a "pretty robust and replicable" assessment of the link between neuroticism and Parkinson's, Terracciano said.

"It kind of gives you a better understanding of the risk factors for the disease and what could be a contributing cause," he said. "This is one of many [factors], but the evidence is convincing."

Globally, an estimated six million people suffer from Parkinson's disease -- about 1% of all older adults -- making it the second most common neurodegenerative disease after Alzheimer's. The causes of Parkinson's disease are not well understood, but scientists believe genetic and environmental factors contribute to its onset.

Neuroticism is a personality trait that measures individual differences in the tendency to experience negative emotions, vulnerability to stress, inability to resist urges and self-consciousness. It is one of the five major personality traits known as the "Big Five" or five-factor model of personality and is one of the most studied psychological dispositions for its relevance spanning normal to abnormal emotional functioning.

Neuroticism has been linked to mood disorders and Alzheimer's, but there have been fewer studies on its prospective connection with Parkinson's.

"Individuals who score high in neuroticism are at higher risk for poor health outcomes across the lifespan, particularly in the domain of mental health and neurodegenerative diseases, including Alzheimer's disease and related dementias," Terracciano said.

Central to Terracciano's research was a large-scale study by the UK Biobank, which recruited nearly a half-million individuals ages 40-69 between 2006 and 2010, and collected data obtained over nearly 12 years of follow up. A baseline assessment measured neuroticism. There were 1,142 identified cases of Parkinson's ascertained through UK National Health Service electronic health records or death records up to 2018.

"Anxiety and depression are comorbid with Parkinson's disease," Terracciano said. "Many people with Parkinson's tend to be anxious or tend to get depressed. Part of that could be due to the disease and how it alters the brain and can have an influence on emotions. Part could be a psychological reaction of having a diagnosis of the disease."

Parkinson's is a long-term degenerative brain disorder that causes progressive decline of motor and physical functions. As the disease progresses, nerve cell damage in the brain causes dopamine levels to drop, leading to symptoms such as tremors, slow movement, stiffness and loss of balance. Dopamine is known as a "feel-good" hormone involved in reward, motivation, memory and attention in addition to regulating body movements.

Terracciano led the research team, which included Damaris Aschwanden, a post-doctoral researcher in the FSU Department of Geriatrics, and Angelina Sutin, professor in the FSU Department of Behavioral Sciences and Social Medicine. Researchers from the University of Montpellier in France; the National Research Council, Sant'Anna Institute and Tor Vergata University of Rome in Italy; and the University of Cambridge in the United Kingston contributed to this study.

Credit: 
Florida State University

Small physician offices are seeing negative effects from virtual health care models

In a newly released study, researchers found that remote and virtual care models can negatively impact small physician offices. Three researchers from University of Colorado Denver conducted the study, which was published in the National Library of Medicine.

The COVID-19 pandemic has accelerated the adoption of remote and virtual care models in both small and large health care facilities around the world. CU Denver researchers Jiban Khuntia, PhD, Rulon Stacey, PhD, and Madhavan Parthasarathy, PhD, initiated this study to assess the perceptions small health care businesses' have regarding the impact of remote and virtual care on their business sustainability during the pandemic.

The team analyzed how well perceptions of virtual care aligned with the current business scenarios of those surveyed. In total, researchers spoke with three different groups of health care facilities across Colorado--82 clinics, 99 small physician offices, and 89 pharmacies.

The study found the perception of virtual health models varied between the three groups. Clinics surveyed believe their virtual care directly affects their current business scenario. Overall, their attitude towards the virtual model was positive. Small physician offices reacted differently to the survey. They believe while virtual care models significantly affect their business scenarios, it mostly comes with negative impacts such as a decrease in revenue. Pharmacies saw no significant effect on their current business scenarios from the virtual model.

"We see small physician offices struggle to keep up with the changes COVID-19 has brought," said Khuntia "While they are ready to adapt to the changes virtual care models bring, the revenue stream isn't what it used to be with in-person visits."

Why is this important?

"If small health care firms cannot compete with the virtual model, they will become nonoperational," said Stacey. "This will damage traditional health services, particularly for critical care delivery and other services that cannot be done virtually."

The current perception small health care businesses have toward remote care will give policy makers a better understanding of the pros and cons of rapidly adopting the remote virtual care model. For small physician offices to surivde, there must be a balance between in-person and virtual care.

Credit: 
University of Colorado Denver

Protein linked to ALS/Ataxia could play key role in other neurodegenerative disorders

Neurological disorders are the number one cause of disability in the world, leading to seven million deaths each year. Yet few treatments exist for these diseases, which progressively diminish a person's ability to move and think.

Now, a new study suggests that some of these neurological disorders share a common underlying thread. Staufen1, a protein that accumulates in the brains of patients with certain neurological conditions, is linked to amyotrophic lateral sclerosis (ALS), or Lou Gehrig's disease, along with other neurological disorders, including Alzheimer's, Parkinson's, and Huntington's disease, according to University of Utah Health scientists.

The findings connect Staufen1 to the emerging concept that neurodegenerative diseases are linked to malfunctions in the way cells cope with cellular stress. These results, based on laboratory studies of human tissue and mouse models, suggest that targeting Staufen1 could eventually lead to therapeutic interventions for a number of these disorders.

The study appears in Annals of Neurology.

"Neurodegenerative diseases are a major cause of morbidity and mortality," says Stefan Pulst, M.D., Dr. Med, chair of the Department of Neurology at the University of Utah School of Medicine and senior researcher on the study. "Unfortunately, at this time, we have few, if any, disease-modifying therapies. This finding provides new insight into the pathogenesis of these disorders and potentially provides us with a new target for treatment."

In previous research, the scientists found that Staufen1 accumulates in cells of patients with ALS and cerebellar ataxia, a rare condition that causes patients to lose control of movement. They found that Staufen1 binds to a protein that is both a risk factor for ataxia and a risk factor for ALS. Together, along with other proteins, they form dense disease-related clusters called stress granules that can disrupt normal cellular function. However, when Staufen1 was lowered in the brains of mice, it not only improved the pathology of disease but also rid cells of stress granules.

In their new study, Pulst and colleagues sought to determine if Staufen1 overabundance was a factor in the development of other neurological disorders. To do it, they conducted laboratory experiments on skin cells and spinal cord tissues collected from 12 patients with several different neurodegenerative diseases. They also examined the effects of Staufen1 on neurodegeneration in two animal models.

"We found that Staufen1 protein levels were vastly increased in all of the disease models we examined," Pulst says. "In our laboratory animals, levels of this protein were three- to five-fold higher than in control animals. That's not subtle. If a protein is changed that much, it probably isn't good for any cell, particularly a neuron."

Digging deeper, the researchers found that Staufen1 has an important interaction with another protein called mTOR, a master regulator of many functions in the body that plays a key role in a process called autophagy. Autophagy, or "self-digestion," is a self-preservation mechanism that the body uses to remove dysfunctional cells.

The new study suggests that the complex relationship between Staufen1, mTOR, and autophagy could be a driving factor in the onset of several neurogenerative diseases, according to Daniel Scoles, Ph.D., study co-author and associate professor of neurology at U of U Health.

"When Staufen1 is increased, it actually impairs autophagy," Scoles says. "But we also know that autophagy can degrade Staufen1. It's a vicious cycle that can have a bad outcome for patients."

Based on these findings, Pulst and Scoles are hopeful that they can develop a medication to reduce Staufen1 levels in people at risk for sporadic ALS, the most common form of ALS, in which the causes of the disease are unknown.

If lowering Staufen1 is effective for ALS, it could eventually lead to new therapeutic approaches for the treatment of Alzheimer's disease and other Staufen1-related disorders, the researchers say.

Credit: 
University of Utah Health

A method to assess COVID-19 transmission risks in indoor settings

Two MIT professors have proposed a new approach to estimating the risks of exposure to Covid-19 under different indoor settings. The guideline they developed suggests a limit for exposure time, based on the number of people, the size of the space, the kinds of activity, whether masks are worn, and the ventilation and filtration rates. Their model offers a detailed, physics-based guideline for policymakers, businesses, schools, and individuals trying to gauge their own risks.

The guideline, appearing this week in the journal PNAS, was developed by Martin .Z. Bazant, professor of chemical engineering and applied mathematics, and John W. M. Bush, professor of applied mathematics. They stress that one key feature of their model, which has received less attention in existing public-health policies, is providing a specific limit for the amount of time a person spends in a given setting.

Their analysis is based on the fact that in enclosed spaces, tiny airborne pathogen-bearing droplets emitted by people as they talk, cough, sneeze, sing, or eat will tend to float in the air for long periods and to be well-mixed throughout the space by air currents. There is now overwhelming evidence, they say, that such airborne transmission plays a major role in the spread of Covid-19. Bush says the study was initially motivated early last year by their concern that many decisions about policies were being guided primarily by the "6-foot rule," which doesn't adequately address airborne transmission in indoor spaces.

Using a strictly quantitative approach based on the best available data, the model produces an estimate of how long, on average, it would take for one person to become infected with the SARS-CoV-2 virus if an infected person entered the space, based on the key set of variables defining a given indoor situation. Rather than a simple yes or no answer about whether a given setting or activity is safe, it provides a guide as to just how long a person could safely expect to engage in that activity, whether it be a few minutes in a store, an hour in a restaurant, or several hours a day in an office or classroom, for example.

"As scientists, we've tried to be very thoughtful and only go with what we see as hard data," Bazant says. "We've really tried to just stick to things we can carefully justify. We think our study is the most rigorous study of this type to date." While new data are appearing every day, and many uncertainties remain about the SARS-CoV-2 virus' transmission, he says, "We feel confident that we've made conservative choices at every point."

Bush adds: "It's a quickly moving field. We submit a paper and the next day a dozen relevant papers come out, so we scramble to incorporate them. It's been like shooting at a moving target." For example, while their model was initially based on the transmissibility of the original strain of SARS-CoV-2 from epidemiological data on the best characterized early spreading events, they have since added a transmissibility parameter, which can be adjusted to account for the higher spreading rates of the new emerging variants. This adjustment is based on how any new strain's transmissibility compares to the original strain; for example, for the U.K. strain, which has been estimated to be 60 percent more transmissible than the original, this parameter would be set at 1.6.

One thing that's clear, they say, is that simple rules, based on distance or capacity limits on certain types of businesses, don't reflect the full picture of the risk in a given setting. In some cases that risk may be higher than those simple rules convey; in others it may be lower. To help people, whether policymakers or individuals, to make more comprehensive evaluations, the researchers teamed with app developer Kasim Khan to put together an open-access mobile app and website where users can enter specific details about a situation -- size of the space, number of people, type of ventilation, type of activity, mask wearing, and the transmissibility factor for the predominant strain in the area at the time -- and receive an estimate of how long it would take, under those circumstances, for one new person to catch the virus if an infected person enters the space.

The calculations were based on inferences made from various mass-spreading events, where detailed data were available about numbers of people and their age range, sizes of the enclosed spaces, kinds of activities (singing, eating, exercising, etc.), ventilation systems, mask wearing, the amount of time spent, and the resulting rates of infections. Events they studied included, for example, the Skagit Valley Chorale in Washington state, where 86 percent of the seniors present became infected at a two-hour choir practice

While their guideline is based on well-mixed air within a given space, the risk would be higher if someone is positioned directly within a focused jet of particles emitted by a sneeze or a shout, for example. But in general the assumption of well-mixed air indoors seems to be consistent with the data from actual spreading events, they say.

"When you look at this guideline for limiting cumulative exposure time, it takes in all of the parameters that you think should be there -- the number of people, the time spent in the space, the volume of the space, the air conditioning rate and so on," Bush says. "All of these things are kind of intuitive, but it's nice to see them appear in a single equation."

While the data on the crucial importance of airborne transmission has now become clear, Bazant says, public health organizations initially placed much more emphasis on handwashing and the cleaning of surfaces. Early in the pandemic, there was less appreciation for the importance of ventilation systems and the use of face masks, which can dramatically affect the safe levels of occupancy, he says.

"I'd like to use this work to establish the science of airborne transmission specifically for Covid-19, by just taking into account all factors, the available data, and the distribution of droplets for different kinds of activities," Bazant says. He hopes the information will help people make informed decisions for their own lives: "If you understand the science, you can do things differently in your own home and your own business and your own school."

Bush offers an example: "My mother is over 90 and lives in an elder care facility. Our model makes it clear that it's useful to wear a mask and open a window -- this is what you have in your control." He was alarmed that his mother was planning to attend an exercise class in the facility, thinking it would be OK because people would be 6 feet apart. As the new study shows, because of the number of people and the activity level, that would actually be a highly risky activity, he says.

Already, since they made the app available in October, Bazant says, they have had about half a million users. Their feedback helped the researchers refine the model further, he says. And it has already helped to influence some decisions about reopening of businesses, he adds. For example, the owner of an indoor tennis facility in Washington state that had been shut down due to Covid restrictions says he was allowed to reopen in January, along with certain other low-occupancy sports facilities, based on an appeal he made based in large part on this guideline and on information from his participation in Bazant's online course on the physics of Covid-19 transmission.

Bazant says that in addition to recommending guidelines for specific spaces, the new tools also provide a way to assess the relative merits of different intervention strategies. For example, they found that while improved ventilation systems and face mask use make a big difference, air filtration systems have a relatively smaller effect on disease spread. And their study can provide guidance on just how much ventilation is needed to reach a particular level of safety, he says.

"Bazant and Bush have provided a valuable tool for estimating (among other things) the upper limit on time spent sharing the air space with others," says Howard Stone, a professor of mechanical and aerospace engineering at Princeton University who was not connected to this work. While such an analysis can only provide a rough estimate, he says the authors "describe this kind of order of magnitude of estimate as a means for helping others judge the situation they might be in and how to minimize their risk. This is particularly helpful since a detailed calculation for every possible space and set of parameters is not possible."

Credit: 
Massachusetts Institute of Technology

How the humble woodchip is cleaning up water worldwide

URBANA, Ill. - Australian pineapple, Danish trout, and Midwestern U.S. corn farmers are not often lumped together under the same agricultural umbrella. But they and many others who raise crops and animals face a common problem: excess nitrogen in drainage water. Whether it flows out to the Great Barrier Reef or the Gulf of Mexico, the nutrient contributes to harmful algal blooms that starve fish and other organisms of oxygen.

But there's a simple solution that significantly reduces the amount of nitrogen in drainage water, regardless of the production system or location: denitrifying bioreactors.

"Nitrogen pollution from farms is relevant around the world, from corn and bean farms here in Illinois to sugarcane and pineapple farms in Australia to diverse farms bordered by ditches in Belgium. We're all dealing with this issue. It's really exciting that bioreactors are bringing us together around a potential solution," says Laura Christianson, assistant professor in the Department of Crop Sciences at the University of Illinois and lead author on a new synthesis article accepted for publication in Transactions of the American Society of Agricultural and Biological Engineers (ASABE).

Denitrifying bioreactors come in many shapes and sizes, but in their simplest form, they're trenches filled with wood chips. Water from fields or aquaculture facilities flows through the trench, where bacteria living in wood chip crevices turn nitrate into a harmless gas that escapes into the air.

This edge-of-field conservation practice has been studied for at least a dozen years, but most of what scientists know about nitrogen removal rates is based on laboratory replicas and smaller-scale experimental setups. The USDA's National Resource Conservation Service published a set of standardized bioreactor guidelines in 2015, based in part on Christianson's early field-scale work, and now more and more U.S. farmers are adding bioreactors. They're catching on in other countries, too.

The ASABE article is the first to synthesize the available data from full-size bioreactors on working farms across the world.

"After gathering all the data, the message is bioreactors work. We've shown a 20-40% reduction in nitrate from bioreactors in the Midwest, and now we can say bioreactors around the world are pretty consistent with that," Christianson says.

She adds bioreactors, like all conservation practices, have their limitations, but nitrous oxide emissions aren't one of them.

"People are worried we're just transferring nitrate in water for nitrous oxide, which is a greenhouse gas. We don't know the full story on nitrous oxide with bioreactors yet, but we can say with good confidence they're not creating a huge nitrous oxide problem," she says. "They're just not."

Christianson says farmers frequently ask her about monitoring the water in bioreactors, so she and her co-authors detail the process in the ASABE article. She also partnered with the Illinois Farm Bureau to create a series of step-by-step videos explaining how to test the water.

"For monitoring, there are two parts. You have to know how much water is flowing through the bioreactor and how much nitrogen is in the water," she says.

The short videos, which are aimed at non-researchers such as farmers and water quality volunteers, break the process down into five steps. Christianson notes her students, postdoctoral researchers, and lab staff all pulled together to create the series.

The videos are available at https://go.aces.illinois.edu/MonitoringMagic.

Christianson, who may just be the world's biggest cheerleader for bioreactors, admits the monitoring guidelines and video series are a little self-serving.

"We included recommended monitoring approaches so that more people will build them, and then more people will monitor them. And then we'll have more data to show how well bioreactors work and how we can make them work better."

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Study reveals how some antibodies can broadly neutralize ebolaviruses

LA JOLLA, CA--Some survivors of ebolavirus outbreaks make antibodies that can broadly neutralize these viruses--and now, scientists at Scripps Research have illuminated how these antibodies can disable the viruses so effectively. The insights may be helpful for developing effective therapies.

Ebolavirus is a family of often-deadly viruses that includes Ebola virus and many lesser-known viruses such as Bundibugyo virus, Sudan virus and Reston virus.

Structural biologists at Scripps Research used electron microscopy techniques to visualize a set of antibodies that target a key site on these viruses called the "glycan cap." Their research showed that the antibodies work against ebolaviruses using the same three mechanisms to prevent the virus from infecting host cells.

The research, published in Cell Reports, is a step toward the creation of an antibody-based treatment that will be useful against a broad range of ebolaviruses.

"We now understand the molecular basis for these antibodies' abilities to neutralize ebolviruses with broad reactivity against different viral species," says the study's first author Daniel Murin, PhD, a staff scientist in the laboratory of Andrew Ward, PhD.

Ward, a professor in the Department of Integrative Structural and Computational Biology at Scripps Research, says he hopes the work will contribute the development of a "cocktail" of therapeutic antibodies that can save lives by treating many forms of the Ebola virus.

"The goal is to provide doctors in Ebola-prone regions their best weapon yet against these deadly outbreaks," Ward says. "The insights we have gained through our structural studies of the virus show how this may be possible."

Ever-emergent Ebola

The first known ebolavirus, now called Zaire ebolavirus or simply Ebola virus, was identified in 1976, named for the site of an outbreak that year near the Ebola river in what was then Zaire and is now the Democratic Republic of Congo.

Other species have since been added to this family of viruses, including Sudan ebolavirus and Bundibugyo ebolavirus. Ebola viruses colonize African fruit bats, often cause disease in chimpanzees and other non-human primates, and trigger outbreaks in humans every few years, on average. Infected people develop a hemorrhagic syndrome that is fatal in roughly half of untreated cases.

Vaccines against Ebola have been developed recently but have not yet been widely used. And although antibody-based treatments also have been developed, none has been shown effective against a broad range of ebolavirus species.

Nevertheless, studies in recent years have shown that some survivors of Ebola infections carry antibodies that, in lab-dish tests, can neutralize multiple ebolavirus species. A surprisingly high proportion of these broadly neutralizing antibodies target the glycan cap, a sugar-slathered site on a stalk-like protein--called the glycoprotein--that enables Ebola viruses to enter host cells.

In the new study, Murin and Ward, along with their colleagues in the James Crowe Lab at Vanderbilt University where the antibodies were isolated, used electron microscopy to analyze a set of glycan cap-targeting antibodies from survivors of various ebolaviruses. Their aim was to understand better how these antibodies target the virus so effectively.

Three ways to defeat the virus

Their analysis suggested that the most broadly effective of these glycan cap-targeting antibodies hit the same vulnerable site on the glycan cap, allowing them to thwart viral infectivity in three ways.

First, the antibody displaces a long viral structure near the glycan cap in a way that destabilizes the entire viral glycoprotein structure, sometimes causing it to fall apart.

Second, the glycan cap antibody--when it binds to its target site--can block a key event in the infection process, in which an enzyme called a cathepsin cleaves off the glycan cap. Blocking this cleavage event blocks the glycoprotein's ability to enter host cells.

Finally, the glycan cap antibody, by displacing the loose structure near the glycan cap, enables another type of neutralizing antibody to bind to a separate vulnerable site on the virus. Thus, a glycan cap antibody can "synergize" with another antibody to hit the virus significantly harder than either antibody does alone.

The scientists also determined the key genetic elements that allow glycan-cap antibodies to thwart ebolaviruses in these three ways.

Now that they have illuminated how these broadly neutralizing antibodies work, Ward, Murin and colleagues are testing them as parts of a next-generation antibody cocktail that they hope will be able to treat the Zaire, Sudan and Bundibugyo ebolaviruses.

Credit: 
Scripps Research Institute

Measuring neutron star squeezability

image: A neutron star begins its life as a star between about 7 and 20 times the mass of the sun. When this type of star runs out of fuel, it collapses under its own weight, crushing its core and triggering a supernova explosion. What remains is an ultra-dense sphere only about the size of a city across, but with up to twice the mass of the sun squeezed inside.

Image: 
NASA / Walt Feimer

A team of scientists used a telescope on the International Space Station to measure the size of PSR J0740+6620 (J0740, for short), the most massive known neutron star. NASA’s Neutron star Interior Composition Explorer (NICER) has captured unprecedented detail from this stellar remnant to learn more about matter in its core, which is on the threshold of collapsing into a black hole.

The NICER team will introduce their groundbreaking findings at a press conference during the 2021 APS April Meeting. NASA astronaut Christina Koch will join them to talk about how researchers use the space station as a science platform.

From pencils to pulsars

“Matter makes up everything we can see in the universe, from pencils to people to planets. In the heart of a neutron star, it’s on the verge of collapsing into a black hole,” said Cole Miller, an astronomy professor at the University of Maryland, College Park and a member of the NICER team.

“We can’t recreate those conditions on Earth, but scientists can study them from a distance by measuring the masses and sizes of neutron stars and calculating how dense they are.”

Miller will share the results from one of two groups that independently calculated the size of J0740.

Neutron star squeezability

“Our measurements of J0740 help constrain the squeezability of matter in neutron star cores, or at what density neutrons, one of the atomic building blocks, break down into smaller particles,” said Anna Watts, a professor of astrophysics at the University of Amsterdam.

Watts will share the results from the other independent team that calculated J0740’s size.

Combined with data from other neutron stars and multimessenger observations, the J0740 results herald a new age of neutron star science.

FEATURED TALKS

NICER Constraints on the Neutron Star Equation of State (E03.3, E03.4)

4:39 p.m. - 5:06 p.m. CDT, Saturday, April 17, 2021
Thomas Riley
Livestream: Access here
Abstract: http://meetings.aps.org/Meeting/APR21/Session/E03.3

5:06 p.m. - 5:33 p.m. CDT, Saturday, April 17, 2021
Cole Miller
Livestream: Access here
Abstract: http://meetings.aps.org/Meeting/APR21/Session/E03.4

PRESS CONFERENCE

Register for the press conference to be held on Zoom at 10:00 a.m. CDT, Saturday, April 17, 2021.

Speakers:

Zaven Arzoumanian (NICER Science Lead, NASA’s Goddard Space Flight Center)

Cole Miller (Professor of Astronomy, University of Maryland)

Anna Watts (Professor of Astrophysics, University of Amsterdam)

Sanjay Reddy (Professor of Physics, University of Washington)

Christina Koch (NASA Astronaut)

Media Contacts:

Claire Andreoli
Public Affairs Officer
NASA’s Goddard Space Flight Center, Greenbelt, Md.
claire.andreoli@nasa.gov
(301) 286-1940

Jeanette Kazmierczak
Science Writer
NASA’s Goddard Space Flight Center
jeanette.a.kazmierczak@nasa.gov

Credit: 
American Physical Society

Tell me who your friends are: Neural network uses data on banking transactions for credit scoring

Researchers from Skoltech and a major European bank have developed a neural network that outperforms existing state-of-the art solutions in using transactional banking data for customer credit scoring. The research was published in the proceedings of the 2020 IEEE International Conference on Data Mining (ICDM).

Machine learning algorithms are already extensively used in risk management, helping banks assess clients and their finances. "A modern human, in particular a bank client, continually leaves traces in the digital world. For instance, the client may add information about transferring money to another person in a payment system. Therefore, every person obtains a large number of connections that can be represented as a directed graph. Such a graph gives an additional information for client's assessment. Efficient processing and usage of the rich heterogeneous information about the connections between clients is the main idea behind our study," the authors write.

Maxim Panov, who heads the Statistical Machine Learning group, and Kirill Fedyanin from Skoltech and their colleagues were able to show that using the data about money transfers between clients improves the quality of credit scoring quite significantly compared to algorithms that only use the target client's data. That would help to make better offers for trustworthy clients while lowering the negative effect of fraudulent activity.

"One of the defining properties of a particular bank client is his or her social and financial interactions with other people. It motivated us to look at bank clients as a network of interconnected agents. Thus, the goal of the study was to find out whether the famous proverb "Tell me who your friends are and I will tell you who you are" applies to financial agents," Panov says.

Their edge weight-shared graph convolutional network (EWS-GCN) uses graphs, where nodes correspond to anonymized identifiers of bank clients and edges are interactions between them, to aggregate information from them and predict the credit rating of a target client. The main feature of the new approach is the ability to process large-scale temporal graphs appearing in banking data as is, i.e. without any preprocessing which is usually complex and leads to partial loss of the information contained in the data.

The researchers ran an extensive experimental comparison of six models and the EWS-GCN model outperformed all its competitors. "The success of the model can be explained by the combination of three factors. First, the model processes rich transactional data directly and thus minimizes the loss of information contained in it. Second, the structure of the model is carefully designed to make the model expressive and efficiently parametrized, and finally, we have proposed a special training procedure for the whole pipeline," Panov notes.

He also says that for the model to be used in banking practice, it has to be very reliable. "Complex neural network models are under the threat of adversarial attacks and due to the lack of knowledge of this phenomenon in relation to our model, we cannot use it in the production process at the moment, leaving it for further research," Panov concludes.

Credit: 
Skolkovo Institute of Science and Technology (Skoltech)

Reliable COVID-19 short-term forecasting

A new study by Texas A&M University researchers published in PLOS ONE details a new model for making short-term projections of daily COVID-19 cases that is accurate, reliable and easily used by public health officials and other organizations.

Led by Hongwei Zhao, professor of biostatistics at the Texas A&M School of Public Health, researchers used a method based on the SEIR (susceptible, exposed, infected and recovered states) framework to project COVID-19 incidence in the upcoming two to three weeks based on observed incidence cases only. This model assumes a constant or small change in the transmission rate of the virus that causes COVID-19 over a short period.

The model uses publicly available data on new reported cases of COVID-19 in Texas from the COVID-19 Data Repository by the Center for Systems Science and Engineering at Johns Hopkins University. Texas A&M researchers used this data on disease incidence for Texas and a selection of counties that included the Texas A&M campus to estimate the COVID-19 transmission rate.

"The results indicate that this model can be used to reasonably predict COVID-19 cases two to three weeks in advance using only current incidence numbers," Zhao said. "The simplicity of this model is one of its greatest strengths as it can be easily implemented by organizations with few resources. Forecasts from this model can help health care organizations prepare for surges and help public health officials determine whether mask mandates or other policies will be needed."

They forecasted future infections under three possible scenarios: a sustained, constant rate of transmission; one where the transmission rate is five percent higher than current levels, reflecting a decrease in practices to prevent transmission or an increase in conditions that promote transmission; and one where transmission is five percent lower.

Estimating the current effective transmission rate can be tricky, since day-to-day variations in both infections and reporting can dramatically influence this estimate. Thus, the researchers smoothed daily reporting variations using a three-day weighted average and performed additional smoothing to account for data anomalies such as counties reporting several months of cases all at once.

The researchers compared their projections with reported incidence in Texas through four periods in 2020: April 15, June 15, August 15 and October 15. The number of new daily COVID-19 cases reported were relatively low in mid-April, when many businesses were shut down, and then started to increase in early May after phased re-openings began in Texas. The numbers increased sharply after Memorial Day, and then trended downward after a statewide mask mandate was enacted during the summer. Infections increased again after Labor Day, but then seemed to plateau until the middle of October, when the transmission rate was observed again to increase dramatically.

The statewide application of the model showed that it performed reasonably well, with only the second period forecast deviating from the actual recorded incidence, perhaps due to the dramatically changing numbers at the time when a great wave of COVID-19 occurred around the Memorial Day holiday. The model performed similarly well at the county level, though the smaller population and changes in population, such as students moving in and out of the area during the school year, influenced reporting of new cases.

However, the model is limited by the data it uses. Local testing and reporting policies and resources can affect data accuracy, and assumptions about transmission rate based on current incidence are less likely to be accurate further into the future. And as more people contract COVID-19 and recover, or are vaccinated, the susceptible population will change, possibly affecting transmission.

Despite these limitations, the researchers said the model can be a valuable tool for health care facilities and public health officials, especially when combined with other sources of information. The COVID-19 pandemic is not yet over, so having a tool that can determine when and where another surge might occur is important. Similarly, researchers hope to use these new tools at their disposal for future infectious disease needs.

Additionally, the model has been used to create a dashboard that provides real-time data on the spread of COVID-19 state-wide. It has been used locally by university administrators and public health officials.

Credit: 
Texas A&M University

Two distinct types of COVID-19-associated acute respiratory distress syndrome identified

BOSTON - Approximately one in four patients hospitalized for the acute respiratory distress syndrome (ARDS) associated with severe COVID-19 infections may have a distinct phenotype (disease presentation) or biochemical profile associated with organ dysfunction, blood-clotting abnormalities and greater risk of death than patients with other, seemingly similar forms of the disease, researchers at Massachusetts General Hospital (MGH) have found.

Among 263 patients admitted to intensive care units (ICUs) at MGH for respiratory failure due to severe COVID-19 infection, 70 (26.6%) had increased levels of biomarkers in the bloodstream indicating disordered blood clotting, higher inflammation, and organ dysfunction compared with the other patients, report Sylvia Ranjeva MD, PhD, and Lorenzo Berra, MD, investigators in the Department of Anesthesia, Critical Care and Pain Medicine, and their colleagues in that department and Pulmonary and Critical Care Medicine and Respiratory Care at MGH.

Patients with this more severe phenotype had twice the risk of death, despite minimal differences in respiratory mechanics or in the severity of ARDS between the two phenotypes.

Their findings are published in EClinicalMedicine, an open-access journal of The Lancet group.

ARDS is a catch-all term for lung injury that can arise from many different conditions such as pneumonia, severe influenza, trauma, blood infections or inflammation of the pancreas.

The syndrome can be life-threatening and may require patients to be put on mechanical ventilation, but it has been difficult for clinicians to develop effective therapies because of the wide spectrum of causes associated with it.

Pulmonary care experts typically base treatment decisions for patients with ARDS on how their bodies and immune systems respond to disease. Prior studies have suggested that some patients have what is called a "hyperinflammatory" phenotype, because their bodies respond to disease or injury by unleashing a flood of cytokines (proteins released in response to inflammation) and other substances to fight the disease. Several studies have shown that patients with the hyperinflammatory phenotype have about a 20% greater risk for death compared with other patients.

But until now, it has been unclear whether patient responses to ARDS from other causes are the same as responses to ARDS associated with COVID-19, Ranjeva explains.

"The motivation for our work is that if we can identify subsets of patients with different biochemical characteristics, and then those patients respond differently to treatment or have different clinical outcomes, we would be one step closer to a more mechanism-based understanding of ARDS," she says.

The researchers identified two distinct phenotypes of COVID-19-associated ARDS that had substantial differences in their responses to disease and in risk for death, despite having minimal differences in respiratory function and oxygenation levels.

Patients with the less common but more serious phenotype could be identified by increases in markers of organ dysfunction (for example, kidney function and cardiac biomarkers) and by increased evidence of blood-clotting dysfunction (coagulopathy).

Their results suggest that disruption of the normal regulation of blood vessels and circulation could be a key feature of critical illness, severe symptoms, and death related to COVID-19 infections, the researchers write.

Credit: 
Massachusetts General Hospital

UZH researchers find new measure to predict stress resilience

Researchers at the University of Zurich show that increased sensitivity in a specific region of the brain contributes to the development of anxiety and depression in response to real-life stress. Their study establishes an objective neurobiological measure for stress resilience in humans.

Some people don't seem to be too bothered when it comes to handling stress. For others, however, prolonged exposure to stress can lead to symptoms of anxiety and depression. While stress resilience is a widely discussed concept, it is still very challenging to predict people's individual response to increased levels of stress. Lab experiments can only go so far in replicating the chronic stress many people experience in their day-to-day lives, as stress simulated in the lab is always limited in exposure time and intensity.

It is possible, however, to observe a group of medical students who are all about to face real-life stress for an extended period - during their six-month internship in the emergency room. This is precisely the real-life situation on which a team of researchers involving Marcus Grueschow and Christian Ruff from the UZH Zurich Center for Neuroeconomics and Birgit Kleim from the Department of Psychology and the University Hospital of Psychiatry Zurich based their study.

Stress as a response to cognitive conflict and loss of control

Before starting their internship, the subjects were given a task that required them to process conflicting information. This conflict task activates the locus coeruleus-norepinephrine (LC-NE) system, a region of the brain associated with regulating our response to stress and resolving conflict. However, the intensity of LC-NE activation - often referred to as the "firing rate" - varies from one person to the next.

Subjects with a higher LC-NE responsivity showed more symptoms of anxiety and depression following their emergency room internships. "The more responsive the LC-NE system, the more likely a person will develop symptoms of anxiety and depression when they're exposed to prolonged stress," Marcus Grueschow summarizes their findings.

Objective measure predicting stress resilience

With their study, the scientists have identified an objective neurobiological measure that can predict a person's stress response. This is the first demonstration that in humans, differences in LC-NE responsivity can be used as an indicator for stress resilience. "Having an objective measure of a person's ability to cope with stress can be very helpful, for example when it comes to choosing a profession. Or it could be applied in stress resilience training with neuro-feedback," Marcus Grueschow explains.

This does not mean that aspiring doctors or future police officers will all have to have their brain scanned. "There might be an even more accessible indicator for stress resilience," Christian Ruff says. Research with animals suggests that stimulation of the LC-NE system correlates with pupil dilation. "If we could establish the same causal link between pupil dilation and the LC-NE system in humans, it would open up another avenue," he adds.

Credit: 
University of Zurich

Discovery of epigenetic factors predicting the severity of COVID-19

The COVID-19 disease due to infection by the SARS-CoV2 virus has changed the behavior patterns of humanity by becoming a pandemic of international scope. To date, more than 136 million people have suffered from the disease and more than 2.9 million of them have lost their lives. It is important to remember that the symptoms of the infection vary widely in the population, from individuals who do not present any symptoms to those who need admission to intensive care units with emergency assisted ventilation. It is largely unknown what factors are responsible for this range of very different clinical pictures. Today, an article published in the journal EBiomedicine, The Lancet's sister journal for laboratory findings, by the group of professor Manel Esteller at the University of Barcelo and ICREA Research Professor and Dr. Aurora Pujol, also ICREA Professor and head of the Neurometabolic Diseases Group of the Bellvitge biomedical campus, shows that the epigenetic endowment of each person influences the severity of the COVID-19 disease.

"Given the high number of people infected by the virus that have saturated all health systems in the world, it would be nice to have ways to predict in advance whether the virus infection in a given individual will require hospitalization or can simply be controlled on an outpatient basis. It is known that advanced age and the co-existence of other pathologies (cardiovascular, obesity, diabetes, immune defects) are associated with a greater severity of the infection, but what happens to the rest of the population?" -Dr. Esteller wondered in the article in the EBiomedicine journal and adds - "We decided to study more than 400 people who had tested positive for COVID-19 who did not belong to any of these risk groups and study their genetic material depending on whether they had not had symptoms, or they were very mild, or instead they had been admitted to the hospital requiring respiratory assistance" adds Esteller.

"We found that there were epigenetic variations, the chemical switches that regulate DNA activity, in the individuals positive for the virus who developed a severe COVID-19. These modifications occurred mainly in genes associated with an excessive inflammatory response and in genes that reflect an overall worse state of health. Interestingly, 13% of the world population presents this epigenetic signature (EPICOVID), thus, this is the group at maximum risk that we must take special care of. " - concludes the researcher.

Credit: 
University of Barcelona

Potential ways to improve survival for cancer patients who receive fragmented care

video: Key findings are summarized by the study's lead investigator, David G. Brauer, MD, MPHS.

Image: 
American College of Surgeons

Key takeaways

Pancreatic, liver, bile duct, and stomach cancer operations are inherently complex and initially often take place at large cancer centers where surgical teams perform a large volume of procedures.
Readmission to a different hospital from where patients had these operations initially performed markedly increases death risk.
There are ways to address care fragmentation with newly identified risk factors for readmission; cancer hospitals should seek to determine safe sites of care for readmissions after these types of operations.

CHICAGO: New research reveals that 28 percent of patients who are readmitted to the hospital with complications after surgical removal of pancreatic, liver, or stomach cancer go to a different hospital for follow-up care. This fragmentation of health care is associated with a 50 percent increased odds of dying, according to a study published online by the Journal of the American College of Surgeons ahead of print.

The researchers from Washington University School of Medicine, St. Louis, sought to identify patient and hospital characteristics that raise the death risk during readmission to an outside hospital--one other than the original hospital where the operation was performed, referred to as the index hospital. Using the state inpatient databases from the federal Healthcare Cost and Utilization Project, the investigators evaluated data from adults undergoing surgical removal of liver, pancreatic, bile duct, and gastric cancers beginning in 2006. Patients lived in California, Florida, or New York.

A total of 31,256 patients were discharged from the hospital postoperatively, and 7,536, or 24 percent, were readmitted to any hospital in the first three months after discharge. Among readmitted patients, 28 percent, or 2,123, went to an outside hospital. As in prior research,1 the study findings showed a higher death rate for those patients versus patients returning to the index hospital: 8 percent versus 5.4 percent.

Centralized cancer care not always aligned with readmissions

Most operations to remove liver, pancreatic, bile duct or stomach cancers take place at regional medical centers where surgical teams perform a large volume of these complex procedures.2 These operations have high rates of complications and readmissions, and large surgical volume is associated with improved outcomes.

"This centralization of cancer surgical care means many patients travel great distances to undergo their operations and therefore may need to present instead to a hospital closer to home if they experience complications. That hospital may not have access to the patient's surgical records or even the specialists to care for patients with such complex medical problems," said the study's lead investigator, David G. Brauer, MD, MPHS. "Patients who experience this type of care fragmentation die more often than patients who don't experience care fragmentation. Convenience of care should not come at the expense of getting the appropriate care."

Readmission risk factors

The researchers identified the following risk factors for readmission to an outside hospital:

Longer time since the first discharge: a median, or middle value, of 25 days versus 12 days for patients readmitted to the index hospital
Living much farther from the index hospital than other readmitted patients: 24 miles versus 10 miles
Living in a rural area, older age, and white race*

(*The reason for a racial disparity will be the subject of future study but could be because white patients make up a larger proportion of rural populations compared with more metropolitan populations in the states studied.)

Ways to address fragmented care

Based on identified risk factors from this study, Dr. Brauer recommended the following ways to reduce care fragmentation for these patients:

Index hospitals should identify patients who have the "clear" risk factors found in this study, such as living far away and older age, and should determine safe sites of care for hospital readmission.
Surgeons might consider more frequent follow-up and telehealth visits with at-risk patients.
At-risk patients should ask their surgeons where they should go if they have a complication.
When patients choose an outside hospital for readmission, they should ask the treating physicians to communicate with their surgeons.
If there is any concern regarding the patient, the readmission hospital should divert care back to the surgeon or another higher-acuity hospital.

Care fragmentation is complex

The investigators noted finding characteristics of readmission hospitals that resulted in fewer in-hospital deaths, even at outside hospitals. These factors included teaching hospitals and hospitals that performed at least 100 of these cancer operations annually. However, Dr. Brauer said multivariable statistical analyses found no significant mortality association with these hospital characteristics, indicating the reasons for care fragmentation are complex.

In addition, Dr. Brauer noted geographical distance was a barrier to care access. However, neither greater distance from the index hospital nor longer time between discharge and readmission had an association with more deaths, suggesting other factors contributed to the mortality difference at outside hospitals. "Our analysis shows that returning to the original surgeon and hospital to manage complications, while preferable, may not always be an absolute necessity," Dr. Brauer said.

He concluded: "There is a suggestion that certain hospital-level characteristics, such as high surgical volume, may make an outside facility safer for postoperative readmission care. It also depends on the specifics of the patient and severity of postoperative complication, barriers to access care, and the treating physicians, including their familiarity with the patient's cancer operation."

Dr. Brauer led the research team while a surgical resident at Washington University School of Medicine, St. Louis. He is now a clinical fellow in surgical oncology at Memorial Sloan-Kettering Cancer Center, New York City. Other study authors are Ningying Wu, PhD; Matthew R. Keller, MS; Sarah A. Humble, MS; Ryan C. Fields, MD, FACS; Chet W. Hammill, MD, MCR, FACS; William G. Hawkins, MD, FACS; Graham A. Colditz, MD, DrPH; and Dominic E. Sanford, MD, MPHS, all from Washington University School of Medicine, St. Louis.

Credit: 
American College of Surgeons

McMaster scientists discover trained immune cells are highly effective against cancer

image: Sophie Poznanski, lead author of the paper and Ali Ashkar, professor of medicine

Image: 
McMaster University

Hamilton, ON (April 14, 2021) - Modified immune cells that ruthlessly kill cancerous tumours may prove a game-changer for people living with late-stage cancer.

McMaster University researchers Ali Ashkar and Sophie Poznanski have uncovered that changing the metabolism of natural killer (NK) immune cells allows these cells to overcome the hostile conditions found inside tumours and destroy advanced ovarian and lung cancer.

In the past decade cancer immunotherapy has achieved tremendous therapeutic effects in patient with blood cancers. However, the immunosuppressive conditions found inside solid tumours, whose aggressive growth starves surrounding healthy tissues of energy, have until now remained a formidable barrier for immune cell therapies.

"In this study, we discovered that the metabolism, or energy "hub", of NK cells is paralyzed by tumours, causing the NK cells to undergo an energy crisis and lose their tumour killing functions," said Poznanski, a McMaster PhD student.

"With that understanding, we were able to reverse the dysfunction of NK cells by repurposing a pre-existing metabolism drug that restored their energy production," she added.

Poznanski is lead author of the paper published this week in the journal Cell Metabolism and a Vanier Scholar.

While these findings answered the decades-old question of how it is that NK cells are suppressed by tumours, the study has another major discovery.

"Drawing on the old adage 'To defeat your enemy, you have to think like your enemy', we additionally discovered that NK cells can be modified to mimic the metabolism of tumours," Poznanski said.

She added that the modified NK cells proved to be far better adapted for the hostile tumour environment.

"We were just hoping that the modified NK cells would better resist suppression in tumours. We were astounded to see that not only did they show no suppression, but they paradoxically functioned better inside of the tumour than outside of it."

"This is the first report of an anti-tumour immune cell that exploits the hostility of tumours for their own advantage" said senior author Ashkar, professor of medicine and the Canada Research Chair in Natural Immunity and NK Cell Function.

"Generating cytotoxic immune cells to have tumour-like metabolism is key for their anti-tumour functions in a very hostile environment of a solid tumour. This could be a paradigm shift for immune cell-based cancer immunotherapy."

So far, NK cells have only proven effective against blood cancers, he said. "The re-programmed and trained NK cells could mean patients with otherwise terminal cancers may have a safe and an effective treatment option. Lung and ovarian cancer are two examples of cancers whose survival rates have remained low over the last 30 or so years."

He added that immunotherapy with NK cells has already proven safe with few, if any, side effects.

Hal Hirte agrees. He is an associate professor of oncology at McMaster, an author of the study and a medical oncologist for Hamilton Health Sciences.

"This could have real potential for the treatment of ovarian cancer, lung cancer, and other poor prognosis tumours.

"Ovarian cancer, in particular, is one of the most immunosuppressive tumour types which is a major reason survival rates have not improved. The therapeutic effects observed in preclinical models in this study presents a major breakthrough for the field.

"Certainly, the next step is to move this promising therapy to clinical trials in patients, and we plan to have the trials underway soon to test this approach in patients with recurrent ovarian cancer."

Credit: 
McMaster University