Culture

Measuring the wind speed on a brown dwarf

Strong winds blow high in the atmosphere of the brown dwarf 2MASS J1047+21, according to a new study, which presents a simple method to deduce the windspeed in other brown dwarf atmospheres, too. By monitoring the brown dwarf's infrared and radio emissions, the researchers were able to derive the distant world's powerful winds - which whip eastward at an average of 660 meters per second, or roughly 2,400 kilometers per hour. The results demonstrate a technique that could be used to characterize atmospheres of exoplanets. Brown dwarfs - bodies with masses between large planets and small stars - share many of the same rotational and atmospheric characteristics as gas giant planets. For gas giants within the Solar System like Jupiter, it's easy to observe the latitudinal wind patterns that dominate their atmospheres. The speed of those winds can be derived by comparing the movement of clouds in Jupiter's atmosphere to the radio emissions caused by the rotation of the planet's interior. Now, Katelyn Allers and colleagues show how this approach can be adapted to measure the winds on gas giants and brown dwarfs far outside our Solar System. Allers et al. observed 2MASS J1047+21, a nearby brown dwarf, and determined its rotational periods at infrared (rotation of the atmosphere) and radio (rotation of the interior) wavelengths. The difference between these measurements allowed the authors to derive the brown dwarf's average wind speed and direction - about 660 meters per second in an west-east direction. "Our method for determining the wind speed can in principle also be applied to exoplanets," say the authors, "which have similar rotation rates and periodic variability to brown dwarfs."

Credit: 
American Association for the Advancement of Science (AAAS)

Astronomers measure wind speed on a brown dwarf

Astronomers have used the National Science Foundation's Karl G. Jansky Very Large Array (VLA) and NASA's Spitzer Space Telescope to make the first measurement of wind speed on a brown dwarf -- an object intermediate in mass between a planet and a star.

Based on facts known about the giant planets Jupiter and Saturn in our own Solar System, a team of scientists led by Katelyn Allers of Bucknell University realized that they possibly could measure a brown dwarf's wind speed by combining radio observations from the VLA and infrared observations from Spitzer.

"When we realized this, we were surprised that no one else had already done it," Allers said.

The astronomers studied a brown dwarf called 2MASS J10475385+2124234, an object roughly the same size as Jupiter, but roughly 40 times more massive, about 34 light-years from Earth. Brown dwarfs, sometimes called "failed stars," are more massive than planets, but not massive enough to cause the thermonuclear reactions at their cores that power stars.

"We noted that the rotation period of Jupiter as determined by radio observations is different from the rotation period determined by observations at visible and infrared wavelengths," Allers said.

That difference, she explained, is because the radio emission is caused by electrons interacting with the planet's magnetic field, which is rooted deep in the planet's interior, while the infrared emission comes from the top of the atmosphere. The atmosphere is rotating more quickly than the interior of the planet, and the corresponding difference in velocities is due to atmospheric winds.

"Because we expect the same mechanisms to be at work in the brown dwarf, we decided to measure its rotation speeds with both radio and infrared telescopes," said Johanna Vos, of the American Museum of Natural History.

They observed 2MASS J10475385+2124234 with Spitzer in 2017 and 2018, and found that its infrared brightness varied regularly, likely because of the rotation of some long-lived feature in its upper atmosphere. The team did VLA observations in 2018 to measure the rotation period of the object's interior.

Just as with Jupiter, they found that the brown dwarf's atmosphere is rotating faster than its interior, with a calculated wind speed of about 1425 miles per hour. This is significantly faster than Jupiter's wind speed, about 230 mph.

"This agrees with theory and simulations that predict higher wind speeds in brown dwarfs," Allers said.

The astronomers said their technique can be used to measure winds not only on other brown dwarfs, but also on extrasolar planets.

"Because the magnetic fields of giant exoplanets are weaker than those of brown dwarfs, the radio measurements will need to be done at lower frequencies than those used for 2MASS J10475385+2124234, said Peter Williams of the Center for Astrophysics, Harvard & Smithsonian, and the American Astronomical Society.

"We're excited that our method can now be used to help us better understand the atmospheric dynamics of brown dwarfs and extrasolar planets," Allers said.

Credit: 
National Radio Astronomy Observatory

Ancient teeth from Peru hint now-extinct monkeys crossed Atlantic from Africa

image: Tiny molar teeth of the parapithecid monkey Ucayalipithecus from the Oligocene of Perú

Image: 
Erik Seiffert

Four fossilized monkey teeth discovered deep in the Peruvian Amazon provide new evidence that more than one group of ancient primates journeyed across the Atlantic Ocean from Africa, according to new USC research just published in the journal Science.

The teeth are from a newly discovered species belonging to an extinct family of African primates known as parapithecids. Fossils discovered at the same site in Peru had earlier offered the first proof that South American monkeys evolved from African primates.

The monkeys are believed to have made the more than 900-mile trip on floating rafts of vegetation that broke off from coastlines, possibly during a storm.

"This is a completely unique discovery," said Erik Seiffert, the study's lead author and Professor of Clinical Integrative Anatomical Sciences at Keck School of Medicine of USC. "It shows that in addition to the New World monkeys and a group of rodents known as caviomorphs - there is this third lineage of mammals that somehow made this very improbable transatlantic journey to get from Africa to South America."

Researchers have named the extinct monkey Ucayalipithecus perdita. The name comes from Ucayali, the area of the Peruvian Amazon where the teeth were found, pithikos, the Greek word for monkey and perdita, the Latin word for lost.

Ucayalipithecus perdita would have been very small, similar in size to a modern-day marmoset.

Dating the migration

Researchers believe the site in Ucayali where the teeth were found is from a geological epoch known as the Oligocene, which extended from about 34 million to 23 million years ago.

Based on the age of the site and the closeness of Ucayalipithecus to its fossil relatives from Egypt, researchers estimate the migration might have occurred around 34 million years ago.

"We're suggesting that this group might have made it over to South America right around what we call the Eocene-Oligocene Boundary, a time period between two geological epochs, when the Antarctic ice sheet started to build up and the sea level fell," said Seiffert. "That might have played a role in making it a bit easier for these primates to actually get across the Atlantic Ocean."

An improbable discovery

Two of the Ucayalipithecus perdita teeth were identified by Argentinean co-authors of the study in 2015 showing that New World monkeys had African forebears. When Seiffert was asked to help describe these specimens in 2016, he noticed the similarity of the two broken upper molars to an extinct 32 million-year-old parapithecid monkey species from Egypt he had studied previously.

An expedition to the Peruvian fossil site in 2016 led to the discovery of two more teeth belonging to this new species. The resemblance of these additional lower teeth to those of the Egyptian monkey teeth confirmed to Seiffert that Ucayalipithecus was descended from African ancestors.

"The thing that strikes me about this study more than any other I've been involved in is just how improbable all of it is," said Seiffert. "The fact that it's this remote site in the middle of nowhere, that the chances of finding these pieces is extremely small, to the fact that we're revealing this very improbable journey that was made by these early monkeys, it's all quite remarkable."

Credit: 
Keck School of Medicine of USC

New study finds EPA mercury analysis is 'seriously flawed'

WASHINGTON, DC--An article published in Science magazine finds deep flaws in the US Environmental Protection Agency (EPA)'s benefit-cost analysis in support of a proposed rule related to the regulation of hazardous air pollution from coal-burning power plants. The analysis forms part of the foundation for a regulatory proposal to roll back the legal underpinnings of its Mercury and Air Toxics Standards (MATS), which power plants have been complying with since 2016, leaving the standards vulnerable to legal challenges.

Researchers from Harvard, Yale, Claremont McKenna College, UC Berkeley, Georgetown, and Resources for the Future (RFF), claim that EPA "ignores scientific evidence, economic best practice, and its own guidance" in the new analysis. The authors assert that EPA "can and should do better."

"The EPA's new analysis of the cost and benefits of the MATS rule is clearly insufficient. It fails to account for advances in our understanding of the negative health impacts of mercury and changes in electricity generation since 2011, which have led to much lower compliance costs than were originally projected," says RFF Senior Fellow Karen Palmer, a coauthor on the paper. "And, it dismisses an entire category of benefits."

The authors highlight the following flaws in EPA's analysis:

It disregards economically significant but indirect public health benefits, or "co-benefits," in a manner inconsistent with economic fundamentals. The expected benefits of reducing particulate matter pollution of $33-90 billion per year easily exceed the expected costs of $9.6 billion under EPA's original 2011 analysis of the MATS rule.

It fails to account for recent science that identifies important sources of direct health benefits from reducing mercury emissions, such as fewer heart attacks.

It ignores transformative changes in the structure and operations of the electricity sector over the last decade. Shifts from coal to natural gas and renewable sources, including wind and solar power, for electricity generation have decreased the number of power plants that must install pollution control equipment. The investment in pollution control has been about half of what was projected in 2011.

"If finalized, the new rule will undermine continued implementation of MATS and set a concerning precedent for use of similarly inappropriate analyses in the evaluation of other regulations," the authors state.

Credit: 
Resources for the Future (RFF)

Immunotherapy treatment after chemotherapy significantly slows metastatic bladder cancer

Using immunotherapy immediately after chemotherapy treatment in patients with metastatic bladder cancer significantly slowed the progression of the cancer, according to results of a clinical trial led by Mount Sinai researchers published in the Journal of Clinical Oncology in April.

The study is the first to show that this approach to therapy, called switch maintenance immunotherapy, significantly slows the worsening of a type of bladder cancer called urothelial cancer. The randomized Phase 2 trial tested this treatment in 108 patients.

The trial tested an immunotherapy known as pembrolizumab after patients were treated with platinum-based chemotherapy in one group of patients and used a placebo after the same type of chemotherapy in a second group. The time until the cancer progressed was approximately 60 percent longer for the pembrolizumab group compared with the control group.

"This trial, along with another recent study testing a similar approach, bolster the use of switch maintenance treatment, which will likely become a standard of care for metastatic urothelial cancer, a disease characterized by a paucity of advances in decades," said lead author Matthew Galsky, MD, Co-Director of the Center of Excellence for Bladder Cancer at Mount Sinai.

Credit: 
The Mount Sinai Hospital / Mount Sinai School of Medicine

Making sense of scents: 3D videos reveal how the nose detects odor combinations

video: 3D volumetric rendering of SCAPE data acquired from the 10 olfactory epithelium at the resting level without odor stimulus showing a 1600μm×1200μm×350μm field of view.

Image: 
Wenze Li and Lu Xu/Hillman and Firestein labs/Columbia University's Zuckerman Institute and Columbia University

NEW YORK -- Every moment of the day we are surrounded by smells. Odors can bring back memories, or quickly warn us that food has gone bad. But how does our brain identify so many different odors? And how easily can we untangle the ingredients of a mixture of odors? In a new study in mice published today in Science, Columbia scientists have taken an important step toward answering these questions, and the secret lies inside the nose.

"From garbage to cologne, the scents we encounter every day are comprised of hundreds or even thousands of individual odors," said Stuart Firestein, PhD, a Columbia professor of biological sciences and the co-senior author of today's study. "Your morning cup of coffee can contain more than 800 different types of odor molecules. Although much work has been done to understand how the nose and brain work together to identify individual odors, scientists have long struggled to explain how this system works when multiple odors are mixed together."

Using a cutting-edge 3D imaging method called SCAPE microscopy, the Columbia team monitored how thousands of different cells in the nose of a mouse responded to different odors -- and mixtures of those odors. They found that the information that the nose sends to the brain about a mixture of scents is more than just the sum of its parts.

The cells in the nose that detect smells each have one of a wide range of different sensors, or receptors; humans, for example have up to 400 different types of these receptors. For a pure, single odor, only the cells whose receptors are sensitive to that odor will become active, sending a code to the brain that it can identify as that odor. But for more complex mixtures of odors, this code would become increasingly complex to interpret.

The researchers expected to see that the cells activated by mixtures of odors would be equivalent to adding together responses to individual odors. In fact, they found that in some cases an odor can actually turn off a cell's response to another odor in a mixture; in other cases, a first odor could amplify a cell's response to a second odor.

Although we often perceive one odor dominating another, it was previously assumed that this processing occurred in the brain. These results show that signals being sent to the brain get shaped by these interactions within the nose.

The team's data challenged the traditional view that the brain makes sense of a mixture of scents by figuring out all of the individual components. It confirmed what perfumers have long known: combining different scents can create a certain experience on its own, essentially becoming an entirely new scent that can provide a completely different experience.

"We were excited to find that these changes in the code happened in the nose, before signals even get to the brain," said Lu Xu, a doctoral candidate in the Firestein lab and the co-first author of today's study. "We think that these effects could help us to detect and identify a much larger range of odors and mixtures than a simple additive code could convey."

To reveal these inner workings of the olfactory system, the researchers harnessed the power of SCAPE microscopy, a technique developed by Elizabeth Hillman, PhD, a Zuckerman Institute principal investigator and the co-senior author of today's Science paper. SCAPE microscopy creates high-speed, 3D images of living tissues in real time. It sweeps an angled sheet of light back and forth to create high-speed 3D movies of living cells and tissues in action.

The Firestein and Hillman labs customized SCAPE to illuminate and view tissues in the noses of mice. The researchers examined neuronal cells within the animals' noses that were labeled with fluorescence that flashed under the microscope when these cells were activated. They then exposed the animals' nasal tissues to a range of different scent combinations: one with a woody bouquet, and the other a mix of almond, floral and citrus scents.

"SCAPE allowed us to simultaneously analyze the activity in any of the tens of thousands of single cells over long periods of time," said Wenze Li, PhD, a postdoctoral research scientist in the Hillman lab and the paper's co-first author. "Using conventional microscopes, we could only image a few hundred cells in a thin layer for a short time. SCAPE made it possible to image many more cells within the intact 3D nasal structures without damaging the tissue. This enabled us to track each cell's response over time to a long series of different odor combinations."

To process the immense amount of data collected -- more than 300 gigabytes per tissue sample -- the team had to build their own powerful data processing server, and worked with algorithms developed by Columbia's Department of Statistics and the Simons Foundation.

For almost twenty years, scent experts have known that certain smells can mask or even enhance others. With today's study, the researchers uncovered a potential mechanism for this phenomenon.

"Our results showed that scent molecules can both activate and deactivate receptors, masking other scents not by overpowering them, but by changing the way cells respond to them," said Dr. Hillman, who is professor of biomedical engineering at Columbia School of Engineering and Applied Science. "These findings could actually be very useful, for example to make better air fresheners that actually block out any unwanted smells."

"These results are also exciting because we didn't expect that this kind of receptor could be enhanced or suppressed in this way," added Dr. Firestein. "Being able to change the way a receptor responds to one substance is very important for drug development. Our studies in the nose actually shed new light on possible ways to modulate the response of other cell types that might be involved in disease."

"This study was a true marriage of the expertise of two different labs, with state-of-the-art microscopy technology and big-data analysis," added Dr. Hillman, who is also a professor of radiology at Columbia's Vagelos College of Physicians and Surgeons. Dr. Hillman credits funding and encouragement from the NIH BRAIN Initiative for making this parallel science and technology work possible. "We discovered something that was almost impossible to see before, because we had a new way to see it."

Credit: 
The Zuckerman Institute at Columbia University

TAILORx dispels chemo-brain notion: Women on hormone therapy also report cognitive decline

image: A comparison of cognitive function in two groups of TAILORx women with early breast cancer revealed that the group given chemotherapy plus hormone therapy had more significant cognitive impairment at three and six months than the group on hormone therapy alone. However, at 12 and 36 months, the impairment was not significantly different between groups. The similarity was not because the women who had chemotherapy improved, but rather because women on hormone therapy were also reporting cognitive impairment, although the pace of decline was slower and more gradual.

Image: 
ECOG-ACRIN Cancer Research Group

The Journal of Clinical Oncology reports that a subgroup of women on chemotherapy in the landmark TAILORx breast cancer treatment trial had an early and abrupt cognitive decline at three and six months following treatment, which leveled off at 12 and 36 months. The women received chemotherapy plus hormone therapy following surgery, to prevent the disease from returning. The publication also reports the loss of cognition in women who received hormone therapy alone. The ECOG-ACRIN Cancer Research Group (ECOG-ACRIN) conducted the study with funding from the National Cancer Institute, part of the National Institutes of Health.

"We found that chemotherapy produced early cognitive impairment that leveled off by one year and did not get worse over time," said lead author Lynne I. Wagner, PhD, a professor of social sciences and health policy at Wake Forest University. "This patient-reported information will hopefully reassure women who get diagnosed with early breast cancer in the future and learn they need chemotherapy because they are at high risk of recurrence."

Cancer-related cognitive impairment is common during chemotherapy treatment. The term 'chemo-brain' refers to cognitive impairments assumed to be from chemotherapy. The randomized design of TAILORx allowed Dr. Wagner and colleagues to evaluate the accuracy of this term by quantifying the unique contribution of chemotherapy to acute and long-term changes in cognition.

TAILORx is the first trial to quantify cognitive impairment and health-related quality of life from the patient's perspective. Dr. Wagner and colleagues compared survey responses from 579 TAILORx women randomized to one of two treatment groups: hormone therapy alone or hormone therapy plus chemotherapy. Their use of questionnaires to capture data directly from patients gives confidence that the results accurately characterize women's perspectives on their symptoms and functioning.

Before TAILORx, it was impossible to quantify the extent to which chemotherapy contributed to cognitive impairment because everyone received chemotherapy.

Over one-third of the women in the study reported a significant decrease in cognition compared to pre-treatment. Dr. Wagner and colleagues calculated a minimum change score to determine what is likely to be a clinically significant level of change. In the group on hormone therapy alone, 34% of participants had change scores from baseline to 12 months that exceeded the level considered to be clinically relevant. In the chemotherapy and hormone therapy group, 38% of participants had change scores exceeding the benchmark.

Women on Hormone Therapy Also Reported Significant Cognitive Impairment

A comparison of the two groups revealed that the group with chemotherapy lost more cognitive function at three and six months than the group on hormone therapy alone. However, at 12 and 36 months, the impairment was not significantly different between the groups. This was not because the women who had chemotherapy improved, but rather because women on hormone therapy were also reporting cognitive impairment, although the pace of decline was slower and more gradual.

"I think we've generally assumed that cognitive impairment is due to chemotherapy," said Dr. Wagner. "Our findings tell us that hormone therapy may also play a role. Future research is needed to understand better how hormone therapy affects cognition in the context of cancer treatment and also how to treat this symptom."

Cognitive Function Did Not Return to Pre-Treatment Levels in Either Group

Another key finding from TAILORx is that women's cognitive function did not return to pre-treatment levels regardless of the treatment they received.

"We were surprised by the finding that women's cognitive function did not return to pre-treatment levels after finishing chemotherapy, but neither did the group with hormone therapy alone," said Dr. Wagner. "These results do not suggest that women should skip hormone therapy. Rather, the results should alert women and their doctors to continually discuss cognitive function even if they've been on hormone therapy for a few years."

This patient-reported data validates the experiences of women with breast cancer who noticed a decline in their cognition but questioned if it was related to treatment or due to aging, stress, or another nonspecific cause.

"Finding that long-term cognitive impairments were comparable between groups confirms we can retire the term 'chemo-brain' as it does not accurately describe the whole picture," said Dr. Wagner.

About TAILORx and Genetic Testing

The groundbreaking TAILORx trial found no benefit from chemotherapy for 70% of women with the most common type of breast cancer: hormone receptor (HR)-positive, HER2-negative, axillary lymph node-negative. The first results in 2018, published in the New England Journal of Medicine, give clinicians high-quality data to inform personalized treatment recommendations for women. The trial used the Oncotype DX Recurrence Score, a molecular test that measures a woman's risk of recurrence by assigning a score (0 - 100) for the presence of 21 tumor genes linked to breast cancer. The higher the score, the more tumor genes present, and thus, the higher the risk of recurrence. The TAILORx results suggest a potential benefit from chemotherapy for women with early breast cancer who are of any age with a recurrence score of 26 - 100 or 50 years old and younger with a recurrence score of 16 - 25.

Before TAILORx, there was uncertainty about the best treatment for women with a mid-range score of 11 - 25. The trial was designed to answer this question. Women with a score of 11 - 25 were randomly assigned to chemotherapy plus hormone therapy or hormone therapy alone.

About the Cognition Study Design

The analysis assessed cognitive impairment among a subgroup of 579 randomized TAILORx women able to be evaluated. It used the 37-item Functional Assessment of Cancer Therapy-Cognitive Function (FACT-Cog) questionnaire, administered at baseline, 3, 6, 12, 24, and 36 months. The FACT-Cog included the 20-item Perceived Cognitive Impairment (PCI) scale, the primary endpoint. Clinically meaningful changes were defined a priori, and linear regression was used to model PCI scores on baseline PCI, treatment, and other factors.

Other Findings

The measure of women who reported improved cognition from baseline to 12 months was 14% in the group on hormone therapy alone and 8% in the group that received chemotherapy plus hormone therapy.

Many have questioned whether cancer-related cognitive impairment is due to aging, menopausal changes, or as a manifestation of anxiety and depression. These findings did not differ based on age. Researchers observed the same magnitude of cognitive changes among women under 50 years of age, 51-64 years, and 65 years and older. They saw the same pattern of results for women who were pre-menopausal as those who were post-menopausal. Neither anxiety nor depression attributed to the results.

Implications

This patient-reported data from TAILORx has important implications for women with early breast cancer and the clinicians who treat them. Finding that women on chemotherapy experienced an early and abrupt decrease in cognition underscores the value of precision-guided treatment to identify women who will not benefit from chemotherapy to spare them this toxicity. The persistent cognitive impairment reported in both groups identifies the need for research to understand this distressing symptom better.

Credit: 
ECOG-ACRIN Cancer Research Group

Rethinking biosecurity governance

Perhaps the most important lesson we can learn from the current coronavirus pandemic is how to learn future lessons without having to experience a pandemic, whether natural in origin or made by humans. To do so, we need to change how we think about the governance of biology.

In a Policy Forum article in this week's issue of Science, lead author Dr. Sam Weiss Evans, a Fellow at Harvard Kennedy School's Program on Science, Technology and Society, joins more than a dozen stakeholders in calling for a new approach to biosecurity governance, grounded in experimentation.

The authors argue that no capability exists today for systematic learning about the effectiveness and limitations of governing biosecurity. The current approach usually relies on traditional risk management aimed at what we already know we should worry about, such as risks from pathogens. Instead, we need to learn to experiment with new sets of assumptions about biology, security and governance. As our current crisis is demonstrating with deadly efficiency, the sooner we recognize that governance requires experiment-based learning, the better our ability to learn what works and doesn't work, and to move past sporadic ad-hoc changes implemented after major events occur.

The stakes are particularly high in biosecurity governance, which must prevent or deter the misuse of biological science and technology.

"It should not take hundreds of thousands of corpses around the world and a recession to get us to assess and address the limitations of our current systems of governing health security and biosecurity," Evans said. "We can do that by taking a more experimental approach to biosecurity and health security governance, periodically testing and reassessing basic assumptions we are making about science, security, and society."

An experimental approach to governance recognizes that that we don't have perfect knowledge about how biology will be used to harm; the more we can test the assumptions we are basing security governance on, the more likely we are to understand the shortcomings and strengths of the system prior to catastrophic events. Experimentation promotes an open discussion about the reasons behind decisions to govern one way and not another.

Evans et al represent a wide range of stakeholders in biosecurity governance, from laboratory scientists and biosafety officers to coordinators and analysts of national and international biosecurity efforts. All of them agree that thinking of biosecurity as an experiment can enable systematic learning about the effectiveness and limitations of current approaches. They point out three initial lessons that are not routinely taken into account in biosecurity governance:

1. In designing a governance experiment, consideration should be given to framing the proposed set of actions in terms of hypotheses, which in turn are based on a set of assumptions about the science, security concerns, and the governing authorities.

2. Developing greater capacity to quickly identify difficult or unanticipated cases allows for governing processes to adapt and account for them. Sharing near misses and failures in a timely manner can aid future biosecurity efforts since similar attack strategies are likely to be applied across disparate sites.

3. Experimental learning needs to take place across biology communities, from commercial and industrial firms to philanthropies and governments. This learning requires a new willingness to think beyond the current crisis, to rethink basic assumptions about the origin of threats, and to embrace an iterative approach to improving systems of governance.

Credit: 
Harvard Kennedy School

Mayo Clinic offers guidance on treating COVID-19 patients with signs of acute heart attack

ROCHESTER, Minn. -- Much remains unknown about COVID-19, but many studies already have indicated that people with cardiovascular disease are at greater risk of COVID-19. There also have been reports of ST-segment elevation (STE), a signal of obstructive coronary artery disease, in patients with COVID-19 who after invasive coronary angiography show no sign of the disease. This false signal of coronary artery disease may cause patients to undergo procedures that present unnecessary risks, especially in the COVID-19 environment, according to a special article published in Mayo Clinic Proceedings.

The article, written by a team of Mayo Clinic cardiologists and radiologists, proposes algorithms for evaluating patients and determining a course of treatment.

"The impact of false activation of the catheterization laboratory includes inherent risks, beginning with the invasive arterial procedure itself and related care for these patients," says J. Wells Askew, M.D., a Mayo Clinic cardiologist. In cases where patients test positive for COVID-19, the risks include respiratory failure, and potential exposure of medical staff and the downstream effects on cardiac catheterization laboratories and cardiac imaging services.

"Nonetheless, it's critically important for patients who are experiencing a heart attack due to coronary occlusion to receive immediate and appropriate treatment," says Dr. Askew. "There is an urgent need for an algorithm that guides triage of patients with suspected or proven COVID-19 patients with STE to determine initial invasive or noninvasive pathways."

The article notes that acute myocardial injury, arrhythmia and shock are common in patients with acute respiratory infections such as COVID-19. Myocardial injury is defined by an elevated cardiac troponin level; when myocardial injury is acute and occurs in the setting of acute myocardial ischemia, it can signal a heart attack.

The article proposes algorithms, based on expert consensus, for responding to patients with STE and acute myocardial injury. It also provides guidance on decision-making regarding the use of an echocardiogram or a coronary CT angiogram for patients with suspected or confirmed COVID-19.

"The reported experiences from countries in which significant exposure to COVID-19 has occurred highlight the challenges we have in treating patients with COVID-19 and STE on the electrocardiogram," says Dr. Askew. "Health care facilities need to rapidly prepare for this, so they can appropriately triage these patients with invasive or noninvasive pathways. This is critically important to minimize risks for the patient as well as risk of COVID-19 exposure to medical personnel."

Credit: 
Mayo Clinic

False-negative COVID-19 test results may lead to false sense of security

ROCHESTER, Minn. -- As COVID-19 testing becomes more widely available, it's vital that health care providers and public health officials understand its limits and the impact false results can have on efforts to curb the pandemic.

A special article published in Mayo Clinic Proceedings calls attention to the risk posed by overreliance on COVID-19 testing to make clinical and public health decisions. The sensitivity of reverse transcriptase-polymerase chain reaction (RT-PCR) testing and overall test performance characteristics have not been reported clearly or consistently in medical literature, the article says.

As a result, health care officials should expect a "less visible second wave of infection from people with false-negative test results," says Priya Sampathkumar, M.D., an infectious diseases specialist at Mayo Clinic and a study co-author.

"RT-PCR testing is most useful when it is positive," says Dr. Sampathkumar. "It is less useful in ruling out COVID-19. A negative test often does not mean the person does not have the disease, and test results need to be considered in the context of patient characteristics and exposure."

Even with test sensitivity values as high as 90%, the magnitude of risk from false test results will be substantial as the number of people tested grows. "In California, estimates say the rate of COVID-19 infection may exceed 50% by mid-May 2020," she says. "With a population of 40 million people, 2 million false-negative results would be expected in California with comprehensive testing. Even if only 1% of the population was tested, 20,000 false-negative results would be expected."

The authors also cite the effects on health care personnel. If the COVID-19 infection rate among the more than 4 million people providing direct patient care in the U.S. were 10% -- far below most predictions ¬¬-- more than 40,000 false-negative results would be expected if every provider were tested.

This poses risks for the health care system at a critical time. "Currently, CDC (Centers for Disease Control and Prevention) guidelines for asymptomatic health care workers with negative testing could lead to their immediate return to work in routine clinical care, which risks spreading disease," says Colin West, M.D., Ph.D., a Mayo Clinic physician and the study's first author. Victor Montori, M.D., a Mayo Clinic endocrinologist, also is a co-author.

While dealing with the enormity of the growing COVID-19 pandemic, it's important for public health officials to stick to principles of evidence-based reasoning regarding diagnostic test results and false-negatives. Four recommendations are outlined in the Mayo Clinic article:

Continued strict adherence to physical distancing, hand-washing, surface disinfection and other preventive measures, regardless of risk level, symptoms or COVID-19 test results. Universal masking of both health care workers and patients may be necessary.

Development of highly sensitive tests or combinations of tests is needed urgently to minimize the risk of false-negative results. Improved RT-PCR testing and serological assays -- blood tests that identify antibodies or proteins present when the body is responding to infections such as COVID-19 -- are needed.

Risk levels must be carefully assessed prior to testing, and negative test results should be viewed cautiously, especially for people in higher-risk groups and in areas where widespread COVID-19 infection has been confirmed.

Risk-stratified protocols to manage negative COVID-19 test results are needed, and they must evolve as more statistics become available.

"For truly low-risk individuals, negative test results may be sufficiently reassuring," says Dr. West. "For higher-risk individuals, even those without symptoms, the risk of false-negative test results requires additional measures to protect against the spread of disease, such as extended self-isolation."

At Mayo Clinic, RT-PCR testing is "one of many factors we take into account in deciding whether the patient meets criteria for COVID-19," Dr. Sampathkumar says. If the RT-PCR test is negative but chest X-ray or CT scan results are abnormal, or there has been close contact with a person who has confirmed COVID-19, the recommendation is to continue caring for the patient as if he or she has COVID-19.

"We need to continue to refine protocols for asymptomatic patients and exposed health care workers," says Dr. Sampathkumar.

Credit: 
Mayo Clinic

New study shows how oxygen transfer is altered in diseased lung tissue

image: Professor Cecilia Leal, center, and graduate students Mijung Kim, left, and Marilyn Porras-Gomez.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- A multidisciplinary team of researchers at the University of Illinois at Urbana-Champaign has developed tiny sensors that measure oxygen transport in bovine lung tissue. The study - which establishes a new framework for observing the elusive connection between lung membranes, oxygen flow and related disease - is published in the journal Nature Communications.

"The membranes that encase lungs and add the elasticity needed for inhaling and exhaling appear to also play a critical role in supplying oxygen to the bloodstream," said materials science and engineering professor Cecilia Leal, who led the study with graduate students Mijung Kim and Marilyn Porras-Gomez.

For lung tissue to perform effectively, it must be able to transfer oxygen and other gases through its membranes, the researchers said. One way this happens is through a substance - called a surfactant - that reduces lung liquid surface tension to allow this exchange to occur. However, a surfactant called cardiolipin is known to be overly abundant in tissues infected with bacterial pneumonia, the study reports.

The new sensors are thin silicon- and graphene-based films that contain tiny transistors that measure oxygen permeation between biological surfaces. "A thin film of lung membranes is spread out over many tiny sensors at the device surface, giving us a better picture of what is going on over a relatively large area rather than just a spot," Leal said.

The team used the sensors to compare oxygen transfer between healthy and diseased membranes. The samples consisted of a bovine lipid-protein extract commonly used to treat premature infants suffering respiratory distress, with a portion of the samples combined with cardiolipin.

"We found that more oxygen passes through the tissue diseased by cardiolipin," Leal said. "Which may help explain previous observations of there being an off-balance of oxygen in the blood of pneumonia patients. Even though an increase in oxygen flow could be perceived as positive, it is important to keep the natural exchange that occurs in the lung - transferring oxygen more rapidly into the bloodstream disrupts this healthy equilibrium."

The researchers also compared the structure of healthy and diseased tissue using microscopic and X-ray imaging. They found that the tissue combined with cardiolipin showed damaged spots, which they posit may be responsible for increased oxygen transfer and subsequent off-balance oxygen levels in pneumonia patients.

The next stage of this research will be to study lung membranes extracted from
healthy and diseased mammalian lungs, Leal said. "Our results raise important insights on lung membrane function, indicating that changes in structure and composition directly relate to oxygen permeation. This work can potentially enable clinical research examining the role of unbalanced oxygen diffusion through lung membranes in a pathological context."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau

Researchers reveal important genetic mechanism behind inflammatory bowel disease

Philadelphia, April 9, 2020 - Researchers at Children's Hospital of Philadelphia (CHOP) have pinpointed a genetic variation responsible for driving the development of inflammatory bowel disease (IBD). The genetic pathway associated with this variation is involved in other immune disorders, suggesting the mechanism they identified could serve as an important therapeutic target. The findings were published today by the Journal of Crohn's and Colitis.

The two main forms of IBD, Crohn's disease and ulcerative colitis, have an important genetic component. More than 240 genetic regions have been identified that associate with IBD, more than any other pathological condition that has been studied. However, as each genetic region has multiple markers, it's important to identify those that are causally involved.

Using a variety of assays and state-of-the-art sequencing methods, the study team sought to characterize the single nucleotide polymorphism (SNP) rs1887428, which is located on the promoter region of the JAK2 gene. The protein coded by this gene is responsible for controlling the production of blood cells. Mutations of JAK2 have been linked to various blood cancers. Multiple drugs have been developed to target the JAK pathway to treat autoimmune disorders and malignancies. However, genetic and biological markers are needed to determine whether patients will respond to them.

"We chose this SNP for in-depth study because of its high probability for modifying JAK2 expression," said Christopher Cardinale, MD, PhD, a scientist in the Center for Applied Genomics at CHOP and first author of the study. "The model we established in this study may help us learn more about other causal SNPs associated with other autoimmune diseases."

The study identified proteins known as transcription factors,which regulate gene expression, that were associated with this particular SNP. Two transcription factors in particular--RBPJ and CUX1--can recognize the DNA sequence altered by the rs1887428 SNP. Additionally, the researchers found that while the SNP only has a very modest influence on JAK2 expression, the effect was amplified by other proteins in the JAK2 pathway.

"Using this method, we believe we have added an important tool to our arsenal of SNP-to-gene assignment methods, allowing us to pinpoint disease-driving genetic mutations that have previously been difficult to properly assign risk," said Hakon Hakonarson, MD, PhD, Director of the Center for Applied Genomics at CHOP and senior author of the paper. "This study in particular also provides evidence that drugs targeting JAK2 may provide some benefit for those patients suffering from IBD who carry mutations that upregulate the JAK2 pathway, though such precision-based approaches would need to be validated in clinical studies."

Credit: 
Children's Hospital of Philadelphia

Bayreuth geneticists discover regulatory mechanism of chromosome inheritance

image: Susanne Hellmuth M. Sc, PhD student at the Chair of Genetics at the University of Bayreuth, here loading samples for the separation of proteins and their subsequent immunological detection.

Image: 
Images: Olaf Stemmann.

In the course of every single cell division, the genetic information on the chromosomes must be distributed equally between the newly developing daughter cells. The enzyme separase plays a decisive role in this process. Susanne Hellmuth and Olaf Stemmann from the Chair of genetics at the University of Bayreuth have now discovered a previously unknown mechanism that regulates the activity of the separase. These fundamental findings add a new aspect to our current understanding of chromosome inheritance. The scientists have presented their study in the journal "Nature".

Crucial for healthy cell development: the regulation of separase

Cell division is essential for human growth and reproduction. Before a cell begins to divide, the genetic information stored on the chromosomes is duplicated. When this process is complete, each chromosome consists of two identical DNA threads, the sister chromatids. Cohesin, a ring consisting of several proteins, encloses each chromosome and holds the pair of chromatids together. Already during preparation for cell division, cohesin is removed from the arms of the chromosomes. However, the complete separation of the sister chromatids can only take place when the cohesin remaining in the middle of the chromosomes is cut by the enzyme separase. The chromatids then migrate to the two opposite ends of the spindle apparatus, where they form the genetic basis of the forming daughter cells.

Healthy development of the daughter cells is only guaranteed if they do not contain genetic defects. In order for this condition to be fulfilled, the separase must become active at exactly the right time. If the sister chromatids are separated too early, they can only be distributed randomly. The resulting daughter cells then contain the wrong chromosome number and die, or they can develop into tumour cells. Only strict regulation of the separase prevents these genetic malfunctions.

A "guardian spirit" suppresses premature sister chromatid separation

The Bayreuth researchers Susanne Hellmuth and Olaf Stemmann, in cooperation with geneticists from the University of Salamanca/Spain, have now discovered that the protein shugoshin (Japanese for "guardian spirit") has exactly this regulating function. Shugoshin causes the separase to remain inactive until the right time for cohesin splitting has come. With this discovery, scientists have succeeded in solving an important puzzle of genetics: Until now, only the protein securin was known to suppress premature activity of the separase. It was therefore believed that the separase was exclusively regulated by securin. However, this view contradicted the observation that separase remains properly regulated even when securin is not present. The study now published in "Nature" provides the explanation: Shugoshin and securin both prevent separase from initiating the process of chromsosome segregation at the wrong time. And if the securin fails, even shugoshin alone is able to regulate the activity of separase in human cells.

"We are dealing with a type of redundancy that is not at all uncommon in the cell cycle: In order for a vital process to proceed in a well-ordered manner, nature has safeguarded it by controlling it simultaneously in two or more different ways. This makes the process particularly robust, but also difficult to study, because individual disturbances have no visible effect," said Susanne Hellmuth, first author of the study.

Dual control through the spindle checkpoint

Indeed, Hellmuth and Stemmann made a further discovery: It is the spindle assembly checkpoint (SAC) that controls the regulating influence of shugoshin as well as that of securin. This finding confirms the well-established assumption in the research that the SAC has, as it were, sovereignty over all processes involved in chromosome inheritance. It had been known for some time that the SAC first stabilizes the securin and does not allow its degradation until the time has come for cohesin splitting by separase. The "Nature" publication now shows how the checkpoint causes shugoshin to suppress the premature activity of separase: namely by associating shugoshin with the SAC component Mad2.

"I was particularly pleased to hear a remark on our publication by one referee that the textbooks will now have to be rewritten," says Olaf Stemmann. "Our further research will show how our fundamental findings could also find their way into cancer therapy." This follow-up study by the Bayreuth research duo will also soon be published in "Nature".

Credit: 
Universität Bayreuth

Call to action: Traditional, complementary and integrative health COVID-19 support registry

image: Dedicated to research on paradigm, practice, and policy advancing integrative health.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, April 9, 2020--The new, global Traditional, Complementary and Integrative Health and Medicine (TCIHM) COVID-19 Support Registry aims to capture key information on the case, treatment/supportive care, and outcome variables related to the use of integrative health products and practices in patients in response to the COVID-19 pandemic. A Call to Action describing the need for, purpose of, and intended use of the Registry is published in JACM, The Journal of Alternative and Complementary Medicine, a peer-reviewed publication from Mary Ann Liebert, Inc., publishers, dedicated to paradigm, practice, and policy advancing integrative health. Click here to read the Call to Action on the JACM website.

JACM Editor-in-Chief John Weeks issued the "Call to Action: Announcing the Traditional, Complementary and Integrative Health and Medicine COVID-19 Support Registry" to help launch the resource that was created by a global network of researchers. The registry is already backed by over a dozen practitioner organizations. While there remains no high-quality evidence to support integrative practices and natural agents against the virus, practitioners and consumers are experimenting with multiple natural health products and practices that existing evidence suggests might have preventive, supportive, complementary, or rehabilitative value.

The Registry is housed at the Portland, OR-based Helfgott Research Institute. Led by multiple NIH grant awardee Ryan Bradley, ND, MPH, Helfgott's Director and an Associate Professor in the University of Washington College of Pharmacy, the Registry is anticipated to help characterize such care, report indications of potential value or harm, and serve as the basis of hypotheses for potentially promising treatments and protocols for COVID-19 management.

JACM Editor-in-Chief John Weeks states: "Non-biomedical strategies are widely in use relative to COVID-19. Governments in India, China, and elsewhere are promoting traditional methods for COVID-19. Governments in the West are silent or antagonistic, yet millions of their citizens and their practitioners are experimenting. In the midst of this, the Chinese government is crediting the apparently relatively quick turn-around in that country to the integration of traditional Chinese medicine with conventional biomedicine in 90% of their patients. If widely utilized, the Registry will cast needed light on strategies for COVID-19 and may prove useful for managing future health issues. We urge all traditional and integrative practitioners to participate. Why leave this stone unturned?"

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Money can't buy love -- or friendship

BUFFALO, N.Y. - While researchers have suggested that individuals who base their self-worth on their financial success often feel lonely in everyday life, a newly published study by the University at Buffalo and Harvard Business School has taken initial steps to better understand why this link exists.

"When people base their self-worth on financial success, they experience feelings of pressure and a lack of autonomy, which are associated with negative social outcomes," says Lora Park, an associate professor of psychology at UB and one of the paper's co-authors.

"Feeling that pressure to achieve financial goals means we're putting ourselves to work at the cost of spending time with loved ones, and it's that lack of time spent with people close to us that's associated with feeling lonely and disconnected," says Deborah Ward, a UB graduate student and adjunct faculty member at the UB's psychology department who led the research on a team that also included Ashley Whillans, an assistant professor at Harvard Business School, Kristin Naragon-Gainey, at the University of Western Australia, and Han Young Jung, a former UB graduate student.

The findings, published in the journal Personality and Social Psychology Bulletin, emphasize the role of social networks and personal relationships in maintaining good mental health and why people should preserve those connections, even in the face of obstacles or pursuing challenging goals.

"Depression and anxiety are tied to isolation, and we're certainly seeing this now with the difficulties we have connecting with friends during the COVID-19 pandemic," says Ward. "These social connections are important. We need them as humans in order to feel secure, to feel mentally healthy and happy. But much of what's required to achieve success in the financial domain comes at the expense of spending time with family and friends."

Ward says it's not financial success that's problematic or the desire for money that's leading to these associations.

At the center of this research is a concept psychologists identify as Financial Contingency of Self-Worth. When people's self-worth is contingent on money, they view their financial success as being tied to the core of who they are as a person. The degree to which they succeed financially relates to how they feel about themselves -- feeling good when they think they're doing well financially, but feeling worthless if they're feeling financially insecure.

The research involved more than 2,500 participants over five different studies that looked for relationships between financial contingency of self-worth and key variables, such as time spent with others, loneliness and social disconnection. This included a daily diary study that followed participants over a two-week period to assess how they were feeling over an extended time about the importance of money and time spent engaged in various social activities.

"We saw consistent associations between valuing money in terms of who you are and experiencing negative social outcomes in previous work, so this led us to ask the question of why these associations are present," says Ward. "We see these findings as further evidence that people who base their self-worth on money are likely to feel pressured to achieve financial success, which is tied to the quality of their relationships with others."

Ward says the current study represents the beginning of efforts to uncover the processes at work with Financial Contingency of Self-Worth.

"I hope this is part of what becomes a longer line of research looking at the mechanisms between valuing money and social-related variables," says Ward. "We don't have the final answer, but there is a lot of evidence that pressures are largely playing a role."

Credit: 
University at Buffalo