Earth

Study shows experimental drug can encourage bone growth in children with dwarfism

image: Chondrocytes.

Image: 
Getty Images

Researchers at Johns Hopkins Medicine, the Murdoch Children's Research Institute in Australia and seven other medical institutions report that an experimental drug called vosoritide, which interferes with certain proteins that block bone growth, allowed the average annual growth rate to increase in a study of 35 children and teenagers with achondroplasia, a form of dwarfism. The patients' average boost in height to about 6 centimeters (2.4 inches) per year is close to growth rates among children of average stature, and the side effects of the drug were mostly mild, according to the researchers.

Results of the four-year study are summarized online June 18 in the New England Journal of Medicine.

"An increase in the annual growth rate alone may have a positive effect on some patients' quality of life. For other patients, now and in the future, our hope is that the altered bone growth throughout the body could ease such problems as sleep apnea, neurological and leg and back problems, and improve their quality of life," says Julie Hoover-Fong, M.D., Ph.D., associate professor and director of the Greenberg Center for Skeletal Dysplasias at the Johns Hopkins McKusick-Nathans Institute of Genetic Medicine. "Right now, the results of the study show an impact on growth, and this effect is sustained, at least over nearly four years in this trial. The potential long-term benefit will take more time to observe."

Achondroplasia, although rare, is the most common form of dwarfism worldwide, affecting an estimated 1 in 15,000-40,000 live births. The condition is caused by mutations in a gene called FGFR3 that result in the excess production of proteins that slow bone growth. The disorder is marked by disproportionate short stature with shortened limbs, near normal-sized torsos and enlarged heads.

About 20% of people with achondroplasia inherit the mutations, meaning most children born with it have parents of average height. People with achondroplasia are prone to develop sleep apnea, chronic ear infections, neurological problems, spinal stenosis and bowed legs, frequently requiring surgical treatments to relieve pain and other symptoms.

"About half of these children will need spinal or other surgery, and this can mean a lot of time away from school as the child recovers and rehabilitates after surgery, which can affect important social connections," says Ravi Savarirayan, M.B.B.S., M.D., clinical geneticist and group leader of skeletal biology and disease at Murdoch Children's Research Institute.

There are no treatments able to reverse achondroplasia or treat the genetic culprit itself. However, growth hormone has been approved to treat the condition in Japan and occasionally is used off label elsewhere, but is not considered very effective in achondroplasia.

Vosoritide is a synthetic version of a protein present in humans called C-type natriuretic peptide. It is designed to bind to a specific receptor on the surface of chondrocytes, a type of cartilage cell found in the growth plates of bones. Once joined, the vosoritide-receptor connection sends a signal inside the fibroblast to stanch the flow of negative growth factors that were triggered by the mutation in the FGFR3 gene.

"This is the first therapeutic option that targets the molecular cause of the condition," says Hoover-Fong.

For the currently reported study, conducted between January 2014 and July 2018, investigators enrolled 35 children ages 5-14 and followed them for a midpoint of 42 months. There were 19 girls and 16 boys, including two Hispanics, seven Asians and two African Americans.

To enroll in the study, participants had previously been monitored for six months to determine their baseline growth rate. Then, for six months, participants received a daily vosoritide injection through the skin at doses ranging from 2.5-30 micrograms/kilogram. Over the four-year study, two participants discontinued the trial because of anxiety with daily injections. One left the trial because their growth plates closed, and another withdrew for unknown reasons. A fifth was withdrawn when doctors found during a routine electrocardiography that a participant had Wolff-Parkinson-White syndrome, a condition in the electrical circuits of heart muscles that causes rapid heartbeat.

At the end of six months, 30 participants opted to continue taking vosoritide. Participants who had earlier received the lowest doses of the drug received increased dosages, up to 15 micrograms/kg, and the other participants continued to receive their initial doses of 15 and 30 micrograms/kg. One of the 30 patients withdrew from the study to undergo limb-lengthening surgery.

Overall, the researchers found that, on average, the childrens' annual growth rate increased from below 4 centimeters per year to just below 6 centimeters per year. "They grew nearly 2 centimeters more, on average, per year, and this rate comes close to the annual growth of average stature people," says Hoover-Fong.

The research team also found that the growth rate increase lasted over the nearly four-year study. Moving forward, Hoover-Fong says all the remaining study participants will receive vosoritide until they reach their final adult height or they choose to withdraw from the study. She says the sweet spot of vosoritide dose seems to be at 15 micrograms/kilogram as participants who received 30 micrograms/kilogram did not gain any growth velocity beyond those who received 15.

All 35 study participants had at least one side effect, considered mild and reversible, including injection site pain, swelling, headache, cough and low-grade fever. One child was admitted to a hospital for surgery to remove their tonsils and adenoids, which is common in achondroplasia to treat sleep apnea, and another had enlarged tonsils. One child had a congenital cyst in the neck and another had a fluid-filled cyst in the spinal cord.

"Importantly, none of the children experienced an anaphylactic reaction to the drug and none developed a low blood pressure problem that required medical intervention, which was a concern with this type of drug," says Hoover-Fong.

Some of the participants were averse to needle sticks, but this became less of a problem as the study went on, she says.

There are ongoing clinical trials of vosoritide, including a randomized, double-blind, placebo-controlled study in more than 100 children and teenagers with achondroplasia worldwide. It is expected to end in 2019, and a global study of the drug in 70 children younger than 5 years is ongoing.

At least four other experimental drugs that target different molecular bone growth receptors or pathways are currently being tested in children and teenagers with achondroplasia at Johns Hopkins and elsewhere.

The cost of vosoritide has not been determined.

Credit: 
Johns Hopkins Medicine

CNIO researchers describe new functions of protein that plays key role in some tumors and rare diseases

image: Two mouse embryonic stem cells in which two Polycomb regions from different chromosomes (HoxC, green; HoxD, red) appear next to each other (arrows). Cohesin-SA2 promotes this type of contacts between distant Polycomb regions, while cohesin-SA1 does the opposite.

Image: 
Molecular Cytogenetics Unit. CNIO

Cohesin is a protein complex that is essential for chromosome segregation in dividing cells. Recent evidence suggests that it also plays an important role in 3D genome architecture, which folds like an origami and regulates essential cellular processes that include gene expression, DNA replication and DNA repair. Cohesin mutations have been identified in some types of cancer and in rare diseases referred to as cohesinopathies. In a paper published in Cell Reports, the Chromosome Dynamics Group at the Spanish National Cancer Research Centre (CNIO), led by Ana Losada, describes new functions of cohesin in mouse embryonic stem cells that might help understand and address the causes of these disorders.

Cohesins and pluripotency

C0hesin has two variants, containing either SA1 or SA2. While the gene that encodes SA2 shows high mutation rates in certain types of cancer, like bladder cancer, acute myeloid leukaemia or Ewing sarcoma, mutation rates of SA1 are much lower.

In 2018, a study conducted by the Losada-led Group and the team headed by Marc Marti-Renom at the National Centre for Genomic Analysis-Centre for Genomic Regulation (CNAG-CRG) revealed the distinct roles of each variant in human epithelial cells. Now, the researchers at CNIO have taken a further step to determine what role each variant plays in the peculiar genome architecture of embryonic stem cells, that is, the pluripotent cells that can produce all the cell types of the adult organism.

Studying mouse embryonic stem cells - similar to human embryonic stem cells -, Losada and her team at CNIO are publishing in Cell Reports how cohesin-SA1 contributes to differentiate the distinct regions (TADs) in which the genome is organised, cohesin-SA2 helps to regulate the expression of genes that have a role in keeping the pluripotency of stem cells - pluripotency is the property that enables stem cells to originate all the cell types that make up the adult organism.

"Confirming that what we observed in human cells also occurs in such different cell types as mouse embryonic stem cells was really important to us," says Losada, corresponding author of the research paper. Moreover, the study provides evidence for a novel role of cohesin in the structure of embryonic stem cells. "We have shown for the first time how cohesin contributes to the 3D organization of Polycomb domains," she adds.

At the spotlight of cancer research

Polycomb domains are 3D genome structures that exist only in embryonic stem cells, as if it were an extra folding in the origami of its genome that is missing in the differentiated tissue cells. These regions, characterised by the presence of Polycomb complexes, are essential to prevent the expression of the genes encoding proteins with specific functions in embryonic development. Polycomb complexes, then, are involved in the repression of genes that otherwise would help initiate cell differentiation. The study to be published in Cell Reports revealed that cohesin-SA2 is present in these regions too, where it retains one of the proteins in the Polycomb complex. "When the levels of cohesin-SA2 in cells decrease, Polycomb protein levels in relevant chromatin domains decrease as well," says Ana Cuadrado, first co-author and co-corresponding author of the study. "Consequently, chromosome folding loosens and the expression of tissue-specific genes is not properly silenced. Stem cells are no longer pluripotent and cannot work properly."

In addition, Polycomb domains in distant chromosome regions, even in different chromosomes, interact with each other and in so doing facilitate gene repression. In retaining Polycomb proteins, cohesin-SA2 promotes such interactions. "We did not expect cohesin-SA1 to have just the opposite function, preventing those contacts," says Daniel Giménez, bioinformatics expert at the Chromosome Dynamics Group and first co-author of the research paper.

Further study is required to understand the link between cohesin and Polycomb complexes in differentiated cells and its probable contribution to the emergence of cancer cells. "We believe that these lines of research will help us understand the role of cohesin mutations in cancer development and cohesinopathies, like Cornelia de Lange syndrome," Losada concludes.

This study has been performed in collaboration with the group of Marc Marti-Renom at CNAG-CRG and the CNIO Bioinformatics Unit. It has been funded by the Ministry of Science, Innovation and Universities, the National Institute of Health Carlos III, FEDER, the Community of Madrid, the European Research Council, Horizon 2020 and AGAUR.

Reference article: Specific contributions of cohesin-SA1 and cohesin-SA2 to TADs and Polycomb domains in embryonic stem cells. Ana Cuadrado et al (Cell Reports, 2019). DOI: 10.1016/j.celrep.2019.05.078

Credit: 
Centro Nacional de Investigaciones Oncológicas (CNIO)

Antidepressants can reduce the empathic empathy

Depression is a disorder that often comes along with strong impairments of social functioning. Until recently, researchers assumed that acute episodes of depression also impair empathy, an essential skill for successful social interactions and understanding others. However, previous research had been mostly carried out in groups of patients who were on antidepressant medication. Novel insights of an interdisciplinary collaboration involving social neuroscientists, neuroimaging experts, and psychiatrists from the University of Vienna and the Medical University of Vienna show that antidepressant treatment can lead to impaired empathy regarding perception of pain, and not just the state of depression itself. The results of this study have been published in the scientific journal Translational Psychiatry.

An interdisciplinary research team jointly led by Prof. Claus Lamm (Department of Basic Psychological Research and Research Methods, University of Vienna), Prof. Rupert Lanzenberger (Department of Psychiatry and Psychotherapy, Medical University of Vienna) and Prof. Christian Windischberger (Center for Medical Physics and Bioengineering, Medical University of Vienna) set out to disentangle effects of acute depressive episodes and antidepressant treatment on empathy. The research has been performed within the research cluster "Multimodal Neuroimaging in Clinical Neurosciences", an intramural research initiative aimed at translational collaborations between researchers at the University of Vienna and the Medical University of Vienna. The researchers recruited unmedicated patients with acute depression, and tested their empathic responses to the pain of others twice: first, during an acute depressive episode, i.e., before they had received any medication. Second, after three months of psychopharmacological treatment with antidepressants (mostly selective serotonin reuptake inhibitors).

In both sessions, patients underwent functional magnetic resonance imaging while watching videos of people undergoing painful medical procedures. Their brain activity and self-reported empathy were compared to those of a group of healthy controls. Before treatment, patients and controls responded in a comparable way. After three months of antidepressant treatment, the research revealed relevant differences: patients reported their level of empathy to be lower, and brain activation was reduced in areas previously associated with empathy.

First author Markus Rütgen underlines that reduced empathic responses were not caused by a general dampening of negative emotions: "The lowered emotional impact of negative events in a social context possibly allows patients to recover more easily. Nevertheless, the actual impact of reduced empathy on patients' social behavior remains to be explored."

Credit: 
University of Vienna

New methods from material sciences in physics find their way into cancer research

image: Illustration of the two perspectives: The biological to the left, focusing on the biological elements, organelles, membranes, genes and proteins and the materials science study to the right, where Hydrogen atoms and the properties of water are dominant. The combination of both provides opportunities for developing new, dedicated treatments.

Image: 
Murillo Longo Martins

Cancer research:

A new study on the behavior of water in cancer cells shows how methods usually limited to physics can be of great use in cancer research. The researchers, Murillo Longo Martins and Heloisa N. Bordallo at the Niels Bohr Institute, University of Copenhagen, have shown how advanced methods in materials analysis - a combination of neutron scattering and thermal analysis - can be used to map the properties of water in breast cancer cells. This pilot work shows how the mobility of water molecules confined in cancer cells changes when subjected to treatment with a chemotherapy drug. This proposed methodology holds potential for advance diseases diagnosis and might guide to the advancement of the approach used in cancer treatment, one of the biggest challenges in medical research. The result, now published in Scientific Reports, is exactly that.

Comparing cancer cells before and after treatment

When treating cancer with chemotherapy, the drug is usually inserted into the body via the bloodstream. Afterwards the medicament spreads to the entire system, making its way to the cancer cells. The effect of the drug depends on many, many factors. For example, the properties of intra cellular water are altered by the action of the drug. However the role of water in the development or remission of tumors is likely bigger than so far considered. This new perspective will be very instrumental in mapping the precise development, when comparing analysis before and after treatment.

Understanding water and its properties - a common denominator for all cancer cells - is vital

Water being the main component in the composition of the cell, understanding its properties, when undergoing treatment for cancer, is vital. Cancer cells respond differently to different kinds of treatment, so a new unorthodox analysis, using techniques from materials-sciences, of the cell's main component, its composition and behavior, could be a common denominator in developing new treatments for individual patients. Murillo Longo Martins, who has been working in this field during his PhD and postdoc at the Niels Bohr Institute, explains: "Our findings indicate that, in the future, drugs can be developed focusing on modifying the properties of cellular water to achieve specific outcomes. In a shorter term, understanding the dynamics of cellular water may provide complementary knowledge about, for example, why some types of cancers respond differently to certain treatments than others".

Unorthodoxy as a method

While physicians and biologists perceive cells as an ensemble of membranes, organelles, genes and other biological components, by combining sophisticated neutron scattering techniques and thermal analysis physicists are able to characterize water dynamics in the cell very precisely. Building a communication interface between these two distinct visions is now proven to be very interesting by the researchers at the Niels Bohr Institute. Their new results can open new areas of inquiry, because of the unorthodox approach. This result is expected to stimulate future collaborations between distinct scientific communities, and further incentivize the use of materials-science approaches when investigating biological matter.

Credit: 
University of Copenhagen

IDIBELL researchers relate the amplification of a chromosomal region with resistance to to a chemotherapeutic drug in breast cancer

image: Emergence of docetaxel resistance.

Image: 
Eva González Suárez

Researchers at the Bellvitge Biomedical Research Institute (IDIBELL), with the participation of collaborators from the Baylor College of Medicine (Houston) and the University Institute of Oncology of Asturias (IUOPA), publish today in Cancer Research a study where they relate the high number of copies of a chromosomal region with the appearance of resistance to a chemotherapeutic drug. The research was led by Dr. Eva González-Suárez, head of the Transformation and Metastasis group at IDIBELL.

Nowadays, chemotherapy, despite its side effects, remains the most efficient treatment to fight cancer. One of the most widespread drugs in chemotherapy, and which has been the subject of this study is docetaxel, a chemical compound that acts on tumor cells preventing their proper division. The research project has focused on the most aggressive subtype of breast cancer: the triple negative breast cancer (TNBC). This subgroup is characterized by being quite heterogeneous (reason why there are no targeted therapies to combat it) and it is usually associated with poor prognosis. Despite promising initial responses to chemotherapy, resistance to the drug often develops during the treatment. One of the challenges that oncologists face is the selection of the chemotherapy drug that will benefit patients with triple-negative breast cancer. This selection is made in most cases arbitrarily.

To carry out this study, scientists have worked with patient derived xenografts (PDXs). These PDXs are animal models (mice) in which tumor cells from the patient have been implanted, so that the tumor sample is much more representative than a conventional cell culture. These models can be used to test the efficiency of drugs as well as to study the processes of emergence of chemoresistance that is, to understand how tumor cells become unresponsive to treatments.

Analyzing the effect of docetaxel in triple negative breast cancer in PDXs, researchers have observed that as in patients, resistance to docetaxel emerges during treatment and have compared the genome of matched tumors sensitive to docetaxel and their counterparts that developed resistance upon continuous drug exposure. They have identified an increase in the number of copies of a region of chromosome 12, called chr12p, in tumors resistant to docetaxel and even after short treatments with docetaxel. These results imply that a subpopulation of tumor cells resistant to docetaxel is present in the tumors and survive the drug, unlike the others that die during treatment.

In addition, researchers have discovered that this subpopulation with multiple copies of chr12p resistant to docetaxel is very vulnerable to treatment with another chemotherapeutic drug, carboplatin, which is why Dr. González-Suárez proposes "applying a sequential treatment that combines first docetaxel and then carboplatin, instead of using both drugs individually or simultaneously as it is currently done. "

"We have associated the presence of this amplified chr12p chromosome region to emergence of docetaxel resistance and carboplatin vulnerability," says Dr. Eva González-Suárez. "We propose that the copy number of chr12p is considered as a biomarker to predict whether patients' tumors will develop, or not, docetaxel resistance; and what is even more important, once docetaxel resistance emerges, to have an alternative drug to treat the patients, carboplatin.

This discovery could represent the first description of a biomarker for the selection of the chemotherapy drug and the sequence of treatments that could benefit patients with triple negative breast cancer.

Credit: 
IDIBELL-Bellvitge Biomedical Research Institute

Changing how we predict coral bleaching

image: A healthy reef (left) compared to a dead coral reef, which has undergone bleaching.

Image: 
© 2019 Charlotte Young

Coral bleaching events may occur more frequently in the Red Sea than previously thought, according to an algorithm developed by KAUST researchers. Their findings also indicate that the northern part of the Red Sea might not remain a thermal refuge for coral ecosystems for long.

Ocean modeling expert Ibrahim Hoteit, and colleagues, used more than 30 years of satellite data on Red Sea surface temperatures to develop an algorithm that successfully isolated every extreme warming event that led to documented coral bleaching in the Red Sea. Their approach suggests that coral bleaching in the Red Sea may be greatly underestimated.

When exposed to unusually high sea surface temperatures for prolonged periods, corals expel the marine algae that live within them. Because these algae serve as the corals' main energy source, in their absence, coral colonies turn a deadly looking white, a phenomenon known as coral bleaching. If the adverse conditions continue, it is difficult for the corals to regain the algae and so they tend to die, in turn affecting the coral reef ecosystem that depends on them for survival.

Red Sea surface temperatures are among the warmest in the world, and its corals are thought to be among the most heat tolerant. But Red Sea corals are poorly monitored, and therefore, little is known about the true extent of their damage due to rising temperatures.

"It is important to detect bleaching-prone regions in the Red Sea because this allows us to optimize the sustainable management of the coastline by identifying the areas most in need of mitigation plans to reduce the stress on corals," says Ph.D. student Lily Genevier.

Currently, scientists use a measure called degree heating weeks to assess the accumulation of heat stress, or the duration and amount that temperatures rise above the bleaching threshold of 1 degree Celsius over the highest summertime mean of sea surface temperatures. But this method has both overestimated and missed Red Sea bleaching events. "This may be because this method is not efficient at detecting unusual warming in cooler periods," Genevier explains.

Marine heatwaves, on the other hand, are calculated by pooling sea surface temperatures around each day of the year. KAUST researchers fine tuned this approach by adapting it to the environmental conditions that have led to documented Red Sea coral bleaching events.

They found that bleaching occurred during summer marine heatwaves where sea surface temperatures remained in the top five percent for at least a week. "Since the marine heatwave threshold is based on a percentile, it follows seasonality, meaning it can detect extreme anomalous heating during cooler summer periods," Genevier explains. They also found that all documented bleaching events happened during marine heatwaves with a mean sea surface temperature of 30 degrees Celsius or higher.

The findings suggest that Red Sea coral bleaching may have been greatly underestimated. They also indicate an emerging pattern of extreme warming events in the northern region, which was previously thought to act as a thermal refuge for corals.

"Because this study was able to detect bleaching-prone areas using only the few known bleaching events in the Red Sea, we think it should be applied to other data-poor regions," says Genevier. The team is now working on implementing their methodology on a global scale by tuning marine heatwaves to bleaching conditions in other tropical marine ecosystems. They plan to upload their results onto the interactive online Red Sea Atlas being developed at KAUST.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Molecular switch for 'exhaustion mode' of immune cells discovered

image: Prof. Dietmar Zehn (right) with the first author of the new study about chronicle immune responses, Francesca Alfei, and his staff member Markus Flosbach.

Image: 
D. Zehn / Technical University of Munich

Tumors and certain viral infections pose a challenge to the human body which the immune system typically fails to hand. In these diseases it switches to hypofunctional state that prevent adequate protection. A research team from the Technical University of Munich (TUM) has achieved a major success: They were able to identify the crucial molecular switch that triggers such dysfunctional immune responses. This could make it possible in the future to switch off or to prevent this state.

Normally, the immune system goes into a state of maximum alert following a viral infection. It triggers the activation of a variety of immune cells such as T and B cells. These procreate in large numbers, and aggressively combat the infected cells. However, if the immune system does not manage to defeat the virus, then immune cells appear with highly inhibited functions. This "exhaustion" of immune cells is triggered by the ongoing immune cell activation due to the virus. Yet, this attenuation of immune responses constitutes also a positive move for the body, as a persisting strong immune response would be a significant burden and a major cause of damage to cells and tissue. The deactivation of immune responses may also permit massive growth of tumors.

The search for a mechanism

The declared objective of tumor and infection research is to control or prevent the switching from a normal to a dysfunctional immune response. Dietmar Zehn, Professor for Animal Physiology and Immunology at the TUM School of Life Sciences Weihenstephan, has for years been interested in this kind of chronic immune response and the underlying molecular deactivation mechanisms.

"They represent the body's compromise between the damage that would be caused by an ongoing strong immune response and the actual illness itself. We're fascinated by these mechanisms for several reasons: Beside chronical infections, they also occur with tumors, have been mechanistically poorly understood in the past, and in 2018 the Nobel Prize for Medicine went to exactly this topic," says Zehn, describing the significance of the field. "In particular the demonstration by colleagues at the university hospital in Freiburg that Tox correlates with T cell dysfunction also in patients with chronic hepatitis C infections underlines the medical relevance of the observations we made."

Protein Tox switches on "exhaustion mode"

Until now it was vaguely understood how the body switches on and regulates these immune responses. Zehn and his team found the deciding factor at the same time as two groups in the USA. The study was published in the scientific journal Nature.

The protein Tox is the decisive molecular switch. Using mouse and cell culture models together with patient samples, the scientists found out that the protein has an effect inside the cell nucleus, where it activates a genetic program that alters immune cell function. As a result, inhibitory surface receptors appear on the surface of the immune cells. The cells are thus open to inhibitory signals, ensuring that the cells "tire", functioning less effectively or even dying off.

Useful in a variety of different therapies

"It is enormously important that we have finally decoded these molecular processes: This is the absolute prerequisite for targeted modification of the processes. Controlling Tox could make it possible to reactivate weak immune responses, which would be interesting for example in fighting tumors, or to slow excessive immune responses like those found in autoimmune diseases," says Dietmar Zehn.

Credit: 
Technical University of Munich (TUM)

New study shows how environmental disruptions affected ancient societies

LSU College of the Coast & Environment Distinguished Professor Emeritus John Day has collaborated with archeologists on a new analysis of societal development. They report that over the past 10,000 years, humanity has experienced a number of foundational transitions, or "bottlenecks." During these periods of transition, the advance or decline of societies was related to energy availability in the form of a benign climate and other factors.

"Studying the factors that led to the advancement and contraction of past societies provides insight into how our globalized society might become more or less sustainable," Day said.

Day's collaborators include Joel Gunn of the University of North Carolina at Greensboro, William Folan of the Universidad Autonoma de Campeche in Mexico and Matthew Moerschbaecher of the Louisiana Oil Spill Coordinators Office. Gunn and Folan are Mayan archeologists and Moerschbaecher is a graduate of LSU's oceanography program.

With the human population having exceeded the capacity of Earth's resources, this analysis suggests that a transition toward sustainability for the current energy-dense, globalized industrial society will be very difficult if not impossible without dramatic changes.

The authors say that these past transitions were caused by a combination of social, astronomical and biogeophysical events such as volcanic eruptions, changes in solar emissions, sea-level rise and ice volume, biogeochemical and ecological changes, and major social and technological innovations. One example is the worldwide crisis that began in 536 AD, which was caused by three major volcanic eruptions within a decade. This event led to the destruction of half the population of Europe via the Black Death plague, starvation and wars. In China and the Mayan region, it led to crop failures, famine and plagues.

They found that when energy was abundant, societies expanded and prospered. Conversely, when energy sources declined, there was societal contraction and collapse. The previous example implies that changes are more likely to transpire due to planetary-scale disturbances and constraints, whether societal or environmental, and will likely lead to strong societal disruptions.

However, in the past, major changes sometimes moved toward a more sustainable social organization. For example, after one disruption, the Mayans switched to a more efficient use of energy and marine transportation and, at the time of European contact, they were leading a sustainable lifestyle.

Credit: 
Louisiana State University

Automated cryptocode generator is helping secure the web

Nearly every time you open up a secure Google Chrome browser, a new MIT-developed cryptographic system is helping better protect your data.

In a paper presented at the recent IEEE Symposium on Security and Privacy, MIT researchers detail a system that, for the first time, automatically generates optimized cryptography code that's usually written by hand. Deployed in early 2018, the system is now being widely used by Google and other tech firms.

The paper now demonstrates for other researchers in the field how automated methods can be implemented to prevent human-made errors in generating cryptocode, and how key adjustments to components of the system can help achieve higher performance.

To secure online communications, cryptographic protocols run complex mathematical algorithms that do some complex arithmetic on large numbers. Behind the scenes, however, a small group of experts write and rewrite those algorithms by hand. For each algorithm, they must weigh various mathematical techniques and chip architectures to optimize for performance. When the underlying math or architecture changes, they essentially start over from scratch. Apart from being labor-intensive, this manual process can produce nonoptimal algorithms and often introduces bugs that are later caught and fixed.

Researchers from the Computer Science and Artificial Intelligence Laboratory (CSAIL) instead designed "Fiat Cryptography," a system that automatically generates -- and simultaneously verifies -- optimized cryptographic algorithms for all hardware platforms. In tests, the researchers found their system can generate algorithms that match performance of the best handwritten code, but much faster.

The researchers' automatically generated code has populated Google's BoringSSL, an open-source cryptographic library. Google Chrome, Android apps, and other programs use BoringSSL to generate the various keys and certificates used to encrypt and decrypt data. According to the researchers, about 90 percent of secure Chrome communications currently run their code.

"Cryptography is implemented by doing arithmetic on large numbers. [Fiat Cryptography] makes it more straightforward to implement the mathematical algorithms ... because we automate the construction of the code and provide proofs that the code is correct," says paper co-author Adam Chlipala, a CSAIL researcher and associate professor of electrical engineering and computer science and head of the Programming Languages and Verification group. "It's basically like taking a process that ran in human brains and understanding it well enough to write code that mimics that process."

Joining Chlipala on the paper are: first author Andres Erbsen and co-authors Jade Philipoom and Jason Gross, who are all CSAIL graduate students; as well as Robert Sloan MEng '17.

Splitting the bits

Cryptography protocols use mathematical algorithms to generate public and private keys, which are basically a long string of bits. Algorithms use these keys to provide secure communication channels between a browser and a server. One of the most popular efficient and secure families of cryptographic algorithms is called elliptical curve cryptography (ECC). Basically, it generates keys of various sizes for users by choosing numerical points at random along a numbered curved line on a graph.

Most chips can't store such large numbers in one place, so they briefly split them into smaller digits that are stored on units called registers. But the number of registers and the amount of storage they provide varies from one chip to another. "You have to split the bits across a bunch of different places, but it turns out that how you split the bits has different performance consequences," Chlipala says.

Traditionally, experts writing ECC algorithms manually implement those bit-splitting decisions in their code. In their work, the MIT researchers leveraged those human decisions to automatically generate a library of optimized ECC algorithms for any hardware.

Their researchers first explored existing implementations of handwritten ECC algorithms, in the C programming and assembly languages, and transferred those techniques into their code library. This generates a list of best-performing algorithms for each architecture. Then, it uses a compiler -- a program that converts programming languages into code computers understand -- that has been proven correct with a proofing tool, called Coq. Basically, all code produced by that compiler will always be mathematically verified. It then simulates each algorithm and selects the best-performing one for each chip architecture.

Next, the researchers are working on ways to make their compiler run even faster in searching for optimized algorithms.

Optimized compiling

There's one additional innovation that ensures the system quickly selects the best bit-splitting implementations. The researchers equipped their Coq-based compiler with an optimization technique, called "partial evaluation," which basically precomputes certain variables to speed things up during computation.

In the researchers' system, it precomputes all the bit-splitting methods. When matching them to a given chip architecture, it immediately discards all algorithms that just won't work for that architecture. This dramatically reduces the time it takes to search the library. After the system zeroes in on the optimal algorithm, it finalizes the code compiling.

From that, the researchers then amassed a library of best ways to split ECC algorithms for a variety of chip architectures. It's now implemented in BoringSSL, so users are mostly drawing from the researchers' code. The library can be automatically updated similarly for new architectures and new types of math.

"We've essentially written a library that, once and for all, is correct for every way you can possibly split numbers," Chlipala says. "You can automatically explore the space of possible representations of the large numbers, compile each representation to measure the performance, and take whichever one runs fastest for a given scenario."

Credit: 
Massachusetts Institute of Technology

Scientists reveal reversible super-glue inspired by snail mucus

Snails secrete a mucous that acts like super-glue, allowing them to adhere to rough surfaces like rocks.

Inspired by this aspect of snail biology, scientists at University of Pennsylvania, Lehigh University and the Korea Institute of Science and Technology have created a super-glue-like material that is "intrinsically reversible." In other words, it can easily come unglued.

Adhesives are everywhere?in daily life and in industrial applications. Achieving both strong adhesion and reversibility (or the ability to reverse the adhesion) is challenging. According to Anand Jagota, professor and founding chair of Lehigh University's Department of Bioengineering, this is especially true of hydrogels which are 90% water.

He says that adhesives usually fall into one of two classes: strong but irreversible, like superglues, or reversible and reusable but weak.

The team has managed to overcome these limitations. They have reported their findings in a paper published today in Proceedings of the National Academy of Sciences called "Intrinsically reversible superglues via shape adaptation inspired by snail epiphragm."

"We report a hydrogel-based, reversible, superglue-like adhesive by combining the benefits of both liquid and dry adhesives in a single material," says Jagota.

The team reports that when hydrated, the softened gel they created conformally adapts to the target surface by low-energy deformation, which is then locked upon drying in a manner similar to the action of the epiphragm of snails. An epiphragm is a temporary structure created by snails and mollusks. Made of dried mucus it holds in moisture during periods of inactivity and enables snails to adhere to surfaces, like a rocks.

The scientists show that reversible super-strong adhesion can be achieved from a non-structured material when the criterion of shape adaption is met, with minimal residual strain energy stored in the system. According to the researchers, the new material can be applied to both flat and rough target surfaces.

"We demonstrate that in this system adhesion strength is based on the material's intrinsic, especially near-surface, properties and not on any near surface structure, providing reversibility and ease of scaling up for practical applications," adds Shu Yang, Professor of Materials Science and Engineering and Chemical and Biomolecular Engineering at the University of Pennsylvania and lead author.

Credit: 
Lehigh University

Cancer-sniffing dogs 97% accurate in identifying lung cancer, according to study in JAOA

CHICAGO-June 17, 2019--Three beagles successfully showed they are capable of identifying lung cancer by scent, a first step in identifying specific biomarkers for the disease. Researchers say the dogs' abilities may lead to development of effective, safe and inexpensive means for mass cancer screening.

After eight weeks of training, the beagles--chosen for their superior olfactory receptor genes--were able to distinguish between blood serum samples taken from patients with malignant lung cancer and healthy controls with 97% accuracy. The double-blind study is published in the July edition of The Journal of the American Osteopathic Association.

"We're using the dogs to sort through the layers of scent until we identify the tell-tale biomarkers," says Thomas Quinn, DO, professor at Lake Erie College of Osteopathic Medicine and lead author on this study. "There is still a great deal of work ahead, but we're making good progress."

The dogs were led into a room with blood serum samples at nose level. Some samples came from patients with non-small cell lung cancer; others were drawn from healthy controls. After thoroughly sniffing a sample, the dogs sat down to indicate a positive finding for cancer or moved on if none was detected.

Dr. Quinn and his team are nearing completion of a second iteration of the study. This time the dogs are working to identify lung, breast and colorectal cancer using samples of patients' breath, collected by the patient breathing into a face mask. Researchers say findings suggest the dogs are as effective detecting cancer using this method.

The next step will be to further fractionate the samples based on chemical and physical properties, presenting them back to the dogs until the specific biomarkers for each cancer are identified. The goal is to develop an over-the-counter screening product, similar to a pregnancy test, in terms of cost, simplicity and availability. Dr. Quinn envisions a device that someone can breathe into and see a color change to indicate a positive or negative finding.

Early detection key

Lung cancer is the leading cause of cancer death worldwide for both women and men, and more than 200,000 people annually in the United States receive a diagnosis of lung cancer. The five-year survival rate for stage IA non-small cell lung cancer (NSCLC) is 92%. That drops to 13% in stage IIIC NSCLC, and after metastasis, the five-year survival rates range from 10% to less than 1%, depending on the stage.

Additionally, screening and imaging for lung cancer is costly and not always reliable. Chest X-rays have a high false-negative rate, while CT scans with computer-aided diagnosis have a high false-positive rate. Previous studies indicated that 90% of missed lung cancers occur when using chest X-rays, and CT scans have dif?culty identifying small, central, juxtavascular lung cancers.

Dr. Quinn believes his research can lead to better screening and diagnosis solutions, potentially creating a change in cancer detection.

"Right now it appears dogs have a better natural ability to screen for cancer than our most advanced technology," says Dr. Quinn. "Once we figure out what they know and how, we may be able to catch up."

Credit: 
American Osteopathic Association

Columbia researcher studies how climate change affects crops in India

Kyle Davis is an environmental data scientist whose research seeks to increase food supplies in developing countries. He combines techniques from environmental science and data science to understand patterns in the global food system and develop strategies that make food-supply chains more nutritious and sustainable.

Since joining the Data Science Institute as a postdoctoral fellow in September 2018, Davis has co-authored four papers, all of which detail how developing countries can sustainably improve their crop production. For his latest study, he focuses on India, home to 1.3 billion people, where he led a team that studied the effects of climate on five major crops: finger millet, maize, pearl millet, sorghum and rice. These crops make up the vast majority of grain production during the June-to-September monsoon season - India's main growing period - with rice contributing three-quarters of the grain supply for the season. Taken together, the five grains are essential for meeting India's nutritional needs.

And in a paper published in Environmental Research Letters, Davis found that the yields from grains such as millet, sorghum, and maize are more resilient to extreme weather; their yields vary significantly less due to year-to-year changes in climate and generally experience smaller declines during droughts. But yields from rice, India's main crop, experience larger declines during extreme weather conditions. "By relying more and more on a single crop - rice - India's food supply is potentially vulnerable to the effects of varying climate," said Davis, the lead author on the paper, "Sensitivity of Grain Yields to Historical Climate Sensitivity in India," which has four co-authors, all of whom collaborated on the research.

"Expanding the area planted with these four alternative grains can reduce variations in Indian grain production caused by extreme climate, especially in the many places where their yields are comparable to rice," Davis added. "Doing so will mean that the food supply for the country's massive and growing population is less in jeopardy during times of drought or extreme weather."

Temperatures and rainfall amounts in India vary from year to year and influence the amount of crops that farmers can produce. And with episodes of extreme climate such as droughts and storms becoming more frequent, it's essential to find ways to protect India's crop production from these shocks, according to Davis.

The authors combined historical data on crop yields, temperature, and rainfall. Data on the yields of each crop came from state agricultural ministries across India and covered 46 years (1966-2011) and 593 of India's 707 districts. The authors also used modelled data on temperature (from the University of East Anglia's Climate Research Unit) and precipitation (derived from a network of rain gauges maintained by the Indian Meteorological Department). Using these climate variables as predictors of yield, they then employed a linear mixed effects modelling approach - similar to a multiple regression ? to estimate whether there was a significant relationship between year-to-year variations in climate and crop yields.

"This study shows that diversifying the crops that a country grows can be an effective way to adapt its food-production systems to the growing influence of climate change," said Davis. "And it adds to the evidence that increasing the production of alternative grains in India can offer benefits for improving nutrition, for saving water, and for reducing energy demand and greenhouse gas emissions from agriculture."

Credit: 
Data Science Institute at Columbia

3D reconstruction of craniums elucidates the evolution of New World monkeys

image: Computed tomography scans of fossils from two extinct species point to evolutionary adaptations and kinship with extant howler, spider and woolly monkeys

Image: 
André Menezes Strauss

Cranial fossils belonging to two extinct species of monkey - Caipora bambuiorum and Cartelles coimbrafilhoi - were examined by computed tomography (CT) scan and reconstructed with three-dimensional imaging by a group of scientists from various countries.

The fossils were found almost 30 years ago in a cave complex in Bahia, Brazil, located in the Caatinga, a semiarid biome that occupies part of Brazil's Northeast Region.

The images were compared with those of craniums from 14 extant Central and South American primate species, enabling the researchers to identify adaptations and infer previously unknown relationships between the extinct and extant species.

"This is the first ever study of endocranial morphology involving fossils of New World monkeys, or platyrrhines," said André Menezes Strauss, a professor at the University of São Paulo's Archeology and Ethnology Museum (MAE) and an associate researcher affiliated with the Laboratory of Archeology and Environmental/Evolutionary Anthropology (LAAAE) at the university's Institute of Biosciences (IB-USP) in Brazil.

Using a cast taken from the inside of the cranium (braincase), paleoanthropologists analyzed endocranial morphology to estimate the shape and size of the brains of the fossil primates.

The results of the study, which was supported by São Paulo Research Foundation - FAPESP, are published in the American Journal of Physical Anthropology.

The researchers described cranial and endocranial shape variations in 14 species belonging to the four extant genera in the family Atelidae - Alouatta (howler monkeys), Ateles (spider monkeys), Brachyteles (woolly spider monkeys or muriquis), and Lagothrix(woolly monkeys) - as well as the extinct species C. bambuiorum and C. coimbrafilhoi. There are approximately 350 primate species in the world today. More than 200 are platyrrhines.

The study was led by Ivan Perez, an anthropologist at Argentina's La Plata Museum. His collaborators included Brazilian scientists affiliated with the University of São Paulo (USP) and the University of Campinas (UNICAMP), as well as researchers at institutions in Belgium, France, Germany and the United States. Cástor Cartelle, a paleontologist at the Pontifical Catholic University of Minas Gerais state (PUC-MG) for whom Cartelles coimbrafilhoi is named, was also a member of the research team.

In addition to FAPESP, Brazil's National Council for Scientific and Technological Development (CNPq) and Argentina's Scientific and Technological Research Fund (FONCYT) and National Scientific and Technical Research Council (CONICET) also supported the study.

The fossil specimens of C. bambuiorum and C. coimbrafilhoi are deposited at PUC-MG's Natural History Museum in Belo Horizonte, Minas Gerais. The 14 crania of the extant platyrrhines came from collections held by Argentina's La Plata Museum, Brazil's National Museum in Rio de Janeiro, the Argentinian Natural Science Museum in Buenos Aires, and the US National Museum of Natural History in Washington, DC.

"All 16 specimens were digitized using a medical CT scanner. A virtual 3D model of the endocranium was generated for each sample, and the 3D models of the cranial surfaces were extracted from the CT scan data," Strauss told.

The fossil specimens were damaged, particularly in the region of the zygomatic arches (cheekbones), so the researchers opted for two strategies to analyze them.
According to Strauss, in the case of C. bambuiorum, the right zygomatic arch was absent, but the left arch was intact.

"We reflected the undamaged arch to the damaged side in the 3D model, taking advantage of bilateral symmetry, and by means of this virtual repair, obtained a complete specimen," he said.

"In C. coimbrafilhoi, both sides were absent, so we used an imputation method to estimate the positions of the missing parts."

Perez digitized 26 anatomical landmarks and 373 semilandmarks along the curves and surfaces of each endocranium, as well as 64 landmarks and 196 semilandmarks on each cranium. In geometrical morphometrics, a landmark is a 2D or 3D point of evolutionary significance. Semilandmarks are defined by locations relative to other landmarks, e.g., midway between landmarks X and Y.

"The data served as a basis for multivariate analysis to compare all the characteristics of the 16 specimens and to look for similarities and differences that indicated morphological [and hence] adaptive patterns," Strauss said.

In other words, because the specimens of extant species of Atelidae, the largest New World monkeys, included crania of Alouatta, Ateles, Brachyteles and Lagothrix, they were compared with the specimens of the extinct species in order to find out whether the two fossils resembled and might be closely related to any of them.

3D craniums

Strauss said the data clearly showed that C. bambuiorum should be grouped with Ateles, Brachyteles and Lagothrix, all of which are distant from Alouatta.

This means that the genus Alouatta shares a common ancestor with the other genera of Atelidae and with C. bambuiorum and that it is older than the common ancestor shared by Ateles, Brachyteles and Lagothrix.

"However, when the position of C. bambuiorum is analyzed solely in relation to Ateles, Brachyteles and Lagothrix, the conclusion is that the fossil is clearly closest to Brachyteles," Strauss said.

"The hypothesis that C. bambuiorum was similar to a giant spider monkey (Ateles), initially posited by Cástor Cartelle 20 years ago, was refuted by our data, which showed that the extinct monkey was actually much more similar to a 'giant' muriqui (Brachyteles)."

In the case of C. coimbrafilhoi, the multivariate analysis produced some surprises. The first was that the data did not clearly group it with any of the four extant genera of Atelidae but consistently filled the previously empty morphospace between Alouatta and the other three genera.

"With regard to the fossil species, we show that C. bambuiorum is positioned within the range of variation observed for Brachyteles, whereas C. coimbrafilhoi presents an endocranial shape that does not overlap with the range of variation observed for any of the extant Atelidae. Of the four genera, C. coimbrafilhoi is closest to Alouatta in endocranial morphospace but closest to Lagothrix in cranial terms," Strauss said.

"We found that when the size factor was removed, the characteristics of C. coimbrafilhoi were intermediate between Alouatta on one side, and Ateles, Brachyteles and Lagothrix on the other," Strauss said.

"Our results suggest that within the atelid clade, the extinction of C. bambuiorum and C. coimbrafilhoi led to a significant loss of biological variation that could not have been imagined with the discovery of these fossils," the article concludes.

Peter Lund and Cástor Cartelle

The entire lineage of New World primates, distributed from northern Argentina to Central America, the Caribbean and Mexico, descends from a single band of founders comprising small African monkeys believed to have crossed the former South Atlantic (then a third of its current size) some 45 million years ago on rafts of floating vegetation.

Research on monkey fossils in the Americas began in Brazil 183 years ago in 1836, when Danish naturalist Peter Wilhelm Lund discovered the remains of a much larger primate than all extant platyrrhines in a cave in the Lagoa Santa area of Minas Gerais.

Lund named this monkey Protopithecus brasiliensis, which means "Brazilian ancestral monkey". It became extinct over 10,000 years ago and is a close relative of Brachyteles, the largest extant platyrrhine.

Fossils found since then (mostly single teeth and fragments of mandibles, plus a few craniums) belonged to very ancient platyrrhines that became extinct millions or tens of millions of years ago in Argentinian Patagonia, high up in the Chilean Andes and Bolivian Altiplano, in the Amazon rainforest of Peru, and on several Caribbean islands.

Although more than 30 extinct platyrrhine species have now been described, P. brasiliensis has never lost its status as a giant much larger than the rest, with two exceptions.

Almost complete skeletons of two other giant platyrrhines were found in 1992 in an inner chamber of Toca da Boa Vista, Brazil's largest cave complex (in Bahia). They lived at the end of the Ice Age over 10,000 years ago.

Both fossils were studied at PUC-MG by Cartelle, who described one of the specimens as a giant spider monkey belonging to a new species he called Caipora bambuiorum. He identified the second specimen as a member of the species described by Lund (P. brasiliensis).

C. bambuiorum and P. brasiliensis belong to the family Atelidae. The largest atelid genus is Brachyteles, whose members can reach 15 kg, followed by Alouatta, which are typically 10 kg or less. C. bambuiorum and P. brasiliensis were 60% larger. When alive, they must have weighed over 25 kg, about as much as an African baboon.

In 2013, thanks to American anthropologists, the fossil found in Toca da Boa Vista and initially thought to be a Protopithecus was found to be an exemplar of a new species named Cartelles coimbrafilhoi in homage to Cástor Cartelle.

The research group led by Perez concluded that C. coimbrafilhoi is related to Alouatta and not, as Cartelle thought, to Brachyteles.

New scientific techniques have now supplied a novel interpretation of these two fossils, especially C. coimbrafilhoi.

The 3D model of C. coimbrafilhoi's cranium constructed by the researchers showed that its morphological adaptations were different from those of any other New World primate, extant or extinct.

"Computed tomography enabled us to produce a new analysis of C. coimbrafilhoi, with novel conclusions. We now believe this giant monkey from the Pleistocene found in Bahia was neither Alouatta nor Brachyteles. Its unique cranial characteristics are no longer seen in any other New World primate," Strauss said.

The Protopithecus described by Lund was not included the study by Perez, Aristide, Strauss and colleagues, for lack of a cranium that could be scanned. The fossils found so far are highly fragmented.

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo

Boaty McBoatface mission gives new insight into warming ocean abyss

video: Footage of the start and end of Boaty McBoatface's mission.

Image: 
Povl Abrahamsen, British Antarctic Survey

The first mission involving the autonomous submarine vehicle Autosub Long Range (better known as "Boaty McBoatface") has for the first time shed light on a key process linking increasing Antarctic winds to rising sea temperatures. Data collected from the expedition, published today in the scientific journal PNAS, will help climate scientists build more accurate predictions of the effects of climate change on rising sea levels.

The research, which took place in April 2017, studied the changing temperatures at the bottom of the Southern Ocean.

During the three day mission, Boaty travelled 180 kilometres through mountainous underwater valleys measuring the temperature, saltiness and turbulence of the water at the bottom of the ocean. Using an echo sounder to navigate, Boaty successfully completed the perilous route, reaching depths of up to 4000 metres, to re-unite with the rest of the project team at the programmed rendezvous location where the sub was recovered and measurements collected along its route were downloaded.

In recent decades, winds blowing over the Southern Ocean have been getting stronger due to the hole in the ozone layer above Antarctica and increasing greenhouse gases. The data collected by Boaty, along with other ocean measurements collected from research vessel RRS James Clark Ross, have revealed a mechanism that enables these winds to increase turbulence deep in the Southern Ocean, causing warm water at mid depths to mix with cold, dense water in the abyss.

The resulting warming of the water on the sea bed is a significant contributor to rising sea levels. However, the mechanism uncovered by Boaty is not built into current models for predicting the impact of increasing global temperatures on our oceans.

Boaty's mission was part of a joint project involving the University of Southampton, the National Oceanography Centre, the British Antarctic Survey, Woods Hole Oceanographic Institution and Princeton University.

Professor Alberto Naveira Garabato from the University of Southampton who led the project said: 'Our study is an important step in understanding how the climate change happening in the remote and inhospitable Antarctic waters will impact the warming of the oceans as a whole and future sea level rise'

Dr. Eleanor Frajka-Williams of the National Oceanography Centre said: "The data from Boaty McBoatface gave us a completely new way of looking at the deep ocean--the path taken by Boaty created a spatial view of the turbulence near the seafloor."

Dr. Povl Abrahamsen of the British Antarctic Survey said: 'This study is a great example of how exciting new technology such as the unmanned submarine "Boaty McBoatface" can be used along with ship-based measurements and cutting-edge ocean models to discover and explain previously unknown processes affecting heat transport within the ocean.'

Credit: 
University of Southampton

Study finds bleeding after minimally invasive pad treatments can increase risk of death

Major bleeding occurs in about 4 percent of surgical procedures to treat blockages in the arteries of the lower leg and leads to an increased risk of in-hospital deaths, according to a new study published in JACC: Cardiovascular Interventions. The study found several risk factors that increase the chance of bleeding, which researchers said can help guide future efforts to reduce bleeding complications. The study is the first published research using NCDR PVI Registry data.

"This is the first large-scale study to describe the frequency of bleeding in patients undergoing lower extremity peripheral vascular interventions (PVI)," said Adam C. Salisbury, MD, MSc, a cardiologist with St. Luke's Health System in Kansas City, Missouri, and the study's lead author. "Bleeding has been well studied in coronary artery procedures but not in vascular procedures involving the lower extremities."

Peripheral vascular intervention (PVI) is a minimally invasive outpatient procedure used to treat peripheral artery disease (PAD), which causes plaque buildup in the arteries leading to the intestines, head, arms and most commonly the legs. PAD affects approximately 8 million Americans.

The goal of PVI is to restore the flow of blood to the lower extremities, eliminating pain, numbness or need for amputation. The procedure involves using a balloon-tipped catheter and/or stents to open blockages from inside the vessel. Over the past two decades, there has been a rapid growth in the use of PVI for the treatment of PAD, but there are limited data about the safety outcomes of PVI in routine clinical practice, Salisbury said.

The researchers studied patients undergoing PVI at 76 hospitals in the NCDR PVI Registry from 2014 to 2016. Among 18,289 PVI procedures, major bleeding occurred in 744 (4.1 percent). The in-hospital death rate was higher in patients who experienced bleeding compared with those who did not (6.6 percent vs. 0.3 percent).

The study found patient characteristics associated with bleeding included age, female sex, heart failure and anemia. Patients with resting leg pain or ulcerations due to poor blood flow were also at higher risk. The researchers found certain surgical strategies were associated with bleeding, such as placing the catheter in an artery other than the femoral artery. The use of thrombolytic ("clot-busting") therapy was also associated with an increased risk of bleeding.

"The findings suggest we can use different procedural strategies, such as using different access points for the catheter, alternative blood thinners or different sizes of equipment, to reduce the risk of bleeding," Salisbury said. "We can use the findings to identify factors and create models to predict who is at higher risk of bleeding. In these patients, we need to be especially careful to avoid doing anything that could increase the risk of bleeding."

In an editorial accompanying the study, Douglas E. Drachman, MD, of Massachusetts General Hospital and Beau M. Hawkins, MD, of the University of Oklahoma Health Sciences Center, wrote that the study "demonstrates that bleeding is a common complication of PVI and that bleeding confers significant clinical risk. For clinicians engaged in the care of patients with lower extremity PAD, this represents an opportunity to establish best practices and improve patient outcomes: it is time to stop the bleeding."

Credit: 
American College of Cardiology