Tech

Blood lipid profile predicts risk of type 2 diabetes better than obesity

Using lipidomics, a technique that measures the composition of blood lipids at a molecular level, and machine learning, researchers at Lund University in Sweden have identified a blood lipid profile that improves the possibility to assess, several years in advance, the risk of developing type 2 diabetes. The blood lipid profile can also be linked to a certain diet and degree of physical activity.

Blood contains hundreds of different lipid molecules that are divided into different classes, such as cholesterol and triglycerides.

"In healthcare, the total amounts of cholesterol and triglycerides are measured, not the exact composition of the classes. One class consists of several molecules and in our study we can see that it is good to have more of certain blood lipid molecules and less of others, and that these can be linked to lifestyle", says Céline Fernandez, associate professor of Integrative Molecular Medicine at Lund University, who carried out the study in cooperation with the company Lipotype and the National Bioinformatics Infrastructure Sweden (NBIS).

"Lipidomics links imbalances in lipid metabolism at a molecular level to physiological differences - helping us to predict type 2 diabetes", says Dr Christian Klose, head of the R&D department at Lipotype.

The study involved the analysis of 178 lipids in blood from 3 668 healthy participants in the Malmö Diet and Cancer study. The participants were divided at random into two equally large groups, group A and B. In addition to blood samples, the study is based on self-reported data about physical activity and diet. In a follow-up just over 20 years later, around 250 in each group had developed type 2 diabetes.

Using machine learning, the researchers could draw up a blood lipid profile in group A based on the concentration of 77 lipid molecules linked to the risk of developing type 2 diabetes later in life. The blood lipid profile's capability to distinguish individuals who develop type 2 diabetes from those who remain healthy was examined and later confirmed in group B.

A risk analysis based on the known risk factors (age, gender, weight, blood sugar, smoking and blood pressure) was carried out to ascertain whether the blood lipid profile could improve the risk assessment. Adding the total amounts of cholesterol and triglycerides did not improve the risk assessment. However, when the researchers added the specific blood lipid profile instead, the risk prediction improved in a statistically significant way.

Furthermore, the blood lipid profile in itself could better predict the risk of developing type 2 diabetes than obesity, which is considered to be the most important risk factor for type 2 diabetes.

"This means that we could produce a better assessment of who had a high risk of developing type 2 diabetes", says Celine Fernandez.

Nikolay Oskolkov, bioinformatics specialist at NBIS adds: "This shows how machine learning can be used to improve clinical diagnosis".

The findings also revealed that the blood lipid profile can be linked to lifestyle. The more that participants exercised in their free time, the smaller the amount of harmful blood lipid profile they had. According to the study, coffee was also linked to a reduced amount of the harmful blood lipid profile. Dairy products, sugar-sweetened drinks and processed meat, on the other hand, were linked to larger amounts of the harmful blood lipid profile.

This indicates that it could be possible to modify your blood lipid profile and thereby your risk of developing type 2 diabetes through lifestyle changes, comments Céline Fernandez. However, further research is needed to confirm this.

One strength of the study is that the researchers could repeat the group A results in group B and thereby validate and confirm the results in other individuals. This was possible because the Malmö Diet and Cancer study has many participants and the blood lipid analysis is so robust.

"One weakness, of course, is that all individuals in the cohort have a similar backgroundorigin. We don't know what the results would have been in another cohort, from another part of the world, for example", concludes Céline Fernandez.

Credit: 
Lund University

Gastric cancer susceptibility marker discovered

image: The nuclei are shown in blue (Hoechst stain), with USF1 in green and p53 in red.

Image: 
© Lionel Costa

Gastric cancer, the third most common cause of cancer-related deaths, is often associated with a poor prognosis because it tends to be diagnosed at an advanced stage and is therefore difficult to treat. To reduce the death rate, it is essential to identify a biomarker enabling early diagnosis of this cancer. In pursuit of this goal, scientists from the Institut Pasteur, CNRS, and University of Rennes 1 working in partnership with the IMSS in Mexico and the University of Florence in Italy, analyzed the mechanisms involved in the development of gastric cancer during infection by the bacterial pathogen Helicobacter pylori. As a result, they were able to identify a potential susceptibility marker. Their findings were published in the journal Gut on December 10, 2019.

Helicobacter pylori is a bacterial pathogen that colonizes the stomachs of almost half the world's population. H. pylori infection is acquired in childhood and lasts for decades. While remaining asymptomatic in most individuals, in some cases the infection develops into gastric cancer. H. pylori is currently thought to be responsible for approximately 90% of gastric cancer cases throughout the world, with an estimated death rate of approximately 800,000 per year.

The first steps have been taken toward deciphering the sequence of events triggered by bacterial infection ultimately leading to gastric cancer, a mechanism in which infected cell DNA instability plays a key role. Previous studies have demonstrated that H. pylori causes DNA breaks and impairs the DNA repair systems by stimulating the accumulation of mutations potentially targeting p53, a protein known as the "guardian of the genome".

Protein p53 is essential for proper cell function since, if significant damage occurs within the genome, it temporarily stops the cell cycle for a sufficient period of time to repair DNA. If p53 is inactivated, the genome is therefore more likely to accumulate instabilities, and normal cells have a greater chance of transforming into cancer cells. It is important to understand cellular transformation caused by H. pylori, which stimulates cancer development, in order to identify a susceptibility marker. This would enable early treatment of patients, thus preventing the development of gastric cancer.

In vitro, in vivo, and clinical testing

The group led by Marie-Dominique Galibert, a scientist at the Rennes Institute of Genetics and Development (University of Rennes 1/CNRS) previously conducted studies revealing that, in response to DNA damage, p53 is stabilized by its interaction with the transcription factor USF1, thus enabling p53 to play its role in DNA repair. However, in vitro results from this publication relating to cell lines demonstrate that H. pylori not only reduces nuclear levels of factor USF1, but also relocates it, leading to its accumulation at the periphery of cells, thus preventing the formation of USF1/p53 complexes in the cell nucleus. If destabilized, protein p53 loses its function, resulting in accumulated oncogenic changes in infected cells, promoting their transformation into cancer cells. Therefore, the loss of USF1 from the nucleus is a key factor in inhibiting p53 activity, which stimulates the development of gastric cancer.

These findings have been confirmed by in vivo studies. Scientists demonstrated that gastric inflammatory lesions caused by H. pylori infection were more severe in a mouse model deficient in factor USF1. These results are also supported by clinical findings, since the prognosis is worse for gastric cancer patients with low levels of USF1 combined with low levels of p53.

These data provide a new conceptual basis for improving patient management. Indeed, variations in USF1 levels in gastric tumor tissue could be an indicator of poor prognosis in cases of gastric cancer, thus enabling subsets of patients at higher risk or with more severe forms of cancer to be identified.

Eliette Touati, joint last author of the article and scientist within the Helicobacter Pathogenesis Unit (Institut Pasteur/CNRS), concludes: "For the first time, we have demonstrated that the loss of transcription factor USF1 accelerates carcinogenesis caused by Helicobacter pylori. This makes USF1 a potential biomarker for gastric cancer susceptibility and a new therapeutic target in the treatment of this cancer."

Credit: 
Institut Pasteur

Zebrafish 'avatars' can help decide who should receive radiotherapy treatment

image: Human rectal-tumor cells (pink) are tested for radio-sensitivity using a zebrafish avatar.

Image: 
Bruna Costa & Rita Fior. Champalimaud Centre for the Unknown.

Radiotherapy can effectively reduce or even eliminate some tumours; others, however, show enduring resistance. Considering the potentially harmful side effects of radiotherapy, clinicians agree that it is paramount to be able to determine if a patient will benefit from radiotherapy before exposing them to any of the associated risks.

Despite significant efforts to develop biomarkers that can assess the potential efficacy of radiotherapy treatment for individual patients, there is currently no established diagnostic test that can provide a clear answer.

To address this urgent need, a multidisciplinary team working at the Champalimaud Centre for the Unknown in Lisbon, Portugal, developed a novel assay for quick radio-sensitivity diagnosis. Depending on the success of upcoming clinical trials, this assay may become a standard personalized medicine tool within a few years. Their results were published today (December 17th) in the scientific journal EBioMedicine, a biomedical open access journal, published by The Lancet.

The four days assay

The team is already known for having established a successful chemotherapy-sensitivity assay, based on transplanting tumor cells into zebrafish and using these avatars for treatment testing. Their results stand now at 85% success rate for predicting how tumors will respond to specific drugs.

Moving into radiotherapy response testing has presented the team with a new set of challenges, which they met by combining the expertise of clinicians, physicists and biologists. The group decided to establish the assay focusing on colorectal cancer, which is the third most common cancer worldwide. "Radiotherapy sensitivity is particularly important for rectal cancer,'' says Rita Fior, one of the leading authors of this study. "In the majority of cases with locally advanced disease, the standard approach is first to administer chemo-radiotherapy and then perform surgery to remove the tumor."

According to Nuno Figueiredo, Head of the Champalimaud Surgical Center, the new assay has the potential to have a tremendously positive impact on patients' lives. "Some tumours may be highly radio-sensitive, leading to a reduction of tumour size or even elimination of the tumour altogether. This allows for a more conservative 'watch and wait' approach in which radiotherapy may effectively postpone or even prevent invasive surgery. On the other hand, if the tumour is radio-resistant, the optimal solution, which is surgery, can be implemented without further delay."

To establish the radiotherapy assay, the team first transplanted tumor cells from human colorectal cancer cell-lines into zebrafish, thereby creating the zebrafish "avatars". Specifically, they focused on two types of cells - radio-sensitive and radio-resistant. Radiation was delivered to the avatars using the same equipment (linear accelerator) applied for the treatment of cancer patients.

The results were promising: " We were able to decisively say which tumor cells had responded to the treatment after just four days. The full set of analysis took an additional eight days. This was very exciting, as it is a very fast timescale that will work well for clinical application", points out Susana Ferreira, a biologist that co-authored the study.

From the lab to the clinic

After establishing the radio-sensitivity assay, the team conducted a "proof of concept" experiment, where they tested it in a clinical setting. They ran the assay on cells derived from biopsies of two patients that were recently diagnosed with rectal cancer and were about to undergo treatment. The results were robust: "the response of the zebrafish avatars perfectly mirrored the response of the patients,'' says Bruna Costa, another co-author of the study.

Oriol Parés, a radiation oncologist who participated in the project, stresses the importance of establishing a radio-sensitivity assay. "There is a global investment towards finding biomarkers that could predict the response of patients to chemo and radiotherapy. This preliminary clinical trial yielded promising results, supporting the prospect of using this assay as a cost-efficient test with timely results. It also corresponds with a modern approach in medicine, which we apply here at the Champalimaud Clinical Centre, to personalize treatment with the goal of optimizing results for each individual patient."

"This is a great example of what can be achieved in cancer clinical research", adds Miguel Godinho-Ferreira, a researcher involved in the study. "At the Champalimaud Centre for the Unknown, not only do we gather patients, clinicians and scientists under the same roof, we also have the financial support to bring the most recent advances of research to cancer patients. I cannot wait to have our assay helping people suffering from this disease."

The team is now working on expanding their data set by using the assay to test radio-sensitivity in more patients. In addition to "in-house" patients of the Champalimaud Clinical Centre, they are hoping to establish a multi-center clinical trial. Fior points out that the way they designed the assay facilitates external collaboration. They can process samples that have been cryogenically preserved, which is necessary for transporting samples between facilities. "We hope that the clinical testing will prove successful and that in some years this assay could be used as a standard tool for designing personalized treatment strategies for cancer patients." she concludes.

Credit: 
Champalimaud Centre for the Unknown

Donkeys are natural heat lovers and prefer Bethlehem to Britain

We might associate donkeys with Christmas, but new research from the University of Portsmouth shows the animals are keener on hotter periods of the year.

Donkeys, it seems, love sun and warmth. That's the finding of the first study to examine the conditions under which healthy (non-working) donkeys and mules seek shelter in hot, dry climates.

It found that whilst mules would seek shelter from the heat and insects, donkeys enjoyed the sunshine and warmth for longer.

The research by equine behaviour expert Dr Leanne Proops, at the University of Portsmouth's Department of Psychology, is published in the Journal of Applied Animal Behaviour Science.

Dr Proops said: "We found that donkeys are less likely to seek shelter from the heat and light than mules. The sensitivity of mules to higher temperatures and sunlight may be due to the geographically different evolution of horses and donkeys and their adaptations to different climates. Donkeys are better adapted to arid, hot climates and hence higher sunlight levels."

"In contrast, horses are more adapted to cold conditions, and our previous research has shown that donkeys seek shelter far more often than horses in cold, wet conditions. As a hybrid, mules often display attributes that are a mixture of both species, such as their winter hair coat growth. Therefore, it might be expected that mules are less adapted to conditions of high temperatures and sunlight levels than donkeys, as we found in this study."

It is known that the effect of heat in the environment becomes physically challenging for animals once the ambient temperature surpasses their thermal neutral zone (TNZ). The TNZ is different for every species. An important method of controlling heat stress from solar radiation is for an animal to seek shade.

A total of 130 donkeys and mules were studied in two locations in Southern Spain in a seven week period during the Summer. In both locations, researchers recorded the animals need for shade.

All the animals in the study were healthy, had free access to shelter and were regularly monitored by vets from The Donkey Sanctuary. Temperatures during the study period ranged from 14 to 37° C and data was collected between 8am and 4:15pm. For each location outside temperature, wind speed, light levels, rainfall, insect density and harassment levels were recorded.

Emily Haddy, PhD student on the project, said: "It has been very interesting to see the results from this study. Despite what equid owners may think, it is clear that different equid species have specific needs and so should be given free access to shelter - there is no 'one size fits all'."

Dr Faith Burden, Director of Research and Operational Support at The Donkey Sanctuary and co-author on the paper, points out the importance of these findings, "The majority of working equids worldwide are exposed to hot climates and as a consequence may suffer from issues such as dehydration and heat stress. By establishing the natural shelter seeking behaviour of healthy donkeys and mules across climates we hope to be able to inform welfare guidelines and encourage good management of these animals."

Anyone with any domestic donkeys this Christmas are urged to keep them warm and dry.

Credit: 
University of Portsmouth

Plant-eating insects disrupt ecosystems and contribute to climate change

A new study from Lund University in Sweden shows that plant-eating insects affect forest ecosystems considerably more than previously thought. Among other things, the insects are a factor in the leaching of nutrients from soil and increased emissions of carbon dioxide. The researchers also establish that the temperature may rise as a result of an increase in the amount of plant-eating insects in some regions.

Using extensive meta-analysis, a research team at Lund University has for the first time examined how plant-eating insects affect soil processes in forest ecosystems globally. The study, which is published in Journal of Ecology, examines biological and biogeochemical reactions in the soil. When damaged plants, carcasses and secretion substances from insects fall to the ground, the turnover of carbon and nutrients increases. This leads to leaching from the forest floor and the release of more carbon dioxide.

"The number of plant-eating insects may increase due to climate change, especially in cold areas where a lot of carbon is sequestered in the ground. This will affect the forest ecosystems and lead to an increased release of greenhouse gases and a potential rise in temperature", says Dan Metcalfe, physical geography researcher at Lund University.

In the new study, researchers have established that insects and large mammals affect soil processes in a similar way, even though they have very different population patterns and feeding habits.

"Insects are more specialised in terms of food sources and can also increase their population by 50 to 100 times from one season to another. This means that plant-eating insects can sometimes disrupt forest ecosystems much more than plant-eating mammals", says Dan Metcalfe.

Tropical and northern forests account for 80 per cent of the world's total forested land area, but are very underrepresented in research literature. The researchers hope that the new results will be of practical use by being incorporated in climate models.

"Understanding how ecosystems work is crucial for being able to predict and combat climate change. Mammals are decreasing, whereas there is a lot to indicate that the number of insects will increase in some regions in a warmer world", concludes Dan Metcalfe.

Credit: 
Lund University

In ancient scottish tree rings, a cautionary tale on climate, politics and survival

image: Winching a long-dead tree from Loch Gamnha, in the Cairngorm Mountains. Scientists have found trees as old as 8,000 years preserved in cold, oxygen-poor lakebed mud.

Image: 
Courtesy Tree Ring Lab, University of St. Andrews.

Using old tree rings and archival documents, historians and climate scientists have detailed an extreme cold period in Scotland in the 1690s that caused immense suffering. It decimated agriculture, killed as much as 15 percent of the population and sparked a fatal attempt to establish a Scottish colony in southern Panama. The researchers say the episode--shown in their study to have been during the coldest decade of the past 750 years--was probably caused by faraway volcanic eruptions. But it was not just bad weather that brought disaster. Among other things, Scotland was politically isolated from England, its bigger, more prosperous neighbor that might have otherwise helped. Propelled in part by the catastrophe, the two nations merged in 1707 to become part of what is now the United Kingdom. Such a famine-related tragedy was never repeated, despite later climate swings.

With Brexit now threatening to isolate the UK from the European Union, the researchers think politicians should take this as a cautionary tale. "By joining England, Scotland became more resilient," said lead author Rosanne D'Arrigo, a tree-ring scientist at Columbia University's Lamont-Doherty Earth Observatory. "The bigger message for today is arguably that as the climate changes, nations will be stronger if they stick together and not try to go it alone." The study appears in the early online edition of the Journal of Volcanology and Geothermal Research.

The "Scottish Ills" have long been noted in history books. In some years, snow from the winter persisted on the ground well into summer, and frosts struck every summer night for weeks. The planting season was cut short, and crops were struck down before they could be harvested. Livestock had nothing to eat. The study quotes Mary Caithness, Countess of Breadalbane, describing "cold misty weather such as the oldest person alive hath not seen." Other regions including France, England and the Netherlands also suffered unusually cold weather, but generally with less drastic results. In Scandinavia, however, tens of thousands died. It was "likely the worst era of crop failure, food shortage and mortality ever documented in Scottish history," the researchers write.

Based on the width and density of tree rings the researchers collected, they showed that 1695-1704 was Scotland's coldest decade in 750 years. This, on top of the fact that much of the northern hemisphere was already in the grip of the so-called Little Ice Age, when cold temperatures were the norm for centuries, until the 1800s. "Before this, we knew it was cold. Now we have an understanding of exactly how cold," said coauthor Rob Wilson of Scotland's University of St. Andrews, and an adjunct researcher at Lamont-Doherty. "The whole 17th century must have been a horrible time to live in Scotland, but this was the worst part."

The researchers say that the Ills coincided closely with multiple large volcanic eruptions. Previous researchers have identified particles in ice cores that traveled long distances from eruptions that probably took place somewhere in the tropics in 1693 and 1695. And Iceland's Mount Hekla darkened the skies for seven months in 1693. Scientists already know that large-scale volcanism throws sulfate particles into the atmosphere; these deflect sunlight and can lower temperatures far from the eruption itself for years. Thus, the researchers believe the eruptions would explain the chilly weather that hit Scotland and other northern hemisphere nations all at the same time. (Unsurprisingly, the tree rings also show that the warmest century of the record was 1911-2010, almost certainly due to human greenhouse-gas emissions.)

The findings are an outgrowth of the Scottish Pine Project, in which Wilson and his colleagues have been gathering tree-ring samples for the past 10 years in northern Scotland. In the desolate Highlands region of Cairngorms, they have drilled out cores from living trees going back to the 1400s. To extend the record back further, they have snorkeled along the nearshore bottoms of icy lochs, searching for long-dead trees that have fallen in and been preserved over the centuries in cold, oxygen-poor mud. Once they find specimens, they winch them out by hand and take out cross sections with chain saws. The team has also studied buildings whose timbers went back to the 1100s, though these were not included in the climate reconstruction. Their initial chronology was published in 2017. In the course of their work, the team has found trees in the lochs as old as 8,000 years. They are still collecting samples and working to construct a continuous climate record predating the Middle Ages.

In the new study, the researchers say that climate was not the only factor in the Scottish Ills. "The connection seems simple--volcanic cooling triggered famine--but the drivers toward famine are far more complex," they write. They cite Scotland's economic circumstances and political isolation from England as major factors. England had more good farmland and, at the time, better agricultural technology and organization for delivering relief to the poor. While also hit with cool weather, England did not suffer a famine, and probably would have come to the aid of Scotland had the nations been united. Scotland also unwisely encouraged the export of crops at a time when they were needed at home.

At the height of the Ills, the Scots developed an intricate venture to send colonists to the Darien region of Panama. Driven in part by the desperation of the famine, the idea caught on as a national mania, and people of all social and economic classes invested much of their assets--in all, as much as half the nation's entire liquid capital. Starting in 1698, a total of 2,500 colonists began sailing to this malarial jungle coast. They were quickly cut down by disease, malnutrition (Scotland could ill afford to resupply the colony) and conflicts with Spanish forces, which already controlled much of South and Central America. The colony was abandoned after just 16 months; only a few hundred colonists survived; and Scotland was financially ruined. The inhospitable Darien region remains barely inhabited even today.

"At the time, the Scots saw the colony as a kind of Exodus, where they would start over somewhere new," said D'Arrigo. "In the end, they couldn't escape."

Repeated proposals to unite England and Scotland had come up during the 1600s, but the Scots had resisted. As the famine came to a close, they finally gave in; apparently, many of the gentry making the decision figured that hitching themselves to a greater power would buffer them from further misfortunes. The Acts of Union, passed by the parliaments of Scotland and England, took effect in 1707. Scotland suffered other climate extremes in succeeding centuries, but never again collapsed in this way.

In 2014, more than 300 years after the union, the Scots took a referendum on whether to once again become an independent state; 55 percent voted to stay with the UK. Then came the 2016 UK-wide referendum that set Brexit in motion--deeply unpopular in Scotland, where 62 percent voted to remain in the EU. In last week's UK parliamentary elections, pro-Brexit forces won overall, but lost resoundingly in Scotland. Many Scots now seem to be reconsidering independence--not because they want to stand alone again, but because independence might allow them to rejoin the larger community of the EU, and leave the isolationist English to fend for themselves. Calls for another independence referendum are already circulating.

"Scotland became more resilient when it became part of a union," said Wilson. "It's a cautionary tale from history."

Credit: 
Columbia Climate School

Mass General team detects Alzheimer's early using electronic health records

BOSTON - A team of scientists from Massachusetts General Hospital (MGH) has developed a software-based method of scanning electronic health records (EHRs) to estimate the risk that a healthy person will receive a dementia diagnosis in the future. Their algorithm uses machine learning to first build a list of key clinical terms associated with cognitive symptoms identified by clinical experts. Next, they used national language processing (NLP) to comb through EHRs looking for those terms. Finally, they used those results to estimate patients' risk of developing dementia.

"The most exciting thing is that we are able to predict risk of new dementia diagnosis up to eight years in advance ," says Thomas McCoy, Jr., MD, first author of the paper. The team included members of MGH's Center for Quantitative Health, the Harvard T.H. Chan School of Public Health, and the Harvard Brain Tissue Resource Center. Their paper was published this week in Alzheimer's & Dementia. The study included data on 267,855 patients admitted to one of two hospital systems. It found that 2.4% of patients developed dementia over the 8 years of follow up.

Early diagnosis of dementia could be one of the most important steps toward improving care and finding truly effective treatments for it. Alzheimer's affects more than 5.5 million Americans at present, and as the population ages that number is expected to balloon. Current early detection tools require additional, potentially costly, data collection. The tool developed at MGH is based entirely on software to make better use of data already generated during routine clinical care. This software-based approach to early risk detection has the potential to accelerate research efforts aimed at slowing progression or reverse early disease.

McCoy notes that "This method was originally developed as a general 'cognitive symptom' assessment tool. But we were able to apply it to answer particular questions about dementia." In other words, a general cognitive symptom detector proved useful for the task of dementia risk stratification. He explained, "This study contributes to a growing body of work on the usefulness of calculating broad symptom burden scores across neuropsychiatric conditions."

Earlier studies by McCoy and his colleagues used these sorts of tools to predict risk of suicide and accidental death, as well as the likelihood of admission and length of stay among children with psychiatric symptoms in emergency rooms. This latest study suggests that properly tailored, the tool can be applied to more specific questions about other brain diseases as well.

"We need to detect dementia as early as possible to have the best opportunity to bend the curve," says Roy Perlis, MD, senior author of the study and director of the MGH Center for Quantitative Health. "With this approach we are using clinical data that is already in the health record, that doesn't require anything but a willingness to make use of the data."

The researchers are hopeful this tool can be used to accelerate research.

"This approach could be duplicated around the world, giving us more data and more evidence for trials looking at potential treatments," says Rudolph Tanzi, PhD, a member of the research team, vice-chair of Neurology, and Co-Director of the MGH McCance Center for Brain Health at the MGH Institute for Neurodegenerative Diseases.

Credit: 
Massachusetts General Hospital

Moffitt researchers develop more efficient approach to create mouse models

TAMPA, Fla. - Genetically engineered mouse models are often used by scientists to study how the addition, deletion or mutation of genes affects the development of disease and effects of drugs. The process of creating these genetically modified mice is extremely time consuming and expensive, which limits the ability of scientists to use their models to perform important research. Moffitt Cancer Center researchers have developed a new platform for creating genetically engineered mice to study melanoma that is significantly faster than a normal mouse model approach. Their work was published in Cancer Research.

Mouse models have enabled numerous advances in our understanding of cancer and potential treatment approaches. Scientists can add or delete genes (alleles) from particular types of cells in mice, such as melanocytes in the skin, with tissue-specific DNA-targeting approaches. Additionally, scientists often make multiple changes in different genes at the same time to determine how multiple genetic alterations affect outcomes, or they may turn genes on or off during predetermined periods of time during the mouse's life to assess how time-dependent alterations impact disease and drug treatments.

Despite the knowledge gained from mouse models, "creating mouse alleles and breeding multi-allelic melanoma-prone experimental mice is expensive, slow, and cumbersome, rendering conventional mouse modeling an inefficient method to study gene functions in vivo," explained Florian Karreth, Ph.D., assistant member of the Department of Molecular Oncology at Moffitt.

Karreth and his team wanted to develop a more efficient method of creating mice with multiple genetic modifications in melanocytes. They began by isolating embryonic stem cells from previously created genetically modified mice that had alterations in genes known to contribute to melanoma (BRAF, NRAS, PTEN and CDKN2A). Next, the researchers modified genes of these embryonic stem cells even further in laboratory cultures and injected them into embryos from the original mouse strain. These embryos were then implanted into female mice and eventually were born as chimeric mice with multiple genetic alterations.

The researchers determined that the chimeric mice were able to develop melanomas to a similar extent as mice created through the normal approach. They used their new platform to show how modulation of PTEN gene expression could affect the development and progression of melanoma, and also created melanoma cell lines from the chimera mice that could be used in laboratory experiments.

Karreth hopes that the new platform will benefit the scientific community as a whole and have allowed both the embryonic stem cells lines and the melanoma cell lines to be available to the entire melanoma research community. "Given that it takes less than 2.5 months from embryonic stem cell targeting to inducing melanomagenesis in experimental chimeras, we anticipate that our platform has the potential to dramatically accelerate melanoma studies in mice," he said.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute

In breakthrough method of creating solar material, NREL scientists prove the impossible really isn't

Scientists at the National Renewable Energy Laboratory (NREL) achieved a technological breakthrough for solar cells previously thought impossible.

The scientists successfully integrated an aluminum source into their hydride vapor phase epitaxy (HVPE) reactor, then demonstrated the growth of the semiconductors aluminum indium phosphide (AlInP) and aluminum gallium indium phosphide (AlGaInP) for the first time by this technique.

"There's a decent body of literature that suggests that people would never be able to grow these compounds with hydride vapor phase epitaxy," said Kevin Schulte, a scientist in NREL's Materials Applications & Performance Center and lead author of a new paper highlighting the research. "That's one of the reasons a lot of the III-V industry has gone with metalorganic vapor phase epitaxy (MOVPE), which is the dominant III-V growth technique. This innovation changes things."

The article, "Growth of AlGaAs, AlInP, and AlGaInP by Hydride Vapor Phase Epitaxy," appears in the journal ACS Applied Energy Materials.

III-V solar cells--so named because of the position the materials fall on the periodic table--are commonly used in space applications. Notable for high efficiency, these types of cells are too expensive for terrestrial use, but researchers are developing techniques to reduce those costs.

One method pioneered at NREL relies on a new growth technique called dynamic hydride vapor phase epitaxy, or D-HVPE. Traditional HVPE, which for decades was considered the best technique for production of light-emitting diodes and photodetectors for the telecommunications industry, fell out of favor in the 1980s with the emergence of MOVPE. Both processes involve depositing chemical vapors onto a substrate, but the advantage belonged to MOVPE because of its ability to form abrupt heterointerfaces between two different semiconductor materials, a place where HVPE traditionally struggled.

That's changed with the advent of D-HVPE.

Sample III-V solar cells grown using HVPE Sample aluminum III-V solar cells, grown using HVPE, are shown as Alx(Ga1-x)0.5In0.5P thin films after removing the GaAs substrate bonded to a glass handle for transmission measurements. The difference in color is due to the difference in the composition of Al and Ga. Specifically, the yellow samples are AlInP (no Ga) and the orange samples are AlGaInP. Photo by Dennis Schroeder, NREL

The earlier version of HVPE used a single chamber where one chemical was deposited on a substrate, which was then removed. The growth chemistry was then swapped for another, and the substrate returned to the chamber for the next chemical application. D-HVPE relies on a multi-chamber reactor. The substrate moves back and forth between chambers, greatly reducing the time to make a solar cell. A single-junction solar cell that takes an hour or two to make using MOVPE can potentially be produced in under a minute by D-HVPE. Despite these advances, MOVPE still held another advantage: the ability to deposit wide band gap aluminum-containing materials that enable the highest solar cell efficiencies. HVPE has long struggled with the growth of these materials due to difficulties with the chemical nature of the usual aluminum-containing precursor, aluminum monochloride.

The researchers always planned on introducing aluminum into D-HVPE, but first focused their efforts on validating the growth technique.

"We've tried to move the technology forward in steps instead of trying to do it all at once," Schulte said. "We validated that we can grow high-quality materials. We validated that we can grow more complex devices. The next step now for the technology to move forward is aluminum."

Schulte's co-authors from NREL are Wondwosen Metaferia, John Simon, David Guiling, and Aaron J. Ptak. They also include three scientists from a North Carolina company, Kyma Technologies. The company developed a method to produce a unique aluminum-containing molecule, which could then be flowed into the D-HVPE chamber.

The scientists used an aluminum trichloride generator, which was heated to 400 degrees Celsius to generate an aluminum trichloride from solid aluminum and hydrogen chloride gas. Aluminum trichloride is much more stable in the HVPE reactor environment than the monochloride form. The other components--gallium chloride and indium chloride--were vaporized at 800 degrees Celsius. The three elements were combined and deposited on a substrate at 650 degrees Celsius.

Using D-HVPE, NREL scientists previously were able to make solar cells from gallium arsenide (GaAs) and gallium indium phosphide (GaInP). In these cells, the GaInP is used as the "window layer," which passivates the front surface and permits sunlight to reach the GaAs absorber layer below where the photons are converted to electricity. This layer must be as transparent as possible, but GaInP is not as transparent as the aluminum indium phosphide (AlInP) used in MOVPE-grown solar cells. The current world efficiency record for MOVPE-grown GaAs solar cells that incorporate AlInP window layers is 29.1%. With only GaInP, the maximum efficiency for HVPE-grown solar cells is estimated to be only 27%.

Now that aluminum has been added to the mix of D-HVPE, the scientists said they should be able to reach parity with solar cells made via MOVPE.

"The HVPE process is a cheaper process," said Ptak, a senior scientist in NREL's National Center for Photovoltaics. "Now we've shown a pathway to the same efficiency that's the same as the other guys, but with a cheaper technique. Before, we were somewhat less efficient but cheaper. Now there's the possibility of being exactly as efficient and cheaper."

Credit: 
DOE/National Renewable Energy Laboratory

Applying physics principle yields grim prediction on hurricane destruction in an era

BROOKLYN, New York, Tuesday, December 17, 2019 – Global warming could lead to hurricanes even more powerful than meteorologists currently forecast. That warning came from a physicist researching the behavior of tropical cyclones who noticed that one of the principles of physics — phase transition — did not appear in the scientific literature of meteorology.

Edward Wolf, professor emeritus at the NYU Tandon School of Engineering, examined the most robust data sets on tropical hurricanes — compiled by noted atmospheric scientist Kerry Emanuel in 2006 on Atlantic storms dating as far back as the 1930s off the coast of Africa. In a paper published recently in the journal Theoretical and Applied Climatology, Wolf demonstrated that the destructive power of these tropical hurricanes increased linearly and rapidly as water temperature increased — in contrast to most meteorological calculations, which lead to more optimistic outcomes.

“This approach indicates the destructive power of Atlantic hurricanes off Africa could reach three times their current level if water temperatures rise by 2 degrees Celsius — well within the range that scientists predict is likely by the year 2100,” Wolf said. “The same calculations would apply to any tropical basin on Earth, and I am working with Dr. Emanuel now to explore this new concept in the hope that it will advance scientists’ predictive ability.”

The journal paper showed how Wolf’s calculations aligned with what has become accepted science: Hurricanes require a surface water temperature above 26.5 degrees Celsius (79.7 degrees Fahrenheit). And every plot of Emanuel’s graph of his power dissipation index values-versus-ocean temperature substantiated Wolf’s initial suspicion that phase transitions — such as the transition from water to vapor – indicate just how much kinetic energy is released as the water that was turned to vapor by a hurricane then cools and falls to Earth as liquid.

Credit: 
NYU Tandon School of Engineering

NREL, Co-Optima research yields potential bioblendstock for diesel fuel

The NREL scientists, along with colleagues at Yale University, Argonne National Laboratory, and Oak Ridge National Laboratory, are part of the Department of Energy's Co-Optimization of Fuels & Engines (Co-Optima) initiative. Co-Optima's research focuses on improving fuel economy and vehicle performance while also reducing emissions.

"If you look at biomass, 30% of it is oxygen," said Derek Vardon, a senior research engineer at NREL and corresponding author of a new paper detailing the Co-Optima research project. "If we can figure out clever ways to keep it around and tailor how it's incorporated in the fuel, you can get a lot more out of biomass and improve the performance of diesel fuel." The molecule, 4-butoxyheptane, contains oxygen while conventional petroleum-derived diesel fuel is comprised of hydrocarbons. The presence of oxygen significantly reduces the intrinsic sooting tendency of the fuel upon burning.

The paper, "Performance-Advantaged Ether Diesel Bioblendstock Production by a priori Design," appears in the journal Proceedings of the National Academy of Sciences. Vardon's co-authors from NREL are Nabila Huq as the first author, with co-authors Xiangchen Huo, Glenn Hafenstine, Stephen Tifft, Jim Stunkel, Earl Christensen, Gina Fioroni, Lisa Fouts, Robert McCormick, Matthew Wiatrowski, Mary Biddy, Teresa Alleman, Peter St. John, and Seonah Kim.

Researchers used corn stover-derived molecules as the starting point for an array of potential fuel candidates. From here, they relied on predictive models to determine which molecules would be best to blend with and improve traditional diesel. The molecules were prescreened based on attributes with implications spanning health and safety to performance.

"With the goal of developing drop-in biofuels that work with our existing infrastructure," Vardon said, "there are a lot of rules and regulations out there that a fuel has to meet. That eliminates a lot of promising molecules because they may be great in certain properties but fail in others. As we're doing this process, it started to become clear which molecules could be successful fuels."

The intention is to blend the 4-butoxyheptane molecule into diesel fuel at a mixture of 20%-30%. Initial results suggest the potential to improve ignition quality, reduce sooting, and improve fuel economy of the base diesel at these blend levels.

Further research is needed, Huq said, including testing the bioblendstock in an actual engine and producing the fuel in an integrated process directly from biomass.

"That first step was just seeing what could rise to the top as far as fuel properties go," she said. "Then it was asking, can we make any of these? The molecule that looked most promising was 4-butoxyheptane, and we were able to successfully produce and characterize it." The molecule didn't exactly match the predicted fuel properties but came close enough to meet the desired performance improvements.

An economic and life-cycle analysis revealed the oxygenate fuel could be cost-competitive with petroleum diesel and result in significant greenhouse gas reductions if the process also yields a high-value co-product such as adipic acid, which is used in the manufacture of nylon.

Credit: 
DOE/National Renewable Energy Laboratory

First US study shows strong results for procedure to treat knee pain from OA

image: Dr. Ari Isaacson.

Image: 
UNC School of Medicine

CHAPEL HILL, NC - A new study published in Journal of Vascular and Interventional Radiology details the first study of its kind in the U.S. to examine the use of genicular artery embolization (GAE) for extended treatment of knee pain caused by osteoarthritis (OA). Principal investigator of the study, Ari Isaacson, MD, clinical associate professor of vascular and interventional radiology in the UNC School of Medicine, says the results are positive.

"In this study we showed that GAE can be performed safely and that it demonstrates potential efficacy," Isaacson said.

An estimated 30 million Americans have knee pain as a result of OA. Current treatments for the pain include steroid injections, pain medications and physical therapy. Injections sometimes do not work, and when they do, tend to last only several months, at most. Pain medications are needed daily and can lead to dependence on opioids and toxicity from nonsteroidal anti-inflammatory drugs (NSAIDs, such as ibuprofen). Physical therapy can be taxing and painful. Total knee replacements are an option as well, but because they have a limited lifetime of 15-20 years, they are not performed until a patient reaches an optimal age.

GAE is a minimally invasive procedure that blocks blood flow to certain parts of the knee that can be the source of OA-related pain. More than 80 percent of patients with chronic OA also have chronic inflammation, which leads to synovial angiogenesis - the formation of new arteries in the portion of the knee called the synovium. Blood flow to the new arteries can irritate nerves in the synovium, causing pain. Therefore, blocking blood flow to these arteries relieves that irritation and reduces pain. Past studies from researchers in Japan and Korea have shown the procedure has the potential to provide pain relief for up to a year or more.

During the procedure, interventional radiologists block blood flow by inserting a spaghetti-sized catheter into the arteries through a very small incision. The catheter is then directed through the arteries to the knee using x-ray and iodinated contrast. Once the appropriate knee arteries are identified, spherical particles are injected to create a blockage. Because the catheter is used to select only the tiny abnormal arteries, the majority of the blood flow to the knee is preserved. Patients were under moderate sedation during the short procedure and were discharged the same day. No physical therapy was required and patients felt relief from their knee pain within three days.

The two-site trial, funded by Boston Scientific, included 20 patients - 9 men and 11 women - aged 49 to 84 with moderate to severe knee pain from OA. After the procedure all patients had follow-up visits at one, three, and six months. The decrease in pain reported by participants was significant, so much so that the improvement scores surpassed researchers' benchmark goal for proving efficacy of the procedure. Nearly all patients still had an improvement in pain after one month. Around 80 percent of patients still had less pain after six months. In addition to less reported knee pain, 65 percent of participants reported a decrease in use of daily pain relief medication.

"We've seen that most patients experience a rapid decrease in pain and disability after GAE for OA-related knee pain," Isaacson said. "This procedure has promise to be a replacement for injections and the daily use of pain medications, but more research is needed."

Isaacson and fellow researchers are in the process of conducting a randomized control trial, funded by a grant from Medtronic, comparing GAE to a sham procedure in order to determine how much of the improvement observed in these patients may be attributed to placebo. The procedure is currently only available in clinical trials.

Credit: 
University of North Carolina Health Care

And then there was light

image: Research from the laboratory of Richard D. Vierstra, the George and Charmaine Mallinckrodt Professor of Biology in Arts & Sciences at Washington University in St. Louis, provides new insights on the photoconversion mechanism of phytochromes.

Image: 
Whitney Curtis, for Washington University in St. Louis

Light provides the energy that plants and other photosynthetic organisms need to grow, which ultimately yields the metabolites that feed all other organisms on the planet. Plants also rely on light cues for developing their photosynthetic machinery and to sync their life cycles around daily and seasonal rhythms.

For example, photoreceptor pathways in plants allow them to determine how deep a seed is in the soil, to "measure" the waning daylight hours and to alter a plant's development to prepare it for the onset of summer or the beginnings of winter.

New research from Washington University in St. Louis provides insight into how proteins called phytochromes sense light and contribute to how plants grow. The paper is published this week in the Proceedings of the National Academy of Sciences.

"Phytochromes are unique among photoreceptors because they exist in two stable yet interconvertible states: an inactive form that is synthesized in the dark and another that requires light for activation," said Richard D. Vierstra, the George and Charmaine Mallinckrodt Professor of Biology in Arts & Sciences.

"By measuring the proportions of these two forms as they flip back and forth, phytochromes can sense light intensity, duration, light color and even day length. How these dark and light forms differ has remained enigmatic despite 60 years of research on photoreceptors."

Vierstra and his collaborators overcame a major hurdle toward defining the sequence of events that support the transition between light- and dark-adapted states.

They discovered and characterized a crystal form of the photoreceptor PixJ from the cyanobacterium Thermosynechococcus elongatus -- one that allows reversible photoconversion between the active and inactive forms. Remarkably, the crystals retain their integrity during the photoconversion process. Sethe Burgie, research scientist in biology in Arts & Sciences and first author of the paper, was able to collect the high resolution X-ray diffraction data necessary for identifying intermediates of the reaction pathway, using a sophisticated technique called X-ray crystallography.

Researchers should now be able to use newly developed X-ray free-electron lasers to acquire structural snapshots of this phytochrome crystal as it initially absorbs light through its inactive photoreceptor to when it acquires its fully mature active state -- a process that is complete within a millisecond.

In a preliminary test, the Vierstra group was able to see the first twitch of the photoreceptor as the part of its chromophore that captures the light energy rotated upon photoactivation.

"In other words, it should now be possible to make an atomic-resolution molecular movie that outlines the structural transitions of the photoreceptor," Burgie said. "We are now at the cusp of defining the internal events and sequence of physical changes that happen within phytochromes as they move between biologically inactive and active states, which will ultimately help researchers to tinker with plants to improve their agricultural yield and sustainability."

Understanding the structural underpinnings of the photoconversion cycle is an important step toward developing modified phytochromes that endow crop plants with beneficial light-sensing properties.

"Additionally, as phytochromes sense both light and temperature, altering phytochrome function has great potential for tailoring crops better fit to specific environments and might help to expand the range of these crops," Vierstra said.

Credit: 
Washington University in St. Louis

A new playbook for interference

image: This is a schematic of an interference experiment in which two photons are produced in different buildings, are generated by different sources and have different colors.

Image: 
S. Kelley/NIST

Particles can sometimes act like waves, and photons (particles of light) are no exception. Just as waves create an interference pattern, like ripples on a pond, so do photons. Physicists from the National Institute of Standards and Technology (NIST) and their colleagues have achieved a major new feat -- creating a bizarre "quantum" interference between two photons of markedly different colors, originating from different buildings on the University of Maryland campus.

The experiment is an important step for future quantum communications and quantum computing, which could potentially do things that classical computers can't, such as break powerful encryption codes and simulate the behavior of complex new drugs in the body. The interference between two photons could connect distant quantum processors, enabling an internet-like quantum computer network.

Using photons that originally had different colors (wavelengths) is important because it mimics the way a quantum computer would operate. For instance, visible-light photons can interact with trapped atoms, ions or other systems that serve as quantum versions of computer memory while longer-wavelength (near-infrared) photons are able to propagate over long distances through optical fibers.

Just as classical computers needed reliable ways to transmit, store and process electrons before complex, networked computing was possible, the NIST result brings the exchange of quantum computing information an important step closer to reality.

In their study, a collaboration between NIST and the Army Research Laboratory, physicists and engineers in adjacent buildings at the University of Maryland created two different and separate sources of individual photons. In one building, a group of rubidium atoms was prompted to emit single photons with a wavelength of 780 nanometers, at the red end of the spectrum of visible light. In the other building, 150 meters away, a trapped ion of barium was induced to emit photons with a wavelength of 493 nanometers -- nearly 40 percent shorter --toward the blue end of the spectrum.

Then the researchers had to make the blue photons dead ringers for the red ones. To do this, Alexander Craddock, Trey Porto and Steven Rolston of the Joint Quantum Institute, a partnership between NIST and the University of Maryland, and their colleagues mixed the blue photons with infrared light in a special crystal. The crystal used the infrared light to covert the blue photons into a wavelength matching the red ones in the other building while otherwise preserving their original properties. Only then did the team send the photons through a 150-meter optical fiber to meet up with the nearly identical red photons in the other building.

The photons were so similar that it was not possible to tell them apart in the experimental setup. Individual photons ordinarily act independently of one another. But due to the peculiar quantum nature of light, when two indistinguishable photons interfere with each other, their paths can become correlated, or dependent upon one another. Such quantum correlation can be used as a powerful tool for computing.

Sure enough, the researchers observed this correlation when pairs of the separately produced photons intersected. The pairs of photons passed through an optical component known as a beamsplitter, which could send them in one of two paths. Acting alone, each photon would do its own thing and would have a 50-50 chance of going through either path. But the two indistinguishable photons overlapped like waves. Because of their bizarre quantum interference, they stayed together and always went on the same path. Joining these once-independent photons at the hip, this interference effect can potentially perform many useful tasks in the processing of quantum information.

The researchers reported their findings online in a recent issue of Physical Review Letters.

A direct connection to quantum computing would come if the interference pattern is linked to another bizarre property of quantum mechanics known as entanglement. This phenomenon occurs when two or more photons or other particles are prepared in such a way that a measurement of a particular property -- for instance, momentum -- of one automatically determines the same property of the other, even if the particles are far apart. Entanglement lies at the heart of many quantum information schemes, including quantum computing and encryption.

In the team's experiment, the two photons were not entangled with the systems that generated them. But in future studies, said Porto, it should be relatively easy to entangle the red photons with the group of rubidium atoms that produced it. Similarly, the blue photons could be entangled with the trapped ion that produced them. When the two photons interfere, that connection would transfer the entanglement between red photon-rubidium atoms and blue photon-ion to become an entanglement between the rubidium atoms and the trapped ion.

It's this transfer of entanglement -- this transfer of information -- that underlies the potentially vast power of quantum computers, Porto noted.

Credit: 
National Institute of Standards and Technology (NIST)

Novel genetic signature that can predict some kinds of breast cancer is identified

image: The image shows blood vessels from the retina of a mouse with retinopathy, normal and pathological vessels stained green and red respectively.

Image: 
Ricardo Giordano

Researchers have identified a genetic signature with prognostic value for certain kinds of breast cancer. The discovery also contributes to a better understanding of the molecular mechanisms of pathological angiogenesis, the aberrant proliferation of blood vessels that occurs during cancer and other diseases.

The research, published in the journal PLOS Genetics, combined a study of the genes involved in retinopathy, as a model of angiogenesis, with analysis of transcriptomic gene expression profiles from public breast cancer databases.

Conducted by researchers at the University of São Paulo's Chemistry Institute (IQ-USP), in collaboration with the Ontario Institute for Cancer Research (OICR) in Toronto, Canada, the study was supported São Paulo Research Foundation - FAPESP.

"We identified a set of genes whose expression in breast cancer correlates with the degree of pathological angiogenesis in the tumor, so that it serves as a genetic signature of angiogenesis that is prognostic and more robust than the signatures identified previously, given the correlation found between angiogenesis and tumors generally," said Ricardo Giordano, a professor at IQ-USP, head of its Vascular Biology Laboratory and a co-author of the study.

In the study the Brazilian researchers identified 153 altered genes in both healthy and diseased retinas in mice. From this list they identified 149 equivalent human genes. The result served as the basis for a genetic signature study in partnership with the Canadian team, using a database with information on breast cancer patients. The conclusion was that 11 key genes involved in pathological angiogenesis performed best in terms of prognostic value.

Pathological angiogenesis is common to breast cancer and retinopathy. "The fact that these two diseases share this process and that angiogenesis is fundamental to the development of cancer in general led us to try to build a bridge between retinopathy and breast cancer," said João Carlos Setubal, head of USP's Bioinformatics Laboratory and also a co-author of the article.

According to the researchers, the study focused on breast cancer because of the large amount of data available on the disease. "We had to have access to a vast quantity of public data, given the considerable variation between one patient and another. This is the case for breast cancer. Genetic profiles are available for some 2,000 patients," Setubal said.

Bioinformatics was crucial to finding the genetic signature, he added. The data generated in the laboratory was submitted to sophisticated computational processing in partnership with researchers at OICR in Canada. Another co-author of the study, Rodrigo Guarischi de Sousa, then a PhD student, wrote the program that tested the 149 human genes as possible components of the signature for breast cancer. In this part of the study he was supported by a scholarship from FAPESP for a research internship abroad supervised by Paul C. Boutros at OICR.

Boutros is currently a researcher in the Human Genetics Department at the University of California, Los Angeles (UCLA). Guarischi de Sousa now works at DASA, Brazil's largest medical diagnostics company.

The researchers plan to find applications for this signature, especially in treatment for breast cancer. "Our next goal is to continue studying angiogenesis in cancer," Giordano said. "We're interested in identifying genes on this list that can be targets for the development of new drugs or new applications of existing drugs."

From retinopathy to breast cancer

The discovery of the prognostic gene signature for breast cancer is the result of a long study supported by a research scholarship from the National Council for Scientific and Technological Development (CNPq) starting in 2010, when Giordano began studying the transcriptome (RNA expression) and proteome (protein expression) of pathological retinal angiogenesis.

"As a result of this study, our lab implemented a mouse model to research retinopathy. The murine model is important as it's difficult to study blood vessel formation outside living tissue. This model enables us to induce retinopathy by modulating oxygen levels and study angiogenesis in the lab," Giordano said.

The researchers took blood samples from mice to investigate the differences between physiological angiogenesis, which occurs in healthy individuals (in wound healing, ovulation and placental growth, for example) and pathological angiogenesis, which is part of disease (e.g. cancer or arthritis).

"We observed which genes were expressed by endothelial cells [the inner lining of blood vessels] in both kinds of angiogenesis, always looking for the genes that were more expressed in one kind than the other," Giordano said.

A key element of the experiment was oxygen variation in the chambers containing newborn mouse pups. In chambers with oxygen at 75%, the mice became retinopathic, whereas in ambient air (oxygen at 20%) the retina developed normally.

The relationship between oxygen and cells has been in the news lately. William Kaelin, Peter Ratcliffe and Gregg Semenza won the 2019 Nobel Prize in Physiology or Medicine for discovering how cells sense and adapt to changing oxygen availability.

Gene expression-based prognostic signature

It is important to stress that clinical applications of the genetic signature for angiogenesis may differ from applications deriving from the marker gene BRCA1. The BRCA1 gene mutation became world-famous in 2013 when US actor Angelina Jolie underwent a preventive double mastectomy after genetic testing showed that she carried the mutation and hence ran an 87% risk of developing breast cancer.

"BRCA1 is a genomic gene," Setubal said. "A woman with mutations in this gene faces a higher risk of developing breast cancer but won't necessarily do so. The presence of mutations in this gene serves to help predict the appearance of the disease. The signature we describe in our study proved promising to predict the development of breast cancer after it actually appears."

Credit: 
Fundação de Amparo à Pesquisa do Estado de São Paulo