Brain

Treating DCIS with surgery and radiotherapy lowers cancer risk but benefits drop over time

A major study of women with ductal carcinoma in situ (DCIS) - a breast condition that can become invasive cancer - has shown that surgery to remove the tissue followed by radiotherapy offers better protection compared to surgery alone.

The study, presented at the 12th European Breast Cancer Conference, followed patients for up to 27 years. Although it shows that the benefit of radiotherapy and surgery over surgery alone persists, it also suggests that this benefit reduces over time.

Researchers say these new findings clarify the long-term risks for women with DCIS and may help women and their doctors to decide which treatment is right for them.

DCIS is a condition where cells lining the milk ducts have started to turn into cancer cells but have not spread into other parts of the breast. DCIS is often picked up by breast screening and affects tens of thousands of women in Europe each year. Only a proportion of DCIS cases will progress into invasive cancer and little is known about which cases will progress, so the treatments available to patients are very similar to treatments for invasive breast cancer.

The research was presented by Dr Maartje van Seijen from the Netherlands Cancer Institute (Amsterdam, The Netherlands). She said: "Most women who are diagnosed with DCIS are offered surgery to remove the abnormal breast tissue and they are often also offered radiotherapy, even though the majority would not go on to develop invasive breast cancer. We wanted to look at how this group of women get on in the long term, according to which treatment they received."

The study included 10,045 women diagnosed with DCIS in The Netherlands between 1989 and 2004. Researchers gathered data on whether the women were treated with breast-sparing surgery to remove the DCIS, or breast-sparing surgery followed by radiotherapy, or mastectomy (removing the whole breast).

They collected information on whether the women were subsequently diagnosed with DCIS in the same breast again or with an invasive breast cancer in the same breast.

In the first ten years after diagnosis, women who had breast-sparing surgery but not radiotherapy had a risk of 13.0% (130 out of 1000 women) of being diagnosed with DCIS again and their risk of invasive breast cancer was 13.9% (139 out of 1000). Women treated with breast-sparing surgery and radiotherapy had a risk of 4.6% (46 out of 1000 women) of DCIS in the first ten years and 5.2% (52 out of 1000 women) of invasive breast cancer.

But although women who had radiotherapy had lower risks in the first ten years, in the following years (ten or more years after diagnosis), their risks were closer to those for women who had surgery alone. After ten years post diagnosis, women who had breast-sparing surgery but not radiotherapy had a risk of 1.2% (12 out of 1000 women) of being diagnosed with DCIS again and their risk of invasive breast cancer was 11.8% (118 out of 1000). In women treated with breast-sparing surgery and radiotherapy these figures were 2.8% (28 out of 1000 women) for DCIS and 13.2% (132 out of 1000 women) for invasive breast cancer.

Dr van Seijen said: "The risk of DCIS or invasive cancer recurring in these women will diminish over time, whether they had just the breast-sparing surgery or breast-sparing surgery with radiotherapy. This study shows that, overall, the addition of radiotherapy gives women the best chances.

"However, there remains a chance of a new DCIS or invasive cancer developing that is not related to the initial diagnosis and we would expect this risk to be similar between the two types of treatment. In a very small number of women, radiotherapy itself might cause a new breast cancer, often many years after the radiotherapy was given."

The study also showed that women who had mastectomy to treat their DCIS had the lowest risks of invasive cancer. Dr van Seijen added: "Although patients who have a mastectomy have the lowest risk of recurrence, it's important to remember that, according to previous research, overall survival in patients who have a mastectomy is the same as in patients who have less aggressive treatments. For the majority of women with DCIS, whose condition will never become invasive, mastectomy would be considered over-treatment."

Professor Emiel Rutgers is President of the European Breast Cancer Council, a member of the of 12th European Breast Cancer Conference scientific committee and was not involved in the research. He said: "DCIS is a condition that affects thousands of women and a proportion of them go on to develop invasive breast cancer. Most of these women will have decades of life ahead of them so it's vital that we understand the long-term impact of the treatments we offer.

"We still need to know much more about DCIS and, in particular, which women will go on to develop invasive cancer and which will not. In the meantime, studies like this one provide patients and their doctors with more information about the benefits and costs of the different treatments available to them.

"Previous research shows that the risk of dying of cancer is only 1-2% in the 20 years following a DCIS diagnosis. So, it's important to remember that whether treated with breast conserving surgery alone or surgery with radiotherapy, the risk of dying from breast cancer in women who had DCIS remains very low."

Credit: 
European Organisation for Research and Treatment of Cancer

New clues about the link between stress and depression

image: Vasco Sousa, Per Svenningsson and Ioannis Mantas, researchers at the Department of Clinical Neuroscience, Karolinska Institutet, Sweden.

Image: 
Ulf Sirborn

Researchers at Karolinska Institutet in Sweden have identified a protein in the brain that is important both for the function of the mood-regulating substance serotonin and for the release of stress hormones, at least in mice. The findings, which are published in the journal Molecular Psychiatry, may have implications for the development of new drugs for depression and anxiety.

After experiencing trauma or severe stress, some people develop an abnormal stress response or chronic stress. This increases the risk of developing other diseases such as depression and anxiety, but it remains unknown what mechanisms are behind it or how the stress response is regulated.

The research group at Karolinska Institutet has previously shown that a protein called p11 plays an important role in the function of serotonin, a neurotransmitter in the brain that regulates mood. Depressed patients and suicide victims have lower levels of the p11 protein in their brain, and laboratory mice with reduced p11 levels show depression- and anxiety-like behaviour. The p11 levels in mice can also be raised by some antidepressants.

The new study shows that p11 affects the initial release of the stress hormone cortisol in mice by modulating the activity of specific neurons in the brain area hypothalamus. Through a completely different signalling pathway originating in the brainstem, p11 also affects the release of two other stress hormones, adrenaline and noradrenaline. In addition, the tests showed that mice with p11 deficiency react more strongly to stress, with a higher heart rate and more signs of anxiety, compared to mice with normal p11 levels.

"We know that an abnormal stress response can precipitate or worsen a depression and cause anxiety disorder and cardiovascular disease," says first author Vasco Sousa, researcher at the Department of Clinical Neuroscience, Karolinska Institutet. "Therefore, it is important to find out whether the link between p11 deficiency and stress response that we see in mice can also be seen in patients."

The researchers believe that the findings may have implications for the development of new, more effective drugs. There is a great need for new treatments because current antidepressants are not effective enough in many patients.

"One promising approach involves administration of agents that enhance localised p11 expression, and several experiments are already being conducted in animal models of depression," says Per Svenningsson, professor at the Department of Clinical Neuroscience, Karolinska Institutet, who led the study. "Another interesting approach which needs further investigation involves developing drugs that block the initiation of the stress hormone response in the brain."

Credit: 
Karolinska Institutet

Caesarean birth, prolonged labour influence infant gut bacteria, risk of childhood obesity

Events at birth may affect the microbes living in a baby's gut during the first few months of life, leading to a higher risk of childhood obesity and allergies, according to a new study published in the journal Gastroenterology.

The researchers used data from the CHILD Cohort Study (CHILD) to look at the complex relationships between birth events, a baby's gut microbiome at three and 12 months of age, and health outcomes at ages one and three.

They linked factors such as caesarean section delivery and prolonged labour to changes in the gut microbes of infants. They then determined the pathways by which these alterations may lead to an increased risk of allergies and obesity later in childhood.

Senior author Anita Kozyrskyj, a CHILD investigator and professor in the Faculty of Medicine & Dentistry at the University of Alberta, said the findings highlight the importance of identifying multiple and common pathways of the gut microbiome during infancy.

"Much of what happens to us later in life is related to the exposures we encounter in infancy and early childhood," she said. "Understanding how disruptions to the gut microbiome affect health in later childhood means we may have several options for effective interventions to prevent these chronic conditions before they become established."

The study showed that infants born by caesarean section were more likely to have a high body-mass index score at ages one and three. When the researchers examined the children's microbiome profiles at three months of age, they found that an altered ratio of two types of bacteria--Enterobacteriaceae and Bacteroidaceae--was the dominant path to overweight.

At 12 months of age, a higher Enterobacteriaceae/Bacteroidaceae (E/B) ratio and colonization with Clostridioides difficile (C. difficile) were the main pathways leading to allergic sensitization.

"While caesarean birth was an initiating event for triggering over 100 gut microbial pathways leading to overweight and allergic sensitization, we found a higher E/B abundance ratio was the dominant compositional change," explained Kozyrskyj.

Infants born after prolonged labour associated with a first pregnancy were also found to be at higher risk for these health outcomes. The researchers found the E/B abundance ratio at three months was the most important microbiota mediator to overweight, and the E/B ratio at 12 months was the most important mediator to allergic sensitization. The abundance of Bifidobacterium, which was reduced with prolonged labour, also played a role in overweight development.

To conduct the study, Kozyrskyj's team collected stool samples from the diapers of 1,667 infants who are part of CHILD, a national birth cohort study following nearly 3,500 Canadian children from before birth to adolescence with the goal of discovering the root causes of allergies, asthma, obesity and other chronic diseases. They then analyzed the samples for gut microbes and their metabolites.

At one and three years of age, the children underwent skin prick tests to check for allergic sensitization to 10 common allergens.

The study's first author and former post-doctoral fellow, Khanh Vu, now an analyst in the U of A's Quality Management in Clinical Research unit, said he believes the central role of the infant gut microbiota involves the production of small molecules or metabolites. "Our study identified key interactions between Bifidobacterium and the metabolite, formate," he commented.

The research also highlighted the critical influence of C. difficile in all microbiota interactions, said Kozyrskyj.

"The takeaway from our study is that exposures at birth can trigger multiple and common gut microbial pathways leading to child overweight and allergic sensitization," she noted.

"We may want to take steps to avoid unnecessary caesarean section deliveries, and possibly consider postnatal microbiota solutions that may help to prevent these two conditions."

Credit: 
University of Alberta Faculty of Medicine & Dentistry

Expert opinion: COVID-19 vaccine rollout unlikely before fall 2021

Experts working in the field of vaccine development tend to believe that an effective vaccine is not likely to be available for the general public before the fall of 2021. In a paper published this week in the Journal of General Internal Medicine, a McGill-led team published the results of a recent survey of 28 experts working in vaccinology.

The survey was carried out in late June 2020. The majority of those surveyed were mostly Canadian or American academics with an average of 25 years of experience working in the field.

"Experts in our survey offered forecasts on vaccine development that were generally less optimistic than the timeline of early 2021 offered by US public officials. In general they seem to believe that a publicly available vaccine next summer is the best-case scenario with the possibility that it may take until 2022," said Jonathan Kimmelman, a James McGill professor and the director of the Biomedical Ethics Unit at McGill University and the senior author on the paper.

Many experts also believe that there may be some false starts before an effective vaccine is available. "The experts we surveyed believe that there is a 1 in 3 chance that the vaccine will receive a safety warning label after approval, and a 4 in 10 chance that the first large field study will not report efficacy," added Patrick Kane, the lead author, who is a decision scientist and postdoctoral fellow at McGill University.

Predicting timelines for vaccine development

Experts were asked to make timeline forecasts for three milestones in vaccine development.
More specifically, experts were asked for their best, soonest, and latest estimates for when each of the following milestone would occur:

1. Question- When will a vaccine be available to the general public in the USA and/or Canada?

Answers

best guess = September/October 2021 (average)
soonest = June 2021 (average)
latest = July 2022 (average)

2. Question- When will a field study with at least 5000 participants report results?

Answers

best guess = March 2021 (average)
soonest = December 2020 (average)
latest = July 2021 (average)

3. Question- when will a vaccine be available to those at highest risk from the virus in the USA and/or Canada?

Answers

best guess = March/April 2021 (average)
soonest = February 2021 (average)
latest = December 2021 (average)

The researchers believe that this kind of approach, where people are asked to suggest a range of responses provides a more complete picture of the range of expert belief than media quotes from individuals.

Likelihood of setbacks

The study also showed that about 1/3 of those surveyed believe that vaccine development is likely to face may face the following setbacks:

1. that the first vaccine widely deployed in the USA and/or Canada will receive a boxed warning from the FDA to highlight serious or life-threatening adverse reactions; or

2. that the first large field trial in the USA and/or Canada will report a null or negative result in terms of efficacy.

"Our study finds that experts are largely in agreement about the timeline for a SARS-CoV-2 vaccine," says Stephen Broomell, an associate professor at the Dietrich College of Humanities and Social Sciences, at Carnegie Mellon University. "While this does not track with many overly optimistic government projections, it reflects a belief that researchers are indeed on a faster pace to development compared to previous vaccines."

Credit: 
McGill University

Study reveals element in blood is part of human--and hibernating squirrel--stress response

image: Hibernating arctic ground squirrel.

Image: 
Photo courtesy of Lesa Hollen / University of Alaska Fairbanks

A new study published in the journal Critical Care Explorations shows for the first time that part of the stress response in people and animals involves increasing the levels of a naturally circulating element in blood.

The discovery demonstrates a biological mechanism that rapidly responds to severe physiologic stress and potentially serves to protect us from further damage due to life-threatening conditions.

Analyzing hundreds of blood samples, the researchers found that iodide, a form of the element iodine, increased by 17 times in trauma patients within two hours of experiencing severe blunt force trauma and by 26 times in sepsis patients compared with healthy donors. Using an animal model of stress -- hibernation -- they also discovered that iodide increases when arctic ground squirrels hibernate.

"The squirrels are doing the same thing as patients with life-threatening injuries: play dead but not be dead," said senior author Dr. Mark Roth, professor in the Basic Sciences Division at Fred Hutchinson Cancer Research Center.

"Our study suggests that rapid increases of iodide in the blood could represent an ancient response to stress that is shared across animals. If we can harness this capability it could transform emergency medicine," he said.

Roth's field of research explores how to put animals in a temporary suspended animation, like pressing pause between life and death. He was named a 2007 MacArthur fellow and received a so-called "genius grant" for his pioneering efforts.

Roth has been trying to understand how suspended animation might be used to survive near-death experiences. For example, he recalls one story of a 15-year-old who survived a trans-Atlantic flight in the wheel well of a jet dressed only in a T-shirt.

"These latest findings put us a step closer to figuring out what happened -- perhaps this kid raised his blood iodide fast enough to basically hibernate before the cold and low oxygen could kill him," said Roth.

It may be odd to compare hibernating squirrels with trauma and sepsis patients, as the current study did. But what unites them is they're experiencing significant physiological stress. The study showed that, in all these groups, stress activated the same chemical responses to prevent further damage to the body.

The researchers used a blood test to see how blood iodide rose in hospitalized trauma and sepsis patients and in the hibernating ground squirrels.

The scientists also found in a mouse model that giving iodide before injury improved recovery.

They believe the protective pathway works through iodide's relationship with stress signals. Under stressful situations, iodide is released to circulate freely in the bloodstream. It acts to shield against tissue damage during stress by turning harmful hydrogen peroxide into harmless molecules of oxygen and water and reducing inflammation.

"We've known for many years that stress-induced inflammation makes injuries even worse," said co-author Dr. Ronald Maier, surgeon-in-chief at Harborview Medical Center, which is part of UW Medicine.

"In this study we found that iodide could provide a recyclable, effective and safe way to block damage from excessive inflammation caused by over production of oxygen radicals after injury and provides a potential therapeutic approach to enhance recovery, prevent complications and reduce mortality in severely injured patients. The use of iodide in the clinical setting should soon be moving to clinical trials," Maier added.

Faraday Pharmaceuticals, a company founded by Dr. Roth in 2014 and led by Dr. Stephen Hill, CEO, is investigating how administering a therapeutic containing iodide may improve recovery following heart attack. The company has completed a Phase 2 study in reperfusion injury following a heart attack and is about to initiate a Phase 2 study exploring the use of iodide in ICU-acquired weakness following trauma.

In another study, Roth's team is currently looking for changes in iodide in blood samples from patients with COVID-19 -- a disease known to have a strong inflammatory response, called a cytokine storm. They're hoping to have results from that study later in the fall.

Credit: 
Fred Hutchinson Cancer Center

Research: COVID-19 is echoed in dreams

The content of the nightmares of nearly a thousand individuals during the coronavirus pandemic were analysed in a study published in the Frontiers in Psychology journal. The study found that the pandemic had affected more than half of the bad dreams reported.

The study, which was based on the crowdsourcing of dreams, saw more than 4,000 people respond to a survey in the sixth week of the state of emergency caused by the coronavirus pandemic in Finland. The survey was published in connection with an article on dreams that appeared in the Helsingin Sanomat daily. Roughly 800 respondents also described their dreams.

"It was interesting to see recurring dream content, which echoed the apocalyptic atmosphere of the circumstances brought about by COVID-19," says Professor Anu-Katriina Pesonen, head of the Sleep and Mind research group at the University of Helsinki.

"The findings enabled us to speculate that dreaming in the middle of exceptional circumstances is a form of shared mindscape between individuals," she adds.

Themes related to the pandemic repeated in nightmares

Together with her team, Pesonen translated the content of dreams from Finnish to English-language word lists, employing in the analysis an AI-based approach where combinations of words that recur often are identified. The computational analysis established 'dream clusters' on the basis of statistical co-occurrence from recurring word associations and their networks. In other words, the dream associations served as individual dream content particles, not comprehensive dream narratives.

Many of the dream clusters were thematic, and there was pandemic-related content in more than half of the nightmare clusters. Such content included failure to observe safe social distancing, contracting the coronavirus, masks and other protective equipment, dystopias and the apocalypse.

For example, the word associations in a dream cluster named 'Ignoring social distancing' included hugging by mistake, hug-handshakes, restrictions related to handshakes, handshaking distance, lapses in social distancing, restrictions related to gatherings and crowded parties.

"The computational analysis carried out in the study is new to dream research," Pesonen notes. "Indeed, we hope to see more AI-aided efforts in the field in the future."

New details pertaining to stress during the pandemic

The study also offered some insights into people's sleeping habits and stress levels during the pandemic. For instance, more than half of the respondents reported having slept more than before, although 10% of respondents found falling asleep more difficult and 25% had more nightmares than before.

That more than half of the study participants said their stress levels had increased is not surprising, and the rise was in turn connected to having nightmares. Those experiencing the most severe stress also had dreams with pandemic-related content.

Sleep is a key factor associated with mental health, with recurring powerful nightmares a potential indication of post-traumatic stress. The content of dreams is not entirely arbitrary, but it may be key to understanding what lies at the heart of stress, traumas and anxiety.

Credit: 
University of Helsinki

Ultrasensitive microwave detector developed

image: Microwave bolometer based on graphene Josephson junction.

Image: 
Sampson Wilcox from MIT

A joint international research team from POSTECH of South Korea, Raytheon BBN Technologies, Harvard University, and Massachusetts Institute of Technology in the U.S., Barcelona Institute of Science and Technology in Spain, and the National Institute for Materials Science in Japan have together developed ultrasensitive sensors that can detect microwaves with the highest theoretically possible sensitivity. The research findings, published in the prominent international academic journal Nature on October 1, are drawing attention as an enabling technology for commercializing the next-generation of technologies including quantum computers.

Microwave is used in a wide range of scientific and technological fields, including mobile communications, radar, and astronomy. Recently, research has been actively conducted to detect microwaves at extremely high sensitivity for the next-generation quantum technologies such as quantum computing and quantum communication.

Currently, microwave power can be detected using a device called bolometer. A bolometer usually consists of three materials: Electromagnetic absorption material, a material that converts electromagnetic waves into heat, and a material that converts the generated heat into electrical resistance. The bolometer calculates the amount of electromagnetic waves absorbed using the changes in the electrical resistance. Using the semiconductor-based diodes such as silicon and gallium arsenide in the bolometer, the sensitivity of the state-of-the-art commercial bolometer operating at room temperature is limited at 1 nanowatt (1 billionth of a watt) by averaging for a second.

The research team broke through this limit by innovating the aspect of materials and structure of the device. Firstly, the team used graphene as the material for absorbing electromagnetic waves. Graphene is made up of one layer of carbon atoms and has a very small electronic heat capacity. The small heat capacity signifies that even if little energy is absorbed, it causes a big temperature change. Microwave photons have very little energy, but if absorbed by graphene, they can cause considerable temperature rise. The problem is that the temperature increase in graphene cools down very quickly, making it difficult to measure the change.

To solve this problem, the research team adopted a device called the Josephson junction. This quantum device, composed of superconductor-graphene-superconductor (SGS), can detect temperature changes within 10 picoseconds (1 trillionth of a second) via an electrical process. This makes it possible to detect the temperature changes in graphene and the resulting electrical resistance.

Combining these key ingredients, researchers reached the noise equivalent power of 1 aW/Hz1/2, which means the device can resolve 1 aW (1 trillionth of a watt) within a second.

"This study is significant in that it has established a scalable technology to enable the next-generation quantum devices," remarked Professor Gil-Ho Lee of POSTECH, who led the study. He further explained, "This study developed a bolometer technology that measures how many microwave photons are absorbed per unit time. But currently, we are developing a single-photon detection technology that can distinguish each microwave photon." He concluded, "We expect this technology to maximize the measuring efficiency of quantum computing and drastically reduce the indirect resources to enable large-scale quantum computers that will be of great use. Dr. Kin Chung Fong of Raytheon BBN Technologies commented, "We are seeing an unexpected interest in this study from those researching the origins of the universe in the field of radio astronomy and those studying dark matter in particle physics." He added, "This is an example of how research on basic science can be applied to various fields."

Credit: 
Pohang University of Science & Technology (POSTECH)

Enforcement more effective than financial incentives in reducing harmful peat fires?

image: A builder by trade turns his hand to fishing and casts his net as firms close in Riau Province, Sumatra, due to the toxic haze from burning peatlands.

Image: 
Bjorn Vaughn

A new study looking at incentives to reduce globally harmful peatland fires suggests that fear of enforcement and public health concerns influence behaviour more than the promise of financial rewards.

The findings come as wildfires devastate the US West Coast and Russian Arctic, and fire season begins in Australia, Indonesia and Brazil.

Led by the University of East Anglia (UEA), the research examined the intervention mix within a leading peat fire prevention programme in Indonesia and found that the incentives had little impact. Instead, communities responded more strongly to the deterrents of sanctions, such as fines, and to raised awareness about the negative health impacts of toxic smoke, or 'haze'. Indeed, fear of sanctions most consistently related to fire-free outcomes.

Indonesian peatlands are globally important for the carbon they store and help protect Southeast Asian biodiversity. However, they are undergoing rapid land-use change. They have been drained and frequently cleared using fire, often to enable the expansion of oil palm and acacia plantations.

Increasing fires are a leading environmental challenge, with impacts ranging from local infringements on public health, livelihoods and daily freedoms through the release of toxic haze, to regional economic losses and global burdens associated with climate change through carbon emissions.

With the fire season in Indonesia imminent, and a bad year in 2019, the authors say their findings have implications for future fire management interventions, including how to balance reward and sanction to ensure equitable and effective fire mitigation.

The study, published in the journal Global Environmental Change, involved researchers from UEA, Lancaster University and the University of Cambridge, together with scientists from the US, France and Indonesia.

Lead author Dr Rachel Carmenta, from the Tyndall Centre and School of International Development at UEA, said: "Uncontrolled fires are increasing globally and the trend is predicted to continue. Humid tropical forests that wouldn't normally burn are now sites of extensive mega-fires. These include the Brazilian Amazon, which last year hit record highs, this year the Brazilian wetland ecosystem the Pantanal, which is suffering extensively from uncontrolled fires, and Indonesia's peat swamp forests, where extensive fires are now annual events.

"Our results highlight that incentives were less important than deterrents in shaping environmental outcomes. However, there was also no single pathway to fire-free outcomes, and combinations of interventions were particularly important in high fire risk situations.

"Previous research shows supporting small-scale farmers is the least controversial fire mitigation policy in Indonesian peatlands. But as we find in this study, even a scheme considered to depend heavily on incentives, in practice hinges on deterrents. This raises important equity concerns. While sanctions are effective, they may cause more damage to those most vulnerable and with least alternatives to fire dependence."

Intentional fires to clear land can more easily escape on peatland and result in extensive uncontrolled peat fires. The resulting toxic smoke is responsible for outdoor air pollution, with atmospheric particulate matter concentrations exceeding those considered extremely hazardous to health, and is linked to hundreds of thousands of public health cases.

Many solutions have been proposed, such as forest protection measures, moratoriums on peat expansion, and agricultural support. However, numerous programmes have largely failed, and what policy interventions to combine and how to align these to local conditions remains unclear.

To help address this, the researchers compared 10 Indonesian villages that participated in the Fire Free Village programme in Riau Province, Sumatra. The scheme is operated by a pulp and paper company to incentivise small-scale farmers living in communities adjacent to their acacia tree concession areas to reduce fire, and therefore the prevalence of uncontrolled fires.

If villages prevent local fires, they are rewarded with US$7,000 to support community projects. The programme includes interventions that focus on sanction and deterrent as part of the policy mix towards fire free outcomes.

The team found that effective combinations of interventions depend on the landscape context of the village. In villages with lower fire risk, a single intervention was enough to reduce fire, for example the threat of enforcement for illegal burning. In these villages people had more diverse livelihood options, most land was already being farmed - reducing the need to use fire - and people farmed on mineral soils, which do not burn.

In villages with far higher risks of fire escape, fire was reduced only where at least two methods were combined: feared enforcement and concern about the impacts of fire haze on their health. Again, incentives did not matter.

People in higher fire risk villages were primarily reliant on oil palm for their livelihood. Village areas were on larger extents of highly flammable peatland and much of the land area was not planted, so people were still clearing for agriculture.

Credit: 
University of East Anglia

Molecules responsible for radio-resistant glioblastoma identified

image: Glioblastoma Multiforme (GBM) cells in culture (Photo: Jin-Min Nam)

Image: 
Jin-Min Nam

Scientists have identified key molecules that mediate radioresistance in glioblastoma multiforme; these molecules are a potential target for the treatment of this brain cancer.

Glioblastoma multiforme (GBM), is the most aggressive type of brain cancer. It is treated by radiation therapy combined with chemotherapy. However, even with treatment, the five-year survival rate for GBM is less than 7%. One of the major causes for this is that GBM rapidly develops radioresistance (resistance to radiotherapy) by unknown mechanisms.

A team of scientists from Hokkaido University and Stanford University have revealed a mechanism by which GBM develops radioresistance. Their research, published in the journal Neuro-Oncology Advances, explains how two key molecules, Rab27b and epiregulin, interact to contribute to radioresistance in GBM.

The primary function of Rab27b is to regulate protein trafficking and secretion of molecules. Rab27b is also known to promote tumor progression and metastasis in several types of cancer. For these reasons, the scientists decided to investigate if Rab27b had any role to play in GBM.

Upon performing tests on human glioblastoma cell lines, the scientists showed that Rab27b expression was increased for at least seven days after exposure to radiation. Knockdown of Rab27b increased the sensitivity of glioblastoma cells to irradiation. These tests were replicated in an animal model: the glioblastoma cells were injected into mice, which were then subjected to radiation therapy. Rab27b knockdown combined with radiation therapy delayed tumor growth and prolonged mouse survival time.

As Rab27b is a regulator of protein trafficking, the scientists continued their work, looking for other molecules that contribute to radioresistance. They discovered that changes in the expression of Rab27b led to corresponding changes in the expression of epiregulin, a growth factor whose expression is known to increase in cancer cells; knocking down the expression of epiregulin increased the sensitivity to irradiation, as seen in the cells with Rab27b knockdown. Further, the scientists showed that increased expression of Rab27b and epiregulin in glioblastoma induced the proliferation of surrounding cancer cells, which could contribute to acquiring radioresistance. Finally, they analyzed gene expression data of GBM patients and found that upregulation of Rab27b and epiregulin correlated with poor prognosis of the patients.

By identifying the roles that Rab27b and epiregulin play in the development of radioresistance in GBM, the scientists have brought to light a novel target for drug development, and one that could significantly increase the survival rate for GBM.

Dr. Jin-Min Nam and Dr. Yasuhito Onodera are part of the Radiation Biology group at the Global Center for Biomedical Science and Engineering (GCB), a collaboration between Hokkaido University, Japan, and Stanford University, USA. The group specializes in molecular and cellular oncology, and radiation biology.

Credit: 
Hokkaido University

Dinosaur feather study debunked

image: Fossil discovered at the site of four Archaeopteryx skeletons.

Image: 
Museum fur Naturkunde

A new study provides substantial evidence that the first fossil feather ever to be discovered does belong to the iconic Archaeopteryx, a bird-like dinosaur named in Germany on this day in 1861. This debunks a recent theory that the fossil feather originated from a different species.

The research published in Scientific Reports finds that the Jurassic fossil matches a type of wing feather called a primary covert. Primary coverts overlay the primary feathers and help propel birds through the air. The international team of scientists led by the University of South Florida analyzed nine attributes of the feather, particularly the long quill, along with data from modern birds. They also examined the 13 known skeletal fossils of Archaeopteryx, three of which contain well-preserved primary coverts. The researchers discovered that the top surface of an Archaeopteryx wing has primary coverts that are identical to the isolated feather in size and shape. The isolated feather was also from the same fossil site as four skeletons of Archaeopteryx, confirming their findings.

"There's been debate for the past 159 years as to whether or not this feather belongs to the same species as the Archaeopteryx skeletons, as well as where on the body it came from and its original color," said lead author Ryan Carney, assistant professor of integrative biology at USF. "Through scientific detective work that combined new techniques with old fossils and literature, we were able to finally solve these centuries-old mysteries."

Using a specialized type of electron microscope, the researchers determined that the feather came from the left wing. They also detected melanosomes, which are microscopic pigment structures. After refining their color reconstruction, they found that the feather was entirely matte black, not black and white as another study has claimed.

Carney's expertise on Archaeopteryx and diseases led to the National Geographic Society naming him an "Emerging Explorer," an honor that comes with a $10,000 grant for research and exploration. He also teaches a course at USF, called "Digital Dinosaurs." Students digitize, animate and 3D-print fossils, providing valuable experience in paleontology and STEAM fields.

Credit: 
University of South Florida

New discovery helps researchers rethink organoid cultures

image: A look at an organoid sample, with different sizes based on their location in the culture.

Image: 
University of Texas at Austin

Organoids are stem cell-based tissue surrogates that can mimic the structure and function of organs, and they have become a key component of numerous types of medical research in recent years. But researchers from The University of Texas at Austin have uncovered problems with the conventional method for growing organoids for common experiments that may cause misleading results.

The researchers discovered that the size of organoids differ depending on where they are located within the hydrogel material called extracellular matrix (ECM) that is commonly used in biomedical research. The team found that organoids on the edges of a dome-shaped ECM respond differently to chemical or biological stimuli compared to those in the center of the dome.

This observation means one organoid in the core might react positively to a new treatment or drug, while another one on the edge could have a negative reaction, potentially muddying the results of an experiment. Ideally, organoids would be consistent in size and reaction in preclinical experiments.

"There are hundreds of organoids in the hydrogel dome, and they're showing different sizes, different functions, and that can be problematic" said Woojung Shin, a postdoctoral fellow and recent Ph.D. graduate from the Cockrell School of Engineering's Department of Biomedical Engineering, who discovered the problem. "You may get very different results from what would actually happen in the human body as a result."

The findings were published recently in Cell Press' iScience. The team includes researchers from the Biomimetic Microengineering Laboratory in the Cockrell School and the Livestrong Cancer Institutes of UT's Dell Medical School.

The research began when Shin noticed that something felt off while examining organoids. They were slightly different sizes based on their location in the sample.

They repeated the experiment and identified the same issues time after time with different organoid lines. The team found morphogens in the culture medium -- signaling molecules that are essential for organoid growth -- that can spread and create a "gradient" within the hydrogel domes. One of the representative morphogens, Wnt3a, was extremely unstable. A computational simulation confirmed that the size difference in organoids is likely explained by the morphogen gradient and its instability.

The paper mainly focuses on the problem the researchers uncovered, but it also offers a roadmap for finding solutions. The key, the researchers say, is to stabilize the Wnt3a protein across the sample, reducing the size of the gradient created and, subsequently, the location-based differences in the organoids.

Shin is a member of biomedical engineering assistant professor Hyun Jung Kim's research group. She focuses on disease modeling and bioinspired organ mimicry.

Organoids are an important part of the ongoing research conducted by Kim and his group. The team uses nature's engineering principles, or biomimetic engineering, to solve the fundamental questions about human health and disease, most notably through its organ-on-a-chip technology.

Continuing to refine organoid research principles is key to the success of Kim's group as well as a host of different types of medical research. The paper mentions disease modeling, tissue engineering, patient-specific validation of new drug candidates and research into the relationship between demographics and disease as areas that have benefitted from organoid research.

"We really want to have reproducible and reliable experimental results," Kim said. "What we've found here is that we all need to be more cautious about how we interpret data, and then maybe we can decrease the risk of misinterpretation."

Credit: 
University of Texas at Austin

Welsh-medium school pupils underperform in tests despite more advantaged backgrounds

video: New research from Lancaster University Management School reveals that secondary schools in Wales that teach pupils through the medium of Welsh are outperformed by their English-speaking counterparts in maths, reading and science tests.

Image: 
Lancaster University Management School

Secondary schools in Wales that teach pupils through the medium of Welsh are outperformed by their English-speaking counterparts in maths, reading and science tests, according to a new study by Lancaster University.

The average results of pupils attending Welsh-language secondary schools are markedly lower than pupils in English-language schools. This is despite Welsh-medium school pupils having more books available at home, spending more time on their studies outside of school and far fewer qualifying for free school meals.

New research, published today in the Wales Journal of Education, uses Programme for International Student Assessment (PISA) data from 2015 which capture results of standardised tests for 15 year olds across 80 countries to compare results in maths, reading and science.

Within Wales, PISA data reveal the average score for Welsh-language school pupils in the maths test was 476, compared to an English-language school pupil of 485. In reading tests, Welsh-medium pupils scored an average of 469, compared to an average of 494 achieved by English-medium school pupils. In science tests, Welsh-medium pupils scored 484 on average, compared to English-medium pupils averaging 499.

Geraint Johnes, a Professor of Economics at Lancaster University Management School, authored the study. He said: "Recent concerns about the standards of education in Wales have prompted reforms but while there is a great deal of focus on the education system, there has been little attention paid to the comparative performance of English-language and Welsh-language schools.

"Despite Welsh-medium schools being regarded very highly and attracting wealthier families, data reveal that secondary school pupils achieve lower scores in reading, maths and science tests when compared with those in English-medium schools. Considering the pupils are coming from more privileged homes, you may expect them to achieve the same scores or perhaps even higher - but this is not the case."

Around 200,000 pupils are taught in Wales in just over 200 secondary schools - 24 per cent of these are Welsh-medium, catering for around 20% of pupils. Of the 200, 140 secondary schools participated in the 2015 PISA tests - 18 of which were Welsh-medium schools.

The new study also looks at how advantaged or disadvantaged each pupil may be in terms of family background, including measures such as household wealth. Using the method of 'data envelopment analysis', the study establishes how 'efficiently' students transform these background factors into results.

Professor Johnes continues: "In addition to measuring test scores, I looked at how 'efficient' pupils were in terms of their performance, considering their background and socio-economic status. For example, pupils from poorer families with less time to study outside of school but who manage to achieve fantastic results are classed as highly efficient. I found that the best, most efficient pupils in Welsh-medium schools were still around 10 per cent less efficient than those attending English-medium schools.

"There are a few plausible explanations for the difference in results that we see. Welsh speaking schools face tougher challenges when recruiting teaching staff, resulting in them fishing in a more limited talent pool. There is also a chance that there are systematic differences in how schools approach these types of standardised tests - they may not taken as seriously by some teachers and pupils as they are by others. To eliminate any doubt, further advances need to be made in terms of releasing data sets about secondary schools in Wales so additional in-depth analysis can be done."

Credit: 
Lancaster University

How everyday speech could transmit viral droplets

video: Close-up of a high-speed video of filament break-up and resulting atomization between the lips of a speaker saying 'Pa' in 'PaPa'.

Image: 
Abkarian & Stone, Phys. Rev. Fluids / APS (2020)

It is well known that an individual infected with the coronavirus can spread it to others through respiratory droplets projected by violent expiratory events like coughing and sneezing. Evidence also shows that the virus can also be transmitted before these symptoms arise. The airflow generated from everyday conversations is increasingly recognized as a potent route of transmission, especially as people spend more time indoors during the fall and winter. Using high-speed imaging of an individual producing common speech sounds, Abkarian and Stone report that the sudden burst of airflow produced from the articulation of consonants like /p/ or /b/ carry salivary and mucus droplets for at least a meter in front of a speaker. In additional experiments, the researchers demonstrate that an ordinary lip balm reduces the droplets contained in speech-driven flows.

Credit: 
American Physical Society

A cancer shredder

image: To fight cancer by a newly developed substance shredding carcinogenic aurora proteins: This is the aim of a new study by scientists at universities in Würzburg and Frankfurt.

Image: 
Dr. Sandy Pernitzsch

The villain in this drama has a pretty name: Aurora - Latin for dawn. In the world of biochemistry, however, Aurora (more precisely: Aurora-A kinase) stands for a protein that causes extensive damage. There, it has been known for a long time that Aurora often causes cancer. It triggers the development of leukemias and many pediatric cancers, such as neuroblastomas.

Researchers at the universities of Würzburg and Frankfurt have now developed a drug that can disarm Aurora. Dr. Elmar Wolf, biochemist and research group leader at the Biocenter of Julius-Maximilians-Universität Würzburg (JMU), and Stefan Knapp, Professor of Pharmaceutical Chemistry at Goethe University Frankfurt, have played a leading role in this development. The results of their work have now been published in the latest issue of Nature Chemical Biology.

Making tumor-promoting proteins disappear

"Cancers are usually triggered by tumorigenic proteins," explains Elmar Wolf. Because cancer cells produce more of these proteins than normal cells, the dynamics are additionally increased. A common therapeutic approach is therefore to inhibit the function of these proteins with drugs. "Although the proteins are then still there, they no longer function as well. This makes it possible to combat the tumor cells," he says.

However, the development of these inhibitors is difficult and has so far not been successful for all tumor-promoting proteins. To date, none of the candidates that inhibit Aurora has shown the desired results in clinical practice. The dream of many scientists is therefore to develop a drug that not only inhibits the tumor-promoting proteins but makes them disappear completely. A promising approach along this path could be a new class of substances with the scientific name "PROTAC".

In vitro cancer cells die

"We have developed such a PROTAC for Aurora," says Elmar Wolf. Together with his team and especially his doctoral student Bikash Adhikari, he was able to show that this PROTAC completely degrades the Aurora protein in cancer cells. Such cells cultivated in the laboratory died as a result.

Wolf describes the mode of action of this substance as follows: "The tumor needs certain tumor-promoting proteins, which we can imagine as the pages of a book. Our PROTAC substance tears out the 'Aurora' pages and destroys them with the help of the machinery that every cell has to degrade old and broken proteins." PROTAC thus "shreds" the Aurora protein, as it were, until nothing of it remains.

Further work is required

Professor Stefan Knapp from the Institute of Pharmaceutical Chemistry at Goethe University explains: "Aurora-A kinase is present in much higher concentrations in many cancer tissues than in healthy tissue and it also plays a key role in prostate cancer. Blocking the activity of Aurora-A kinase alone seems not a promising approach as none of the many clinically tested drug candidates has achieved clinical approval. With our PROTAC variant, we inhibit Aurora-A kinase via another, possibly more effective mechanism, which may open up new treatment options. That's why in the next step we'll test effectiveness and tolerance in animal models."

Credit: 
University of Würzburg

Geoscience: Cosmic diamonds formed during gigantic planetary collisions

It is estimated that over 10 million asteroids are circling the Earth in the asteroid belt. They are relics from the early days of our solar system, when our planets formed out of a large cloud of gas and dust rotating around the sun. When asteroids are cast out of orbit, they sometimes plummet towards Earth as meteoroids. If they are big enough, they do not burn up completely when entering the atmosphere and can be found as meteorites. The geoscientific study of such meteorites makes it possible to draw conclusions not only about the evolution and development of planets in the solar system but also their extinction.

A special type of meteorites are ureilites. These are fragments of a larger celestial body - probably a minor planet - which was smashed to pieces through violent collisions with other minor planets or large asteroids. Ureilites often contain large quantities of carbon, among others in the form of graphite or nanodiamonds. The diamonds on the scale of over 0.1 and more millimetres now discovered cannot have formed when the meteoroids hit the Earth. Impact events with such vast energies would make the meteoroids evaporate completely. That is why it was so far assumed that these larger diamonds - similar to those in the Earth's interior - must have been formed by continuous pressure in the interior of planetary precursors the size of Mars or Mercury.

Together with scientists from Italy, the USA, Russia, Saudi Arabia, Switzerland and the Sudan, researchers from Goethe University have now found the largest diamonds ever discovered in ureilites from Morocco and the Sudan and analysed them in detail. Apart from the diamonds of up to several 100 micrometres in size, numerous nests of diamonds on just nanometre scale as well as nanographite were found in the ureilites. Closer analyses showed that what are known as londsdalite layers exist in the nanodiamonds, a modification of diamonds that only occurs through sudden, very high pressure. Moreover, other minerals (silicates) in the ureilite rocks under examination displayed typical signs of shock pressure. In the end, it was the presence of these larger diamonds together with nanodiamonds and nanographite that led to the breakthrough.

Professor Frank Brenker from the Department of Geosciences at Goethe University explains:

"Our extensive new studies show that these unusual extraterrestrial diamonds formed through the immense shock pressure that occurred when a large asteroid or even minor planet smashed into the surface of the ureilite parent body. It's by all means possible that it was precisely this enormous impact that ultimately led to the complete destruction of the minor planet. This means - contrary to prior assumptions - that the larger ureilite diamonds are not a sign that protoplanets the size of Mars or Mercury existed in the early period of our solar system, but nonetheless of the immense, destructive forces that prevailed at that time."

Credit: 
Goethe University Frankfurt