Earth

Hidden messages in protein blueprints

Scientists from the German Cancer Research Center (DKFZ) and the Heidelberg Institute of Stem Cell Technology and Experimental Medicine (HI-STEM)* and the Max Planck Institute in Freiburg have identified a new control mechanism that enables stem cells to adapt their activity in emergency situations. For this purpose, the stem cells simultaneously modify the blueprints for hundreds of proteins encoded in the gene transcripts. In this way, they control the amount of protein produced and can also control the formation of certain proteinisoforms. If this mechanism is inactivated, stem cells lose their self-renewal potential and can no longer react adequately to danger signals or inflammation.

Messenger RNA molecules (mRNAs), the transcripts of genes, serve as the blueprint for the construction of proteins. In all higher organisms, the cell attaches a long chain of adenine nucleotides, the so-called poly(A) tail, to the rear end of the transcripts in a process known as polyadenylation. The length and position of the chain varies from organism to organism and serves to stabilize the RNA molecule.

Different signaling motifs show the participating enzymes the site where the poly(A) chain is to be attached to the transcript. This does not always happen at the same site of the mRNA. The differential use of these sites is known as "alternative polyadenylation". This mechanism affects the length of the so-called 3'-untranslated end of the mRNA, a region that contains information beyond the protein sequence. This 3'-untranslated region is particularly important for stability, localisation and the efficiency with which the transcripts are translated into proteins. "Only recently it has been known that some cell types use this mechanism to control how much protein is produced per transcript and which isoform is to be expressed," says Pia Sommerkamp, the lead author of the study conducted by DKFZ and HI-STEM.

By applying a novel sequencing method, the scientists were able to identify numerous genes that are essential for stem cell development and are regulated via alternative polyadenylation during differentiation or in response to inflammation. These include the central metabolic enzyme glutaminase, which can be produced in two differently active isoforms. As the researchers found out, the activation of blood stem cells leads to a change from the less active to the highly active glutaminase isoform. This switch is coordinated by alternative polyadenylation.

"Only this isoform switch enables the stem cells to adapt all the necessary metabolic pathways according to their needs. This includes rapid increases in activity that are necessary in the case of infections or inflammations," explains Nina Cabezas-Wallscheid, who was co-supervisor of the study at the MPI in Freiburg. "With alternative polyadenylation, we have now discovered another control level with which stem cells regulate vital processes. We now want to investigate in more detail whether cancer stem cells also use this mechanism for their own purposes in leukaemias. We hope that this will provide us with new approaches for fighting the disease," explains Andreas Trumpp, Director of HI-STEM gGmbH at DKFZ and senior author of the study.

Credit: 
German Cancer Research Center (Deutsches Krebsforschungszentrum, DKFZ)

Drug used for liver disease also affects C. diff life cycle, reduces inflammation in mice

Researchers from North Carolina State University have found that a commonly used drug made from secondary bile acids can affect the life cycle of Clostridioides difficile (C. diff) in vitro and reduce the inflammatory response to C. diff in mice. The findings aid understanding of how this drug may be used in future treatment of C. diff infections in humans.

The drug in question - ursodiol, ursodeoxycholate or UDCA - is a secondary bile acid made by bacteria in the gut and is also FDA approved to treat inflammatory liver diseases. It is currently in Phase 4 clinical trials for use in treatment of C. diff infections.

"If UDCA proves effective against C. diff infection, it would give us an alternative to antibiotic treatments that further disrupt the gut microbiome and can lead to relapse, or to fecal transplants that may have unknown side effects," says Casey Theriot, assistant professor of population health and pathobiology at NC State and corresponding author of the work.

Theriot and a team of researchers from NC State, Pennsylvania State University and the University of North Carolina at Chapel Hill looked at UDCA treatment of C. diff in vitro and as a pre-treatment in a mouse model of the disease.

C. diff exists in the environment as a dormant spore. In humans, the spores colonize the large intestine by germinating, becoming bacteria that produce damaging toxins. Theriot and her team, led by former NC State graduate student Jenessa Winston, knew that UDCA would inhibit germination, growth and toxin production in vitro, and wanted to see if it would have the same effect in a mouse model.

In vitro, treatment with UDCA significantly decreased C. diff spore germination, growth and toxin activity. In the mouse model, pre-treatment with UDCA had some effect on bacterial growth, but the main effect of treatment was to suppress the inflammatory response of the immune system to bacterial growth and toxin.

"Mitigating the immune response means that pre-treatment with UDCA could significantly reduce tissue damage due to C. diff infection," Theriot says. "This work is the first to explore how UDCA works in vivo against C. diff infection, and demonstrates that the drug may be a viable pre-treatment to help patients avoid damaging effects of a C. diff infection. Our next steps will be to look at dosages and timing, in order to determine how to use it most effectively."

Credit: 
North Carolina State University

Extreme, high temperatures may double or triple heart-related deaths

DALLAS, March 30, 2020 -- When temperatures reach extremes of an average daily temperature of 109 degrees Fahrenheit, the number of deaths from cardiovascular disease may double or triple. Researchers note that these findings raise concerns that traditionally hot regions may be especially vulnerable to heat-related cardiovascular deaths, according to new research published today in Circulation, the flagship journal of the American Heart Association.

The highest temperature on earth in the last 76 years, 129 degrees Fahrenheit, was recently recorded in Kuwait. Given the consistently high temperatures in Kuwait (average ambient temperature 82.2 degrees Fahrenheit), researchers examined the relationship between temperature and more than 15,000 cardiovascular-related deaths in the country. All death certificates in Kuwait from 2010 to 2016 that cited "any cardiovascular cause" for individuals ages 15 and older were reviewed for this study.

Compared to the number of deaths on days with the lowest mortality temperature (average daily temperature of 94.5 degrees Fahrenheit, when the fewest people died), when the 24-hour average temperature was extreme (109 degrees Fahrenheit or higher), the investigators found:

Overall, a 3-times greater risk of dying from any cardiovascular cause;

Men were more affected by the extreme temperatures, experiencing a 3.5 times higher death rate;

The death rate among women was nearly 2.5 higher;

Working-age people (ages 15-64 years) had a death rate 3.8 times higher; and

The death rate was just over 2-times higher for people 65 and older.

To examine the effects of temperature on its own, the investigators adjusted for other environmental factors such as air pollution and humidity. Higher temperatures affected both genders and all ages differently.

"While cardiologists and other medical doctors have rightly focused on traditional risk factors, such as diet, blood pressure and tobacco use, climate change may exacerbate the burden of cardiovascular mortality, especially in very hot regions of the world," said Barrak Alahmad, M.B.Ch.B, M.P.H., a mission scholar from Kuwait University and a Ph.D. candidate in environmental health at the Harvard T.H. Chan School of Public Health in Boston.

When core body temperature increases, the human body tries to cool itself by shifting blood from the organs to underneath the skin. This shift causes the heart to pump more blood, putting it under significantly more stress. A collaborative group of cardiologists, environmental health specialists and epidemiologists hypothesized that increasing temperatures in hotter regions of the world could lead to increased CVD death due to extreme heat's effects on the body.

"The warming of our planet is not evenly distributed. Regions that are inherently hot, like Kuwait and the Arabian Peninsula, are witnessing soaring temperatures unlike ever before. We are sounding the alarm that populations in this part of the world could be at higher risk of dying from cardiovascular causes due to heat," said Alahmad. "Although we cannot conclude it from this analysis, men and working-age people may have been at greater increased risk because of spending more time outside. We need to explore this relationship further and consider serious preventive strategies that could reduce the impact of rising temperatures on our health."

The study was limited by only having information on any cardiovascular cause of death, so it is not known whether any specific type of heart problem is more susceptible to the influence of extreme heat. Although the researchers found a strong association between extremely high temperatures and increased cardiovascular deaths, further research is needed to establish a cause and effect relationship.

Credit: 
American Heart Association

Advances in production of retinal cells for treating blindness

image: Fredrik Lanner, Sandra Petrus-Reurer, Alvaro Plaza Reyes and Anders Kvanta, researchers at Karolinska Institutet, Sweden. Photo: Paschalis Efstathopoulos

Image: 
Paschalis Efstathopoulos

Researchers at Karolinska Institutet and St Erik Eye Hospital in Sweden have discovered a way to refine the production of retinal cells from embryonic stem cells for treating blindness in the elderly. Using the CRISPR/Cas9 gene editing, they have also managed to modify the cells so that they can hide from the immune system to prevent rejection. The studies are published in the scientific journals Nature Communications and Stem Cell Reports.

Age-related macular degeneration of the eye is the most common cause of blindness in the elderly. This loss of vision is caused by the death of the photoreceptors (the rods and cones) resulting from the degeneration and death of the underlying retinal pigment epithelial (RPE cells), which provide the rods and cones vital nourishment. A possible future treatment could be to transplant fresh RPE cells formed from embryonic stem cells.

Working with colleagues at St Erik Eye Hospital, researchers at Karolinska Institutet have now found specific markers on the surface of the RPE cells that can be used to isolate and purify these retinal cells. The results are published in Nature Communications.

"The finding has enabled us to develop a robust protocol that ensures that the differentiation of embryonic stem cells into RPE cells is effective and that there is no contamination of other cell types," says principal investigator Fredrik Lanner, researcher at the Department of Clinical Science, Intervention and Technology and the Ming Wai Lau Center for Reparative Medicine at Karolinska Institutet. "We've now begun the production of RPE cells in accordance with our new protocol for the first clinical study, which is planned for the coming years."

One obstacle when transplanting tissue generated from stem cells is the risk of rejection, which occurs if transplantation antigens of the donor and patient tissue differ. Research groups around the world are therefore working on creating what are known as universal cells, which ideally will not trigger an immune response.

In a study published in Stem Cell Reports the same group at Karolinska Institutet created embryonic stem cells able to hide from the immune system. Using CRISPR/Cas9 gene editing, they removed certain molecules, HLA class I and class II, which sit on the surface of the stem cells as a means by which the immune system can identify them as endogenous or not. The stem cells lacking these molecules were then differentiated into RPE cells.

The researchers have been able to show that the modified RPE cells retain their character, that no harmful mutations appear in the process and that the cells can avoid the immune system's T cells without activating other immune cells. The rejection response was also significantly less and more delayed than after the transplantation of regular RPE cells, the surfaces of which still possess HLA molecules.

"The research is still in an early stage, but this can be an important initial step towards creating universal RPE cells for the future treatment of age-related macular degeneration," says joint last author Anders Kvanta, adjunct professor at the Department of Clinical Neuroscience and consultant at St Erik Eye Hospital.

Credit: 
Karolinska Institutet

Microelectronics for birds

image: Portable device for local application of oscillating magnetic fields: 1 - generator; 2 - batteries; 3 - connecting cable; 4 - magnetic coil; 5 - loops for attaching the device on the back of the bird.

Image: 
SPbU

The scientists have discovered that magnetoreception in birds is unlikely to be associated with the light-sensitive protein cryptochrome in the retina of their eyes, though photochemical reactions in cryptochrome have been so far considered to be the primary biophysical mechanism behind the magnetic sense of birds.

At present, researchers are puzzled by the mechanism of avian magnetic orientation, which allows migratory birds to solve the complex tasks of moving between distant breeding and wintering grounds. The existence of a magnetic compass system in birds has been well documented, but the sensory mechanism and its location still remain unknown. What is known is that birds use multiple sources of directional information for their navigation: the sun, stars, landscape, smells, as well as parameters of the Earth's geomagnetic field. Birds can use the geomegnetic field to maintain the flight direction (magnetic compass) and, possibly, to determine their location (magnetic map). It was also established that the avian magnetic compass can be disrupted if the whole body of the bird is exposed to a weak oscillating magnetic field (OMF) in the megahertz range. Furthermore, in 2018, an international team of researchers, which included scientists from St Petersburg University, found that magnetic map information is transmitted to the brain of the bird through the ophthalmic branches of the trigeminal nerve. Thus, the functions and certain neural pathways are known, yet the receptor itself has still not been found.

According to one of the theories explaining the unique ability of migratory birds, the most probable magnetoreceptors are molecules of a photosensitive protein, the cryptochrome. These are located in the retina of the bird's eye. The outcome of the photochemical reactions inside the magnetically sensitive protein depends on intensity and direction of the magnetic field. Some researchers suggest that the cryptochromes in birds' eyes give them anability to 'see' the Earth's magnetic fields. To check whether the avian magnetic compass is really embedded in the visual system, the researchers from St Petersburg University decided to apply OMF locally to the bird's eyes.. Theoretically, this should havedisrupted the bird's orientation.

'We have developed miniaturised devices, each weighing just 0.95 grams and comprising a magnetic coil and a high-frequency generator fed from watch batteries,' explained Kirill Kavokin, Leading Scientist of the I.N. Uraltsev Spin Optics Laboratory at St Petersburg State University. 'Since the coils are very small, the magnetic field can be applied locally only to the bird's eyes. It was a challenge to construct such a device. Besides, we spent a lot of time selecting a non-toxic adhesive that would be safe for birds and provide reliable bonding. This turned out to be an eyelash glue.'

22 Garden warblers were equipped with portable devices, with the magnetic coil attached to the bird's head. The Garden warbler is a widespread small bird that breeds in most of Europe and spends the winter in Africa. The experimental birds were captured during the autumn migration at the Biological Station Rybachy on the Courish Spit, Kaliningrad region, Russia. After a series of experiments, it was determined that the birds with portable devices attached, both in 'on' and 'off" state, behave the same way. The birds were tested in round arenas placed inside a screened and grounded non-magnetic chamber with artificial nocturnal lighting by green LEDs. Under these conditions, the only orientation cue available to the experimental birds during the tests was the geomagnetic field. The birds, however, were not disoriented and showed the seasonally appropriate migratory direction - southwest. On the contrary, when placed in an OMF generated by large stationary coils in control experiments, the birds demonstrated clear disorientation.

"Our findings cast doubt on the prevailing photochemical theory of the magnetic compass of birds. They point to the existence of other components of the avian magnetoreception system that are sensitive to OMFs. These findings challenge the existing ideas about the biophysics and neurophysiology of magnetoreception."
Julia Bojarinova, one of the leading authors of the research, Associate Professor of the Department of Vertebrate Zoology at St Petersburg University

In further experiments, the scientists seek to explore other places on the bird's body where the compass receptor may 'hide itself.' According to available data, such a receptor might be situated either in the upper beak, or in the lagena - part of the bird's inner ear. To test new theories, experimenters will be able to utilise the developed miniaturised devices - only slight modifications of the magnetic coils will be required.

'At present, our interest is mainly fundamental. We are driven by curiosity, as the magnetic compass of birds remains the unsolved puzzle,' noted Kirill Kavokin. 'Nonetheless, there is a chance that the solution of this fundamental problem will enable the creation of new satellite-free navigation systems.' After all, birds always know where to fly, and they do not need satellites for navigation.'

Credit: 
St. Petersburg State University

Study helps to identify medications which are safe to use in treatment of COVID-19

A recent study has found that there is no evidence for or against the use of non-steroidal anti-inflammatory drugs such as ibuprofen for patients with COVID-19.

The study, led by researchers at King's College London, also found other types of drugs, such as TNF blockers and JAK inhibitors safe to use.

89 existing studies on other coronavirus strains such as MERS and SARS, as well as the limited literature on COVID-19, were analysed to find out if certain pain medications, steroids, and other drugs used in people already suffering from diseases should be avoided if they catch COVID-19.

Some patients, for example those with cancer, are already given immunosuppressive drugs - therapies that can lower the body's immune system - or immunostimulant drugs - therapies that boost it. If these patients then catch COVID-19, doctors need to know what medication to stop.

Dr Mieke Van Hemelrijck, a cancer epidemiologist and an author on the paper, said "This pandemic has led to challenging decision-making about the treatment of COVID-19 patients who were already critically unwell. In parallel, doctors across multiple specialties are making clinical decisions about the appropriate continuation of treatments for patients with chronic illnesses requiring immune suppressive medication."

The article has been published in ecancermedicalscience, an open access oncology journal, and is authored by researchers from King's College London and Guy's and St. Thomas NHS Foundation Trust, London.

There had been some speculation that non-steroidal anti-inflammatory drugs (NSAIDs) such as ibuprofen might make things worse for some COVID-19 patients, but the researchers did not find evidence to support this statement.

Other types of drugs such as TNF blockers and JAK inhibitors, used to treat arthritis or other forms of inflammation, were also found to be safe to use. Another class of drug known as anti-interleukin-6 agents is being investigated for helping to fight COVID-19, although there is no conclusive proof yet.

The researchers found that low amounts of prednisolone or tacrolimus therapy may be helpful in treating COVID-19.
Co- author, Dr Sophie Papa, a medical oncologist and immunologist said: "Current evidence suggests that low dose prednisolone (a steroid used to treat allergies) and tacrolimus therapy (an immunosuppressive drug given to patients who have had an organ transplant) may have beneficial impact on the course of coronavirus infections. However further investigation is needed."

As more people catch the disease, researchers will continue to investigate how it interacts with commonly used medications and make further guidance recommendations.

Credit: 
King's College London

Without additional investment 11 million children are expected to die from cancer between now and 2050

Improving care for children with cancer worldwide will bring a triple return on investment and prevent millions of needless deaths, according to a new Commission report published today by The Lancet Oncology entitled Sustainable Care for Children with Cancer.

Without additional investment in childhood cancer care, new estimates produced for the report reveal that over 11 million children aged 14 years and younger are expected to die from cancer over the next 30 years worldwide. The vast majority of those--more than 9 million deaths (84%)--will be in low-income and lower-middle-income countries.

Authored by 44 of the world's leading oncologists, paediatricians, global health experts, and economists, the landmark report synthesises existing evidence with new modelling and economic analyses to demonstrate that--with investment in expanding worldwide coverage of achievable cost-effective interventions and strengthening health systems--millions of children's lives could be saved, with huge economic benefits that far exceed the costs.

"For too long, there has been a widespread misconception that caring for children with cancer in low- and middle-income countries (LMICs) is expensive, unattainable, and inappropriate because of competing health priorities. Nothing could be further from the truth", says Professor Rifat Atun from Harvard T.H. Chan School of Public Health, USA, Co-chair of the Commission.

He continues: "This report provides compelling evidence that improving outcomes for children with cancer is both feasible and a highly cost-effective investment for all countries rich and poor alike. Expanding access to achievable diagnostics, treatment, and supportive care, alongside strengthening health systems more widely, could prevent more than 6 million child deaths and bring almost US$2 trillion in economic benefits over the next 30 years. The time is right for a global push to expand coverage of care for children with cancer." [1]

Misconceptions driving global childhood cancer divide

Although recent decades have seen unprecedented progress in diagnosis, treatment, and supportive care, and many childhood cancers are now curable, striking inequalities in financing and access to care have resulted in worldwide disparities in the survival of children with cancer. Around 80% of children diagnosed with cancer in high-income countries will live for more than five years, yet fewer than 30% of children with cancer in LMICs have the same chance of survival, falling to just 8% in eastern Africa.

"The stark reality is worldwide inequity and a bleak picture for children with cancer in low- and middle-income countries. So far investment targeted to childhood cancer in developing countries has been minuscule. Yet childhood cancer is no longer complex, expensive, difficult to diagnose, or complicated to treat", says Dr Ramandeep Arora from the Max Super-Speciality Hospital in India, Commission co-author. "Outcomes for children in low- and middle-income countries could be dramatically improved by addressing key issues such as delayed diagnosis and lack of access to essential medicines. Our report lays out an evidence-based medical framework that countries can use for implementing, integrating, and scaling up care pathways for childhood cancer." [1]

Modelling the true burden of childhood cancer

In the report, the authors use new modelling, which, for the first time, explicitly takes into account weaknesses in health systems that contribute to under-reporting and under-diagnosis as well as population growth, to predict national and global estimates of cancer incidence, survival, and mortality for children aged 0-14 years in 200 countries from 2020 to 2050 (appendix pp 16-19).

Without additional investment to improve access to health-care services and cancer treatment, around 13.7 million children are expected to develop cancer worldwide between 2020 and 2050--over 10 million of these cases will be in LMICs, and around 939,000 in high-income countries (table 5). Of these, 6.1 million cases (45%) will be left undiagnosed and untreated (table 4). This is particularly true in south Asia and sub-Saharan Africa, where one in two new cases of cancer will be missed.

"Health systems are ill-prepared to meet this unprecedented challenge, particularly in low- and middle-income countries where treatment for paediatric cancer is often managed through charitable partners", says Dr Agnes Binagwaho, Vice-Chancellor of the University of Global Health Equity in Rwanda and Commission co-author. "The situation is particularly dire in many countries in sub-Saharan Africa, where health systems are fragile and the population of children is expected to double from 320 million in 2015 to more than 720 million by 2050." [1]

Scaling up achievable interventions could save US$3 for every dollar spent

The Commission also analysed the potential return on investment of simultaneous comprehensive scale-up of three health and social interventions--access to primary care and referral to specialist care, treatment (eg, chemotherapy, radiotherapy, surgery), and supportive services to reduce treatment abandonment. They found that over 6 million child deaths from cancer could be prevented worldwide over the next 30 years--more than half (56%) of the total 11 million deaths otherwise projected.

This is equivalent to a gain of 318 million years of healthy life for children who are treated successfully and survive into adulthood, with the health benefits expected to be greatest in LMICs (figure 9; table 7).

The report also highlights a strong economic case for investing in expanding coverage of all three interventions globally. Cumulative treatment costs of US$594 billion will be offset, the report predicts, by global lifetime productivity gains of US$2,580 billion between 2020 and 2050--producing a net benefit of US$1986, which includes countries of all income levels (figures 11 and 12).

"Our findings indicate that US$20 billion of funding per year over a 30-year period could bring a return of US$3 for every dollar spent. This should reassure policymakers that a sizeable return on investment is realistic and feasible", says Dr Carlos Rodriguez-Galindo from St Jude Children's Research Hospital, USA, Co-chair of the Commission. "Without this investment to halt millions of needless deaths from childhood cancer, we are unlikely to reach the Sustainable Development Goals." [1]

A global push for sustainable cancer care

Given the growing burden of childhood cancer, the Commission urges that six specific and deliberate actions at country, regional, and global levels are taken to tackle the unacceptable inequalities between cancer services in rich and poor countries and to achieve sustainable care for children with cancer wherever they live:

1) Make childhood cancer an integral part of essential benefits packages when expanding universal health coverage

2) Develop fully costed national cancer control plans for LMICs

3) End out-of-pocket costs for children with cancer to prevent treatment abandonment

4) Establish national and regional cancer networks to increase access to effective services

5) Expand the quality and quantity of population-based cancer registries that include data on childhood cancers

6) Invest in research and innovation in LMICs, supported by a new global coalition fund of US$100 million per year

"Improving childhood cancer outcomes will save millions of lives and generate US$2 trillion in economic benefits. But this will require strong political leadership, global solidarity, collective action, inclusive participation of all major stakeholders, and alignment of national and global efforts to expand access to effective and sustainable care for children with cancer", says Atun. "Only then will all children who are diagnosed with cancer be able to enjoy equitable access to optimal care, better health, the chance to reach a fulfilling and productive adulthood, and the dignity that they deserve."[1]

Credit: 
The Lancet

Tree rings could pin down Thera volcano eruption date

Charlotte Pearson's eyes scanned a palm-sized chunk of ancient tree. They settled on a ring that looked "unusually light," and she made a note without giving it a second thought. Three years later, and armed with new methodology and technology, she discovered that the light ring might mark the year that the Thera volcano on the Greek island of Santorini erupted over the ancient Minoan civilization. The date of the eruption, which is one of the largest humanity has ever witnessed, has been debated for decades.

Pearson, a University of Arizona assistant professor of dendrochronology and anthropology, is lead author of a paper, published in the Proceedings of the National Academy of Sciences, in which she and her colleagues have used a new hybrid approach to assign calendar dates to a sequence of tree rings, which spans the period during which Thera erupted, to within one year of a calendar date. This allows them to present new evidence that could support an eruption date around 1560 B.C.

Filling the Gaps

"In every tree ring, you have this time capsule that you can unpack," Pearson said.

Trees grow in accordance with the conditions of their local environment. Each year, trees produce a new layer of concentric growth, called a tree ring, which can record information about rainfall, temperature, wildfires, soil conditions and more. Trees can even record solar activity as it waxes and wanes.

When a sequence of rings from trees of various ages are overlapped and added together, they can span hundreds or thousands of years, providing insight about past climate conditions and context for concurrent civilizations.

"The longest chronology in the world stretches back 12,000 years. But in the Mediterranean, the problem is that we don't have a full, continuous record going back to the time of Thera," Pearson said. "We have recorded the last 2,000 years very well, but then there's a gap. We have tree rings from earlier periods, but we don't know exactly which dates the rings correspond to. This is what's called a 'floating chronology.'"

Filling this gap could help pin down the Thera eruption date and paint a climatic backdrop for the various civilizations that rose and fell during the Bronze and Iron ages, which together spanned between 5,000 and 2,500 years ago.

"Until you can put an exact year on events on a scale that makes sense to people - one year - it's not quite as powerful," Pearson said. "This study is really about taking (my co-author and tree ring lab research professor) Peter Kuniholm's chronology that he's put together over 45 years of work and dating it in a way not possible before. Most importantly, it is fixed in time, just as if we had filled our tree ring gap."

A Hybrid Approach

Since the inception of the UArizona Laboratory of Tree-Ring Research in 1937, an assortment of tree ring samples from all over the world accumulated in less-than-ideal conditions beneath Arizona Stadium. But since the completion of the university's upgraded Bryant Bannister Tree Ring Building in 2013, the curation team, led by Peter Brewer, has been relocating, organizing and preserving samples for future research.

"This is the collection that founded the field of tree ring research, and it's by far the world's largest," Brewer said. "Researchers come from all over to use our collection."

"It's just crammed full of the remains of ancient forests and archaeological sites, which no longer exist, and it contains wood samples that were fundamental in the growth of the discipline of dendrochronology," Pearson said.

The collection includes timbers from the Midas Mound Tumulus at Gordion in Turkey - a giant tomb of a man that was likely Midas' father or grandfather. From timbers like these, Kuniholm has been building a tree ring chronology from the Mediterranean for nearly a half century. Together, Kuniholm's record from the B.C. period spans over 2,000 years, including trees growing downwind of the Thera eruption, making it key to the team's research.

Despite the length of this chronology, it remained undated. To pin it down, the team decided to try something new.

When cosmic rays from space enter the Earth's atmosphere, neutrons collide with nitrogen atoms to create a radioactive version of carbon, called carbon-14, which spreads around the planet. All other life on Earth, including tree rings, pick up the carbon-14, and because tree-rings lock away a measurement of carbon-14 for each year that they grow, they hold patterns showing how it changed over time. These patterns of carbon-14 in tree rings around the world should match.

Pearson and her team used the patterns of carbon-14 captured in the Gordion tree rings to anchor the floating chronology to similar patterns from other calendar dated tree ring sequences.

"It's a new way to anchor floating tree ring chronologies that makes use of the annual precision of tree rings," Pearson said.

To validate their findings, the team turned to the calendar-dated rings of high-elevation bristlecone pines from western North America that lived at the same time as the Gordion.

"When there are large volcanic eruptions, it often scars bristlecone by freezing during the growing season, creating a frost ring," said second author Matthew Salzer, research scientist at the tree ring laboratory. "Then we compared the dates of the frost rings with what was going on in the Mediterranean trees, which respond to volcanoes by growing wider rings. And it worked. It showed that the wide rings in the Mediterranean chronology occurred in the same years as the frost rings in the bristlecone. We took that to be confirmation that the dating was probably correct."

The team then thought to use a new piece of technology in the lab called the X-ray fluorescence machine to scan the wood for chemical changes.

"We scanned the entire period across when Thera is known to have happened," Pearson said, "and we detected a very slight depletion in calcium, right where I saw this lighter ring years ago."

While it's a slight fluctuation, it is significant and only occurs at one point in the years around 1560 B.C.

"We put that in the paper and tentatively suggest it's a possible date for Thera," Pearson said.

Something changed the chemistry of the environment in which the tree grew; acid deposition from a volcano is one possibility, wildfire is another, but because the date happens to coincide with other tree ring markers for a major eruption, Pearson she says it's worthy of further exploration.

"I think to do good science you have to investigate everything and keep an open mind until sufficient data comes together," Pearson said. "This is another little piece of the puzzle."

Credit: 
University of Arizona

Tiny optical cavity could make quantum networks possible

image: A nanophotonic cavity created by the Faraon lab.

Image: 
Faraon lab/Caltech

Engineers at Caltech have shown that atoms in optical cavities--tiny boxes for light--could be foundational to the creation of a quantum internet. Their work was published on March 30 by the journal Nature.

Quantum networks would connect quantum computers through a system that also operates at a quantum, rather than classical, level. In theory, quantum computers will one day be able to perform certain functions faster than classical computers by taking advantage of the special properties of quantum mechanics, including superposition, which allows quantum bits to store information as a 1 and a 0 simultaneously.

As they can with classical computers, engineers would like to be able to connect multiple quantum computers to share data and work together--creating a "quantum internet." This would open the door to several applications, including solving computations that are too large to be handled by a single quantum computer and establishing unbreakably secure communications using quantum cryptography.

In order to work, a quantum network needs to be able to transmit information between two points without altering the quantum properties of the information being transmitted. One current model works like this: a single atom or ion acts as a quantum bit (or "qubit") storing information via one if its quantum properties, such as spin. To read that information and transmit it elsewhere, the atom is excited with a pulse of light, causing it to emit a photon whose spin is entangled with the spin of the atom. The photon can then transmit the information entangled with the atom over a long distance via fiber optic cable.

It is harder than it sounds, however. Finding atoms that you can control and measure, and that also aren't too sensitive to magnetic or electric field fluctuations that cause errors, or decoherence, is challenging.

"Solid-state emitters that interact well with light often fall victim to decoherence; that is, they stop storing information in a way that's useful from the prospective of quantum engineering," says Jon Kindem (MS '17, PhD '19), lead author of the Nature paper. Meanwhile, atoms of rare-earth elements--which have properties that make the elements useful as qubits--tend to interact poorly with light.

To overcome this challenge, researchers led by Caltech's Andrei Faraon (BS '04), professor of applied physics and electrical engineering, constructed a nanophotonic cavity, a beam that is about 10 microns in length with periodic nano-patterning, sculpted from a piece of crystal. They then identified a rare-earth ytterbium ion in the center of the beam. The optical cavity allows them to bounce light back and forth down the beam multiple times until it is finally absorbed by the ion.

In the Nature paper, the team showed that the cavity modifies the environment of the ion such that whenever it emits a photon, more than 99 percent of the time that photon remains in the cavity, where scientists can then efficiently collect and detect that photon to measure the state of the ion. This results in an increase in the rate at which the ion can emit photons, improving the overall effectiveness of the system.

In addition, the ytterbium ions are able to store information in their spin for 30 milliseconds. In this time, light could transmit information to travel across the continental United States. "This checks most of the boxes. It's a rare-earth ion that absorbs and emits photons in exactly the way we'd need to create a quantum network," says Faraon, professor of applied physics and electrical engineering. "This could form the backbone technology for the quantum internet."

Currently, the team's focus is on creating the building blocks of a quantum network. Next, they hope to scale up their experiments and actually connect two quantum bits, Faraon says.

Credit: 
California Institute of Technology

Alcohol consumption by fathers before conception could negatively impact child development

image: Photo shows, from left to right, Riley Bottom, Kelly Huffman, and Kathleen Conner.

Image: 
Huffman lab, UC Riverside.

RIVERSIDE, Calif. -- Scientists at the University of California, Riverside, have explored the relationship between parental alcohol consumption -- before conception in the case of fathers and during pregnancy in the case of mothers -- and offspring development.

In a paper published in Alcoholism: Clinical and Experimental Research, the researchers report that when alcohol-exposed male mice mated with alcohol-naïve females, the offspring displayed significant deficits in brain development. Specifically, the neocortex, the most complex part of the mammalian brain responsible for complex cognitive and behavioral function, had patterning deficits where abnormal gene expression led to miswiring of connections. Although neither these mice nor their mothers had ever been exposed to alcohol, their brains showed changes consistent with a mouse model of Fetal Alcohol Spectrum Disorders, or FASD.

"People have known about the dangers of maternal drinking during pregnancy for years; however, the safety of paternal drinking while trying to conceive has barely been considered," said Kelly Huffman, an associate professor of psychology who led the study and whose lab generated the FASD mouse model. "Our research shows that fathers' exposure to alcohol leading up to conception can have deleterious effects on the child's brain and behavioral development."

In a second paper, published in Neuropharmacology, Huffman's team reports that when female mice were given choline, an essential nutrient, along with alcohol during their pregnancies, the negative outcomes associated with prenatal alcohol exposure, such as smaller body weight, brain weight, and abnormalities in the anatomy of the neocortex, were reduced in the offspring. This suggests choline supplementation could prevent the adverse outcomes associated with prenatal alcohol exposure.

"Our work shows that prenatal choline supplementation, when administered at the time of prenatal alcohol exposure, improves abnormal brain and behavioral development in offspring," Huffman said. "It rescues some of the phenotypes associated with FASD."

Sins of the father

In the first study, male mice consumed alcohol for approximately two-three weeks before mating with alcohol-naïve females. Huffman's team found this preconceptual paternal alcohol exposure altered neocortical gene expression and connectivity in their offspring. The offspring also demonstrated atypical features such as increased anxiety or hyperactivity and reduced motor function, consistent with some documented behavior patterns of children born to alcoholic fathers.

"Fathers who consistently consume moderate to high amounts of alcohol leading up to conception may negatively impact offspring development due to the exposure to the paternal sperm," Huffman said. "In our previous study, we described how the paternal germ line specifically can transmit heritable changes through multiple generations after a single prenatal alcohol exposure. Clearly, the paternal environment before conception is critical for healthy offspring development."

Additionally, the team found male offspring generally seem to be more adversely affected than female offspring by paternal alcohol exposure in terms of increased hyperactivity, impaired coordination, and impaired short-term motor learning abilities.

The study is the first to examine the effects of preconceptual paternal alcohol exposure on the gross anatomical development of the neocortex, including genetic patterning and circuit development, coupled with extensive behavioral analyses in the affected offspring. Huffman's team plans to extend the mouse study to investigate whether the effects of paternal alcohol consumption on the offspring are transmitted to subsequent generations.

Huffman was joined in the research by graduate students Kathleen E. Conner and Riley T. Bottom.

Nutrient to the rescue

Depending on maternal age, up to 18% of pregnant women in the United States report alcohol consumption during their pregnancies. Gestational or prenatal alcohol exposure can produce problematic deficits in offspring. In mice, prenatal alcohol exposure, via maternal drinking, results in gross developmental abnormalities, including decreased body weight, brain weight, and brain size. Also, the exposure causes profound abnormalities in the patterning of an infant's neocortex and the resulting circuitry, or connections, necessary for precise function.

In the second study, Huffman's team exposed pregnant mice to 25% alcohol, the usual dose for the FASD model, as well as about 640 milligrams per liter of choline chloride supplement throughout the pregnancy. Her team's goal was to test potential rescue effects of choline supplementation on abnormal neocortical and behavioral development induced by prenatal alcohol exposure.

Choline, a vitamin-like essential nutrient, is a methyl group donor and is crucial for proper brain development as it generates the methyl group that attaches to DNA and affects gene expression. Given the transgenerational effects of prenatal alcohol consumption discovered by the Huffman lab, Huffman's team believed co-administration of choline with alcohol could mitigate the deleterious effects of the exposure.

"Our findings suggest that providing methyl group donors, such as choline, to alcoholic women during pregnancy could be effective in reducing the extent of the damage that prenatal alcohol exposure can cause," said Bottom, the first author of the research paper. "This could possibly reduce the multigenerational transmission of FASD in our prenatal alcohol exposure model."

Huffman and Bottom were joined in the study by Charles W. Abbott III, a former graduate student in Huffman's lab. This work is a major component of Bottom's dissertation research.

Credit: 
University of California - Riverside

Boosts in serum EPA levels from prescription fish oil drive heart benefits

Higher levels of the omega-3 fatty acid eicosapentaenoic acid (EPA) found in the blood--and not a decrease in triglyceride levels as originally thought--appear to explain the striking reductions in cardiovascular events and deaths seen among people taking 4 grams daily of the prescription fish oil, icosapent ethyl, according to findings from a REDUCE-IT substudy presented at the American College of Cardiology's Annual Scientific Session Together with World Congress of Cardiology (ACC.20/WCC).

"A major missing piece of the puzzle, and what many clinicians want to know, is how icosapent ethyl actually works to produce such dramatic cardiovascular risk lowering," said Deepak L. Bhatt, MD, MPH, executive director of interventional cardiovascular programs at Brigham and Women's Hospital, professor of medicine at Harvard Medical School and lead author of the current study. "On-treatment EPA levels achieved via the drug strongly correlated with lower rates of cardiovascular events, heart attack, stroke, coronary revascularization procedures, unstable angina, sudden cardiac arrest, new heart failure, or death for any reason."

REDUCE-IT enrolled 8,179 patients at 473 sites in 11 countries who had elevated cardiovascular risk and were already being treated with statins. The trial found that taking a high dose of icosapent ethyl cut the combined rate of first and subsequent nonfatal heart attacks, strokes, cardiovascular deaths, procedures for coronary artery disease such as stenting, or hospitalizations for unstable angina by 25% and 30%, respectively, over a median of 4.9 years of follow-up. Most patients in the trial were already on antiplatelet therapy, ACE-inhibitors/ARBs, beta blockers, aspirin and statins, which, researchers said, provided reassurance that icosapent ethyl, by itself, offered separate and incremental benefit.

Prior to REDUCE-IT, icosapent ethyl was approved by the U.S. Food and Drug Administration (FDA) for people with triglycerides above 500 mg/dL, so Bhatt said many people understandably thought the study drug reduced cardiovascular events primarily by lowering triglycerides. However, Bhatt said the current study, which looked at the association between blood serum levels of EPA achieved on icosapent ethyl and cardiovascular outcomes, found the lion's share of the drug's remarkably large cardiovascular benefit is driven by achieved EPA levels.

"Changes in triglycerides levels and other cardiovascular risk markers, including LDL, HDL, apoB and CRP, appear to be responsible for a significantly lesser portion of the overall observed benefit," he said. "I think this finding is going to usher in a whole new era of cardiovascular therapies. We are, in a sense, where we were with statins when the first one came out."

As part of the analysis, Bhatt and his team first looked at EPA levels prior to randomization to determine whether the drug worked differently based on a person's initial baseline EPA level, which may reflect a diet high in fish consumption or genetics. But they found that regardless of patients' initial serum EPA levels, they derived a similar and large degree of cardiovascular benefit. Data on baseline EPA were missing for 14% patients, but the baseline characteristics and outcomes were similar between patients with and without missing data.

The researchers then examined achieved EPA levels on the drug compared with placebo, grouping patients into thirds, or tertiles, ranging from the lowest to highest levels of EPA and averaged across visits. Achieved EPA within the icosapent ethyl group was strongly associated with cardiovascular events, and each of the tertiles showed a significant relative risk reduction in cardiovascular events.

They examined on-treatment EPA levels from lowest to highest and found significant associations with all measured cardiovascular outcomes. "The higher the EPA level in their blood, the lower the rates of the different cardiovascular events, cardiovascular deaths and even total mortality," Bhatt said.

Overall, the drug significantly increased serum EPA levels by 386% from baseline to one-year compared with placebo. Levels of docosahexaenoic acid (DHA), which is another omega-3 fatty acid also found in oily fish like salmon, decreased by 2.9%, which Bhatt said suggests the cardiovascular benefits are clearly from EPA and not DHA.

Researchers also examined the relationship between on-treatment EPA levels and several other cardiovascular outcomes, though analyses for bleeding and atrial fibrillation were not yet available. While there were no significant reductions in heart failure in REDUCE-IT, among patients with the highest on-treatment EPA levels, there was a significant reduction in hospitalizations for new heart failure with the drug versus placebo, which Bhatt said is quite remarkable. There were also significant associations between on-treatment EPA levels and lower risks of sudden cardiac death and cardiac arrest, further validating what was seen in the overall trial.

Experts don't know why some people are able to achieve higher serum EPA levels and others are not. Bhatt and his team accounted for whether patients took the drug, but there may be other influencing factors, such as how someone metabolizes EPA, their body size or their genetics--this needs further study.

Bhatt said the EPA levels attained on the drug are well beyond what can be achieved with diet or dietary supplements. He said that the study drug is a unique prescription medicine and that the results do not apply to other omega-3 products or to dietary supplement formulations, which are not approved or strictly regulated by the FDA and which, per the FDA, have not demonstrated reliable or consistent cardiovascular risk reduction.

Based on the data from REDUCE-IT, the FDA in December 2019 expanded the icosapent ethyl label to be used as an add-on to maximally tolerated statin therapy to help reduce the risk of heart attack, stroke, coronary revascularization and unstable angina requiring hospitalization in adult patients with triglyceride levels ?150 mg/dL and either established cardiovascular disease or diabetes mellitus plus two or more additional risk factors for cardiovascular disease. Based on this, Bhatt said icosapent ethyl stands to benefit over 12 million patients in the U.S. alone.

Credit: 
American College of Cardiology

E-cigarettes more effective than counseling alone for smoking cessation

Smokers who received smoking cessation counseling and used electronic cigarettes (e-cigarettes) containing nicotine were more than twice as likely to successfully quit smoking compared to those who received counseling but did not use e-cigarettes, in a clinical trial presented at the American College of Cardiology's Annual Scientific Session Together with World Congress of Cardiology (ACC.20/WCC). Researchers cautioned, however, that the health effects of e-cigarettes are unknown, and they should not be used for any purpose other than smoking cessation.

At 12 weeks, 21.9% of participants given nicotine-containing e-cigarettes had quit smoking, 17.3% of participants given non-nicotine e-cigarettes had quit and 9.1% of participants given only counseling had quit. Overall, those using nicotine-containing e-cigarettes were 2.4 times more likely to quit than those who did not vape at all.

"These findings show that nicotine e-cigarettes are effective for smoking cessation in the short term," said Mark J. Eisenberg, MD, MPH, a cardiologist at the Jewish General Hospital, professor of medicine at McGill University and the study's lead author. "Vaping with counseling is more effective than counseling alone, although it's not a magic bullet for smoking cessation."

Vaping has rapidly become more popular in recent years. Eisenberg said that there is evidence that some e-cigarette users try vaping to cut down on smoking, though only a few clinical trials have assessed the use of e-cigarettes as a smoking cessation tool, with somewhat mixed results. Further research is needed to elucidate potential health impacts of vaping, but Eisenberg said that they should be avoided by youth and anyone who does not smoke.

The new trial was designed to reflect the real-world conditions lifelong smokers face when attempting to quit. Researchers enrolled 376 participants at 17 sites in Canada. Participants were an average of 53 years old, had smoked for an average of 35 years and smoked an average of 21 cigarettes a day before the study. All participants were motivated to quit. Ninety-one percent had previously attempted to quit but failed, and the majority had previously tried smoking cessation medications or behavioral therapy without success.

One-third of the participants were assigned to receive e-cigarettes containing nicotine, one-third received e-cigarettes without nicotine and one-third did not receive an e-cigarette. All participants received about 100 minutes of counseling over the course of the 12 weeks that included guidance on quitting smoking. Those receiving e-cigarettes were instructed to vape as much as they felt they needed.

Participants reported their progress via three phone calls and two clinic visits during the 12-week treatment period. At clinic visits, participants underwent a breath test for carbon monoxide to verify whether or not they had smoked. At 12 weeks, those who reported not smoking a cigarette (even one puff) in the past week, and passed the breath test, were counted as having quit. For those who did not quit, researchers considered whether or not participants had reduced the total number of cigarettes smoked per day compared to the number they were smoking at the start of the study.

Eisenberg said that the quit rates among e-cigarette users were relatively high in the broader context of smoking cessation research, especially given that participants were heavy smokers who had used tobacco for years.

"While these are not clinical outcomes like death or lung cancer rates, the results are still quite impressive," he said.

Compared to baseline smoking rates of 21 cigarettes per day, participants who received nicotine e-cigarettes reduced their daily cigarette consumption by 13 cigarettes per day compared to 11 among participants receiving non-nicotine e-cigarettes and seven among those receiving counseling alone.

The study saw few adverse events. One participant assigned to receive nicotine-containing e-cigarettes experienced an exacerbation of chronic obstructive pulmonary disease, though it is unclear whether this was attributable to vaping.

"We desperately need information on whether e-cigarettes are effective for smoking cessation, but [we] also [need] safety data, as well," Eisenberg said. He suggested it may be wise to limit access to e-cigarettes to people who are actively attempting to quit smoking under a physician's guidance.

Researchers stopped enrollment early (after reaching 77% of target enrollment) due to a prolonged delay in the manufacturing of e-cigarettes used in the study. The study was underway before reports of vaping-linked lung injuries, and no such lung injuries occurred among study participants.

Researchers will continue to collect data for one year, with the last follow-up to be completed in September 2020. While e-cigarettes appear to be effective for smoking cessation in the short term, the one-year follow-up will show whether these effects persist over time.

Credit: 
American College of Cardiology

The placebo effect and psychedelic drugs: tripping on nothing?

There has been a lot of recent interest in the use of psychedelic drugs to treat depression. A new study from McGill suggests that, in the right context, some people may experience psychedelic-like effects from placebos alone. The researchers reported some of the strongest placebo effects (these are effects from "fake" medication) on consciousness in the literature relating to psychedelic drugs. Indeed, 61% of the participants in the experiment reported some effect after consuming the placebo.

"The study reinforces the power of context in psychedelic settings. With the recent re-emergence of psychedelic therapy for disorders such as depression and anxiety, clinicians may be able to leverage these contextual factors to obtain similar therapeutic experiences from lower doses, which would further improve the safety of the drugs," said Jay Olson, a Ph.D. candidate in McGill's Department of Psychiatry and the lead author on the research paper that was recently published in Psychopharmacology.

Setting the mood

Participants, who were expecting to take part in a study of the effects of drugs on creativity, spent four hours together in a room that had been set up to resemble a psychedelic party, with paintings, coloured lights and a DJ. To make the context seem credible and hide the deception, the study also involved ten research assistants in white lab coats, psychiatrists, and a security guard.

The 33 participants had been told they were being given a drug which resembled the active ingredient in psychedelic mushrooms and that they would experience changes in consciousness over the 4-hour period. In reality, everyone consumed a placebo. Among the participants were several actors who had been trained to slowly act out the effects of the ostensible drug. The researchers thought that this would help convince the participants that everyone had consumed a psychedelic drug and might lead them to experience placebo effects.

Strong effects for a placebo

When asked near the end of the study, the majority (61%) of the participants reported some effect of the drug, ranging from mild changes to effects resembling taking a moderate or high dose of an actual drug, though there was considerable individual variation. For example, several participants stated that they saw the paintings on the walls "move" or "reshape" themselves. Others described themselves as feeling "heavy... as if gravity [had] a stronger hold", and one had a "come down" before another "wave" hit her. Several participants reported being certain that they had taken a psychedelic drug.

"These results may help explain 'contact highs' in which people experience the effects of a drug simply by being around others who have consumed it," says Samuel Veissière, a cognitive anthropologist who teaches in McGill's Department of Psychiatry and supervised the study. "More generally, our study helps shed light on the 'placebo boosting' component inherent in all medical and therapeutic intervention, and the social influences that modulate these enhancing effects. Placebo effects may have been under-estimated in psychedelic studies. The current trend towards 'micro-dosing' (consuming tiny amounts of psychedelic drugs to improve creativity), for example, may have a strong placebo component due to widespread cultural expectations that frame the response"

Credit: 
McGill University

New explanation for sudden heat collapses in plasmas can help create fusion energy

image: PPPL physicist Stephen Jardin with figure from paper.

Image: 
Photo and composite by Elle Starkman/PPPL Office of Communications.

Scientists seeking to bring the fusion that powers the sun and stars to Earth must deal with sawtooth instabilities -- up-and-down swings in the central pressure and temperature of the plasma that fuels fusion reactions, similar to the serrated blades of a saw. If these swings are large enough, they can lead to the sudden collapse of the entire discharge of the plasma. Such swings were first observed in 1974 and have so far eluded a widely accepted theory that explains experimental observations.

Consistent with observations

Researchers at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) have proposed a new theory to explain the swings that occur in doughnut-shaped tokamaks, or fusion facilities. The theory, created through high-fidelity computer simulations, appears consistent with observations made during tokamak experiments, the researchers said. Understanding the process could prove vital to next-generation fusion facilities such as ITER, the international experiment under construction in France to demonstrate the practicality of fusion power.

Fusion combines light elements in the form of plasma -- the hot, charged state of matter composed of free electrons and atomic nuclei -- that generates massive amounts of energy. Scientists seeking to replicate fusion on Earth intend to provide a virtually inexhaustible supply of safe and clean power to generate electricity.

The recent findings demonstrate that when the pressure in the core of the plasma reaches a certain point, other instabilities can be excited that produce the sudden pressure and temperature drops. These instabilities create jumbled -- or stochastic -- magnetic fields in the core of the plasma that cause the collapse, said physicist Stephen Jardin, lead author of a paper describing the process in Physics of Plasmas and highlighted in a featured American Institute of Physics publication called "SciLight."

"Most tokamak discharges exhibit sawteeth," Jardin said, "and we're trying to provide the theory of the physics behind them."

The new findings depart sharply from a long-held theory that causing the swings is an instability that leads to magnetic reconnection -- the breaking apart and snapping together of the magnetic field lines in plasma. "That theory has been around for over 40 years," Jardin said.

Motivating the new theory

Motivating the new theory is previous PPPL research that demonstrates how the instability that was thought to lead to magnetic reconnection can, in fact, self-stabilize the plasma. It does this by producing a localized voltage that prevents the current in the core of the plasma from peaking sufficiently to be subject to magnetic reconnection.

The new explanation holds that even though the magnetic reconnection is suppressed, an increase of heat in the core of the plasma can excite localized instabilities that act together to flatten the pressure and temperature during the sawtooth cycle. Simulations produced by codes developed by Jardin and PPPL physicist Nate Ferraro, a coauthor of the paper, demonstrate this process. The new instabilities can grow very fast, consistent with the rapid collapse of heat seen in experiments that the traditional theory cannot explain.

This advanced model provides a new way to understand sawtooth phenomena. Looking ahead, the scientists want to explore the applicability of the model to tasks such as describing the evolution of "monster sawteeth" and using high powered Radio Frequency antennas to control sawtooth swings. "We want to develop a simulation model of a whole tokamak plasma," Jardin said, "and this new theory of the sawteeth is an important part of the effort."

Credit: 
DOE/Princeton Plasma Physics Laboratory

Moffitt identifies novel therapeutic targets in cutaneous squamous cell carcinoma

TAMPA, Fla. - Cutaneous squamous cell carcinoma (cuSCC) is the second most common diagnosed malignancy in the United States, with approximately 700,000 new cases each year. Cumulative exposure to ultraviolet light is the primary environmental risk factor that leads to cuSCC. While the majority of cuSCC cases are easily managed and treated, a small percentage of patients do not respond to treatment and have poor outcomes. Researchers at Moffitt Cancer Center want to devise better therapeutic strategies for this patient population by improving their understanding of how cuSCC develops. In a new article published online ahead of print in the journal Cancer Research, Moffitt scientists report on their identification of potential therapeutic targets for cuSCC.

The Flores Lab has been focused on identifying therapeutic targets for p53 mutant cancers. In this research, the team focused on the activity of the protein TAp63 - a tumor suppressor protein that is part of the p53 family of tumor suppressors. This family of proteins normally function to inhibit tumor development and growth. TAp63 is also known to compensate for p53 loss and to play an important role in the creation of small segments of RNA called microRNAs that are present in the human genome. MicroRNAs contribute to many biological processes and are often deregulated in different diseases, including cancer.

The researchers wanted to determine whether the ability of TAp63 to function as a tumor suppressor and regulator of microRNAs could impact the development of cuSCC. They treated mice that were missing the TAp63 gene and normal mice with ultraviolet light and discovered that mice without the TAp63 gene developed a significantly higher number of cuSCCs than normal mice. These observations demonstrate that TAp63 also acts as a tumor suppresser protein in the skin.

Flores's team then performed a series of comparative studies in both mouse and human cuSCC tumor samples and discovered that two microRNAs - miR-30c-2* and miR-497 - were expressed at much lower levels in tumors that lacked TAp63. When these microRNAs were reintroduced into tumor cells, the development of cuSCCs was significantly reduced. These observations suggest that miR-30c-2* and miR-497, similar to TAp63, may function by inhibiting the growth and development of cuSCC. The researchers verified this hypothesis by showing that miR-30c-2* contributes to cell death, while miR-497 inhibits cell growth and proliferation; therefore, when these miRNAs are missing, cuSCC growth can proceed unchecked.

The team also performed studies to assess how downstream targets of the microRNAs impacted cuSCC growth. They demonstrated that seven genes that are targets of the microRNAs are commonly deregulated in cuSCC tumors, including the AURKA gene. They confirmed the importance of AURKA by showing that inhibition of AURKA was able to reduce cuSCC growth in mouse models, suggesting that AURKA may be an appropriate therapeutic target.

"Our study establishes TAp63 as an essential suppressor of UV-induced cuSCC and reveals a previously undescribed functional network of microRNAs and targeted mRNAs," said Elsa Flores, Ph.D., chair of Moffitt's Department of Molecular Oncology and leader of the Cancer Biology and Evolution Program. "Given the lack of FDA approved targeted therapies for advanced cuSCC, our study provides preclinical evidence for the use of miR-30c-2*/miR-497 delivery or AURKA inhibition for the effective treatment approach."

Flores hopes this will lead to further investigations on these potential targets.

Credit: 
H. Lee Moffitt Cancer Center & Research Institute