Earth

Detecting solar flares, more in real time

image: This new technique transforms observations during the September 6th, 2017, solar flare into understandable, multi-colored maps. Different colors identify different solar phenomena.

Image: 
Dan Seaton and J. Marcus Hughes/CU Boulder, CIRES & NCEI

Computers can learn to find solar flares and other events in vast streams of solar images and help NOAA forecasters issue timely alerts, according to a new study. The machine-learning technique, developed by scientists at CIRES and NOAA's National Centers for Environmental Information (NCEI), searches massive amounts of satellite data to pick out features significant for space weather. Changing conditions on the Sun and in space can affect various technologies on Earth, blocking radio communications, damaging power grids, and diminishing navigation system accuracy.

"Being able to process solar data in real time is important because flares erupting on the Sun impact Earth over the course of minutes. These techniques provide a rapid, continuously updated overview of solar features and can point us to areas requiring more scrutiny," said Rob Steenburgh, a forecaster in the NOAA Space Weather Prediction Center (SWPC) in Boulder.

The research was published in October in the Journal of Space Weather and Space Climate.

To predict incoming space weather, forecasters summarize current conditions on the Sun twice daily. Today, they use hand-drawn maps labeled with various solar features--including, active regions, filaments, and coronal hole boundaries. But solar imagers produce a new set of observations every few minutes. For example, the Solar Ultraviolet Imager (SUVI) on NOAA's GOES-R Series satellites runs on a 4-minute cycle, collecting data in six different wavelengths every cycle.

Just keeping up with all of that data could take up a lot of a forecaster's time. "We need tools to process solar data into digestible chunks," said Dan Seaton, a CIRES scientist working at NCEI and one of the paper's co-authors. CIRES is part of the University of Colorado Boulder.

So J. Marcus Hughes, a computer science graduate student at CU Boulder, CIRES scientist in NCEI and lead author of the study, created a computer algorithm that can look at all the SUVI images simultaneously and spot patterns in the data. With his colleagues, Hughes created a database of expert-labeled maps of the Sun and used those images to teach a computer to identify solar features important for forecasting. "We didn't tell it how to identify those features, but what to look for--things like flares, coronal holes, bright regions, filaments, and prominences. The computer learns the how through the algorithm," Hughes said.

The algorithm identifies solar features using a decision-tree approach that follows a set of simple rules to distinguish between different traits. It examines an image one pixel at a time and decides, for example, whether that pixel is brighter or dimmer than a certain threshold before sending it down a branch of the tree. This repeats until, at the very bottom of the tree, each pixel fits only one category or feature--a flare, for example.

The algorithm learns hundreds of decision trees--and makes hundreds of decisions along each tree--to distinguish between different solar features and determine the "majority vote" for each pixel. Once the system is trained, it can classify millions of pixels in seconds, supporting forecasts that could be routine or require an alert or warning.

"This technique is really good at using all the data simultaneously," Hughes said. "Because the algorithm learns so rapidly it can help forecasters understand what's happening on the Sun far more quickly than they currently do."

The technique also sees patterns humans can't. "It can sometimes find features we had difficulty identifying correctly ourselves. So machine learning can direct our scientific inquiry and identify important characteristics of features we didn't know to look for," Seaton said.

The algorithm's skill at finding patterns is not only useful for short-term forecasting, but also for helping scientists evaluate long-term solar data and improve models of the Sun. "Because the algorithm can look at 20 years' worth of images and find patterns in the data, we'll be able to answer questions and solve long-term problems that have been intractable," Seaton said.

NCEI and SWPC are still testing the tool for tracking changing solar conditions so forecasters can issue more accurate watches, warnings, and alerts. The tool could be made officially operational as early as the end of 2019.

Credit: 
University of Colorado at Boulder

Study reveals dynamics of crucial immune system proteins

image: In this schematic of peptide loading on class-I MHC proteins, the molecular chaperone TAPBPR (blue) catalyzes the loading of a small peptide antigen (purple), derived from a viral or tumor protein, on the peptide-binding groove of a class-I MHC protein (gray/green). The interplay between a highly polymorphic set of MHC-I alleles and molecular chaperones shapes the repertoire of peptide antigens displayed on the cell surface for T-cell surveillance.

Image: 
Nikolaos Sgourakis

Of the many marvels of the human immune system, the processing of antigens by the class I proteins of the major histocompatability complex (MHC-I) is among the most mind-boggling. Exactly how these proteins carry out their crucial functions has not been well understood. Now, however, researchers at UC Santa Cruz have worked out the details of key molecular interactions involved in the selection and processing of antigens by MHC-I proteins.

The new findings, published December 3 in Proceedings of the National Academy of Sciences, help explain certain puzzling differences among MHC-I proteins, with implications for understanding autoimmune diseases and immune responses to infections and cancer. The results also suggest ways in which MHC-I proteins can be manipulated in the laboratory for use in diagnostic and therapeutic applications.

"Our discovery of these fundamental mechanisms enables us to develop technologies with tremendous potential for diagnostic and therapeutic purposes," said Nikolaos Sgourakis, assistant professor of chemistry and biochemistry at UC Santa Cruz and corresponding author of the paper.

The role of MHC-I proteins is to enable every cell in the body to display on its surface fragments of all the proteins being produced in that cell, typically about 10,000 different proteins. The protein fragments displayed by MHC-I proteins on the cell surface are scanned by specialized immune cells called cytotoxic T cells, which can recognize foreign proteins from an infection or mutated proteins from a tumor and launch an immune response.

"The cell has this barcoding system in place so it can show the immune system what's going on inside, and the T cells continuously surveil the surfaces of cells to sniff out the barcodes of aberrant proteins," Sgourakis explained.

Sgourakis and his team, working in close collaboration with coauthor Erik Procko's group at the University of Illinois, focused on the process of "antigen loading"--how the protein barcodes are selected and bound to MHC-I proteins so they can be displayed on the cell surface. Molecular "chaperones" play an important role in antigen loading and help determine which protein fragments get displayed. The new paper reveals how the interplay between MHC-I proteins and chaperones shapes the repertoire of displayed antigens.

There are thousands of different variants of the human MHC-I proteins, produced by different "alleles" of the MHC-I genes. The extreme variability of MHC-I proteins accounts for much of the individual variation in immune responses, including differences in susceptibility to autoimmune diseases, infections, and cancer. Each person has six main MHC-I alleles (three inherited from mom and three from dad), and each allele can display a unique subset of all possible barcodes.

"Our six MHC-I proteins sample a fraction of all the possible barcodes being generated in our cells. The ones they select become the displayed antigen repertoire, which is different for every person," Sgourakis said.

Sgourakis's team studied four different MHC-I alleles, examining their interactions with molecular chaperones and antigens. One function of the chaperones is to help MHC-I proteins fold into their active shapes and stabilize them to prevent misfolding and aggregation. But only some MHC-I alleles are dependent on chaperones for antigen loading. The new findings explain why that is and reveal important details of the antigen selection process.

The key to understanding these interactions was the use of nuclear magnetic resonance (NMR) techniques to reveal dynamic structural changes in the MHC-I proteins. "We've had static crystal structures of MHC proteins, but we could not figure out why some are chaperone-dependent and others are not," Sgourakis said. "It turns out to be a matter of protein dynamics."

The researchers found that if the three-dimensional structure of the MHC-I molecule is rigid, chaperones are not involved in antigen loading. If it has flexibility in the peptide binding groove, however, the chaperone will interact with it and help with the antigen loading process. The chaperone can eject antigens that have low affinity for the binding groove, ensuring that the MHC-I protein binds only high-affinity antigens that can be displayed at the cell surface in the proper conformation to activate a T cell response.

A flexible groove may enable the MHC-I molecule to accommodate a broader range of antigens, Sgourakis said. "The immune system has to cover all these possible barcodes with a limited number of MHC-I alleles. One way to do that is for the binding groove to adopt different shapes, but that flexibility comes at a price. You need to have a mechanism to stabilize these more flexible proteins--hence the chaperones," he said.

Sgourakis said his lab can now use chaperones in a high-throughput procedure to create libraries of barcoded MHC-I protein complexes encompassing hundreds of different peptides for use in screening T cells from patients and determining their antigen specificities. This procedure has potential applications in immunotherapy for cancer and other diseases. Sgourakis said his team is actively exploring this direction for cancer immunotherapy development in collaboration with clinical researchers at Children's Hospital of Philadelphia.

Credit: 
University of California - Santa Cruz

NASA finds second tropical system develops in Arabian Sea

image: On Dec. 3, 2019 at 12 pm. EST (1700 UTC) the MODIS instrument that flies aboard NASA's Terra satellite showed the southeastern quadrant of newly formed Tropical Storm 07A where cloud top temperatures (in yellow) were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius).

Image: 
NASA/NRL

Tropical Storm 07A has developed in the eastern Arabian Sea, one day after Tropical Storm 06A developed in the western part of the sea. Infrared imagery from an instrument aboard Terra revealed that very high, powerful storms with very cold cloud top temperatures were southwest of the center.

Tropical Storm 07A developed on Dec. 2 from a low-pressure area designated as System 91A. The storm consolidated into a tropical storm today and was renamed 07A.

Tropical cyclones are made of up hundreds of thunderstorms, and infrared data can show where the strongest storms are located. They can do with infrared data that provides temperature information. The strongest thunderstorms that reach highest into the atmosphere have the coldest cloud top temperatures.

On Dec. 3 at 12 pm. EST (1700 UTC) the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite showed the southeastern quadrant of newly formed Tropical Storm 07A where cloud top temperatures (in yellow) were as cold as or colder than minus 80 degrees Fahrenheit (minus 62.2 Celsius). NASA research has found that cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

On Dec. 3 at 10 a.m. EDT (1500 UTC), Tropical Storm 07A had maximum sustained winds near 35 knots (40 mph /65 kph). It was located near latitude 13.4 degrees north and longitude 70.2 degrees east, about 367 nautical miles south-southwest of Mumbai, India.

Forecasters at the Joint Typhoon Warning Center expect 07A will move northwest. The system is not expected to intensify and is expected to dissipate in a day or so.

Credit: 
NASA/Goddard Space Flight Center

New screening method identifies potential anticancer compounds that reawaken T cells

LA JOLLA, CA - Scientists at Scripps Research have developed a method for rapidly discovering potential cancer-treating compounds that work by resurrecting anti-tumor activity in immune cells called T cells.

Cancerous tumors often thrive because they render T cells dysfunctional or "exhausted." The new method uncovers medicinal compounds that can restore the function of these T cells, making cancers vulnerable to them again.

The approach, described in a study published in Cell Reports, may also help restore T-cell responses to persistent infections from viruses or other pathogens. It therefore should speed the development of new cancer and infectious-disease immunotherapies, including those that can be combined with existing immunotherapy drugs to enhance their effects. The scientists demonstrated the potential utility of the approach by using it to rapidly screen a collection of more than 12,000 drug compounds--uncovering 19 that can reawaken exhausted T cells.

"This new screening method should be particularly useful because we can use it not only to identify compounds that restore needed function to exhausted T cells, but also to quickly analyze these T cells to determine how these compounds work on them," says senior author Michael Oldstone, MD, Professor Emeritus in the Department of Immunology and Microbiology at Scripps Research.

The new screening system--and to some extent, the wider field of cancer immunotherapy--is based in part on research over the past several decades by Oldstone's laboratory and several former lab members including Rafi Ahmed, David Brooks, and John Teijaro, along with other scientists that have conducted animal-based research on how the immune system responds to lymphocytic choriomeningitis virus (LCMV).

A unique variant of LCMV known as "clone 13" establishes a persistent infection by exhausting the virus-specific T cells that are required to clear the infection. It does this by boosting signals through T-cell receptors such as PD-1 and IL-10. The discovery that LCMV clone 13 can survive by switching off anti-LCMV T cells was quickly followed by the recognition that cancers often persist using the same trick.

Immunotherapies that block signaling from PD-1 or similarly acting receptors to restore T cells' anti-cancer responses are among the most powerful cancer medicines available today. These therapies save many patients who in the past had seemingly untreatable tumors. But because treatment with these drugs typically works well for only a few cancers, including melanoma--and less often on other cancers--scientists suspect that cancers usually hijack multiple inhibitory T-cell pathways. This suggests that a combination of immunotherapies directed to different molecular pathways could be more effective than the current therapy.

"The idea now is to develop more immunotherapy drugs and find the best combinations of them," Oldstone says.

A promising hit

The new screening system is designed to enable scientists to swiftly find such drugs--in this case, pharmacologically active small-molecule compounds that might work better than, or augment, the current injectable antibody immunotherapies now available.

The system uses T cells that have been exhausted by LCMV clone 13 and detects signs of renewed activity in these cells when a tested compound works to reawaken them. An advantage of the new screening system is that it is specific and highly automated; thus, thousands of compounds can be tested within days, with the "hits" verified in experiments involving mice.

Oldstone and colleagues applied the new screening system to a drug repurposing library of more than 12,000 compounds that either are FDA-approved or have been tested as potential drugs. They quickly identified 19 hits--compounds that, at modest doses, can effectively resurrect the activity of exhausted T cells.

One of these compounds, ingenol mebutate, is a plant-derived molecule that is already used in gel form (Picato) to treat actinic keratosis, a pre-cancerous skin condition. The researchers employed elements of their screening system to study the reactivated T cells and determined that ingenol mebutate restores function for these cells largely by activating signaling enzymes called protein kinase C enzymes, a known pathway of activity for this compound.

Co-first authors of the study, postdoctoral fellows Brett Marro, PhD and Jaroslav Zak, PhD, in the Department of Immunology and Microbiology, are currently collecting and exploring the therapeutic potential of other reported hits that may work in combination with treatments that block PD-1- and another T-cell-inhibitory receptor, CTLA-4. Indeed, one such hit in combination with antibody to PD-L1 is already undergoing evaluation in patients.

Oldstone notes that the new screening approach is flexible enough to adapt for finding compounds that have other effects on T-cells, such as reducing T-cell activity to treat autoimmune conditions.

Credit: 
Scripps Research Institute

New treatment for brain tumors uses electrospun fiber

image: The University of Cincinnati used coaxial electrospinning to create a fine fiber with a core of one material surrounded by a sheath of another to treat brain tumors.

Image: 
Joseph Fuqua II/UC Creative Services

A novel engineering process can deliver a safe and effective dose of medicine for brain tumors without exposing patients to toxic side effects from traditional chemotherapy.

University of Cincinnati professor Andrew Steckl, working with researchers from Johns Hopkins University, developed a new treatment for glioblastoma multiforme, or GBM, an aggressive form of brain cancer. Steckl's Nanoelectronics Laboratory applied an industrial fabrication process called coaxial electrospinning to form drug-containing membranes.

The treatment is implanted directly into the part of the brain where the tumor is surgically removed.

The study was published in Nature Scientific Reports

"Chemotherapy essentially is whole-body treatment. The treatment has to get through the blood-brain barrier, which means the whole-body dose you get must be much higher," Steckl said. "This can be dangerous and have toxic side-effects."

Steckl is an Ohio Eminent Scholar and professor of electrical engineering in UC's College of Engineering and Applied Science.

Coaxial electrospinning combines two or more materials into a fine fiber composed of a core of one material surrounded by a sheath of another. This fabrication process allows researchers to take advantage of the unique properties of each material to deliver a potent dose of medicine immediately or over time.

"By selecting the base materials of the fiber and the thickness of the sheath, we can control the rate at which these drugs are released," Steckl said.

The electrospun fibers can rapidly release one drug for short-term treatment such as pain relief or antibiotics while an additional drug or drugs such as chemotherapy is released over a longer period, he said.

"We can produce a very sophisticated drug-release profile," Steckl said.

The breakthrough is a continuation of work conducted by research partners and co-authors Dr. Henry Brem and Betty Tyler at Johns Hopkins University, who in 2003 developed a locally administered wafer treatment for brain tumors called Gliadel.

Unlike previous treatments, electrospun fibers provide a more uniform dose over time, said UC research associate Daewoo Han, the study's lead author.

"For the current treatment, most drugs release within a week, but our discs presented the release for up to 150 days," he said.

Glioblastoma multiforme is a common and extremely aggressive brain cancer and is responsible for more than half of all primary brain tumors, according to the American Cancer Society. Each year more than 240,000 people around the world die from brain cancer.

The electrospun fiber created for the study provided a tablet-like disk that increased the amount of medicine that could be applied, lowered the initial burst release and enhanced the sustainability of the drug release over time, the study found.

Chemotherapy using electrospun fiber improved survival rates in three separate animal trials that examined safety, toxicity, membrane degradation and efficacy.

"This represents a promising evolution for the current treatment of GBM," the study concluded.

While this study used a single drug, researchers noted that one advantage of electrospinning is the ability to dispense multiple drugs sequentially over a long-term release. The latest cancer treatments rely on a multiple-drug approach to prevent drug resistance and improve efficacy.

Steckl said the study holds promise for treatments of other types of cancer.

"Looking ahead, we are planning to investigate 'cocktail' therapy where multiple drugs for the combined treatment of difficult cancers are incorporated and released either simultaneously or sequentially from our fiber membranes," Steckl said.

Credit: 
University of Cincinnati

Researchers discover new way to split and sum photons with silicon

image: Silicon nanocrystals are formed by a silane gas in a plasma process.

Image: 
Lorenzo Mangolini/UC Riverside

A team of researchers at The University of Texas at Austin and the University of California, Riverside have found a way to produce a long-hypothesized phenomenon--the transfer of energy between silicon and organic, carbon-based molecules--in a breakthrough that has implications for information storage in quantum computing, solar energy conversion and medical imaging. The research is described in a paper out today in the journal Nature Chemistry.

Silicon is one of the planet's most abundant materials and a critical component in everything from the semiconductors that power our computers to the cells used in nearly all solar energy panels. For all of its abilities, however, silicon has some problems when it comes to converting light into electricity. Different colors of light are comprised of photons, particles that carry light's energy. Silicon can efficiently convert red photons into electricity, but with blue photons , which carry twice the energy of red photons, silicon loses most of their energy as heat.

The new discovery provides scientists with a way to boost silicon's efficiency by pairing it with a carbon-based material that converts blue photons into pairs of red photons that can be more efficiently used by silicon. This hybrid material can also be tweaked to operate in reverse, taking in red light and converting it into blue light, which has implications for medical treatments and quantum computing.

"The organic molecule we've paired silicon with is a type of carbon ash called anthracene. It's basically soot," said Sean Roberts, a UT Austin assistant professor of chemistry. The paper describes a method for chemically connecting silicon to anthracene, creating a molecular power line that allows energy to transfer between the silicon and ash-like substance. "We now can finely tune this material to react to different wavelengths of light. Imagine, for quantum computing, being able to tweak and optimize a material to turn one blue photon into two red photons or two red photons into one blue. It's perfect for information storage."

For four decades, scientists have hypothesized that pairing silicon with a type of organic material that better absorbs blue and green light efficiently could be the key to improving silicon's ability to convert light into electricity. But simply layering the two materials never brought about the anticipated "spin-triplet exciton transfer," a particular type of energy transfer from the carbon-based material to silicon, needed to realize this goal. Roberts and materials scientists at UC Riverside describe how they broke through the impasse with tiny chemical wires that connect silicon nanocrystals to anthracene, producing the predicted energy transfer between them for the first-time.

"The challenge has been getting pairs of excited electrons out of these organic materials and into silicon. It can't be done just by depositing one on top of the other," Roberts said. "It takes building a new type of chemical interface between the silicon and this material to allow them to electronically communicate."

Roberts and his graduate student Emily Raulerson measured the effect in a specially designed molecule that attaches to a silicon nanocrystal, the innovation of collaborators Ming Lee Tang, Lorenzo Mangolini and Pan Xia of UC Riverside. Using an ultrafast laser, Roberts and Raulerson found that the new molecular wire between the two materials was not only fast, resilient and efficient, it could effectively transfer about 90% of the energy from the nanocrystal to the molecule.

"We can use this chemistry to create materials that absorb and emit any color of light," said Raulerson, who says that, with further finetuning, similar silicon nanocrystals tethered to a molecule could generate a variety of applications, from battery-less night-vision goggles to new miniature electronics.

Other highly efficient processes of this sort, called photon up-conversion, previously relied on toxic materials. As the new approach uses exclusively nontoxic materials, it opens the door for applications in human medicine, bioimaging and environmentally sustainable technologies, something that Roberts and fellow UT Austin chemist Michael Rose are working towards.

At UC Riverside, Tang's lab pioneered how to attach the organic molecules to the silicon nanoparticles, and Mangolini's group engineered the silicon nanocrystals.

"The novelty is really how to get the two parts of this structure--the organic molecules and the quantum confined silicon nanocrystals--to work together," said Mangolini, an associate professor of mechanical engineering. "We are the first group to really put the two together."

Credit: 
University of Texas at Austin

A CERN for climate change

In a Perspective article appearing in this week's Proceedings of the National Academy of Sciences, Tim Palmer (Oxford University), and Bjorn Stevens (Max Planck Society), critically reflect on the present state of Earth system modelling.

They argue that it is a mistake to frame understanding of global warming as the product of sophisticated models, because this framing understates the contributions of physical principles and simple models, as well as observations, in establishing this understanding.

Such a framing also inevitably leads to a downplaying of deficiencies in the state of Earth-system modelling -- and this has implications for how the science develops. The contribution of Earth-system modelling to understanding of global warming has been important, but primarily to show that the theoretical frameworks for interpreting observations were, despite their many simplifications, on track. Now that the causes of global warming are settled, and the imperative that this places on reducing carbon emissions are clear, climate science is facing new challenges, for instance as Marotzke et al. (2017) point out, the need to understand the habitability of the planet and the ability of human populations to be resilient to the extremes of weather and climate that may accompany future warming."

To address these challenges and inform decision making about the rate of future warming and the risks of a warming world, a new modelling strategy is required. This strategy, Professors Palmer and Stevens argue, should exploit exascale computing and an emerging new generation of models; ones aim to reduce biases by representing -- through known laws of physics rather than error prone semi-empirical approaches -- important physical processes. Decades of experience in numerical weather prediction has, after all shown, that reducing biases leads to improved predictions. To develop this new generation of more physically based models, something that has been advocated before but now is becoming possible, Palmer and Stevens press for bold multi-national initiatives to bring together computational, computer and climate scientists to co-develop modelling systems that will fully exploit emerging technologies and exa-scale computing.

Asked whether he feared their critique of the present state of Earth system modelling might be exploited by those attempting to cast doubt on present understanding of global warming, Stevens replies: "It is important that scientists speak candidly. It shouldn't come as a surprise that we can understand some things (like the world is warming because of human activities) but not everything (like what this warming means for regional changes in weather, extremes, and the habitability of the planet). By not talking about the limits of our understanding we run the risk of failing to communicate the need for new scientific approaches, just when they are needed most."

When asked whether spending new money on such an international climate modelling initiative can be justified, Professor Palmer said: "By comparison with new particle colliders or space telescopes, the amount needed, maybe around $100 million per year, is very modest indeed. In addition, the benefit/cost ratio to society of having a much clearer picture of the dangers we are facing in the coming decades by our ongoing actions, seems extraordinarily large. To be honest, all is needed is the will to work together, across nations, on such a project. Then it will happen."

Credit: 
University of Oxford

Endometriosis could be treated with cancer drug, study suggests

The painful symptoms of endometriosis - a chronic condition which affects millions of women - could potentially be reduced with a drug that had previously been investigated as a cancer treatment.

Researchers found that using dichloroacetate to treat the cells of women with endometriosis lowered the production of lactate - a potentially harmful waste product - and stopped abnormal cell growth.

Endometriosis - which affects 176 million women worldwide - is caused by the growth of lesions made up of tissue similar to the lining of the womb in other parts of the body, such as the lining of the pelvis and ovaries.

Researchers from the University of Edinburgh found that cells from the pelvic wall of women with endometriosis have different metabolism compared to women without the disease. The cells produced higher amounts of lactate similar to the behavior of cancer cells.

When the cells from women with endometriosis were treated with dichloroacetate, they were found to return to normal metabolic behavior. The scientists also noted a reduction in lactate and an impact on the growth of endometrial cells grown together with the pelvic cells.

Further tests on a mouse model of endometriosis found, after seven days, a marked reduction in lactate concentrations and the size of lesions.

This research was funded by the charity Wellbeing of Women, and supported by PwC and the Medical Research Council UK.

Currently available treatments for endometriosis are either hormone-based, which can produce unpleasant side effects, or surgery, which in half of cases, results in lesions returning after five years.

The researchers believe these new findings could help alleviate endometriosis in women who cannot - or do not wish to - take hormonal treatments or prevent recurrence after surgery. The team are conducting an early phase clinical trial to confirm their findings.

This research is published in the Proceedings of the National Academy of Sciences.

Wellbeing of Women aims to saves and transforms the lives of women and gives babies the best start in life by finding cures and treatments across the breadth of female reproductive health, including overlooked areas like endometriosis.

Lead Researcher, Professor Andrew Horne, MRC Centre for Reproductive Health at University of Edinburgh, said: "Endometriosis can be a life-changing condition for so many women. Now that we understand better the metabolism of the cells in women that have endometriosis, we can work to develop a non-hormonal treatment. Through a clinical trial with dichloroacetate we should be able to see if the conditions we observed in the lab are replicated in women."

Janet Lindsay, CEO of Wellbeing of Women said: "More than 176 million women suffer from endometriosis yet few people have heard of it and treatment, which can impact fertility, has progressed very little for over 40 years. This is why we are so excited by the findings of this research that Wellbeing of Women has funded and which could lay the basis for the first new non-hormonal treatment offering women a life-changing option. We are delighted that Professor Andrew Horne's new treatment going to clinical trial could hugely impact so many women's lives."

Credit: 
University of Edinburgh

Svalbard reindeer populations rebounding from centuries of hunting

image: Reindeer grazing on an open patch of vegetation surrounded by ice and snow.

Image: 
Photo: Brage B. Hansen

Mathilde Le Moullec and her colleagues have walked more than 2000 kilometres over four field seasons in the high-Arctic Norwegian archipelago of Svalbard, all in a quest to count reindeer.

She can now quite confidently state that Svalbard is home to approximately 22000 of the animals.

It's a number that was hard won.

In some places, Le Moullec and her colleagues walked 30-40 kilometres a day, day after day after day, on a constant lookout for both reindeer and polar bears. During her second field season, she sailed on a small boat with three colleagues to difficult-to-reach research sites in the east and southwestern part of Svalbard.

"I don't think we got more than 5 hours of sleep a night that summer," she said. "The nautical charts aren't good enough to find anchorages for a small boat." That meant everyone on the boat had to help watch for shallow water and icebergs, and find places where the boat could spend the night safely.

Now, Le Moullec, who received her PhD from the Norwegian University of Science and Technology's (NTNU) Department of Biology this year, can describe the long, surprising history of this unusual subspecies of reindeer. It's a story that demonstrates how protecting a species enables their populations to recover from past overexploitation.

It's also a story about how climate change and other human-caused environmental problems might affect the animals in the near future.

Svalbard is a Norwegian territory, a collection of nine islands at 78 degrees N, halfway between mainland Norway and the North Pole.

As improbable as it seems, given the great distances from Svalbard to anywhere else, reindeer have lived here for thousands of years. During their four field seasons wandering inland Svalbard to count reindeer, Le Moullec and her colleagues documented where she found ancient bones and antlers from the animals -- hundreds of them.

You might be tempted to ignore these weather-beaten, moss-covered fragments of antlers and bones, buried among miniature arctic plants, but Le Moullec and her colleagues realized immediately that the old reindeer remains were a treasure trove of information -- because they could be dated using radiocarbon dating.

"I call them my treasures," she said. "Walking all those kilometres, you get to places you would otherwise never go. So we started finding and collecting these ancient bones."

Although it has been known since the 1950s that reindeer have inhabited Svalbard for centuries, the bones and antlers tell researchers exactly where on the different islands the animals have lived over the centuries -- and how long ago they lived there. Their oldest find is an antler (or bone) that is 3600 years old -- or roughly from about the time Europe was entering the Bronze Age.

These dates are important because they tell the researchers which parts of the archipelago were capable of supporting reindeer populations. Le Moullec can compare that information to where Svalbard reindeer are found now, so she knows how extensively the animals are recolonizing areas where they once lived.

The big problems for Svalbard reindeer likely began after the Dutchman Willem Barents reported the existence of the archipelago in 1596. After his discovery, whalers, fisherfolk and explorers began to visit the islands and hunted reindeer.

Hunting pressures exploded with an influx of miners and trappers in the late 1800s, when coal was first discovered on Svalbard. Overwintering sailing expeditions also relied on reindeer for food. Svalbard reindeer are unlike their southern cousins in that they tend to be docile and extremely sedentary, making them easy targets.

The result was by 1900, the animals were more or less locally extirpated, Le Moullec said, although there were a few isolated areas where small populations persisted.

Those few reindeer were important, however, because they provided the animals that could slowly recolonize Svalbard after the Norwegian government extended full protection to the animals in 1925.

Now, almost one hundred years later, Le Moullec's research can tell us how well that protection worked.

With her thorough population count, and knowledge of where reindeer lived in centuries past, Le Moullec can say with confidence that Svalbard reindeer have recovered enough to recolonize virtually all non-glaciated areas in the archipelago. But because of their sedentary behaviour and the barriers posed by crossing glaciers, steep mountains and open fjords, this recolonization took a century.

"Reindeer have recolonized their ancient grazing areas, based on the information we have from the antlers and bones," she said. "And their densities are thirteen times higher than the minimum numbers we have subsequent to protection, from the 1950s."

The researchers also had information from digital maps about the quality of vegetation in areas on the island that could potentially support reindeer.

That allowed them to estimate how many reindeer ought to be able to live in these different areas -- because if the vegetation production in an area is high, that area is capable of supporting more reindeer than an area where the vegetation biomass is low.

The combination of all this information tells the scientists that although populations have grown enormously since the hunting ban was put in place, "we still see the effect of hunting from 100 years ago," Le Moullec said. "In the areas where they were extirpated, their numbers still have the potential to increase."

Le Moullec and her colleagues, Åshild Pedersen from the Norwegian Polar Institute, Jørgen Rosvold and Audun Stien from the Norwegian Institute for Nature Research, and her supervisor from NTNU's Centre for Biodiversity Dynamics, Brage Bremset Hansen, aren't just looking back in time. They're also interested in how their knowledge of reindeer recovery over time can help them evaluate how global warming will affect future populations of reindeer.

The current warming rates on Svalbard are the fastest on Earth. Other research conducted by Le Moullec's supervisor Hansen, who is also senior author on the new paper, shows that Svalbard reindeer are already being greatly affected by climate change. In some cases, reindeer are forced to eat seaweed during the winter, when ice, caused by rain-on-snow, covers their preferred foraging areas.

Currently, the researchers are seeing that reindeer populations inland are thriving more than their coastal brethren, because coastal areas are rainier and warmer during the winter, and more likely to experience rain-on-snow events.

Globally, however, Svalbard reindeer are in an enviable position. "This study represents a counter example to the many reindeer population status assessments reporting recent local or regional declines in abundance," Le Moullec and her co-authors wrote.

For example, the Rivière-George herd in northern Quebec (Canada), once the largest in the world, has declined >99% said Steeve D. Côté from Laval University in Quebec, Canada.

"Such large declines have never been reported since we developed the capacity to survey populations," Côté said. Although caribou and reindeer numbers have always fluctuated naturally, climate change may have contributed to this recent decline. For example, migratory caribou increased movements by nearly 30% due to changes in the freezing-thawing cycles of large water bodies in Nunavik (Canada), Côté said.

The Svalbard ecosystem contains just three overwintering creatures distributed across the archipelago: the rock ptarmigan (Lagopus muta hyperborea), Svalbard reindeer (Rangifer tarandus platyrhynchus) and the Arctic fox (Vulpes lagopus). What happens to one species, like the reindeer, has a ripple effect on all other species.

"Changes in reindeer abundance therefore have important top-down and bottom-up effects on the ecosystem," Le Moullec and her co-authors wrote.

For example, if there are more reindeer overall, that means there will be an increase in carcasses, which means more food for Arctic foxes, and eventually more Arctic foxes. If there are more foxes, they may eat the eggs and young of ground-nesting birds, like geese, which come to Svalbard to raise their young.

The extremely rapid climate change on Svalbard will continue to directly or indirectly affect all these animals through rainier winters, earlier springs and loss of sea ice as a travel route between islands. The long time it took Svalbard reindeer to recover from intense hunting suggests that future populations will need to be managed with great care, the researchers wrote.

"Bearing in mind that it took approximately a century for the subspecies to recover from overharvesting, the reindeer responses may be too slow to track the speed of future climate change," the researchers wrote.

Credit: 
Norwegian University of Science and Technology

Global levels of biodiversity could be lower than we think, new study warns

image: The progression of land change from 1983 to 2017 as seen from the Satellites near the Thailand-Cambodia border (Lat: 13.3606, Long: 102.354) with the former Roneam Daun Sam Wildlife sanctuary on the right of the picture.

Image: 
Martin Jung, University of Sussex

Biodiversity across the globe could be in a worse state than previously thought as current biodiversity assessments fail to take into account the long-lasting impact of abrupt land changes, a new study has warned.

The study by PhD graduate Dr Martin Jung, Senior Lecturer in Geography Dr Pedram Rowhani and Professor of Conservation Science Jörn Scharlemann, all at the University of Sussex, shows that fewer species and fewer individuals are observed at sites that have been disturbed by an abrupt land change in past decades.

The authors warn that areas subjected to deforestation or intensification of agriculture can take at least ten years to recover, with reductions in species richness and abundance.

With current biodiversity assessments failing to take into account the impacts of past land changes, the researchers believe that the natural world could be in a far worse state than currently thought.

Lead author, Dr Martin Jung said: "These findings show that recent abrupt land changes, like deforestation or intensification through agriculture, can cause even more impactful and long-lasting damage to biodiversity than previously thought.

"Our study shows that it can take at least ten or more years for areas which have undergone recent abrupt land changes to recover to levels comparable to undisturbed sites. This only strengthens the argument to limit the impacts of land change on biodiversity with immediate haste."

The study combined global data on biodiversity from the PREDICTS database, one of the largest databases of terrestrial plants, fungi and animals across the world, with quantitative estimates of abrupt land change detected using images from NASA's Landsat satellites from 1982 to 2015.

Comparing numbers of plants, fungi and animals at 5,563 disturbed sites with those at 10,102 undisturbed sites across the world from Africa to Asia, the researchers found that biodiversity remains affected by a land change event for several years after it has occurred, due to a lag effect.

Species richness and abundance were found to be 4.2% and 2% lower, respectively, at sites where an abrupt land change had occurred.

In addition, the impacts on species were found to be greater if land changes had occurred more recently, and caused greater changes in vegetation cover. At sites that had land changes in the last five years, there were around 6.6% fewer species observed.

However, at sites where a land change had taken place 10 or more years ago, species richness and abundance were indistinguishable from sites without a past land change in the same period, indicating that biodiversity can recover after such disturbances.

Dr Jung explained: "For us, the results clearly indicate that regional and global biodiversity assessments need to consider looking back at the past in order to have more accurate results in the present.

"We've shown that remotely-sensed satellite data can assist in doing this in a robust way globally. Our framework can also be applied to habitat restoration and conservation prioritization assessments."

Prof Jörn Scharlemann added: "Although the number of species and individuals appear to recover more than 10 years after a land change, we will still need to find out whether the original unique species recover or whether common widespread species, such as weeds, pigeons and rats, move into these disturbed areas."

Credit: 
University of Sussex

McGill-led research unravels mystery of how early animals survived ice age

image: Maxwell Lechte examines rock formations in the Flinders Ranges (South Australia).

Image: 
Brennan O'Connell

How did life survive the most severe ice age? A McGill University-led research team has found the first direct evidence that glacial meltwater provided a crucial lifeline to eukaryotes during Snowball Earth, when the oceans were cut off from life-giving oxygen, answering a question puzzling scientists for years.

In a new study published in the Proceedings of the National Academy of Sciences of the United States of America, researchers studied iron-rich rocks left behind by glacial deposits in Australia, Namibia, and California to get a window into the environmental conditions during the ice age. Using geological maps and clues from locals, they hiked to rock outcrops, navigating challenging trails to track down the rock formations.

By examining the chemistry of the iron formations in these rocks, the researchers were able to estimate the amount of oxygen in the oceans around 700 million years ago and better understand the effects this would have had on all oxygen-dependent marine life, including the earliest animals like simple sponges.

"The evidence suggests that although much of the oceans during the deep freeze would have been uninhabitable due to a lack of oxygen, in areas where the grounded ice sheet begins to float there was a critical supply of oxygenated meltwater. This trend can be explained by what we call a 'glacial oxygen pump'; air bubbles trapped in the glacial ice are released into the water as it melts, enriching it with oxygen," says Maxwell Lechte, a postdoctoral researcher in the Department of Earth and Planetary Sciences under the supervision of Galen Halverson at McGill University.

Around 700 million years ago, the Earth experienced the most severe ice age of its history, threatening the survival of much of the planet's life. Previous research has suggested that oxygen-dependent life may have been restricted to meltwater puddles on the surface of the ice, but this study provides new evidence of oxygenated marine environments.

"The fact that the global freeze occurred before the evolution of complex animals suggests a link between Snowball Earth and animal evolution. These harsh conditions could have stimulated their diversification into more complex forms," says Lechte, who is also the study's lead author.

Lechte points out that while the findings focus on the availability of oxygen, primitive eukaryotes would also have needed food to survive the harsh conditions of the ice age. Further research is needed to explore how these environments might have sustained a food web. A starting point might be modern ice environments that host complex ecosystems today.

"This study actually solves two mysteries about the Snowball Earth at once. It not only provides explanation for how early animals may have survived global glaciation, but also eloquently explains the return of iron deposits in the geological record after an absence of over a billion years," says Professor Galen Halverson.

Credit: 
McGill University

Molecular vibrations lead to high performance laser

image: Molecular vibrations lead to high performance laser (illustration).

Image: 
Troan Tran

Lasers. They are used for everything from entertaining our cats to encrypting our communications. Unfortunately, lasers can be energy intensive and many are made using toxic materials like arsenic and gallium. To make lasers more sustainable, new materials and lasing mechanisms must be discovered.

Professor Andrea Armani and her team at the USC Viterbi School of Engineering have discovered a new phenomenon and used it to make a laser with over 40 percent efficiency-nearly 10 times higher than other similar lasers. The laser itself is made from a glass ring on a silicon wafer with only a monolayer coating of siloxane molecules anchored to the surface. Thus, it has improved power consumption and is fabricated from more sustainable materials than previous lasers.

The work from Armani and her co-authors Xiaoqin Shen and Hyungwoo Choi from USC's Mork Family Department of Chemical Engineering and Material Science; Dongyu Chen from USC's Ming Hsieh Department of Electrical and Computer Engineering; and Wei Zhao, from the Department of Chemistry at the University of Arkansas at Little Rock, was published in Nature Photonics.

The surface Raman laser is based on an extension of the Raman effect, which describes how the interaction of light with a material can induce molecular vibrations that result in light emission. One unique feature of this type of laser is that the emitted wavelength is not defined by the electronic transitions of the material, but instead it is determined by the vibrational frequency of the material. In other words, the emitted laser light can be easily tuned by changing the incident light. In previous work, researchers have made Raman lasers leveraging the Raman effect in "bulk" material, like optical fiber and silicon.

Raman lasers have a wide range of applications including military communications, microscopy and imaging, and in medicine for ablation therapy, a minimally invasive procedure to destroy abnormal tissue such as tumors.

Armani, USC's Ray Irani Chair in Chemical Engineering and Materials Science, said she realized that a different strategy might give even higher performing Raman lasers from sustainable materials like glass.

"The challenge was to create a laser where all of the incident light would be converted into emitted light," Armani said. "In a normal solid-state Raman laser, the molecules are all interacting with each other, reducing the performance. To overcome this, we needed to develop a system where these interactions were reduced."

Armani said that if conventional Raman lasers were thought of as the old energy-inefficient light bulbs many of us grew up with, this new technology would result in the laser equivalent of energy efficient LED lightbulbs; a brighter result requiring lower energy input.

Armani's interdisciplinary team, comprised of chemists, materials scientists and electrical engineers, quickly realized that they could design this type of laser system. Combining surface chemistry and nanofabrication, they developed a method to precisely form a single monolayer of molecules on a nanodevice.

"Think of the molecule as looking like a tree," Armani said. "If you anchor the base of the molecule to the device, like a root to a surface, the molecule's motion is limited. Now, it can't just vibrate in any direction. We discovered that by constraining the motion, you actually increase the efficiency of its movement, and as a result, its ability to act as a laser."

The molecules are attached to the surface of an integrated photonic glass ring, which confines an initial light source. The light inside the ring excites the surface-constrained molecules, which subsequently emit the laser light. Notably, the efficiency is actually improved nearly 10 times, even though there is less material.

"The surface-constrained molecules enable a new process, called Surface Stimulated Raman, to happen,"said Xiaoqin Shen, the paper's co-lead author with Hyungwoo Choi, "This new surface process triggers the boost of the lasing efficiency."

Additionally, just like conventional Raman lasing, by simply changing the wavelength of light inside the ring, the emission wavelength from the molecules will change. This flexibility is one reason why Raman lasers - and now Surface Stimulated Raman lasers - are so popular across numerous fields including defense, diagnostics, and communications.

Armani said the team managed to bind the molecules to the surface of the glass ring by harnessing the hydroxyl molecule groups on the surface, entities with the formula OH, that contain oxygen bonded to hydrogen, using a process called silanization surface chemistry. This reaction forms a single monolayer of precisely oriented individual molecules.

The discovery is a passion project for Armani; one that she has been pursuing since her days as Ph.D. student.

"This is a question I've been wanting to look into for a while, but it just wasn't the right time and the right place and right team to be able to answer it," she said.

Armani said the research has the potential to significantly reduce the input power required to operate Raman lasers as well as impact numerous other applications.

"The Raman effect is a fundamental, Nobel-Prize winning science behavior originally discovered in the early 20th century," Armani said. "The idea of contributing something new to this rich field is very rewarding ."

Credit: 
University of Southern California

Australian GPs widely offering placebos, new study finds

image: Reasons Australian GPs offer placebos.

Image: 
University of Sydney

Most Australian GPs have used a placebo in practice at least once, with active placebos (active treatments used primarily to generate positive expectations) more commonly used than inert placebos, according to a new study.

International studies indicate that placebo use by general practitioners (GPs) is remarkably high, but until now usage in Australia was unknown.

A new survey by Associate Professor Ben Colagiuri, in the School of Psychology at the University of Sydney, and Dr Kate Faasse at the University of New South Wales, examined rates of use and beliefs about placebos in Australian general practice. The findings are published today in the journal The Australian Journal of General Practice.

Key findings:

77% of GPs had offered an active placebo (such as antibiotics for a virus)

39% of GPs had offered an inert placebo (such as saline spray or a water-based cream)

GPs primarily used placebos because they believed they could provide genuine benefit and viewed themselves as having a strong role in shaping patients' expectations

53% of GPs felt that administering placebos deceptively was unethical, but most (>80%) believed openly providing placebos - ie with the patient's knowledge - is ethical

GPs felt that medical trainees would benefit from more education about placebos

"We already know that doctors and GPs use placebos regularly overseas," Associate Professor Ben Colagiuri said. "So, we wanted to see what was happening in Australia. We found that placebo use is also relatively common here. The good news is that Australian GPs are predominantly using placebos because they believe that there's some real benefit to them. They are simply trying to help their patients."

The more concerning news, Associate Professor Colagiuri said, is that in some cases GPs are also prescribing antibiotics, an active medication, for purposes other than its design.

"The most common case is when a GP prescribes antibiotics when they know or strongly suspect that the patient doesn't have a bacterial infection," he said. "In these cases, they are prescribing antibiotics as a type of placebo, often because a patient expects or demands treatment. But antibiotics can have side effects and there are problems with antibiotic resistance if we prescribe antibiotics too much."

According to Associate Professor Colagiuri, one of the most important findings coming from the study is that GPs felt that medical trainees could benefit from more education about the placebo effect. "Currently, there are no guidelines on placebo use in clinical practice in Australia. As such, GPs and other medical professionals are left to make up their own minds as to if, at all, and how to use placebos. It is really important for medical professionals and patients that we develop evidenced-based guidelines for placebo use in Australia."

Co-author Dr Kate Faasse, from the School of Psychology at UNSW said the study found rates of placebo use by Australian GPs that were similar to those seen in other countries - the rates of use that we are seeing in the current study are very much in line with international research.

"Now we need more focus on understanding the role of psychological and social factors in physical health outcomes," Dr Faasse said. "There is so much more than just the active ingredients of a medicine, for example, that can help to improve people's health."

"In terms of future research, I think the possibility that we - either as individuals, or in medical contexts - can be harnessing the placebo effect in our own lives by knowingly using 'open-label' placebos is fascinating," Dr Fasse said. "Figuring out the best way to do this, for example what information helps open-label placebos be most effective, in what dose, and for what outcomes, are really fascinating research questions that we're starting to explore."

What is a placebo?

A placebo is a treatment that works because the patient expects it to. Placebos have been found to produce genuine therapeutic benefit in conditions ranging from pain, nausea and sleep, to hypertension, immune function and even Parkinson's disease.

An inert placebo treatment is something that has no active ingredients whatsoever, such as a sugar pill, saline nasal spray or a water-based cream. An active placebo treatment is one that contains active ingredients - an antibiotic or cough mixture - but is unlikely to have a specific physiological effect on the patient's current condition.

What is the placebo effect?

The placebo effect occurs when the patient believes a treatment will help them to feel better. These beliefs trigger changes in the central nervous system - such as the release of neurotransmitters in our brains - that actually cause improvement. Usually placebos involve deception, that is the patient is led to believe that they are receiving an active treatment. However, recent studies have shown that the placebo effect can happen even when the patient knows they are receiving a placebo. The vast majority of Australian GPs - more than 80 per cent - believe that giving a placebo openly, without deception, is ethical.

Credit: 
University of Sydney

The impact of molecular rotation on a peculiar isotope effect on water hydrogen bonds

image: Desorption of water isotopomers (H2O, HDO and D2O) from surfaces of isotope-mixed ice with various H/D compositions.

Image: 
NINS/IMS

The physicochemical and biological properties of hydrogen-bonded systems are significantly affected by nuclear quantum effects including zero-point energies of vibrational modes, proton delocalization, and tunneling effect. These originate from the extremely low nuclear mass of hydrogen; thus, hydrogen-bonded systems show remarkable isotope effects upon deuteration. In the 1930s, Ubbelohde first proposed that deuteration elongates and weakens hydrogen bonds in many hydrogen-bonded systems. Ever since, such an isotope effect has been widely confirmed and is nowadays well known as the "Ubbelohde effect." In contrast, deuterating water molecules in liquid water and ice elongates but strengthens hydrogen bonds. Despite intensive experimental and theoretical studies in more than three-quarters of a century, the molecular-level origin of this peculiar isotope effect on water hydrogen bonds has been unclear.

Very recently, researchers led by Toshiki Sugimoto, Associate Professor at the Institute for Molecular Science, have tackled the longstanding mystery: how do more expanded D2O aggregates form stronger hydrogen bonds than H2O aggregates, in contrast to the hydrogen-bonded systems composed of bulky constituent molecules? By means of isotope selective measurements on sublimation of isotope-mixed ice with various H/D isotopic compositions, the researchers have made a new discovery to unravel the mystery; the isotope effect on the strength of hydrogen bonds are governed by two deuteration effects: (1) the bond-strengthening effect derived from zero-point energy of hindered rotational motion, and (2) the bond-weakening (and elongating) effect derived from quantum anharmonic coupling between inter- and intramolecular modes.

The most important concept is that the deuteration effect (1) derived from rotational motion plays crucial roles in the bond breaking process of extremely small and light molecules. In the case of water aggregates, huge isotopic difference in the zero-point energy of hindered rotation brings out a peculiar nature of the bond strengthening effect (1) overwhelming over the bond-weakening effect (2), leading to the unique isotope effect: deuterated water molecules form longer but stronger hydrogen bonds than hydrogenated water molecules. In contrast, in the case of other typical hydrogen-bonded systems composed of larger and heavier constituent molecules, such as oxalic acid dihydrate, benzoic acid, succinic acid, and cyclohexane/Rh(111), the isotopic differences in the zero-point energy of hindered rotation are negligibly small. Therefore, only the bond-weakening effect (2) is predominant in the isotope effect on their binding energy, resulting in the longer and weaker hydrogen bonds in deuterated systems than hydrogenated systems. Thus, the isotopic differences in the strength of hydrogen bonds are determined by a delicate balance between the competing two deuteration effects (1) and (2), while those in hydrogen-bond length, i.e. geometrical isotope effect, are basically dominated by the deuteration effect (2). "These results and concepts provide a new basis for our fundamental understanding of the highly quantum water hydrogen bonds," says Sugimoto.

Credit: 
National Institutes of Natural Sciences

Oat pathogen defence discovery marks an important milestone

image: Researchers have identified the critical last pieces of a genetic defence system that gives oats resistance to soil pathogens.

Image: 
Andrew Davis

Researchers have identified the critical last pieces of a genetic defence system that gives oats resistance to soil pathogens.

The discovery opens significant opportunities for scientists and breeders to introduce versions of this defence mechanism into other crops.

It is an important milestone in research into avenacins, defensive antimicrobial compounds produced in the roots of oats. These were first identified more than 70 years ago and belong to the triterpene glycoside family of natural products which have diverse industrial and agricultural applications.

Avenacins give oats resistance against soil pathogens including Take-all, a notorious disease that causes major losses in wheat and barley.

Professor Anne Osbourn of the John Innes Centre and an author of the study said: "When we started 20 years ago we didn't have any of the genetic pieces in this avenacin defence pathway. Now we have found the last critical steps we have the potential to engineer it into other crops.

"Wheat, other cereals and grasses are not good at making antimicrobial compounds. Oat, on the other hand, is a prime example of a super-fit plant that has not been extensively bred and has disease resistance which could benefit more cultivated crops."

Many plant natural products such as avenacin have sugars attached to them, a process called glycosylation. This is important for the biological activity of natural products, in this case the antifungal activity of avenacin.

In this study, which appears in the journal PNAS, researchers used a range of genomic analysis techniques to identify the enzymes that catalyse this process.

They discovered the final component in this sugar chain was added by an unexpected class of enzyme. Furthermore, the sugar was added in the cell vacuole not the cytosol where most glycosylation steps occur in the avenacin pathway.

"This transglucosidase enzyme elucidated in this study belongs to a large family of enzymes not generally thought of as providing this function," explains first author Dr Anastasia Orme.

"Understanding the contribution of this family opens up a whole new sphere of carbohydrate biology that is relevant to natural products and hence to drugs and other valuable compounds."

The study was carried out in collaboration with Professor Bin Han as part of the Centre of Excellence for Plant and Microbial Science (CEPAMS) a partnership between the Chinese Academy of Sciences and the John Innes Centre.

One of the potential benefits of greater understanding this large family may be in biosynthesis of traditional Chinese medicine using resources such as the plant transient expression system technology developed by the John Innes Centre.

Credit: 
John Innes Centre