Earth

Record-breaking laser link could help us test whether Einstein was right

image: UWA's rooftop observatory.

Image: 
ICRAR

Scientists from the International Centre for Radio Astronomy Research (ICRAR) and The University of Western Australia (UWA) have set a world record for the most stable transmission of a laser signal through the atmosphere.

In a study published today in the journal Nature Communications, Australian researchers teamed up with researchers from the French National Centre for Space Studies (CNES) and the French metrology lab Systèmes de Référence Temps-Espace (SYRTE) at Paris Observatory.

The team set the world record for the most stable laser transmission by combining the Aussies' 'phase stabilisation' technology with advanced self-guiding optical terminals.

Together, these technologies allowed laser signals to be sent from one point to another without interference from the atmosphere.

Lead author Benjamin Dix-Matthews, a PhD student at ICRAR and UWA, said the technique effectively eliminates atmospheric turbulence.

"We can correct for atmospheric turbulence in 3D, that is, left-right, up-down and, critically, along the line of flight," he said.

"It's as if the moving atmosphere has been removed and doesn't exist.

"It allows us to send highly-stable laser signals through the atmosphere while retaining the quality of the original signal."

The result is the world's most precise method for comparing the flow of time between two separate locations using a laser system transmitted through the atmosphere.

ICRAR-UWA senior researcher Dr Sascha Schediwy said the research has exciting applications.

"If you have one of these optical terminals on the ground and another on a satellite in space, then you can start to explore fundamental physics," he said.

"Everything from testing Einstein's theory of general relativity more precisely than ever before, to discovering if fundamental physical constants change over time."

The technology's precise measurements also have practical uses in earth science and geophysics.

"For instance, this technology could improve satellite-based studies of how the water table changes over time, or to look for ore deposits underground," Dr Schediwy said.

There are further potential benefits for optical communications, an emerging field that uses light to carry information.

Optical communications can securely transmit data between satellites and Earth with much higher data rates than current radio communications.

"Our technology could help us increase the data rate from satellites to ground by orders of magnitude," Dr Schediwy said.

"The next generation of big data-gathering satellites would be able to get critical information to the ground faster."

The phase stabilisation technology behind the record-breaking link was originally developed to synchronise incoming signals for the Square Kilometre Array telescope.

The multi-billion-dollar telescope is set to be built in Western Australia and South Africa from 2021.

Credit: 
International Centre for Radio Astronomy Research

A quarter of known bee species haven't appeared in public records since the 1990s

image: This photo shows a giant Patagonian bumblebee (Bombus dahlbomii). Four decades ago, these bees were abundant in Chile and Argentina, but now they have become an uncommon sight.

Image: 
Eduardo E. Zattara

Researchers at the Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET) in Argentina have found that, since the 1990s, up to 25% of reported bee species are no longer being reported in global records, despite a large increase in the number of records available. While this does not mean that these species are all extinct, it might indicate that these species have become rare enough that no one is observing them in nature. The findings appear January 22 in the journal One Earth.

"With citizen science and the ability to share data, records are going up exponentially, but the number of species reported in these records is going down," says first author Eduardo Zattara (@ezattara), a biologist at the Pollination Ecology Group from the Institute for Research on Biodiversity and the Environment (CONICET-Universidad Nacional del Comahue). "It's not a bee cataclysm yet, but what we can say is that wild bees are not exactly thriving."

While there are many studies about declining bee populations, these are usually focused on a specific area or a specific type of bee. These researchers were interested in identifying more general, global trends in bee diversity.

"Figuring out which species are living where and how each population is doing using complex aggregated datasets can be very messy," says Zattara. "We wanted to ask a simpler question: what species have been recorded, anywhere in the world, in a given period?"

To find their answer, the researchers dove into the Global Biodiversity Information Facility (GBIF), an international network of databases, which contains over three centuries' worth of records from museums, universities, and private citizens, accounting for over 20,000 known bee species from around the world.

In addition to finding that a quarter of total bee species are no longer being recorded, the researchers observed that this decline is not evenly distributed among bee families. Records of halictid bees--the second most common family--have declined by 17% since the 1990s. Those for Melittidae--a much rarer family--have gone down by as much as 41%.

"It's important to remember that 'bee' doesn't just mean honeybees, even though honeybees are the most cultivated species," says Zattara. "Our society's footprint impacts wild bees as well, which provide ecosystem services we depend on."

While this study provides a close look at the global status of bee diversity, it is too general an analysis to make any certain claims about the current status of individual species.

"It's not really about how certain the numbers are here. It's more about the trend," says Zattara. "It's about confirming what's been shown to happen locally is going on globally. And also, about the fact that much better certainty will be achieved as more data are shared with public databases."

However, the researchers warn that this type of certainty may not come until it is too late to reverse the decline. Worse still, it may not be possible at all.

"Something is happening to the bees, and something needs to be done. We cannot wait until we have absolute certainty because we rarely get there in natural sciences," says Zattara. "The next step is prodding policymakers into action while we still have time. The bees cannot wait."

Credit: 
Cell Press

New variety of paintbrush lily developed by a novel plant tissue culture technique

image: The steps involved in generating triploid and hexaploid plants by endosperm culture (Photos: Arisa Nakano, Masahiro Mii, Yoichiro Hoshino).

Image: 
Arisa Nakano, Masahiro Mii, Yoichiro Hoshino

Scientists at Hokkaido University and Chiba University have developed simultaneous triploid and hexaploid varieties of Haemanthus albiflos by the application of endosperm culture, thus extending the use of this technique.

In plants, the number of chromosome sets in cells (ploidy) affects a large number of desirable characteristics. In general, the greater the number of chromosome sets, the more like the plant is to have larger flowers, larger fruits, be more disease resistant, and so on. Hence, particularly in agriculture and horticulture, the development of polyploid plants continues to receive much attention.

Scientists from Hokkaido University and Chiba University have successfully developed triploid (3 chromosome sets) and hexaploid (6 chromosome sets) plants of the ornamental plant Haemanthus albiflos, via plant tissue culture (PTC) techniques. In addition to increasing the ornamental value of this plant, this is one of the first studies that use "endosperm culture"--an application of PTC techniques--for non-cereal monocotyledonous plants. Their findings were published in the journal Plant Cell, Tissue and Organ Culture.

Triploid plants are quite unique among polyploid plants. Their most significant advantage is also their most significant disadvantage: due to the odd number of chromosome sets, the fruits are seedless, which boosts market value but also means that the plants can only be propagated by cuttings, instead of seeds. This disadvantage can be overcome by generating hexaploid plants from triploid plants.

Triploid plants are found naturally, albeit in very small numbers. They can be produced by cross-breeding diploid (2 chromosome sets) and tetraploid (4 chromosome sets) plants, or by PTC techniques. The advantage of PTC over cross-breeding is that a wider variety of plants can potentially be generated over a shorter period of time. Additionally, it is far easier to convert triploid plants to hexaploid plants by PTC techniques.

The scientists isolated the endosperm of H. albiflos, a food reserve tissue inside the seeds that is naturally triploid. The endosperm was grown into a mass of cells called a callus using a PTC technique. A portion of this tissue was then directly subjected to another PTC technique, organogenesis, to generate triploid plantlets of H. albiflos. Another portion of the callus was first treated with colchicine before organogenesis. Colchicine is a chemical that causes a doubling of the number of chromosome sets; thus, during organogenesis, hexaploid plantlets are generated.

The scientists chose H. albiflos for two reasons: it is a monocotyledonous plant (grass, cereal, and their close relatives), and it is an ornamental plant. Historically, endosperm culture in monocots has focused on rice and barley, with very few examples in other plants. By achieving their goal, the scientists have not only extended the use of endosperm culture, but they have also developed a valuable variety of the ornamental paintbrush lily.

Yoichiro Hoshino is a Professor at the Field Science Center, Hokkaido University. His research focuses on plant breeding of horticultural crops by using biotechnology, and on analysis of the fertilization process in higher plants. He is interested in utilization of plant genetic resources in the Hokkaido area and in developing novel breeding methods.

Credit: 
Hokkaido University

Novel target identified that could improve safety of therapy for pancreatic cancer

Researchers from Queen Mary University of London, have identified a protein that may represent a novel therapeutic target for the treatment of pancreatic cancer. Using this protein as a target, the team successfully created a CAR T cell therapy - a type of immunotherapy - that killed pancreatic cancer cells in a pre-clinical model.

CAR T cell therapy is an immunotherapy that has shown great promise for the treatment of some blood cancers; however, the treatment of solid tumours using this therapy has proved very difficult. One barrier to success is toxicity in tissues other than the cancer because most of the proteins currently used to target CAR T cells to pancreatic cancer cells and other solid tumours are present in low levels on other normal tissues, leading to toxic side effects.

In this study, published today in Clinical Cancer Research and funded by the charity Pancreatic Cancer UK, the team identified a protein called CEACAM7 that may represent a safer treatment target for the development of therapies against pancreatic ductal adenocarcinoma (PDAC), the most common type of pancreatic cancer.

By using a specialised technique called immunostaining, the team examined a panel of human PDAC samples, and normal tissues for the presence of CEACAM7. A large subset of PDAC samples tested expressed CEACAM7, but the protein was undetectable in a panel of normal tissues including tonsil, lung, liver, and prostate, suggesting that CEACAM7 may be an ideal target for CAR T cell development against pancreatic cancer.

To determine the potential of CEACAM7 as a treatment target, the team developed CAR T cells targeted to CEACAM7 and applied these to PDAC cell lines as well as a preclinical model of PDAC. The CAR T cells effectively targeted the CEACAM7-expressing cells in PDAC cell cultures, and eliminated cancer cells in a late-stage preclinical model of PDAC.

Professor John Marshall from Queen Mary University of London who led the study, said: "This is an exciting development. Finding that CEACAM7 allows us to kill pancreatic cancer cells specifically with CAR T cells while having no significant toxicity in non-tumour tissues, gives us hope that this strategy could be effective in the future. It is also possible that other types of immune-based therapies could be directed to CEACAM7 for the treatment of pancreatic cancer."

Dr Deepak Raj, postdoctoral researcher and first author of the study, said: "As CEACAM7 is a poorly studied protein thus far, we were excited to find that it appears a promising CAR T-cell target on pancreatic cancer. It would be important to assess a larger number of antibodies against CEACAM7, not only to generate and test a larger panel of CAR T cells that may have increased efficacy against pancreatic cancer, but also to more conclusively rule whether low levels of CEACAM7 are present in normal tissues."

How does CAR T cell therapy work?

CAR T cell therapy utilises immune cells (called killer T cells) from the patient's blood, which have a critical role in the immune response. Killer T cells are first isolated from the patient's blood and modified in the laboratory to express special protein receptors on their surface, called Chimeric Antigen Receptors (CAR), creating CAR T cells. The CAR protein allows the CAR T cells to recognise a specific protein on the surface of cancer cells. CAR T cells are multiplied in the laboratory and then re-injected back into the patient where they recognise and kill cancer cells that have the target protein on their surface.

In this study, the team made a new CAR using part of an antibody to CEACAM7 from collaborator Professor Brad Nelson (British Columbia, Canada). They then modified killer T cells to present this new CAR protein on their surface that recognises and binds to CEACAM7, directing the killer T cells to kill only cells with CEACAM7, and these seem to be only pancreatic cancer cells.

Challenges in the treatment of pancreatic cancer

Pancreatic ductal adenocarcinoma (PDAC) is the most common type of pancreatic cancer and has the lowest survival rate of all the common cancers, as only about 7% of those diagnosed with this cancer type in the UK survive their cancer for 5 years or more. Diagnosis often comes too late due to a lack of definitive symptoms, by which point surgery to remove the tumour - which offers the greatest chance of a cure - is not possible. There is an urgent requirement for new and more effective targeted therapies.

Chris Macdonald, Head of Research at Pancreatic Cancer UK said: "These findings are very encouraging and offer real hope that a new, innovative immunotherapy treatment for pancreatic cancer is on the horizon. For the first time a distinct and specific target protein for pancreatic cancer cells has been identified and, crucially, the brilliant team at Barts have shown that by focusing on it, they can destroy the cancer without damaging healthy tissue. This has never been done before in pancreatic cancer and marks an important step towards a desperately needed new treatment option, which could be both more effective and have fewer side-effects for patients.

Currently treatment options are limited and people affected by this devastating disease face incredibly low odds of survival. I look forward to seeing the results of targeting this protein in future clinical trials. I hope we'll see these findings, along with the other research funded by Pancreatic Cancer UK's Grand Challenge, benefit people with pancreatic cancer, the way we've seen new immunotherapy treatments benefit people with other types of cancer."

Credit: 
Queen Mary University of London

Reducing traps increases performance of organic photodetectors

image: Flexible organic photodetectors (OPDs) have a huge potential for applications in low-cost imaging, health monitoring and near infrared sensing.

Image: 
Christian Körner

Organic photodetectors (OPDs) have a huge potential for applications in low-cost imaging, health monitoring and near infrared sensing. Yet, before industrially realizing these applications, the performance of these devices still needs to be improved.

Recent research on organic photodetectors based on donor-acceptor systems has resulted in narrow-band, flexible and biocompatible devices, of which the best reach external photovoltaic quantum efficiencies of close to 100%. However, the high noise in the off state produced by these devices limits their specific detectivity, severely reducing the performance, for example measuring faint light.

Jonas Kublitski and his colleagues at the Dresden Integrated Center for Applied Physics and Photonic Materials (IAPP) and the Institute of Applied Physics (IAP) at TU Dresden now found out that the high noise in the off state is a consequence of unwanted trap states distributed near the mid-gap of organic semiconductors. By measuring the amount of traps, the physicists draw a direct correlation between the characteristics of the trap states and the off state of OPDs.

Building on these results, Mr. Kublitski was able to draw a model depicting this relation:

"By modelling the dark current of several donor-acceptor systems, we reveal the interplay between traps and charge-transfer states as a source of dark current and show that traps dominate the generation processes, thus being the main limiting factor of OPD detectivity.

The newly discovered relation does only clarify the operation of OPDs but gives guidance for further research in the field. This work is a result of four years of research during my Ph.D. I am very happy to share these results, as they can refocus the attention of our field into understanding the origin of the limited performance of OPDs, which was so far unknown."

Credit: 
Technische Universität Dresden

Shift in caribou movements may be tied to human activity

image: University of Cincinnati assistant professor Joshua Miller holds shed antler he collected from the Arctic National Wildlife Refuge to study caribou.

Image: 
Colleen Kelley/UC Creative

Human activities might have shifted the movement of caribou in and near the Arctic National Wildlife Refuge, according to scientists with the University of Cincinnati.

Each year caribou take on one of nature's longest land migrations, trekking hundreds of miles across Alaska and Canada to find food and give birth in their preferred calving grounds.

A UC study published today in the journal Frontiers in Ecology and Evolution identified a shift in one herd's movements after the 1970s that coincided with changes in herd size and climate, and the construction of new roads and other energy infrastructure.

Researchers used isotope analysis of antlers shed by female caribou to track their historical patterns of movement over the landscape. Female caribou are unique among deer for growing and shedding antlers each year like males.

The study is timely given the auction this year of oil and gas leases in the Arctic National Wildlife Refuge. Indigenous Gwich'in opposed the leases, arguing development could disrupt the migration of caribou they depend on for sustenance.

An international team of researchers led by UC geologist Joshua Miller focused on the antlers of female caribou, which are shed within a few days of giving birth each spring. The location where antlers drop marks their spring calving grounds.

Caribou then spend the summer growing a new pair of antlers.

Miller and his collaborators found that analyses of isotopes from the antlers could not only  identify one caribou herd from another but also identify changes in their summer range over time.

Miller, an assistant professor in UC's College of Arts and Sciences, traveled extensively across the Arctic during five expeditions with collaborators from the U.S. Fish and Wildlife Service. Using inflatable boats, the team navigated rivers, avoiding bears and enduring mosquitoes to collect caribou antlers across the Arctic National Wildlife Refuge in northeast Alaska.

"It's one of the most remote places on the planet," Miller said. "So it poses all sorts of logistical challenges. It is a real adventure."

The refuge is home to grizzly and polar bears, musk ox and hundreds of thousands of caribou found in different herds. Caribou are an important staple food for Indigenous Alaskans who seasonally hunt them.

Two populations of caribou are found in the Arctic Refuge: the Central Arctic herd and the Porcupine caribou herd, which is named for the Porcupine River that flows in the heart of its range. While caribou numbers can fluctuate year to year, the Porcupine herd is home to about 200,000 caribou. The Central Arctic herd has approximately 60,000 more, though its numbers may be declining.

The collected antlers were shipped back to Miller's UC geology lab, where researchers, including UC graduate student Abigail Kelly, prepared them for isotopic analysis.

Strontium, which is found virtually everywhere on Earth, is absorbed up the food chain through plants that caribou and other herbivores eat. Strontium exists as different isotopes, which vary with the geology like an isotopic footprint. By comparing the ratios of strontium-87 and strontium-86, researchers could track where the antlers were grown.

Since new female antlers grow in just a few months each year, they make an ideal time capsule to identify where a caribou has been feeding.

Miller said finding the antlers is straightforward on the flatter terraces of the Arctic Refuge away from the thick grass tussocks.

"On these flat areas, they can be everywhere - more than 1,000 antlers per square kilometer. In some places, you can find one every several steps," he said.

Some of the antlers had been lying on the tundra for hundreds of years. One was dated to the 1300s.

Researchers observed a shift in summer movements among caribou in the Central Arctic herd before and after the 1970s. This coincides with three factors known to alter caribou migration: population growth, climate change and increased human disturbances to their summer and calving ranges.

Human development in the 1970s included oil field expansion and construction of the Trans-Alaska Pipeline. Previous studies have found that pregnant caribou avoid pipelines and roads while calves born in the vicinity of roads and other development are underweight compared to those living farther from human development.

"An important future area of research will be to test this shift in preferred summer landscapes using an expanded sampling of antlers shed across each herd's calving grounds," researchers said.

Scientists only began studying caribou migration using radio telemetry in the 1970s and '80s. Miller said. With antlers, it is possible to track historical caribou landscape use long before that.

"The question is, how can we evaluate the effects of human impacts given that we only recently started paying close attention? Antlers provide opportunities to look at the past and fill in some of these gaps in our knowledge," he said.

"One thing we know about caribou is they often avoid human-modified landscapes: pipelines, roads, tourism lodges," Miller said. "They are surprisingly sensitive to these changes."

UC associate professor Brooke Crowley, a study co-author, has employed similar methods to identify critical hunting areas for endangered goshawks in Madagascar, track endangered jaguars in Belize and even follow the migrations of long-extinct animals like mammoths and mastodons.

"Strontium isotopes allow researchers to understand mobility of animals on temporal and spatial scales that complement other conservation tools," Crowley said. "It is particularly valuable to be able to reconstruct what a species or population did in the past because then we have some baseline data that we can compare to modern trends."

The other co-authors included Clément Bataille from the University of Ottawa, Eric Wald from the U.S. Fish and Wildlife Service, Volker Bahn from Wright State University and Patrick Druckenmiller from the University of Alaska Fairbanks.

UC doctoral student and paper co-author Madison Gaetano said tools developed by paleontologists to study long-extinct animals are helping researchers answer pressing questions about wildlife conservation.

"Bones lying on modern landscapes accumulate over many generations and record data that are applicable to a myriad of questions about the evolution and ecology of animals and their ecosystems," Gaetano said. "Our role is to develop methods to access, interpret and apply this information, which I think is nicely demonstrated by this research."

Credit: 
University of Cincinnati

UMD researcher expands plant genome editing with newly engineered variant of CRISPR-Cas9

Alongside Dennis vanEngelsdorp, associate professor at the University of Maryland (UMD) in Entomology named for the fifth year in a row for his work in honey bee and pollinator health, Yiping Qi, associate professor in Plant Science, represented the College of Agriculture & Natural Resources on the Web of Science 2020 list of Highly Cited Researchers for the first time. This list includes influential scientists based on the impact of their academic publications over the course of the year. In addition to this honor, Qi is already making waves in 2021 with a new high-profile publication in Nature Plants introducing SpRY, a newly engineered variant of the famed gene editing tool CRISPR-Cas9. SpRY essentially removes the barriers of what can and can't be targeted for gene editing, making it possible for the first time to target nearly any genomic sequence in plants for potential mutation. As the preeminent innovator in the field, this discovery is the latest of Qi's in a long string of influential tools for genome editing in plants.

"It is an honor, an encouragement, and a recognition of my contribution to the science community," says Qi of his distinction as a 2020 Web of Science Highly Cited Researcher. "But we are not just making contributions to the academic literature. In my lab, we are constantly pushing new tools for improved gene editing out to scientists to make an impact."

With SpRY, Qi is especially excited for the limitless possibilities it opens up for genome editing in plants and crops. "We have largely overcome the major bottleneck in plant genome editing, which is the targeting scope restrictions associated with CRISPR-Cas9. With this new toolbox, we pretty much removed this restriction, and we can target almost anywhere in the plant genome."

The original CRISPR-Cas9 tool that kicked off the gene editing craze was tied to targeting a specific short sequence of DNA known as a PAM sequence. The short sequence is what the CRISPR systems typically use to identify where to make their molecular cuts in DNA. However, the new SpRY variant introduced by Qi can move beyond these traditional PAM sequences in ways that was never possible before.

"This unleashes the full potential of CRISPR-Cas9 genome editing for plant genetics and crop improvement," says an excited Qi. "Researchers will now be able to edit anywhere within their favorable genes, without questioning whether the sites are editable or not. The new tools make genome editing more powerful, more accessible, and more versatile so that many of the editing outcomes which were previously hard to achieve can now be all realized."

According to Qi, this will have a major impact on translational research in the gene editing field, as well as on crop breeding as a whole. "This new CRISPR-Cas9 technology will play an important role in food security, nutrition, and safety. CRISPR tools are already widely used for introducing tailored mutations into crops for enhanced yield, nutrition, biotic and abiotic stress resistance, and more. With this new tool in the toolbox, we can speed up evolution and the agricultural revolution. I expect many plant biologists and breeders will use the toolbox in different crops. The list of potential applications of this new toolbox is endless."

Credit: 
University of Maryland

Experts call for more pragmatic approach to higher education teaching

image: A university lecture but could the standard of teaching be improved if a different approach was taken by educators?

Image: 
Swansea University

Millions of students around the world could benefit if their educators adopted a more flexible and practical approach, say Swansea University experts.

After analysing the techniques current being used in higher education, the researchers are calling for a pragmatic and evidence-based approach instead.

Professor Phil Newton, director of learning and teaching at of Swansea University Medical School, said: "Higher education is how we train those who carry out important professional roles in our society. There are now more than 200 million students in HE worldwide and this number is likely to double again over the next decade.

"Given the size, impact, importance and cost of HE, it would be reasonable to assume that policies and practices employed are the best available, based upon rigorous evidence. However, this does not appear to be the case."

In a new paper, Professor Newton, Dr Ana Da Silva and Sam Berry argue that the findings of higher education research are not being used to develop and benefit educational practice.

They say belief in ineffective methods such as Learning Styles persist, teaching quality and teacher performance are measured using subjective and potentially biased feedback while university educators have limited access to professional development.

Instead, the academics are proposing a pragmatic model of evidence-based higher education which they say could deliver results that are more obviously useful, focusing on practical teaching skills.

Prof Newton added: "The model is intended for educators and policymakers, to help them make the best use of existing education research evidence when making contextual decisions about local practice.

"It can also be used by learners to make decisions about how, when, why and what to study, and for the teaching of study skills to learners."

The model and how it can be applied in education settings is detailed in their research which has been published in the journal Frontiers in Education.

However, they say any decisions made using the model would need to be reviewed regularly, as the evidence base updates and the context shifts.

The need for this flexibility - and the benefits of adopting a pragmatic approach - have been highlighted by the pandemic which has led the global HE sectors to embrace online learning.

Prof Newton said: "There is an abundant evidence-base regarding learning online and at a distance, but much of this was developed to optimize learning under planned circumstances where students could choose to learn online, or in a structured blended way, very different to the situation we find ourselves in now.

"A pragmatic application of the existing evidence to the new context can help us with this rapid change and help us plan for what might become a 'new normal'."

Among the paper's recommendations, the researchers are calling for:

Faculty development programmes and credentials for HE educators to be practical and skills based;

Establishing pragmatic practical evidence summaries for use across international HE, allowing adjustment for context;

More syntheses of existing primary research that answer useful questions such as what works, for whom, in what circumstances, and why? How much does it cost, what is it compared to, how practical is it to implement?

Increased funding for research into the effectiveness of learning and teaching approaches in HE.

Prof Newton added "There is an abundance of academic literature on higher education, stretching back decades. We owe it to all involved in education to ensure that this can best inform innovation and improvement, in a way that allows for professional judgement and a consideration of context.

"This could be achieved by adopting principles of pragmatic, evidence-based higher education."

Credit: 
Swansea University

Crystal structures in super slow motion

image: At the heart of the imaging technique is a complex array of 72 circular apertures

Image: 
Dr Murat Sivis

Laser beams can be used to change the properties of materials in an extremely precise way. This principle is already widely used in technologies such as rewritable DVDs. However, the underlying processes generally take place at such unimaginably fast speeds and at such a small scale that they have so far eluded direct observation. Researchers at the University of Göttingen and the Max Planck Institute (MPI) for Biophysical Chemistry in Göttingen have now managed to film, for the first time, the laser transformation of a crystal structure with nanometre resolution and in slow motion in an electron microscope. The results have been published in the journal Science.

The team, which includes Thomas Danz and Professor Claus Ropers, took advantage of an unusual property of a material made up of atomically thin layers of sulphur and tantalum atoms. At room temperature, its crystal structure is distorted into tiny wavelike structures - a "charge-density wave" is formed. At higher temperatures, a phase transition occurs in which the original microscopic waves suddenly disappear. The electrical conductivity also changes drastically, an interesting effect for nano-electronics.

In their experiments, the researchers induced this phase transition with short laser pulses and recorded a film of the charge-density wave reaction. "What we observe is the rapid formation and growth of tiny regions where the material was switched to the next phase," explains first author Thomas Danz from Göttingen University. "The Ultrafast Transmission Electron Microscope developed in Göttingen offers the highest time resolution for such imaging in the world today." The special feature of the experiment lies in a newly developed imaging technique, which is particularly sensitive to the specific changes observed in this phase transition. The Göttingen physicists use it to take images that are composed exclusively of electrons that have been scattered by the crystal's waviness.

Their cutting-edge approach allows the researchers to gain fundamental insights into light-induced structural changes. "We are already in a position to transfer our imaging technique to other crystal structures," says Professor Claus Ropers, leader of Nano-Optics and Ultrafast Dynamics at Göttingen University and Director at the MPI for Biophysical Chemistry. "In this way, we not only answer fundamental questions in solid-state physics, but also open up new perspectives for optically switchable materials in future, intelligent nano-electronics."

Credit: 
University of Göttingen

UTMB team proves potential for reducing pre-term birth by treating fetus as patient

GALVESTON, Texas - The results of a study by researchers at the University of Texas Medical Branch may pave the way for a new medicine delivery system that could reduce the incidence of pre-term labor and premature birth by allowing physicians to treat the 'fetus as the patient'. The study has been published in Science Advances.

It has long been suspected that pre-term labor is triggered by inflammation caused by a sick fetus. A new study by scientists at UTMB has proved the hypothesis by studying several important assumptions about the relationship between the health of a mother and her unborn child.

According to Dr. Ramkumar Menon, a Professor in UTMB's Department of Obstetrics and Gynecology and Cell Biology, his team worked with ILIAS Biologics, Inc., a South Korean biotechnology company, to test their bioengineered exosomes as a delivery system for anti-inflammatory medicine directly to the fetus.

"Exosomes are natural nanoparticles or vesicles in our bodies, and we have trillions of them circulating through us at all times. By packaging the medicine inside a bioengineered exosome and injecting it into the mother intravenously, the exosomes travel through the blood system, cross the placental barrier and arrive in the fetus, where they deliver the medicine," explains Dr. Menon.

In laboratory tests with mice, there were several steps prior to testing the drug delivery. First, Menon said it was important to prove that fetal cells, specifically immune cells, actually migrated through the mother's body to her uterine tissues as well as to her, which can cause inflammation, the leading cause of pre-term labor.

To prove migration of cells, female mice were mated with male mice who had been genetically engineered with a red fluorescent dye called tdtomato. The dye causes cells in the male to turn red, so once mating has occurred, cells in the developing fetus also turn red and can easily be tracked as they migrate through the mother. This model was developed by Dr Sheller-Miller, a post-doctoral fellow in the Menon lab who is also the first author of this report. Development of this model that determined fetal immune cells reaching maternal tissues was also a turning point in this research.

Once scientists had proof of cell migration, they next used the mouse model to determine if bioengineered exosomes could deliver a special anti-inflammatory medicine, an inhibitor of NF-kB, called super repressor (SR) IkB from the mother's bloodstream to the fetus.

The exosomes were created using an innovative approach developed by ILIAS Biologics, Inc. called EXPLOR®, or Exosomes engineering for Protein Loading via Optically Reversible protein to protein interaction.
The study proved that the exosomes effectively delivered medicine to the fetus, slowed the migration of fetal immune cells, and delayed pre-term labor.

In addition, the study found that:
* Sustained effects/delays in labor required repeated dosing
* Prolongation of gestation improved pup viability
* Mouse models provided valuable information to help understand the mechanisms often seen in humans
* Future studies, including human clinical trials are needed to confirm laboratory results

"Pre-term birth rates have not reduced in the past few decades, and this technology (the bioengineered exosomes) could lead the way to other treatments for the delivery of drugs to treat the underlying cause of inflammation in a fetus," said Dr. Menon. This technology can also be used to package other drugs in exosomes to treat other adverse pregnancy complications.

This study result is the second proof of concept that suggests significant anti-inflammatory effects of the same exosomes from ILIAS Biologics. In April 2020, the researchers at Korea Advanced Institute of Science and Technology (KAIST) and the ILIAS team published the same exosomes' substantial efficacy in the septic mouse model in Science Advances. (Link: https://advances.sciencemag.org/content/6/15/eaaz6980)

Credit: 
University of Texas Medical Branch at Galveston

Highly functional membrane developed for producing freshwater from seawater

image: Scanning Electron Microscope image of the nanosheet-laminated membrane developed through this research.

Image: 
Membrane Engineering Group, Kobe University

Professor MATSUYAMA Hideto's research group at Kobe University's Research Center for Membrane and Film Technology has successfully developed a new desalination membrane. They achieved this by laminating a two-dimensional carbon material (*1) on to the surface of a porous polymer membrane (*2).

Desalination (*3) membranes are used to produce freshwater from seawater. In order to solve the worldwide issue of insufficient freshwater resources, researchers are striving to develop desalination membranes that are not only permeated by water faster than those currently in use but also remove salt efficiently, so that more effective, low-energy desalination systems can be implemented.

In this research study, graphene oxide (*4) nanosheets, which are a type of two-dimensional nanomaterial, were stacked upon the surface of a porous membrane after being given a chemical reduction treatment (*5), enabling a desalination membrane layer of approximately 50 nanometers (nm) to be developed (a nanometer is 1/20000th of a millimeter). The developed membrane has the potential to perform highly efficient desalination because it is possible to control the gaps between its nanosheets and the charge on the nanosheets' surfaces. It is hoped that this research will contribute towards the application and implementation of futuristic desalination membranes.

These research results were published in Journal of Materials Chemistry A on November 18, 2020.

Main Points

The researchers successfully developed a new desalination membrane using two-dimensional nanosheets.

The chemical reduction treatment of the graphene oxide nanosheets strengthened π- π stacking (*6) between the nanosheets.

The π- π stacking improved the stability of the nanosheet-laminated membrane and made it possible to manipulate the interlayer gap between each nanosheet.

Porphyrin-based planar molecules (*7) with charged groups and a conjugated π system (*8) were introduced between the nanosheets. This resulted in electrostatic repulsion (*9) between the graphene oxide and the planar compound's negative charge, enabling the researchers to control the anions' (*10) movement within the nanochannels (*11).

The nanosheet-laminated membrane developed through this research was able to reject sodium chloride (NaCl) permeation by 95%. In the future, these research results can contribute towards the creation of new, high performance membrane technologies for desalination.

Research Background

97.5% of the water on Earth is seawater and only 2.5% is freshwater. Within this percentage, a mere 0.01% of freshwater resources can be easily treated in order to be utilized by humankind. However, the human population continues to increase every year. Consequently, it has been predicted that in several years' time, two thirds of the world's population will have insufficient access to freshwater. A worldwide water shortage is one of the gravest issues facing humankind. Therefore, technologies that can obtain the necessary resources by converting the Earth's abundant seawater into freshwater are paramount.

Evaporation methods have been used to convert seawater to freshwater, however they require large amounts of energy in order to evaporate the seawater and remove the salt (desalination). On the other hand, membrane separation methods provide a low energy alternative; they enable freshwater to be produced by filtering water out of seawater and removing the salt. Methods of producing freshwater from seawater using membranes have been implemented, however with the desalination membranes developed so far there is always a trade-off between permeation speed and desalination ability. Therefore, it is vital to develop a revolutionary desalination membrane from new materials in order to resolve this trade-off and to make it possible to desalinate seawater at a higher rate of efficiency.

Research Methodology

This research team developed a highly functional desalination membrane by laminating the membrane with a two-dimensional carbon material of the approximate thickness of a carbon atom. These 2D carbon materials were graphene oxide nanosheets that were chemically reduced to give them strengthened π-π interaction.

By applying nanosheet coatings with intercalation of porphyrin-based planar molecules (with charged groups and a conjugated π system) to the surface of a porous membrane, the research group was able to construct an ultrathin desalination membrane layer approximately 50nm thick (Figure 1).

This layer demonstrated high ion-blocking functionality because the size of the nanochannels (the gaps between each nanosheet) could be controlled within 1nm. Furthermore, the gaps between the nanochannels in the nanosheet-laminated membrane demonstrated continuous water-stability due to the strong π- π stacking between the sheets, suggesting the possibility that it could be utilized for a long period of time. In addition, there was no loss of desalination functionality even under a pressure of 20 bar.

The researchers revealed that the transfer of ions inside the developed nanosheet-laminated membrane were effectively suppressed by electrostatic repulsion on the nanosheet surface (Figure 2). This electrostatic repulsion was highly effective when the width of the nanochannels was appropriately controlled. For the nanosheet material used in this study, the width of the nanochannels could be confined by controlling the chemical reduction process and the intercalation ratio of porphyrin-based planar molecules.

NaCl is the main component of seawater ions and it is particularly difficult to prevent it from permeating the membrane. However, a nanosheet-laminated membrane produced under optimal conditions was able to block around 95% of NaCl.

Further Developments

The 2D nanosheet-laminated membrane developed through this research was produced by regulating the reduction of the oxidized graphene sheet and the intercalation ratio of planar molecules, which in turn enabled both the interlayer space between the nanosheets and the electrostatic repulsion effect to be controlled. In addition to desalination membranes, this technique can also be applied to the development of various electrolyte separation membranes.

Low-energy desalination technologies using separation membranes are indispensable for reducing water shortages. It is hoped that the technology will contribute towards resolving the issue of water resources drying up worldwide. Next, the research team will try to further improve the developed membrane's high functionality, so that it can be implemented.

Credit: 
Kobe University

Scientists discover how the potentially oldest coral reefs in the Mediterranean developed

image: In the present-day, the cold-water coral reefs of the Mediterranean are mainly formed by the species Madrepora oculata and partly Desmophyllum pertusum / Cabliers reef - SHAKE Cruise.

Image: 
ICM-CSIC, University of Southampton

A new study from the Institut de Ciències del Mar (ICM-CSIC, Spain) and the National Oceanography Centre brings unprecedented insights into the environmental constraints and climatic events that controlled the formation of these reefs.

The results of this research will help understand how cold-water coral reefs can react to the effects caused by the present-day climate change.

Similar to tropical coral reefs, cold-water coral reefs are incredible hotspots of biodiversity, with the difference that they do not rely on symbiosis with microscopic algae, and therefore can be found in the dark and deep waters of our oceans. Despite their uniqueness and key functional role in the ocean, they are still partially unknown ecosystems, which still lack of thorough procedures to protect them from human-derived disturbances. In fact, they are considered vulnerable marine ecosystems by the United Nations, the OSPAR Commission and the General Fisheries Commission for the Mediterranean.

Now, an international team of scientists from the Institut de Ciències del Mar (ICM-CSIC) and the NOC has studied for the first time the main drivers that control the development of cold-water reefs in the Western Mediterranean during the last 400,000 years. In these reefs, the deeper you go, the older the corals will be, since new generations grow on top of the previous ones. The results of this research are collected in a paper published recently in the journal Quaternary Science Reviews and bring unprecedented insights into the environmental constraints and climatic events that controlled the cyclic development of these reefs.

To carry out the study, researchers made use of Laser Ablation U-series dating, a new technique consisting of ablating and ionising samples with an inductively coupled plasma mass spectrometer to determine the age of 110 cold-water coral skeletons. Combined with other analyses, these allowed them to describe when the main periods of reef formation occurred and which were the main environmental drivers of coral reef formation in this region.

According to this work, cold-water corals have been growing almost continuously in the Mediterranean for the last 400,000 years, even before the appearance of the first Neanderthals. Nonetheless, they might have started to form much earlier, as only the shallower part -10 m- of the entire reef height -80-90 m- could be described in this study.

The analyses of the semi-fossil corals acquired showed that coral growth and reef formation was affected by major changes in climate over this time period. "Climate swings associated with ice ages, such as changes in sea surface productivity and sea-level variations appear to be the main factors controlling the development of these cold-water coral reefs", explains Guillem Corbera, PhD student from NOC and the University of Southampton.

"In addition, intense and prolonged monsoon events that mainly affected the Eastern Mediterranean Sea had a detrimental impact for the development of these reefs, located 1000s of kilometers away in the Westernmost Mediterranean", adds Corbera.

"Throughout the last 400,000 years, depending on the climate conditions, different species of corals dominated these reefs, which created impressive geo-forms in the deep ocean. This research helps us understand how cold-water coral reefs can react to the effects caused by the present-day climate change", states the ICM-CSIC researcher Claudio Lo Iacono, who discovered these reefs some years ago and has now led this study.

In the Mediterranean Sea the development of cold-water coral reefs has been studied before, and scientists have so far determined the age of coral samples from different locations. They have also attempted to link coral reef formation patterns to different environmental factors, but unlike this article, they have not been able to investigate cold-water coral reef development beyond the last ~15,000 years.

Credit: 
National Oceanography Centre, UK

NUI Galway contribute to significant breast cancer risk genes study

image: Professor Michael Kerin, Chair of Surgery at NUI Galway, Director of the Cancer Managed Clinical Academic Network for Saolta University Health Care Group, and Research Director of the National Breast Cancer Research Institute.

Image: 
NUI Galway

Breast cancer investigators in the Lambe Institute at NUI Galway have collaborated on a pivotal international study into breast cancer risk which was published in the New England Journal of Medicine today (Wednesday, 20 January). The results of the study have identified that there are nine specific genes associated with breast cancer risk.

Contributing authors Professor Michael Kerin, Chair of Surgery at NUI Galway, Director of the Cancer Managed Clinical Academic Network for Saolta University Health Care Group, along with Dr Nicola Miller, Lecturer in NUI Galway's School of Medicine, have directed the Breast Cancer in Galway Genetics Study (BIGGS) since 2008. DNA samples, which have been collected from 2,000 Irish patients and controls have contributed to the findings of this paper, and to numerous high impact publications in the past decade.

Led from the University of Cambridge, the BRIDGES (Breast Cancer Risk after Diagnostic Gene Sequencing) study aimed to identify women at high risk of breast cancer and to develop sensitive and informative gene panel testing for the prediction of breast cancer risk. Gene panel testing is a technique in which a number of specific genes that are linked to a particular genetic condition are examined at the same time.

Gene panel testing for breast cancer susceptibility is widely used, but there is only weak evidence for cancer risk association with many genes. The BRIDGES study tested 34 potential "risk" genes from 60,466 breast cancer cases worldwide and 53,461 controls (patients who did not have breast cancer) from 44 international studies in the Breast Cancer Association Consortium. The study found that variants in nine genes were associated with breast cancer risk (ATM, BRCA1, BRCA2, CHEK2, PALB2, BARD1, RAD51C, RAD51D, TP53).

Professor Michael Kerin, who is also Research Director of the National Breast Cancer Research Institute, a voluntary national charity that funds a comprehensive research programme at the Lambe Institute in NUI Galway, said: "With this study we can identify the members within families who have abnormal genes that puts them at a higher risk of getting breast cancer, and they can avail of strategies such as early screening and risk reduction surgery, in order to improve their life expectancy."

Professor Kerin said that the success of this research is testament to the power of bio-banking and the need to futureproof research: "Having a set of bio-banked samples and the ability to closely follow up with these patients has enabled us to add value to international research studies and improve the knowledge base around breast cancer risk.

"The BRIDGES study has revealed that changes which were thought to be unimportant in the well-known breast cancer genes, BRCA1 and BRCA2 are significant, and this allows us to manage the risk of developing breast cancer in people affected by these gene alterations."

Acknowledging the support of breast cancer research charity funding, Dr Nicola Miller said: "This work highlights the importance of collaboration in breast cancer research in the generation of data of global significance. It helps to better define the genes associated with breast cancer risk. While we can't change the genes we inherit, this knowledge will benefit patients undergoing genetic testing for breast-cancer susceptibility. We gratefully acknowledge the ongoing support of the National Breast Cancer Research Institute for funding the Irish contribution of this study."

Credit: 
University of Galway

COVID-19 is dangerous for middle-aged adults, not just the elderly

COVID-19 has been spreading rapidly over the past several months, and the U.S. death toll has now reached 400,000. As evident from the age distribution of those fatalities, COVID-19 is dangerous not only for the elderly but for middle-aged adults, according to a Dartmouth-led study published in the European Journal of Epidemiology.

"For a person who is middle-aged, the risk of dying from COVID-19 is about 100 times greater than dying from an automobile accident," explains lead author Andrew Levin, a professor of economics at Dartmouth College. "Generally speaking, very few children and young adults die of COVID-19. However, the risk is progressively greater for middle-aged and older adults. The odds that an infection becomes fatal is only 1:10,000 at age 25, whereas those odds are roughly 1:100 at age 60, 1:40 at age 70, and 1:10 at age 80."

These findings represent the culmination of a systematic review of all available studies of COVID-19 prevalence in countries with advanced economies; this review encompassed more than 1,000 research papers and government documents disseminated prior to September 18, 2020. The research team identified 27 studies where the survey design was representative of the general population, covering 34 geographical locations in the U.S., Canada, Asia, and Europe. Using those prevalence data, the researchers investigated the age-specific ratio of COVID-19 fatalities to infections and found a very clear exponential relationship.

An initial version of this study was posted online in July 2020 as an NBER Working Paper and was regularly updated on the medRxiv preprint server prior to being published as an open-access article in the European Journal of Epidemiology. The findings remain highly relevant as the total number of COVID-19 deaths in the U.S. continues to climb. "Our findings are consistent with the CDC's Weekly Updates by Select Demographic and Geographic Characteristics, which report on COVID-19 deaths by age group," says Levin. "Nearly 40 percent of U.S. COVID-19 deaths have occurred among those ages 45 to 74 years, while almost 60 percent have occurred among those over 75 years old. By contrast, children and young adults (less than 45 years old) account for less than 3 percent of U.S. COVID-19 deaths."

Levin also emphasized the urgent practical implications of his team's research findings. "While COVID-19 vaccines are now being distributed, several more months are likely to pass before these vaccines have been fully disseminated to the public," says Levin. "We need get through this period as safely as possible. Taking basic precautions--including wearing a mask, practicing social distancing, and washing your hands often--is critical to protecting yourself, family, friends, and community members from this very deadly disease."

Credit: 
Dartmouth College

Dynamic, personalized treatment approach may improve outcomes in gastroesophageal cancers

A phase 2 clinical trial providing personalized treatments based on the genetic profile of metastatic tumors in gastroesophageal cancers has found that using customized treatment approaches, and adapting them over time as tumors become resistant, led to higher rates of survival compared to historical controls. The final results were published online on Jan. 21 in Cancer Discovery, a journal of the American Association for Cancer Research.

Advances in technology have made it possible for scientists and physicians to use information about the genetic makeup of a cancerous tumor to inform cancer treatment, but genetic heterogeneity -- the genetic variation between cancers in different patients and even between tumors within the same patient -- can make it difficult to determine the most effective targeting strategy for treating individual patients.

Seeking to overcome these limitations, oncologist Daniel Catenacci, MD, director of the gastrointestinal oncology program and associate professor of medicine at the University of Chicago Medicine, and his colleagues developed a detailed strategy to better assess tumor heterogeneity and use that information to determine the best strategy for treating a patient's cancer. Dubbed the Personalized ANtibodies for Gastro-Esophageal Adenocarcinoma, or PANGEA study, their approach, he says, could make it easier for physicians to strategically target treatments in the future.

"The genetic drivers between patients are often quite different, and it can be hard to run a classical clinical trial because only a fraction of patients has that specific genetic profile," said Catenacci. "On top of that, differences in the composition of a patient's primary tumor compared to metastatic tumors adds further treatment complications."

"We decided to choose a token metastatic disease site that would serve as the molecular profile to target," he continued. "Rather than looking at one genetic event at a time with one drug, we'd use a predefined algorithm to test multiple therapies for different genetic drivers at once, and using the same algorithm, adjust therapeutic approaches if a patient's cancer became resistant to their original treatment."

Using a novel statistical approach, the PANGEA team built an algorithm that would assign treatment based on a predefined, prioritized set of biomarkers. This allowed them to biopsy each patient's tumors (with a focus on metastatic tumors) and determine which genetic biomarkers were present. At each line of treatment, an optimal therapy for those specific biomarkers was given based on the biopsy results.

Patients were assigned to one of eight groups, with six monoclonal antibody treatment options to complement traditional chemotherapy. If patients' tumors progressed with treatment, their tumors were rebiopsied and reexamined. If the tumor had adapted to resist the current treatment, patients were reassigned to a new group and given treatment that more closely matched the tumor's new profile, up to two more times.

"During the study, we learned that not only is it common to see genetic heterogeneity between the primary and metastatic tumor -- about 40% of the time, the metastatic tumor differed substantially from the primary tumor -- but also that about 45 or 50% of the time, patients had their therapies changed as the disease evolved," said Catenacci. "By going after the metastatic site at the start of treatment, reassessing the tumor if and when there was clinical disease progression, and using the algorithm to prioritize therapy each time, the survival outcome was substantially higher than would be expected using standard therapies."

With most advanced gastroesophageal adenocarcinomas, only about 50% of patients are still alive after one year, and the median survival time is less than 12 months. Using the PANGEA approach, 66% of patients were alive one year after their initial diagnosis, with a median survival time of 15.7 months across all patients.

The trial was so successful, Catenacci said, that some of the participants are still receiving treatment under this paradigm several years from initiation of therapy.

Setting up the clinical trial was challenging, requiring immense coordination between investigators, pharmaceutical companies, and clinical staff to access and provide the wide variety of therapeutic options for the eight patient groups, and keep track of all of the biopsies taken at various timepoints during the trial.

This phase 2 study, conducted as a pilot and feasibility study with 68 patients, was not randomized. The team is now working on expanding the program with additional collaborators, industry partners, and includes discussions with the FDA, seeking to set up a broader infrastructure to make a confirmatory trial possible in order to test the approach in a larger group of participants with randomized controls.

Catenacci hopes that in addition to expanding into larger clinical trials, these results can help inform treatment decisions for patients even now.

"Some of these biomarker groups and treatments we've identified are not yet the standard of care," he said. "At first there was only chemo for HER2-negative tumors in the first-line setting, but now other targeted therapies are coming out, including anti-PD1, anti-FGFR2 and anti-claudin. How does a physician decide which treatment to prescribe when a patient might be eligible for multiple options given known overlap in a tumor of the predictive biomarkers for each of these therapies? This study shows that using an algorithm such as ours could help with that prioritization to direct optimal care, and may potentially lead to better outcomes for patients."

Credit: 
University of Chicago Medical Center